You'll be part of a dynamic team developing a cutting-edge Analytical Platform for one of the largest resorts and casino companies in Southeast Asia. This mission-critical application is designed to empower client stakeholders with deep insights into customer behavior, enabling data-driven decision-making and strategic business development.
Project description
The platform will play a pivotal role in shaping the company's future by:
- Enhancing customer engagement through data-driven personalization and targeted marketing strategies.
- Optimizing business operations by leveraging real-time analytics for revenue forecasting, operational efficiency, and risk management.
- Driving strategic growth by identifying emerging trends, uncovering new revenue opportunities, and improving competitive positioning in the market.
You'll be working with state-of-the-art open-source technologies such as Apache Kafka, Spark, Airflow, ClickHouse, MLflow, and PostgreSQL, collaborating with top-tier engineers, data scientists, and business analysts to deliver a high-performance, scalable, and secure platform.
If you're passionate about big data, advanced analytics, and AI-driven solutions, this is an exciting opportunity to contribute to a transformational project in a fast-paced, high-impact environment.
Requirements
- 7+ years of experience in data architecture, engineering, and database design.
- Expertise in data lakes, data warehousing, ETL/ELT processes, and big data technologies.
- Strong knowledge of Apache Kafka, NiFi, Spark, PostgreSQL, ClickHouse, Apache Iceberg, and Delta Lake.
- Experience with data modeling, schema design, and performance tuning for analytical workloads.
- In-depth understanding of data security, governance, access control, and compliance (e.g., GDPR, SOC 2).
- Familiarity with cloud services (AWS, GCP, Azure) for data storage, processing, and orchestration.
- Hands-on experience with Kubernetes, Docker, and infrastructure-as-code (Terraform, Ansible) for deployment and automation.
- Ability to optimize query performance and handle large-scale distributed data processing.
- Knowledge of real-time data processing and streaming architectures.
- Experience with Metabase, Grafana, or other monitoring tools for data observability and analytics.
- Strong problem-solving skills and ability to design highly scalable, fault-tolerant solutions.
- Experience in collaborating with data scientists, engineers, and business analysts to ensure seamless data integration and usability.
- Upper-Intermediate level of English and Ukrainian
Responsibilities:
- Design the architecture for the open-source-based data analytics platform.
- Develop scalable data models, data pipelines, and data lakes.
- Ensure integration of various data sources, including Kafka, NiFi, Apache Airflow, and Spark.
- Implement modern data platform components like Apache Iceberg, Delta Lake, ClickHouse, and PostgreSQL.
- Define and enforce data governance, security, and compliance best practices.
- Optimize data storage, access, and retrieval for performance and scalability.
- Collaborate with data scientists, engineers, and business analysts to ensure platform usability.
Benefits
- 35 absence days per year for work-life balance
- Udemy courses of your choice
- English courses with native-speaker
- Regular soft-skills trainings
- Excellence Сenters meetups
- Online/offline team-buildings
- Business trips