Responsibilities:
- Develop, document, and test ELT/ETL solutions using industry standard tools (Snowflake, Denodo Data Virtualization, Looker).
- Recommend process improvements to increase efficiency and reliability in ELT/ETL development.
- Extract data from multiple sources, integrate disparate data into a common data model, and integrate data into a target database, application, or file using efficient ELT/ ETL processes.
- Collaborate with Quality Assurance resources to debug ELT/ETL development and ensure the timely delivery of products.
- Should be willing to explore and learn new technologies and concepts to provide the right kind of solution.
- Target and result oriented with strong end user focus.
- Effective oral and written communication skills with BI team and user community.
Requirements:
- 5+ years of experience in ETL/ELT design and development, integrating data from heterogeneous OLTP systems and API solutions, and building scalable data warehouse solutions to support business intelligence and analytics.
- Demonstrated experience in utilizing python for data engineering tasks, including transformation, advanced data manipulation, and large-scale data processing.
- Experience in data analysis, root cause analysis and proven problem solving and analytical thinking capabilities.
- Experience designing complex data pipelines extracting data from RDBMS, JSON, API and Flat file sources.
- Demonstrated expertise in SQL and PLSQL programming, with advanced mastery in Business Intelligence and data warehouse methodologies, along with hands-on experience in one or more relational database systems and cloud-based database services such as Oracle, MySQL, Amazon RDS, Snowflake, Amazon Redshift, etc.
- Proven ability to analyze and optimize poorly performing queries and ETL/ELT mappings, providing actionable recommendations for performance tuning.
- Understanding of software engineering principles and skills working on Unix/Linux/Windows Operating systems, and experience with Agile methodologies.
- Proficiency in version control systems, with experience in managing code repositories, branching, merging, and collaborating within a distributed development environment.
- Excellent English communication skills.
- Interest in business operations and comprehensive understanding of how robust BI systems drive corporate profitability by enabling data-driven decision-making and strategic insights.
Pluses
- Experience in developing ETL/ELT processes within Snowflake and implementing complex data transformations using built-in functions and SQL capabilities.
- Experience using Pentaho Data Integration (Kettle) / Ab Initio ETL tools for designing, developing, and optimizing data integration workflows.
- Experience designing and implementing cloud-based ETL solutions using Azure Data Factory, DBT, AWS Glue, Lambda and open-source tools.
- Experience with reporting/visualization tools (e.g., Looker) and job scheduler software.
- Experience in Telecom, eCommerce, International Mobile Top-up.
- Education: BS/MS in computer science, Information Systems or a related technical field or equivalent industry expertise.
- Preferred Certification: AWS Solution Architect, AWS Cloud Data Engineer, Snowflake SnowPro Core.