Category: Technology
Location:
Summary
We are looking for a highly skilledData Engineerto design, build, and maintain robust data pipelines and warehouse solutions. The ideal candidate has strong experience inETL development, data modeling (star schema), and modern data platformssuch asMicrosoft Fabric or Databricks.
Key Responsibilities:
Design and implementETL pipelinesfor large-scale data processing usingPythonandPySpark.
Develop and maintaindata modelsand schemas optimized for analytics and reporting.
Collaborate with cross-functional teams to define data requirements and integration strategies.
Optimize data performance, reliability, and scalability inFabricorDatabricksenvironments.
Participate indata modernization or migration projects, ensuring seamless transition and system integrity.
Qualifications:
4–5 yearsof experience as aData Engineeror similar role.
Proven expertise inPython,PySpark, andMicrosoft FabricorDatabricks.
Strong understanding ofETL processes,data warehousing, andstar schema design.
Experience withdata transformation, integration, and performance optimization.
Background indata modernization or migration initiativesis preferred.
Top 3 Non-Negotiables:
Python
PySpark
Microsoft Fabric or Databricks
JOB REQUIREMENTS
• Should be willing to accept a long-term work-from-home arrangement.
• Should be amenable to a permanent night shift schedule.
Benefits
• Full Philippine Statutory Benefits
• 13th Month Pay
• De Minimis Allowance
• Night Shift Differential Pay
• Paid Time Off (PTO)
• Health Insurance
• Life Insurance (maximum of PHP 3M coverage)
• Company-Provided Equipment
Requirements
• Should be willing to accept a long-term work-from-home arrangement.
• Should be amenable to a permanent night shift schedule.
Benefits
• Full Philippine Statutory Benefits
• 13th Month Pay
• De Minimis Allowance
• Night Shift Differential Pay
• Paid Time Off (PTO)
• Health Insurance
• Life Insurance (maximum of PHP 3M coverage)
• Company-Provided Equipment
Details