OCI is seeking experienced Data Engineers to design, develop, and manage data infrastructure for internal and client projects within our AI/ML & Data Insights team.
About OCI
We provide a collaborative environment focused on innovation, leveraging open-source software and strategic partnerships with Amazon and Google to build transformative technology solutions.
Job Responsibilities
- Design, build, and optimize large-scale data pipelines (Azure, AWS, GCP)
- Develop data ingestion and storage solutions.
- Implement scalable APIs and ensure system performance.
- Manage big data infrastructure and cloud deployments.
- Collaborate with developers, designers, and data scientists.
- Work in an Agile/DevOps environment.
Core Competencies
- ETL data engineering (Databricks, SQL Server, Snowflake, BigQuery, Apache Spark).
- Proficiency in Go, Java, Python, or Scala.
- Proficiency in CI/CD pipelines and Infrastructure as Code (IaC) (Terraform, CDK, Ansible).
- Hands-on experience with event-driven architectures (Kafka, Pulsar).
- Strong knowledge of data warehousing, SQL/NoSQL databases, and cloud platforms.
- Experience with distributed computing, DevOps tools, and data governance.
- Familiarity with Delta Lake, Unity Catalog, Delta Sharing, and DLT.
Preferred Qualifications
- Degree in Computer Science, Data Engineering, or a related field, or equivalent experience.
- Experience with AI/ML-driven data solutions and real-time data processing.
- Expertise in building scalable APIs and integrating with modern analytics tools (Power BI, Tableau, QuickSight).
- Cloud certifications (Databricks, AWS, Azure, GCP).