Design, build, and manage end-to-end pipelines for data ingestion, transformation, model deployment, and monitoring, as well as ensure system performance and scalability through proactive monitoring and incident response.
Requirements
- 10+ years of experience building and scaling complex software or data platforms
- Strong programming skills in one or more languages such as Python or Java
- Experience with cloud-native architectures (e.g., AWS, Azure) and container orchestration (e.g., Kubernetes, Docker)
- Deep understanding of data systems, including SQL and NoSQL databases, as well as modern data architectures
- Hands-on experience working with large-scale data platforms and distributed processing technologies
- Strong grasp of software engineering fundamentals, including data structures, algorithms, and design patterns
- Experience with version control, CI/CD pipelines, and Agile development practices
- Understanding of data governance, security, and privacy considerations (e.g., encryption, data protection strategies)
- Excellent communication skills with the ability to collaborate across technical and non-technical teams
