This is a remote position.
Responsibilities
- Lead and manage teams working on large, complex, end-to-end data platforms and pipelines.
- Develop deep understanding of the structure, semantics, and business meaning of complex data landscapes.
- Own the technical architecture of the platform, designing scalable, secure, and cost-efficient cloud-based data solutions.
- Translate business requirements into technical strategies and execution plans for projects of any size.
- Create and communicate technical artifacts, including architecture diagrams, data models, data dictionaries, and technical design documentation.
- Act as the primary technical authority, unblocking the team and guiding complex technical decisions.
- Define delivery plans, estimates, and priorities; clearly communicate plans, progress, and risks to the team and to clients.
- Ensure high-quality delivery at both code and platform levels, enforcing engineering, data quality, and operational standards.
- Perform and oversee peer code reviews, promoting clean code, reusable patterns, testing, and strong Git practices.
- Drive continuous improvement initiatives to enhance platform reliability, efficiency, scalability, and delivery speed.
- Anticipate technical, delivery, and operational risks; escalate issues proactively and plan mitigations.
- Own responsibilities beyond development, including infrastructure, monitoring, production support, maintenance, and operational processes.
- Establish and enforce observability practices (monitoring, logging, alerting, SLOs/SLAs) and lead incident response when needed.
- Work across teams, providing architectural guidance and governance beyond the immediate team.
- Mentor and develop engineers, fostering technical excellence, ownership, and career growth.
- Participate in talent assessment decisions to ensure strong team composition.
Requirements
- 5+ years of professional experience in software or data engineering.
- Proven experience leading teams on complex IT or data platform projects.
- Strong experience with end-to-end big data pipelines (batch, streaming, or hybrid).
- Advanced proficiency in SQL (queries, performance tuning, indexing, partitioning).
- Strong programming skills in Python or similar languages, following clean and modular engineering practices.
- Deep understanding of cloud architecture, including compute, storage, networking, identity, and cost optimization.
- Expertise in data platform architecture, including modern paradigms such as Lambda, Kappa, microservices, and event-driven pipelines.
- Strong background in data modeling (dimensional, normalized, and data vault models).
- Ability to independently deliver and oversee tasks of any complexity.
- Excellent business and technical acumen, with the ability to connect technical decisions to business impact.
- Proven experience defining and managing CI/CD pipelines, automation, and release strategies.
- Strong experience debugging and resolving complex production issues in data platforms.
- Excellent communication skills, with confidence driving discussions and providing clear updates to clients and stakeholders.
- Experience working with agile methodologies (e.g., Scrum).
- Strong organizational skills, including prioritization, estimation, and management of competing priorities.
- Advanced English proficiency for client-facing communication and documentation.
- Experience defining organization-wide data quality frameworks, data contracts, and anomaly detection.
- Hands-on experience with data governance, lineage, cataloging, and access control.
- Familiarity with regulatory and compliance requirements (e.g., GDPR, HIPAA).
- Experience leading technology evaluations and Proofs of Concept (PoCs).
- Experience providing cross-team architectural governance or acting as a platform steward.
- Background influencing engineering culture at scale, including documentation, observability, and review practices.
- Must be based in Latin America.
