We’re seeking a DataOps Engineer who thrives at the intersection of data engineering, DevOps, and workflow orchestration. You’ll be instrumental in designing, automating, and optimizing data pipelines that power analytics, machine learning, and operational intelligence across the organization.
If you’re passionate about building resilient data systems, streamlining deployments, and enabling data teams to move faster with confidence—this role is for you.
Responsibilities
- Administer a large-scale MongoDB cluster using a combination of bash, python, and Linux OS skills.
- Work with data engineers to design and maintain scalable, automated data pipelines using tools like Apache Airflow, dbt, and Terraform.
- Create abstractions around data workflows for self-service creation of data products in development teams.
- Implement CI/CD workflows for data infrastructure and analytics code
- Monitor and optimize data workflows for performance, reliability, and cost-efficiency
- Integrate cloud-native services (e.g., S3, Redshift, BigQuery, Databricks) into unified data workflows
- Develop disaster recovery strategies and backup automation for critical data assets
- Champion DataOps best practices across teams, including version control, testing, and observability.
- Participate in the team’s emergency on-call rotation, to ensure 24/7 uptime of our systems.
- 3+ years of experience in data engineering, DevOps, or cloud infrastructure roles
- Proficiency in automating administrative workflows using Bash and Python–with an emphasis on writing clean and maintainable code.
- Intermediate knowledge of Linux system administration.
- Basic proficiency writing queries in both relational (SQL) and NoSQL paradigms.
- Experience administering Big Data querying engines like Hadoop, Apache Spark, or Google BigQuery.
- Experience with data orchestration tools (Airflow, Prefect, Dagster)
- Familiarity with cloud platforms (AWS, Azure, or GCP) and infrastructure-as-code (Terraform, CloudFormation)
- Strong understanding of data lake and warehouse architectures
- Experience working with containers.
Headquartered in Mountain View, California, with over 220 team members across the United States and Europe, DNAnexus is experiencing rapid growth and market adoption. With the support of leading investors including Google Ventures and Blackstone, and trusted by hundreds of the world's biomedical leaders, the company is at the innovative forefront with our precision health data cloud to drive scientific breakthroughs. If you are interested in joining our team, please apply today!
DNAnexus is the enterprise platform for precision health. We are on a mission to accelerate the development, approval and delivery of personalized treatments.
Building on 15 years of bioinformatics innovation and genomics expertise, DNAnexus provides the cloud platform that centralizes and enriches multimodal omics data, supports an extensive suite of informatics use cases, and allows secure collaboration across the care continuum. DNAnexus powers a connected ecosystem trusted by the world’s precision health leaders. This flexible ecosystem makes omics and real-world data accessible, actionable, and secure, while unlocking insights that improve patient lives. For more information visit www.dnanexus.com or follow @DNAnexus on social media.
As we pursue these goals, it remains essential to us that we stay grounded in our values and adhere to the highest standards. At DNAnexus, you will be joining an ambitious, supportive, and driven team with diverse backgrounds but a shared passion to deliver on our mission to make personalized medicine a reality for all.