Dune is hiring a Data Engineer to build and operate reliable pipelines, model complex protocol mechanics, and scale curated datasets across chains. The role involves designing, implementing, and maintaining data pipelines, establishing architecture for data quality and timely delivery, and partnering with product and customers to unlock user value and faster time to insight.
Requirements
- Strong SQL skills and experience modeling large datasets in modern warehouses
- Proficiency in Python for pipeline development, tooling, and automation
- Proven track record operating robust pipelines and orchestration with tools like Prefect, Airflow, Elementary, or similar
- Solid computer science fundamentals, system design skills, and experience in public cloud and container orchestration (e.g., Kubernetes)
- Ability to analyze, debug, and resolve data pipeline and modeling issues independently in a remote, async environment
- Strong blockchain data intuition, including reading transactions, events, and traces
- Interest in query engine internals and performance optimization
- Experience with data lakes, large-scale data processing, and systems performance
- Familiarity with CI for data, observability, cost optimization, and leveraging AI tools to accelerate development and operations
Benefits
- Competitive salary and equity package
- 5 weeks PTO + local public holidays
- Fully remote-first approach
- Private medical insurance, dental & vision
- Paid parental leave
- Quarterly offsites in various locations
- Yearly travel allowance
- Allowance for at-home setup
