About the role
- Build scalable and reliable data pipeline that collects, transforms, loads and curates data from internal systems
- Augment data platform with data pipelines from select external systems
- Ensure high data quality for pipelines you build and make them auditable
- Drive data systems to be as near real-time as possible
- Support design and deployment of distributed data store that will be central source of truth across the organization
- Build data connections to company's internal IT systems
- Develop, customize, configure self service tools that help our data consumers to extract and analyze data from our massive internal data store
- Evaluate new technologies and build prototypes for continuous improvements in data engineering
- 5+ years of work experience in relevant field (Data Engineer, DW Engineer, Software Engineer, etc)
- Experience with data warehouse technologies and relevant data modeling best practices (Spark, Presto, Druid, etc)
- Experience building data pipelines/ETL and familiarity with design principles
- Excellent SQL skills
- Experience with business requirements gathering for data sourcing
Learn more about us:
Please let Kraken know you found this job on Himalayas. This will help us grow!
About this role
April 29th, 2021
Job posted on
January 12th, 2021
About the companyBased in San Francisco, Kraken is the world’s largest global bitcoin exchange in euro volume and liquidity. Kraken’s clients also trade USD, CAD, ETH, XRP, LTC, and other digital currencies, on a platf...
We'll keep you updated when the best new remote jobs pop up.