Duties and Responsibilities:
- Build realtime data pipelines to ingest structured and unstructured data into data warehouse
- Build ETL pipelines using Airflow to drive analytics, reporting and machine learning
- Build and maintain data governance, classification and dictionary
- Constantly improve A/B test framework and reporting
- Support leadership and every single team with research on key business initiatives and challenges
Qualifications and Skills:
- 5+ years of experience with Python, Java, or Scala
- 5+ years of solid software engineering experience
- 5+ years of experience with ETL and Data Pipelines
- 5+ years of experience with Airflow, Hive, Presto, AWS, etc.
- Comfortable with navigating complex topics and using data to make decisions
- Excellent communication, presentation, and interpersonal skills
Everyone is welcome at Ethos. We are an equal opportunity employer who values diversity and inclusion and look for applicants who understand, embrace and thrive in a multicultural world. We do not discriminate on the basis of race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability status. Pursuant to the SF Fair Chance Ordinance, we will consider employment for qualified applicants with arrests and conviction records.
Please let Ethos know you found this job on Himalayas. This will help us grow!
About this role
October 13th, 2021
Job posted on
January 2nd, 2021
Ethos is hiring for this role in the following timezones:
About the companyEthos provides modern, ethical life insurance to protect the life you're building and the people you love. Ethos is built for people who don't have time for fine print, extra doctors appointments or hi...
We'll keep you updated when the best new remote jobs pop up.