We are currently looking for a Data Engineer II, Fandango (AWS/Redshift/PySpark) in the United States. This role offers the opportunity to build and maintain robust data pipelines and systems that power critical business operations for a leading entertainment and digital media platform.
Requirements
- Bachelor’s degree in Computer Science, Computer Engineering, or related technical field, or equivalent practical experience
- 5+ years of applied experience in data engineering, including data pipeline development, orchestration, data modeling, and data lake solutions
- Proficiency in Python, SQL, and experience with reusable, efficient coding for automation and analysis
- Experience with data warehousing and dimensional modeling
- Hands-on experience with large datasets, ETL frameworks, and workflow orchestration tools such as Apache Airflow / Amazon MWAA
- Expertise with AWS data management services (S3, Redshift, DynamoDB, Athena, EMR, Glue, Lambda)
- Familiarity with near real-time data processing and batch pipeline development
- Experience working in agile/scrum environments
- Strong collaboration, problem-solving skills, and the ability to learn new tools and methods quickly
Benefits
- Competitive salary
- Fully remote work flexibility
- Opportunity to work on high-impact projects in the entertainment and digital media space
- Collaborative, supportive environment with professional growth opportunities
- Exposure to large-scale cloud data platforms and cutting-edge technologies
- Inclusive and diverse workplace culture
