The Data Engineer role will play a key role in optimising and managing operational workflows across projects to ensure efficiency and productivity. The ideal candidate will utilise advanced Python and Bash scripting skills to build, troubleshoot, and enhance workflows, while leveraging GCP/AWS services for deployment and automation.
Requirements
- Minimum 6 years of experience in operations with a track record of successfully managing workflows and processes
- Proficiency in Python for scripting and automation tasks
- Strong knowledge of Bash scripting
- Familiarity with pipeline workflow systems
- GCP/AWS knowledge (preferred but not mandatory)
- Demonstrated experience managing multiple projects simultaneously, ensuring quality and timely delivery
- Excellent verbal and written communication skills to effectively liaise with wider teams and stakeholders
- Strong analytical skills with a proactive approach to identifying and resolving operational issues
Benefits
- Full-time employment
- Remote work arrangement
