Key Responsibilities
- Collaboration
- Collaborate with the Senior Data Engineer to design, develop, and maintain scalable data pipelines and ETL processes that support internal reporting and embedded analytics.
- Partner with the data analysts to understand data needs, assisting in the creation of dashboards, reports, and KPIs.
- Data Solutions
- Design and build data solutions to support reporting, analytics, and business insights for supply chain solutions.
- Ensuring high data quality, integrity, and security across the different data solutions.
- Participate in building dashboards and visualisations to communicate insights to stakeholders clearly and effectively.
- Solutions Lifecycle
- Troubleshoot and resolve data-related issues, improving data flow efficiency and performance.
- Evaluate and recommend improvements in our data tools, architecture, and practices, focusing on long-term scalability and adaptability.
- Data Pipelines
- Support in designing, building, and maintaining advanced data models and data warehouses.
- Writing, optimising, and reviewing SQL queries for data extraction and transformation.
Skills, Knowledge and Expertise
Soft Skills
- Enjoy working collaboratively with data engineers and analysts to enhance data insights and analytics.
- You thrive in team environments, are open to feedback, and communicate well with technical and non-technical stakeholders.
- Passionate about solving both business and technical problems with a strong analytical mindset and attention to detail.
- Self-motivated and able to take the initiative to tackle incoming challenges.
- You work effectively under time constraints and can manage competing priorities.
- Experience in statistical analysis, predictive modelling, and data interpretation is beneficial.
- Strong visualisation and BI skills
- Proficiency in Google Sheets and/or Excel
- Solid SQL skills, with experience in optimising queries and managing complex datasets.
- Understands data warehouse concepts and has experience in data modelling and designing and building data warehouses.
- Have experience building ETL/ELT data pipelines and data manipulation.
- Familiarity with version control systems (e.g., Git).
- Experience with GCP BI Components (BigQuery, Cloud Functions and Compute Engine), dbt, data visualisation tools (i.e. Thoughspot, Looker, Power BI), Github, and Terraform.
- Basic Python scripts
