Who Are Massive Rocket?
Who Are We Looking to Add to Our Team?
This role is centred around moving, transforming, and operationalising data across systems — ensuring our internal platforms, client environments, and third-party tools remain connected, scalable, and reliable.
You’ll work across cloud infrastructure, ETL orchestration, APIs, warehousing, and platform integrations, helping create the technical foundations that support CRM, analytics, CDP, and customer engagement use cases.
What Will You Do?
- Design, build, and maintain scalable ETL/ELT pipelines across internal and external systems
- Develop and manage integrations between data warehouses, APIs, CDPs, CRM tools, and marketing platforms
- Build and optimise data ingestion frameworks across batch and real-time architectures
- Manage API integrations, webhook frameworks, file-based ingestion, and event-driven pipelines
- Work closely with engineering and platform teams to ensure data is structured, accessible, and reliable
- Improve observability, monitoring, alerting, and performance across data pipelines and platform integrations
- Ensure robust governance, security, and documentation standards across all integrations
- Troubleshoot data delivery issues, system failures, and performance bottlenecks
- Support the implementation of scalable architecture patterns that can be reused across multiple clients
What Makes You a Great Fit
- Strong experience building and maintaining APIs, webhooks, and platform integrations
- Strong understanding of ETL/ELT design patterns and orchestration tools
- Experience with cloud infrastructure such as AWS, Azure, or Google Cloud
- Experience with tools such as dbt, Airflow, Fivetran, Stitch, Hightouch, Segment, or similar platforms
- Familiarity with event streaming or messaging systems such as Kafka, Kinesis, Pub/Sub, or SQS
- Experience working with multiple source systems, including CRM, CDP, product analytics, and transactional systems
- Strong SQL and Python skills with the ability to work across multiple data environments
- Experience building reusable frameworks, automation, and monitoring processes
- Strong problem-solving skills and comfort operating across complex technical environments
Preferred
- Experience working with reverse ETL, composable CDPs, and audience activation tooling
- Familiarity with infrastructure-as-code tools such as Terraform
- Experience with containerisation and orchestration technologies such as Docker or Kubernetes
- Experience working in highly technical agency or consulting environment
