Work where work matters.
Elevate your career at Qodea, where innovation isn't just a buzzword, it's in our DNA.
We are a global technology group built for what's next, offering high calibre professionals the platform for high stakes work, the kind of work that defines an entire career. When you join us, you're not just taking on projects, you're solving problems that don't even have answers yet.
You will join the exclusive roster of talent that global leaders, including Google, Snap, Diageo, PayPal, and Jaguar Land Rover call when deadlines seem impossible, when others have already tried and failed, and when the solution absolutely has to work.
Forget routine consultancy. You will operate where technology, design, and human behaviour meet to deliver tangible outcomes, fast. This is work that leaves a mark, work you’ll be proud to tell your friends about.
Qodea is built for what’s next. An environment where your skills will evolve at the frontier of innovation and AI, ensuring continuous growth and development.
We look for people who embody:
Innovation to solve the hardest problems.
Accountability for every result.
Integrity always.
About The Role
We're looking for an experienced and hands-on Senior Scala Engineer to help drive the modernization and evolution of our core data processing platforms. This role is key to enhancing and scaling high-traffic, mission-critical data pipelines on the Google Cloud Platform (GCP).
You will be a vital contributor to our Scala-based Dataflow processing, focusing on implementation, performance, and reliability at scale. This is a hands-on technical role for someone who thrives on solving complex distributed systems problems and delivering high-quality, efficient code within a collaborative team.
- Implement and enhance sophisticated, high-throughput batch and streaming data pipelines using Scala and GCP Dataflow (Apache Beam).
- Contribute to the architectural design and technical roadmap for modernizing our data ingestion and processing pipelines.
- Develop and implement performance optimizations, such as migrating services to use Cloud Functions for exporting data from BigQuery to Pub/Sub more efficiently.
- Collaborate with Staff Engineers to expand the functionality of our data export systems for external partners. This includes pulling data from BigQuery, creating formatted feed files, and exporting them to various external stores (e.g., FTP, GCS, S3).
- Support and contribute to the expansion of existing data integrations, including adding new fields to schemas and ensuring that data is propagated correctly through the entire pipeline.
- Actively contribute to the refactoring and modernization of legacy Scala codebases, applying best practices in functional programming, testing, and observability.
- Collaborate closely with teams using Node.js APIs (for data ingestion) and Python (for data transformations), maintaining clear data contracts and robust integration patterns.
- Act as a strong technical communicator, contributing to technical approaches, breaking down complex problems, and engaging in rapid, iterative development cycles by asking questions quickly rather than working in isolation.
This role is designed for impact, and we believe our best work happens when we connect. While we operate a flexible model, we expect you to spend time on site (at our offices or a client location) for collaboration sessions, customer meetings, and internal workshops.
Requirements
What Success Looks Like
- Degree in Computer Science or a related technical discipline.
- Extensive experience of software engineering. with a strong background in building and operating large-scale, high-traffic distributed systems in production.
- Deep expertise in Scala and its functional programming paradigms.
- Demonstrable, hands-on experience with GCP Dataflow (Apache Beam). Experience with other major streaming/batch frameworks (e.g., Apache Spark, Akka Streams) is also highly valuable.
- Strong proficiency in the GCP ecosystem, including critical services like BigQuery, Pub/Sub, and Cloud Functions.
- Solid understanding of data engineering principles, including data modeling, schema design, and data lifecycle management.
- Experience building and maintaining data export feeds to external systems (e.g., SFTP, GCS, S3).
- Proven ability to take ownership of complex technical features and deliver them from design to production.
- Excellent communication skills, with experience working in polyglot environments (interfacing with Node.js, Python, etc.) and a proactive, inquisitive approach to problem-solving.
- Ability to be available for key synchronous meetings and standups between 4:00- 6:00 PM UK time.
Nice to Have
- Familiarity with CI/CD best practices and infrastructure-as-code tools (e.g., Terraform, Docker, Kubernetes).
- Working knowledge of Node.js or Python for data-related tasks.
- Experience with other data stores (e.g., NoSQL, time-series databases) or data orchestration tools (e.g., Airflow).
Benefits
Our Benefits
Culture and Environment
- We are a team of passionate people who genuinely care about what they do and the standard of work they produce.
- Collaborate with our two hubs in Portugal: Lisbon and Porto.
- A strong company culture that includes weekly meetings, company updates, team socials, and celebrations.
- In-house DE&I council and mental health first-aiders.
Time Off and Well-being
- 25 days’ annual leave, Juneteenth, your birthday off, and a paid office closure between Christmas and New Year's.
- Health insurance.
- 15 days of paid sickness and wellness days.
Growth and Development
- A generous learning and development budget and an annual leadership development programme.
Diversity and Inclusion
At Qodea, we champion diversity and inclusion. We believe that a career in IT should be open to everyone, regardless of race, ethnicity, gender, age, sexual orientation, disability, or neurotype. We value the unique talents and perspectives that each individual brings to our team, and we strive to create a fair and accessible hiring process for all.
