A LITTLE BIT ABOUT Boldr
- Boldr is the first global B-Corp dedicated to delivering world-class Client experiences while creating access to dignified, meaningful work in communities around the world.
- We are a global team, united by our desire to connect diverse people with common values for boldr impact.
- We employ just over a thousand team members across five countries and we want to employ over 5,000 people by 2027, if not sooner.
LET’S START WITH OUR VALUES
- Meaningful connections start with AUTHENTICITY
- We do our best work by being CURIOUS
- We grow by remaining DYNAMIC
- Our success combines AMBITIOUS VISION with OPERATIONAL EXCELLENCE
- At the heart of great partnerships we’ll always find EMPATHY
WHAT IS YOUR ROLE
As a Lead Data Engineer, you’ll design, build, and optimize data pipelines that power insights for both Boldr and our clients. You’ll help shape our data ecosystem, ensuring information flows securely and efficiently across platforms.
You’ll collaborate with analysts, engineers, and business stakeholders to turn raw data into meaningful outputs. You’ll also mentor junior team members and support best practices in data quality, security, and scalability.
WHY DO WE WANT YOU
We are looking for an impact-driven data engineer who is passionate about building scalable, reliable data solutions that help Boldr grow and deliver exceptional client outcomes. You enjoy mentoring and collaborating with colleagues, championing data best practices, and solving complex problems with curiosity and precision.
You thrive in fast-moving environments and take pride in your work, always looking for ways to improve processes, automate repetitive tasks, and contribute to the success of the team. Most importantly, you embody Boldr’s values: you are curious, dynamic, and authentic.
WHAT WILL YOU DO
Data Engineering & Pipeline Development
- Design, implement, and maintain scalable ETL/ELT pipelines to process both internal and client data.
- Ensure data is clean, accurate, and readily available for analysts and business stakeholders.
- Optimize pipelines for efficiency, performance, and reliability.
- Troubleshoot and resolve data pipeline and workflow issues promptly.
Data Architecture & Quality
- Contribute to enhancing the data platform, including database design, schema optimization, and data integration.
- Implement and maintain data quality checks, monitoring, and error handling.
- Document data sources, transformations, and standards to ensure clarity and maintainability.
Mentorship & Collaboration
- Mentor junior engineers, providing guidance on coding, problem-solving, and best practices.
- Conduct code reviews to ensure high-quality deliverables and adherence to standards.
- Collaborate with analysts, engineers, and business teams to understand requirements and provide actionable solutions.
- Facilitate cross-functional access to data and support stakeholders in using data effectively.
Process Improvement & Innovation
- Identify and implement process improvements to streamline data workflows.
- Automate repetitive tasks and optimize existing pipelines.
- Explore new technologies and tools to enhance data capabilities.
Stakeholder Engagement
- Partner with product, engineering, and business teams to translate business needs into data requirements.
- Support reporting and analytics initiatives to help stakeholders make data-driven decisions.
- Advocate for best practices in data governance, security, and compliance.
YOU ARE…
- Curious: You ask questions, explore new solutions, and stay updated on industry trends.
- Dynamic: You adapt to changing priorities and thrive in a fast-moving environment.
- Authentic: You communicate openly, provide honest feedback, and build trust.
- Collaborative: You work effectively across teams and support colleagues in achieving goals.
- Improvement-minded: You continuously look for ways to optimize processes and enhance systems.
YOU HAVE…
Experience
- 6+ years in data engineering or related roles.
- 2+ years mentoring or guiding junior engineers.
- Hands-on experience with ETL/ELT pipelines, data platforms, and cloud tools (Airflow, Snowflake, Athena, Glue, Postgres).
- Strong programming skills (SQL, Python, or other relevant languages).
- Familiarity with cloud platforms (AWS, GCP, Azure), APIs, and data warehousing.
Skills & Attributes
- Strong problem-solving and analytical skills.
- Excellent communicator and collaborator.
- Able to balance hands-on work with process improvements.
- Passionate about mentoring and team growth.
- Continuously seeks opportunities for improvement and problem-solving.
