Lovelytics is a Databricks-focused data and AI consulting firm specializing in artificial intelligence, data, and analytics solutions. Since partnering with Databricks in 2019, Lovelytics has experienced exponential growth, growing from 50 people to over 340 over the past 3 years. Lovelytics is a trusted partner for many of the most high-profile enterprise clients in Media & Entertainment, Manufacturing, Retail & CPG, Healthcare & Life Sciences, and Financial Services.
We’re looking for a Data Engineering Architect to design and lead the build-out of modern, scalable data platforms and pipelines for clients across industries. This role is deeply hands-on in data engineering architecture, helping clients modernize their data to improve business outcomes.
You’ll define technical strategies, architect data platforms, and guide delivery teams to implement best-in-class ingestion, transformation, and storage solutions on the cloud. You’ll also partner with sales and account teams to scope engagements, shape technical proposals, and showcase Lovelytics’ engineering expertise.
This role is open to remote candidates in the U.S. and Ontario, Canada. You’re also welcome in any of our offices in Arlington, VA; Chicago, IL; New York, NY; or Toronto!
Primary Responsibilties
- Formulate forward-looking data strategies aligned with client business objectives and industry best practices
- Design and oversee large-scale lakehouse and warehouse implementations on Databricks (must-have) and other cloud-native technologies
- Create solutions that integrate on-premises and multiple cloud environments seamlessly.
- Architect batch and streaming ingestion, real-time processing, and ELT/ETL patterns
- Ensure security, privacy, compliance, and data quality at scale on client engagements
- Tackle intricate data engineering challenges and make strategic decisions to de-risk delivery
- Introduce emerging technologies and methodologies to keep client solutions at the cutting edge.
- Drive performance, cost optimization, scalability, and maintainability across data engineering solutions
- Mentor engineers, review architectures and code, and guide teams through implementation.
- Lead technical discovery, shape solution architectures, respond to RFPs, and deliver demos and proofs of concept for data engineering engagements
- Create technical blueprints and recommend tools, frameworks, and design patterns aligned to client needs
Required Qualifications
- Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field.
- 8+ years of experience in data engineering and architecture, including large-scale cloud deployments
- 4+ years in a client facing role, preferably in a professional services firm
- Proven track record designing and implementing modern data lakehouses, warehouses, and pipelines in AWS, Azure, or GCP
- Expert knowledge of Databricks and Spark (required)
- Experience creating proofs of concept, technical presales presentations, and pricing for engagements
- Strong client-facing communication skills with the ability to influence technical and executive stakeholders