HimalayasHimalayas logo
SW

Senior Data Engineer (GCP, BigQuery, Looker), PK [AS233]

Smart Working
Pakistan only

Stay safe on Himalayas

Never send money to companies. Jobs on Himalayas will never require payment from applicants.

About Smart Working
At Smart Working, we believe your job should not only look right on paper but also feel right every day. This isn’t just another remote opportunity - it’s about finding where you truly belong, no matter where you are. From day one, you’re welcomed into a genuine community that values your growth and well-being.

Our mission is simple: to break down geographic barriers and connect skilled professionals with outstanding global teams and products for full-time, long-term roles. We help you discover meaningful work with teams that invest in your success, where you’re empowered to grow personally and professionally.

Join one of the highest-rated workplaces on Glassdoor and experience what it means to thrive in a truly remote-first world.

About the Role
We're looking for a Senior Data Engineer to become a cornerstone of our data platform team. This is a long-term, strategic role, not a short sprint. You'll be embedded in a collaborative engineering and analytics team, working across the full data lifecycle: ingestion, transformation, modelling, and surfacing insights through Looker. You'll work closely with stakeholders across commercial, product, and marketing to ensure data is reliable, scalable, and meaningful.

You'll be given real ownership. This is a role for someone who wants to shape standards, improve the architecture, and grow with a brand that takes its data seriously.

Responsibilities

  • Design, build, and maintain robust ETL/ELT pipelines that move data from source systems into Google BigQuery, ensuring reliability, scalability, and observability at every stage.
  • Develop and enforce data models and schema standards using best-practice SQL and dimensional modelling principles, with a focus on clarity, reuse, and performance.
  • Own the Google BigQuery environment, optimising queries, managing costs, enforcing data governance, and ensuring the platform scales alongside the business.
  • Build and maintain Looker explores, LookML models, and dashboards that translate complex datasets into clear, actionable business intelligence for non-technical stakeholders.
  • Work across the full Google Cloud Platform stack, including Cloud Storage, Dataflow, Pub/Sub, Cloud Functions, and Composer, to architect end-to-end data solutions.
  • Partner with analytics, engineering, and commercial teams to understand data requirements and translate business problems into scalable technical solutions.
  • Champion data quality and testing frameworks, implementing monitoring and alerting so that issues are caught early and resolved quickly.
  • Contribute to documentation, coding standards, and architectural decision records so the team can move fast with confidence.
  • Mentor junior data team members and set the bar for engineering rigour across the data function.
  • Stay current with developments in the modern data stack and proactively recommend tooling or process improvements where appropriate.

Requirements

  • 5+ years of experience in SQL and data modelling, with strong command of dimensional modelling, star schemas, and performance optimisation.
  • 3+ years working with Google BigQuery in a production environment.
  • 3+ years hands-on experience with Google Cloud Platform (Cloud Storage, Dataflow, Pub/Sub, Cloud Functions, Composer).
  • 3+ years building and maintaining ETL/ELT pipelines at scale.
  • 1+ year working with Looker and LookML to deliver business-facing dashboards and data products.
  • Demonstrable experience leading at least one data project end-to-end, from scoping through to delivery.
  • Able to communicate clearly with non-technical stakeholders about data limitations, timelines, and trade-offs.
  • Comfortable making pragmatic architecture decisions in a cloud-native, modern data stack environment.

Nice to Have

  • Experience with dbt (Data Build Tool) for transformation layer management and testing.
  • Familiarity with orchestration tools such as Apache Airflow or Cloud Composer.
  • Python skills for pipeline scripting, data validation, or automation.
  • Background in retail, ecommerce, or fashion, understanding how data flows across commercial and digital channels.
  • Exposure to real-time or streaming data pipelines using Pub/Sub or Dataflow.
  • Experience with Terraform or Infrastructure-as-Code practices in a GCP context.
  • Familiarity with data governance frameworks, cataloguing, and lineage tracking.

At Smart Working, you’ll never be just another remote hire.

Be a Smart Worker - valued, empowered, and part of a culture that celebrates integrity, excellence, and ambition.

If that sounds like your kind of place, we’d love to hear your story.

About the job

Apply before

Posted on

Job type

Full Time

Experience level

Experience

5 years minimum

Location requirements

Hiring timezones

Pakistan +/- 0 hours
SW

Smart Working

View company profile

Similar remote jobs

Here are other jobs you might want to apply for.

View all remote jobs

125 remote jobs at Smart Working

Explore the variety of open remote roles at Smart Working, offering flexible work options across multiple disciplines and skill levels.

View all jobs at Smart Working

Find your dream job

Sign up now and join over 100,000 remote workers who receive personalized job alerts, curated job matches, and more for free!

Sign up
Himalayas profile for an example user named Frankie Sullivan