Badal Shrestha
@badalshrestha
I build secure, cloud-native data platforms as a senior data engineer.
What I'm looking for
I’m a Senior Data Engineer with 8 years of experience designing and optimizing scalable data platforms, with deep expertise in Google Cloud Platform (GCP). I lead cross-functional teams to build high-performance, analytics-ready pipelines that improve decision-making.
I’ve architected end-to-end data workflows and migrated legacy systems to GCP-based lakehouse architectures, delivering measurable impact—like a 30% improvement in pipeline efficiency, 20% lower processing costs, and reduced infrastructure costs. I focus on robust governance, compliance, and reliable data modeling to keep platforms secure and enterprise-ready.
Hands-on engineering is central to my approach. I’ve optimized pipelines with Apache Beam on Dataflow, automated ETL/ELT using Cloud Composer (Airflow), and enabled real-time ingestion with Pub/Sub + Dataflow for faster operational insights.
I bring a strong balance of technical leadership and mentorship—improving deployment times, reducing pipeline downtime, and expanding self-service analytics with Looker. I’m passionate about enabling data-driven decision-making through cloud-native engineering that teams can trust.
Experience
Work history, roles, and key accomplishments
Designed a GCP-based ingestion framework feeding clinical, claims, and operational data from 70M+ patient records into a centralized data lake. Migrated legacy systems to BigQuery, cutting data processing time 30%, infrastructure costs 20%, and improving real-time pipeline throughput 35% while strengthening HIPAA-governed data access and streaming analytics.
Built and operated a GCP ingestion platform using GCS and BigQuery to deliver timely insights for medical teams. Led legacy migrations and pipeline automation with Dataflow/Composer, improving data retrieval 20% and processing performance 30%, while enabling near real-time streaming alerts via Pub/Sub and improving reliability by reducing pipeline failures 25%.
Developed scalable AWS ETL pipelines on Glue and Redshift to process claims and underwriting data, reducing data processing time 30%. Migrated legacy ETL to AWS with S3/Glue and optimized transformations with Lambda, cutting operational costs 20% and improving efficiency by reducing manual intervention 40%.
Built and optimized backend services and microservices in Python (Flask/FastAPI/Django), improving performance of hardware diagnostics tools by 30%. Developed Kafka-based data pipelines and REST APIs, improving data ingestion rate 40% and backend response times 35%, while implementing secure auth (OAuth2/JWT) and infrastructure automation with Terraform.
Education
Degrees, certifications, and relevant coursework
Midwestern State University
Master of Business Administration, Business Analytics
Earned an MBA with a focus on Business Analytics at Midwestern State University in Wichita Falls, Texas.
Tech stack
Software and tools used professionally
Amazon Redshift
Azure Synapse
Apache Spark
AWS Glue
Talend
Amazon Quicksight
AWS IAM
Google Cloud Platform
Amazon CloudWatch
Stackdriver
Amazon S3
Google Cloud Storage
GitLab
Kubernetes
Jenkins
dbt
MySQL
PostgreSQL
MongoDB
Cassandra
Hadoop
HBase
Gmail
Django
Databricks
Redis
Terraform
AWS CloudFormation
Java
Perl
AWS CloudTrail
TensorFlow
PyTorch
MLflow
scikit-learn
Kubeflow
Kafka
FastAPI
Prometheus
SQLAlchemy
Google Cloud Dataflow
Amazon Kinesis
Google Cloud Pub/Sub
Ansible
AWS Lambda
Azure SQL Database
pytest
OAuth2
Airflow
Apache Beam
Time Analytics
SQL
Enhance
Beam
Safe
Jan
Availability
Location
Authorized to work in
Job categories
Skills
Interested in hiring Badal?
You can contact Badal and 90k+ other talented remote workers on Himalayas.
Message BadalFind your dream job
Sign up now and join over 100,000 remote workers who receive personalized job alerts, curated job matches, and more for free!
