Nikhil Yadav
@nikhilyadav4
Senior Data Engineer specializing in scalable ETL, cloud data platforms, and Lakehouse architectures.
What I'm looking for
I am a Senior Data Engineer with over six years of experience building scalable ETL and ELT pipelines across cloud and on-prem environments. I specialize in Databricks, Snowflake, PySpark, dbt, and orchestration with Airflow to deliver reliable data platforms.
I've led large migrations and built Lakehouse architectures, migrating enterprise data warehouses from Teradata to Snowflake while ensuring zero data loss and minimal downtime. I have implemented CI/CD for DAGs and ETL workflows, improved pipeline throughput, and reduced execution times through performance tuning and parallelization.
My hands-on work spans Azure, AWS, and GCP — designing data models, data marts, and dimensional schemas and integrating RDBMS and NoSQL sources for analytics and ML feature generation. I have strong experience in SQL tuning, SnowSQL, data governance, and implementing RBAC and security controls for production systems.
I collaborate closely with data scientists, analysts, and business stakeholders to translate requirements into robust pipelines, observability, and self-service BI solutions using Power BI, Tableau, and Databricks SQL. I focus on maintainability, automated testing, and operational monitoring to ensure data quality and timely delivery.
Experience
Work history, roles, and key accomplishments
Led design and implementation of scalable ETL/ELT pipelines and Lakehouse architectures using Databricks, Snowflake, and Azure, reducing processing times by up to 50% and improving pipeline reliability and observability.
Data Engineer
Sobeys
Jan 2021 - Nov 2024 (3 years 10 months)
Developed and optimized Spark/Delta Lake data pipelines and streaming solutions, automated DAG templates in Astronomer Airflow reducing onboarding effort by 30% and enabling real-time ingestion with Snowpipe.
Built Hadoop and Azure-based ETL solutions, migrated on-prem Teradata workloads to cloud Snowflake/Synapse, and developed MapReduce and Spark jobs to support analytics and fraud-detection use cases.
Education
Degrees, certifications, and relevant coursework
Gitam University
Bachelor of Engineering, Computer Science Engineering
Bachelor’s degree in Computer Science Engineering from Gitam University.
Tech stack
Software and tools used professionally
Splunk
Apache Spark
AWS Glue
Talend
Microsoft Azure
Azure Storage
GitHub
GitLab
Kubernetes
Jenkins
GitHub Actions
GitLab CI
NumPy
Pandas
PySpark
AWS Data Pipeline
dbt
DB
Sqoop
MySQL
PostgreSQL
MongoDB
Cassandra
Hadoop
HBase
Gmail
Yarn
Databricks
Jira
JavaScript
Java
JSON
Perl
PowerShell
XML
scikit-learn
Kafka
Azure Monitor
Linux
Windows
Datadog
GraphQL
ws
The Hive
Serverless
Amazon RDS
sso
Airflow
Time Analytics
GraphDb
Compiled
SQL
AWS KMS
Delta Lake
Trino
dbt Cloud
Bash
Transform
Jasper
Enhance
Make
Availability
Location
Authorized to work in
Job categories
Skills
Interested in hiring Nikhil?
You can contact Nikhil and 90k+ other talented remote workers on Himalayas.
Message NikhilFind your dream job
Sign up now and join over 100,000 remote workers who receive personalized job alerts, curated job matches, and more for free!
