Sneha Sabale
@snehasabale
I’m a Data Engineer specializing in scalable ELT pipelines, incremental loading, and Snowflake/DBT optimization.
What I'm looking for
I’m a Data Engineer with 4+ years of experience building enterprise data pipelines across traditional ETL (Informatica BDM, CDC) and modern ELT frameworks (DBT on Snowflake). I focus on delivering analytics-ready datasets with strong performance and reliable data quality.
At Philip Morris International, I designed scalable DBT on Snowflake ELT pipelines using a layered architecture (staging, intermediate, marts). I built reusable DBT macros and modular transformation models to standardize logic, reduce code redundancy, and improve maintainability.
I also optimized complex Snowflake SQL queries, improving performance by ~30–40%, and implemented incremental loading strategies for efficient processing of large datasets. I integrated CI/CD workflows using GitHub and Bitbucket, and developed DBT data quality tests and documentation for reliability and lineage transparency.
In prior work at Bank of Baroda, I built ETL pipelines integrating Kafka, flat files, and relational sources into the Hadoop ecosystem, and implemented near real-time CDC using Informatica PowerExchange and Kafka. I was recognized as “Outstanding Performer – November 2025” (Accenture at PMI) for stabilizing data pipelines and proactively mitigating production risks.
Experience
Work history, roles, and key accomplishments
Designed and implemented scalable ELT pipelines using dbt on Snowflake for Philip Morris International, using layered architecture (staging, intermediate, marts) and reusable macros. Optimized Snowflake SQL for 30–40% performance gains, implemented incremental loading, and built dbt data quality tests and CI/CD deployments via GitHub and Bitbucket.
Data Engineer
Bank of Baroda
Jun 2023 - Feb 2025 (1 year 8 months)
Built and maintained ETL pipelines integrating Kafka, flat files, and relational sources into the Hadoop ecosystem. Implemented near real-time CDC ingestion using Informatica PowerExchange and Kafka, and optimized Hive/Impala queries to improve execution performance and resource utilization.
Associate Data Engineer
Bank of Baroda
Oct 2021 - May 2023 (1 year 7 months)
Configured Informatica CDC with Kafka for real-time data capture by registering source tables and mapping Kafka objects into ingestion workflows. Automated batch workflows with shell scripting to improve scheduling efficiency and supported system maintenance through upgrades, patching, and issue resolution.
Education
Degrees, certifications, and relevant coursework
Terna Engineering College
Bachelor of Engineering, Computer Engineering
2018 - 2021
Completed a B.E. in Computer Engineering at Terna Engineering College under Mumbai University.
Government Polytechnic, Mumbai
Diploma in Computer Engineering, Computer Engineering
2015 - 2018
Completed a Diploma in Computer Engineering at Government Polytechnic, Mumbai.
Availability
Location
Authorized to work in
Job categories
Interested in hiring Sneha?
You can contact Sneha and 90k+ other talented remote workers on Himalayas.
Message SnehaFind your dream job
Sign up now and join over 100,000 remote workers who receive personalized job alerts, curated job matches, and more for free!
