deepak koundel
@deepakkoundel
Results-driven Data Engineer with 10 years of experience.
What I'm looking for
I am a results-driven Data Engineer with a decade of experience in designing and building scalable data processing pipelines. My expertise lies in optimizing distributed workloads using Spark, automating workflows with Airflow, and collaborating effectively across cross-functional teams. I have a strong background in banking and financial services, particularly in risk and treasury reporting.
At Deutsche Bank, I developed and maintained the Projection Engine, a data ingestion and processing framework that significantly improved data handling efficiency. My contributions included creating a Python-based Airflow DAG for orchestrating complex workflows and optimizing Spark job performance, which reduced execution time by over 40%. I take pride in mentoring team members and leading initiatives that enhance team proficiency and operational reliability.
Previously, at Citi Bank, I played a key role in migrating legacy ETL pipelines to a modern big data stack, ensuring high-throughput processing and flexibility. My technical skills span various languages and technologies, including SQL, Scala, Python, and cloud platforms like GCP. I am passionate about leveraging my skills to drive impactful data solutions in dynamic environments.
Experience
Work history, roles, and key accomplishments
Data Engineer
Deutsche Bank
Nov 2022 - Present (2 years 7 months)
Developed and maintained Projection Engine (PE), a data ingestion and processing framework that processes data using Spark on a Hadoop cluster and loads it into Oracle and Hive. Optimized Spark job performance, reducing execution time from 9.5 hours to 5.5 hours by tuning configurations such as parallelism, executor memory, and executor cores.
Data Engineer
Citi Bank
Oct 2018 - May 2022 (3 years 7 months)
Contributed to an enterprise-scale migration project, migrating legacy ETL pipelines from Ab Initio and Oracle to a modern big data stack using Spark, Scala, and HBase on Hadoop. Built a scalable Spark-based ingestion framework to handle complex data inputs and load structured outputs into HBase, ensuring flexibility, maintainability, and high-throughput processing.
ETL Developer
Bitwise India
Sep 2014 - Sep 2018 (4 years)
Designed and developed a generic Ab Initio application for data warehousing, enabling automated data extraction, transformation, and loading into a Teradata-based warehouse. Utilized Ab Initio components like Reformat, Rollup, Scan, Join, and Lookup to build scalable ETL processes.
Education
Degrees, certifications, and relevant coursework
Pune University
Bachelor of Engineering, Engineering
Completed a Bachelor of Engineering degree, gaining foundational knowledge and skills in various engineering disciplines. Focused on core engineering principles and problem-solving methodologies.
Availability
Location
Authorized to work in
Job categories
Interested in hiring deepak?
You can contact deepak and 90k+ other talented remote workers on Himalayas.
Message deepakFind your dream job
Sign up now and join over 85,000 remote workers who receive personalized job alerts, curated job matches, and more for free!
