Henry Izaar
@henryizaar
Senior Data Engineer with expertise in cloud data platforms.
What I'm looking for
I am a Senior Data Engineer with over 12 years of experience in designing and optimizing both cloud and on-premise data platforms. My expertise lies in building scalable ETL pipelines and real-time processing solutions utilizing technologies such as SQL, Python, Apache Spark, Kafka, and Flink. I have a proven track record of architecting data lakehouse solutions on AWS, Azure, and GCP, with a strong focus on delivering reliable data systems that empower analytics, machine learning, and actionable business insights.
Throughout my career, I have successfully led projects that involved migrating legacy data platforms to modern architectures, enhancing scalability and reliability. My role at TechNova Solutions as a Lead Data Integration Engineer allowed me to design large-scale ETL pipelines and establish data quality frameworks, ensuring accuracy and consistency across data pipelines. I am passionate about mentoring junior engineers and collaborating with cross-functional teams to deliver production-ready data models that drive business intelligence and reporting.
Experience
Work history, roles, and key accomplishments
Lead Data Integration Engineer
TechNova Solutions
Jan 2020 - Present (5 years 7 months)
Designed and implemented large-scale ETL pipelines using PySpark and Apache Airflow to process multi-terabyte datasets across cloud and on-premise environments. Led the migration of legacy data platforms to a modern data lakehouse architecture on AWS using Databricks and Delta Lake, improving scalability and reliability.
Senior Data Engineer
Databricks
Jan 2018 - Dec 2019 (1 year 11 months)
Developed scalable ETL pipelines using PySpark and Databricks notebooks to process large volumes of structured and semi-structured data. Designed and implemented data lake solutions on AWS integrating Delta Lake for efficient storage and versioned data management.
Data Engineer
Mavericks Labs
Sep 2013 - Dec 2017 (4 years 3 months)
Built and maintained ETL pipelines using Python and Apache Airflow to ingest and transform data from multiple APIs and databases. Designed relational and NoSQL data models in PostgreSQL and MongoDB to support analytics and application workflows.
Education
Degrees, certifications, and relevant coursework
National University of Science and Technology (NUST)
Bachelor of Science, Computer Science
Completed a Bachelor of Science in Computer Science. Gained foundational knowledge in computer science principles and practices.
Availability
Location
Authorized to work in
Job categories
Skills
Interested in hiring Henry?
You can contact Henry and 90k+ other talented remote workers on Himalayas.
Message HenryFind your dream job
Sign up now and join over 100,000 remote workers who receive personalized job alerts, curated job matches, and more for free!
