Jagadish Reddy
@jagadishreddy
Senior Data Engineer with extensive experience in Big Data technologies.
What I'm looking for
I am a Senior Data Engineer with over 9 years of experience in Information Technology, specializing in Hadoop and Big Data processing. My expertise lies in implementing complex Big Data projects using technologies such as Apache Hadoop, Spark, and various Azure services. I have successfully migrated SQL databases to Azure Data Lake and developed data engineering pipelines that enhance data accessibility and usability.
Throughout my career, I have worked with industry leaders like JP Morgan Chase and Alaska Airlines, where I designed and implemented data solutions that improved system uptime and data processing efficiency. My hands-on experience with AWS and Azure cloud infrastructures, combined with my proficiency in data modeling and performance tuning, allows me to deliver high-quality results in fast-paced environments. I am passionate about leveraging data to drive business insights and support strategic decision-making.
Experience
Work history, roles, and key accomplishments
Senior Data Engineer
JP Morgan Chase
Sep 2022 - Present (2 years 8 months)
Designed technical solutions aligning with business and IT strategies, developing applications for data transformation across enterprise analytics platforms. Built an ETL framework for data migration to Azure cloud and created a new data quality check framework in Python.
Data Engineer
Alaska Airlines
Apr 2020 - Aug 2022 (2 years 4 months)
Developed Scala Spark pipelines to transform raw data into parquet files and load data from Hive to Amazon RDS. Worked extensively with AWS services like S3, Glue, EMR, and Lambda to process data for downstream customers. Created CI/CD pipelines using Jenkins and Rundeck for job scheduling.
Data Engineer
ADP
Aug 2018 - Apr 2020 (1 year 8 months)
Designed Hadoop architecture and set up a multi-node cluster, performing performance tuning on Azure Data Factory pipelines. Extracted data from various sources using Sqoop and built a Data Discovery Platform using Azure HDInsight components. Developed ETL operations using Pig Latin scripts and Python, and created workflows on Talend.
Junior Software Developer
Geospatial Solutions
Jun 2014 - Jun 2015 (1 year)
Imported data from various formats to HDFS and ingested data from RDBMS sources using Sqoop. Developed real-time data ingestion and analysis using Kafka and Spark Streaming. Managed and scheduled jobs on a Hadoop cluster using Oozie and built NiFi dataflows.
Education
Degrees, certifications, and relevant coursework
Northwestern University
Master of Science, Information Systems
2015 - 2017
Pursued a Master of Science degree focusing on Information Systems. Studied advanced topics in data management, system analysis, and IT strategy.
Jawaharlal Nehru Technological University (JNTU)
Bachelor of Science, Computer Science
2010 - 2014
Completed a Bachelor of Science program in Computer Science. Gained foundational knowledge in programming languages, data structures, and algorithms.
Tech stack
Software and tools used professionally
Azure HDInsight
Azure Synapse
Apache Spark
AWS Glue
Apache Hive
Talend
Azure Storage
Azure Repos
Jenkins
Pandas
PySpark
DB
Sqoop
MySQL
MongoDB
Cassandra
Hadoop
HBase
Gmail
Yarn
Azure DevOps
Jira
Java
JSON
AWK
XML
Kafka
Amazon SQS
Zookeeper
Linux
Azure Active Directory
The Hive
Avro
AWS Lambda
Azure Functions
Amazon RDS
Azure SQL Database
Airflow
Apache Ranger
Amazon Web Services (AWS)
SQL
Amazon SageMaker
Rundeck
ADP
Availability
Location
Authorized to work in
Job categories
Skills
Interested in hiring Jagadish?
You can contact Jagadish and 50k+ other talented remote workers on Himalayas.
Message JagadishFind your dream job
Sign up now and join over 85,000 remote workers who receive personalized job alerts, curated job matches, and more for free!
