Sairam Patturi
@sairampatturi
Experienced Data Engineer with expertise in Big Data technologies.
What I'm looking for
With over 10 years of IT experience in Data Engineering, I have honed my skills in Big Data technologies, particularly within the Hadoop ecosystem. My expertise spans across various industries, including banking, healthcare, and manufacturing, where I have successfully designed and implemented robust data ingestion, processing, and analytics pipelines. I am proficient in using tools such as Azure Databricks, Talend, and PySpark, which have enabled me to deliver efficient data solutions that meet complex business requirements.
Throughout my career, I have demonstrated a strong ability to translate business needs into scalable data architectures. My hands-on experience with Azure cloud services, including Azure Data Factory and Azure Data Lake Storage, has allowed me to lead data migration efforts and optimize ETL workflows. I take pride in my collaborative approach and my capacity to adapt to new technologies, ensuring that I remain at the forefront of the rapidly evolving data landscape.
Experience
Work history, roles, and key accomplishments
Senior Data Engineer
Strategic System Inc.
Jul 2025 - Present (2 months)
Developed robust ETL data ingestion pipelines from diverse sources into Azure Data Lake Storage (ADLS) using Azure Data Factory (ADF) and Databricks. Built scalable and reusable ETL/ELT workflows leveraging PySpark, Scala, SQL, and Talend to transform and process high-volume structured and semi-structured datasets efficiently.
Senior Data Engineer
Wells Fargo
Oct 2021 - Jun 2025 (3 years 8 months)
Designed and implemented ETL data ingestion pipelines from various sources into Azure Data Lake Storage (ADLS) using Azure Data Factory (ADF) and Databricks. Developed scalable and reusable ETL/ELT workflows using PySpark, Scala, SQL, and Talend to transform and process high-volume structured and semi-structured datasets.
Data Engineer
Micron
Sep 2019 - Sep 2021 (2 years)
Built ETL data ingestion pipeline design from multiple sources using Azure Data Factory and Databricks. Led large-scale data migration and modernization projects to Snowflake and Databricks, leveraging Snowpipe, Delta Lake time travel, RBAC, and advanced data modeling techniques.
Software Engineer
CGI
Feb 2016 - Sep 2019 (3 years 7 months)
Involved in Model Requirement Analysis and Specification Documentation, and Model Designing and developing the code for Business Case. Responsible for importing data from RDBMS to Hadoop Distributed File System (HDFS) using Sqoop and later analyzed the imported data using Spark and Scala.
Education
Degrees, certifications, and relevant coursework
Unknown
Bachelor of Technology, Computer Science and Engineering
Completed a Bachelor of Technology in Computer Science and Engineering. This program provided a strong foundation in computer science principles and engineering practices.
Tech stack
Software and tools used professionally
Amazon API Gateway
Amazon Redshift
Splunk
Azure Synapse
Apache Spark
AWS Glue
Talend
Amazon CloudWatch
Amazon S3
AWS Step Functions
GitHub
Jenkins
GitHub Actions
PySpark
dbt
DB
Sqoop
MySQL
PostgreSQL
MongoDB
Cassandra
Hadoop
Gmail
Yarn
Databricks
AWS CloudFormation
Visual Studio
PyCharm
Azure DevOps
Jira
Java
AWS CloudTrail
Kafka
Amazon SQS
Apache NiFi
Grafana
Amazon Kinesis
AWS Lambda
Azure Functions
Amazon RDS
sso
Time Analytics
SQL
Apache Iceberg
Availability
Location
Authorized to work in
Job categories
Skills
Interested in hiring Sairam?
You can contact Sairam and 90k+ other talented remote workers on Himalayas.
Message SairamFind your dream job
Sign up now and join over 100,000 remote workers who receive personalized job alerts, curated job matches, and more for free!
