I am looking for a Data Engineer role.
Nandini Wadaskar
@nandiniwadaskar
Senior Data Engineer |Snowflake SQL, Python, DBT, Databricks Azure, AWS| Docker, Apache
What I'm looking for
Data Engineer | Expertise in Scalable Data Solutions, Automation, and Analytics
As a results-driven Data Engineer, I specialize in designing efficient ETL pipelines, automating workflows, and delivering actionable insights with tools like SQL, Python, PySpark, and advanced data engineering frameworks. My work empowers businesses to maximize data value while enhancing decision-making processes.
Professional Highlights:
Data Integration: Led large-scale data migrations with tools like Alteryx, SSMS, and Azure SQL, ensuring accuracy and governance.
Workflow Automation: Reduced processing times by 70% through Python-based automation, boosting departmental efficiency.
Scalable Pipelines: Built robust pipelines with JAMS, Databricks, and Azure Data Factory, enabling seamless data transformation.
Data Visualization: Designed dashboards with Power BI, Tableau, and Looker, enhancing data accessibility and improving decisions by 80%.
Technical Skills:
Data Engineering: Snowflake, DBT, Databricks, Azure, AWS, Airflow, SparkSQL, PySpark
Visualization: Power BI, Tableau, Looker, QuickSight
Automation & Scripting: Python, SQL, Alteryx, Bash
Other Tools: Docker, Streamlit, GitHub, Soda
Key Projects:
Credit Automation (SQL-Python): Developed a credit tracking system, reducing financial errors by 75%.
Govt ID ML Project: Built a computer vision solution to extract Aadhaar card data using YOLO v8, OCR, and GCP Vertex AI.
YouTube Trends Pipeline: Analyzed trends using AWS Glue, Athena, and QuickSight, showcasing expertise in cloud data engineering.
Warehousing Model (Python): Created a pricing strategy optimizing warehouse profitability via cubic foot analysis.
Experience
Work history, roles, and key accomplishments
Data Engineer
Silicon Labs
Mar 2024 - Present (1 year 2 months)
-Migrated legacy systems to BMS by validating and correcting data with Alteryx and SQL, ensuring accuracy and quality.
-Processed 7M+ records and built data sources for dashboards using orchestration tools.
-Transitioned Alteryx workflows to Python pipelines orchestrated with JAMS for efficiency.
-Executed POC with PySpark, creating jobs, dashboards, and scripts.
- Developed Python scripts for operations, reducing processing time by 70% across six major projects, including cost calculators and ETL workflows.
- Designed Azure pipelines for seamless data flow.
- Resolved 600+ tickets and delivered 100+ reports with an 86% SLA success rate.
Education
Degrees, certifications, and relevant coursework
Lovely Professional University
MCA, Data Science
2022 - 2024
Grade: 8.3
Tech stack
Software and tools used professionally
Postman
AWS CLI
Snowflake
Fivetran
Apache Spark
Tableau
Looker
Plotly.js
Amazon EC2
Microsoft Azure
GitHub
Jupyter
NumPy
Pandas
PySpark
MySQL WorkBench
dbt
MySQL
PostgreSQL
Databricks
OpenCV
Jira
React
Python
HTML5
Java
C++
Visual Basic
Streamlit
Google Workspace
Microsoft Office 365
Confluence
Microsoft Excel
Azure SQL Database
Visual Studio Code
GitHub CLI
Git
Airflow
TS-SQL
Amazon Web Services (AWS)
Microsoft Power BI
SciPy
Docker Machine
Alteryx Designer
Availability
Location
Authorized to work in
Portfolio
tinyurl.com/3cdmns8nSalary expectations
Job categories
Interested in hiring Nandini?
You can contact Nandini and 50k+ other talented remote workers on Himalayas.
Message NandiniFind your dream job
Sign up now and join over 85,000 remote workers who receive personalized job alerts, curated job matches, and more for free!
