This role is for one of the Weekday's clients
Salary range: Rs 1200000 - Rs 1400000 (ie INR 12-14 LPA)
Min Experience: 5 years
Location: Remote (India), India
JobType: full-time
We are looking for a skilled and detail-oriented Data Engineer to join our team. The ideal candidate will have strong experience across data engineering, data warehousing, and cloud platforms—particularly GCP. You'll play a key role in designing and managing efficient data pipelines, enabling scalable analytics, and supporting decision-making across the organization.
Requirements
Key Responsibilities
Data Engineering & Storage
- Write and optimize complex SQL queries, including joins, stored procedures, and certificate-auth-based queries.
- Work with NoSQL databases such as Firestore, DynamoDB, or MongoDB.
- Design and maintain scalable data models and warehouses—experience with BigQuery is preferred; Redshift or Snowflake is also acceptable.
- Build and manage ETL/ELT pipelines using tools like Airflow, dbt, Kafka, or Spark.
- Script and automate data workflows using PySpark, Python, or Scala.
- Demonstrate strong hands-on experience with Google Cloud Platform (GCP) services.
Visualization & Analytics
- Develop dashboards and reporting solutions using tools like Google Looker Studio, LookerML, Power BI, or Tableau.
- Collaborate with stakeholders to understand data needs and translate them into effective visualizations.
Good-to-Have
- Familiarity with Master Data Management (MDM) systems.
- Interest in or experience working with Web3 data and blockchain analytics.
Key Skills
- Languages & Scripting: Python, PySpark, Scala, SQL
- Databases: NoSQL (Firestore, DynamoDB, MongoDB), BigQuery, Redshift, Snowflake
- Data Tools: Airflow, dbt, Kafka, Spark
- Cloud: GCP (Google Cloud Platform)
- Visualization: Google Looker Studio, LookerML, Power BI, Tableau
- Other: Data Warehousing, Data Modeling, ETL/ELT processes