Himalayas logo
Streamline DigitalSD

Sr. Data Engineer- Snowflake

Stay safe on Himalayas

Never send money to companies. Jobs on Himalayas will never require payment from applicants.

Sr. Data Engineer

Who We Are
At Streamline, we are experts in Enterprise Mobility, Product Engineering, and IT Transformation. We help organizations navigate the constantly evolving landscape of IT. Our sole focus is ensuring that our client’s organization is armed with the strategies, products and solutions that are transformative to their business. Streamline works closely with our clients, takes pride in developing genuine relationships and embraces open communication and collaboration with our clients. We become a part of our client’s team, working together to achieve short-term goals and enable long-term success. Our team is comprised of world-class strategists, architects, engineers, and developers.

In our flagship product, iEnterprise, we are taking things to the next level, using our collective experience and customer input to create new enterprise mobility management products that reduce operational costs, prevent issues before they happen, and resolve issues faster than with traditional tools and approaches.

Role Summary

This is a full-time remote position. The Senior Data Engineer designs, builds, and optimizes data pipelines that move, transform, and load data into Snowflake using Azure services and serverless components. The role focuses on production-grade engineering: automating data quality, improving reliability, and continuing to optimize cloud infrastructure costs.

Role Responsibilities
  • Design, develop, and deploy Azure Functions and broader Azure data services to extract, transform, and load data into Snowflake data models and marts.
  • Implement automated data quality checks, monitoring, and alerting to ensure accuracy, completeness, and timeliness across all pipelines.
  • Optimize workloads to reduce cloud hosting costs, including right-sizing compute, tuning queries, and leveraging efficient storage and caching patterns.
  • Build and maintain ELT/ETL workflows and orchestration to integrate multiple internal and external data sources at scale.
  • Design data pipelines that support both near real-time streaming data ingestion and scheduled batch processing to meet diverse business requirements.
  • Collaborate with engineering and product teams to translate requirements into robust, secure, and highly available data solutions.

Qualifications & Skills
  • Strong expertise with Azure data stack (e.g., Azure Functions, Azure Data Factory, Event/Service Bus, storage) and Snowflake for analytical workloads.
  • Proven experience designing and operating production data pipelines, including CI/CD, observability, and incident response for data systems.
  • Advanced SQL and performance tuning skills, with experience optimizing transformations and Snowflake queries for cost and speed.
  • Solid programming experience in Python or similar for building reusable ETL components, libraries, and automation.
  • Experience with streaming and batch ingestion patterns (e.g., Kafka, Spark, Databricks) feeding Snowflake.
  • Familiarity with BI and analytics tools (e.g., Power BI, Grafana) consuming Snowflake data models.
  • Background in DevOps practices, including containerization, CI/CD pipelines, and infrastructure-as-code for data platforms.
  • Experience with modern data transformation tools (e.g., dbt) and data observability platforms for monitoring data quality, lineage, and pipeline health.

Additional Requirements
  • Ability to adapt to a fast-paced and dynamic work environment.
  • Self-motivated and able to work independently with minimal supervision, taking initiative to drive projects forward.
  • Expert-level problem-solving skills with the ability to diagnose complex data pipeline issues and architect innovative solutions.
  • Proven ability to integrate and analyze disparate datasets from multiple sources to deliver high-value insights and drive business impact.
  • Strong problem-solving skills and attention to detail.
  • Proven ability to manage multiple priorities and deadlines.
  • Passionate about staying current with emerging data engineering technologies and best practices, driving innovation to enhance product capabilities and maintain competitive advantage.
  • Experience developing and architecting SaaS platforms with a focus on scalability, multi-tenancy, and cloud-native design patterns.

What We Offer

  • A challenging and rewarding role in a dynamic and international environment.
  • Opportunity to be part of a growing company with a strong commitment to innovation and excellence.
  • A supportive and collaborative team culture that values personal growth and development.
  • Competitive compensation and benefits package.

About the job

Apply before

Posted on

Job type

Full Time

Experience level

Mid-level

Location requirements

Hiring timezones

India +/- 0 hours

About Streamline Digital

Learn more about Streamline Digital and their company culture.

View company profile
Claim this profileStreamline Digital logoSD

Streamline Digital

View company profile

Similar remote jobs

Here are other jobs you might want to apply for.

View all remote jobs

4 remote jobs at Streamline Digital

Explore the variety of open remote roles at Streamline Digital, offering flexible work options across multiple disciplines and skill levels.

View all jobs at Streamline Digital

Remote companies like Streamline Digital

Find your next opportunity by exploring profiles of companies that are similar to Streamline Digital. Compare culture, benefits, and job openings on Himalayas.

View all companies

Find your dream job

Sign up now and join over 100,000 remote workers who receive personalized job alerts, curated job matches, and more for free!

Sign up
Himalayas profile for an example user named Frankie Sullivan
Streamline Digital hiring Sr. Data Engineer- Snowflake • Remote (Work from Home) | Himalayas