HimalayasHimalayas logo
1950LabsLA

Data Architect #6632

1950Labs
United States only

Stay safe on Himalayas

Never send money to companies. Jobs on Himalayas will never require payment from applicants.

Client Description

The client is a global organization in the tourism industry, offering river, ocean, and expedition cruises for passengers worldwide and operating a large fleet of vessels.

The company is currently undergoing an extensive cloud data modernization and unification program. We support them across data architecture, BI, migration, and data platform development.

A key focus area is the migration to Databricks Unity Catalog, including:

  • Migrating all data layers (landing, raw, prepared, reporting, services) from Hive Metastore to Unity Catalog
  • Migrating DLT (Delta Live Tables) and Python/SQL jobs into Databricks
  • Migrating pipelines in Azure Synapse/ADF
  • Rebuilding and adapting metadata frameworks
  • Standardizing access, lineage, governance, and overall Lakehouse structure

The client has very high technical expectations and is looking for top-level specialists capable of leading complex architectural initiatives.

Technical Requirements

  • Advanced knowledge of Microsoft Azure (data infrastructure, networking, authorization, cloud design)
  • Experience with Azure Synapse (especially Synapse Serverless and pipelines)
  • Strong expertise in Databricks (DLT, workflows, workspace administration)
  • Ability to design Data Lakehouse architectures (Medallion Architecture, Metadata-Driven ETL)
  • Very good knowledge of Python and code optimization
  • Strong SQL skills, including query optimization and SQL Server experience
  • Experience with Apache Spark (data processing workflows)
  • Experience building ETL/ELT processes and data warehouses
  • Experience with CI/CD processes (Azure DevOps, Git, branching strategies)
  • Experience implementing logging, monitoring, and optimization of data processes
  • Familiarity with Power BI and analytics workflows
  • Strong communication skills and documentation ability
  • Experience as a Lead Engineer (technical leadership, decision-making, stakeholder collaboration)

Scope of Responsibilities

  • Design and develop data architecture in Azure and Databricks environments
  • Participate in the Unity Catalog transformation (infrastructure, pipelines, frameworks, standards)
  • Migrate and modernize ETL/ELT processes (DLT, Python/SQL jobs, Synapse/ADF pipelines)
  • Design and implement Data Lakehouse solutions using Medallion architecture
  • Optimize data processing workflows (Python, SQL, Spark)
  • Build CI/CD processes and automation in Azure DevOps
  • Implement standards for logging, monitoring, and data quality
  • Lead the project from a technical perspective (Lead Engineer role)
  • Collaborate with business and technical stakeholders
  • Document solutions and mentor team members

About the job

Apply before

Posted on

Job type

Full Time

Experience level

Location requirements

Hiring timezones

United States +/- 0 hours
Claim this profile1950Labs logoLA

1950Labs

View company profile

Similar remote jobs

Here are other jobs you might want to apply for.

View all remote jobs

11 remote jobs at 1950Labs

Explore the variety of open remote roles at 1950Labs, offering flexible work options across multiple disciplines and skill levels.

View all jobs at 1950Labs

Find your dream job

Sign up now and join over 100,000 remote workers who receive personalized job alerts, curated job matches, and more for free!

Sign up
Himalayas profile for an example user named Frankie Sullivan