Quantiphi is seeking a Senior Technical Architect - Databricks to solve enterprise data problems and develop solutions for migration, storage, and processing. As a Senior Technical Architect - Databricks at Quantiphi, you will have the opportunity to work with Fortune 500 companies and disruptive innovators in a research-driven environment with 60+ patents.
Requirements
- 12+ years of relevant experience in building cloud-native, hybrid, or multi-cloud solutions including Dababricks and AWS.
- Experience with Databricks implementations - developing data pipelines using PySpark.
- Experience in Databricks Workspaces, Notebooks, Delta Lake, APIs.
- Expertise in production grade solutions using: AWS (Redshift, S3, Glue, Lambda, Airflow), Pyspark, Python, Data pipelines using Glue/Airflow/Sagemaker.
- Hands-on experience in working on large cloud-based migration workloads involving SQL and NoSQL Databases.
- Experience with migration of databases to AWS. Strong ability in creating roadmaps and architecture to execute migration workloads on AWS.
- Exposure to ETL tools and Data warehouse.
- Experience in SQL and Query optimisation.
- Proficient in SQL-based technologies (MySQL, Oracle DB, SQL Server etc.).
- Experience in creation and maintenance of data dictionaries, metadata repositories, and data lineage documentation.
- Experience building and supporting large-scale systems in a production environment.
- Strong skills to mentor and manage teams of junior and senior data engineers and leading end to end delivery of technical workloads.
- Experience using Github/codecommit for developmental activities.
- Experience designing and optimizing orchestration frameworks using Apache Airflow / Amazon MWAA including DAG optimization, dependency management, retries, monitoring, and operationalization.
- Experience with AWS services - S3,Redshift,Secrets Manager.
- Exposure to SageMaker Unified Studio (SMUS) and modern business catalog/governance implementations for metadata management, data discovery, and governed analytics access.
- Experience in implementing data integration projects using ETL.
- Experience in using Airflow or Step Functions for orchestration.
- Exposure to IaC tools like Terraform and to CI/CD tools.
- Prior experience in migrating (on-prem to cloud) and processing large amounts of data.
- Experience in implementing data lake and data warehouse on cloud.
- Experience in implementing the industry's best practices.
Benefits
- Generous Paid Time Off
- 401k Matching
- Retirement Plan
- Visa Sponsorship
- Four Day Work Week
- Generous Parental Leave
- Tuition Reimbursement
- Relocation Assistance
