This position focuses on developing, implementing, and maintaining architecture solutions across a large enterprise data warehouse to support effective and efficient data management and enterprise-wide business intelligence analytics.
Requirements
- Implement and optimize data pipeline architectures for sourcing, ingestion, transformation, and extraction processes, ensuring data integrity and compliance with organizational standards.
- Develop and maintain scalable database schemas, data models, and data warehouse structures; perform data mapping, schema evolution, and integration between source systems, staging areas, and data marts.
- Automate data extraction workflows and create comprehensive technical documentation for ETL/ELT procedures; collaborate with cross-functional teams to translate business requirements into technical specifications.
- Establish and enforce data governance standards, including data quality metrics, validation rules, and best practices for data warehouse design and architecture.
- Develop, test, and deploy ETL/ELT scripts using SQL, Python, Spark, or other relevant languages; optimize code for performance and scalability.
- Tune data warehouse systems for query performance and batch processing efficiency; apply indexing, partitioning, and caching strategies.
- Perform advanced data analysis, validation, and profiling using SQL and scripting languages; develop data models, dashboards, and reports in collaboration with stakeholders.
- Conduct testing and validation of ETL workflows to ensure data loads meet SLAs and quality standards; document testing protocols and remediation steps.
- Troubleshoot production issues, perform root cause analysis, and implement corrective actions; validate data accuracy and consistency across systems.
Benefits
- Generous Paid Time Off
- 401k Matching
- Retirement Plan
- Relocation Assistance
