Qualifications:
- Overall 9+ years of IT experience
- Minimum of 5+ years' preferred managing Data Lakehouse environments, Azure Databricks, Snowflake, DBT (Nice to have) specific experience a plus.
- Hands-on experience with data warehousing, data lake/lakehouse solutions, data pipelines (ELT/ETL), SQL, Spark/PySpark, DBT.
- Strong understanding of Data Modelling, SDLC, Agile, and DevOps principles.
- Bachelor’s degree in management/computer information systems, computer science, accounting information systems, computer or in a relevant field.
Knowledge/Skills:
- Tools and Technologies: Azure Databricks, Apache Spark, Python, Databricks SQL, Unity Catalog, and Delta Live Tables.
- Understanding of cluster configuration, compute and storage layers.
- Expertise with Snowflake Architecture, with experience in design, development, and evolution
- System integration experience, data extraction, transformation, and quality controls design techniques.
- Familiarity with data science concepts, as well as MDM, business intelligence, and data warehouse design and implementation techniques.
- Extensive experience with the medallion architecture data management framework as well as unity catalog.
- Data modeling and information classification expertise at the enterprise level.
- Understanding of metamodels, taxonomies and ontologies, as well as of the challenges of applying structured techniques (data modeling) to less-structured sources.
- Ability to assess rapidly changing technologies and apply them to business needs.
- Be able to translate the information architecture contribution to business outcomes into simple briefings for use by various data-and-analytics-related roles.