Client:A well-known global professional services firm. They are starting a new data platform project focused on leveraging Microsoft technologies to build and optimize data transformation workflows.
Project description:The project involves architecting and optimizing data transformation workflows using Microsoft Fabric, Delta Lake, and Apache Spark. The role focuses on providing technical expertise on Azure's data services to ensure efficient, reliable, and scalable data processing. Key tasks include designing data pipelines, integrating with various Azure services (e.g., Data Lake, Synapse, Data Factory), advising on best practices for data modeling and performance, and ensuring data security and compliance.
Stack:Microsoft Fabric, Delta Lake, Apache Spark, Azure services (Data Lake, Synapse, Data Factory, Purview), Python, Scala, SparkSQL, PySpark, Azure SDKs/APIs.
Team structure:Role involves collaboration with a team of full-stack developers.
English: Excellent communication skills are required.
