Datasite and its associated businesses are the global center for facilitating economic value creation for companies across the globe. From data rooms to AI deal sourcing
and more. Here you’ll find the finest technological pioneers: Datasite, Blueflame AI, Firmex, Grata, and Sherpany. They all, collectively, define the future for business growth.
Apply for one position or as many as you like. Talent doesn’t always just go in one direction or fit in a single box. We’re happy to see whatever your superpower is and find the best place for it to flourish.
Get started now, we look forward to meeting you.
Job Description:
Sherpany by Datasite is the leading Swiss meeting management solution, designed to meet the unique needs of the board, board committee, and executive meetings. Our solution streamlines the entire meeting process to make meetings more productive and thus enhancing company performance. Our customers include well-known medium to large companies in all industries, such as Axpo, Raiffeisen Bank and Calida Group. More than 400 companies already use Sherpany.
We’ve come a long way since 2011. Sherpany is now a team of 150 talented individuals, working from all around the world. Our culture is rooted in trust and responsibility, and we’re proud of the productive and healthy nature of our work environment.
About the role:
We’re looking for a hands-on data professional to support ongoing data pipeline and modeling work within our data warehouse. You’ll collaborate closely with the analytics team to maintain and improve existing workflows, data quality, dbt models, KPI definitions, and documentation practices.
Please note this is a temporary 9 months contract with the possibility to extend after this period.
Key Responsibilities
Maintain and optimize end-to-end ELT pipelines to ensure reliable, well-structured data flows into the warehouse.
Develop, test, and maintain scalable data models following modern data modeling and best practices.
Support data validation, testing, documentation, and governance across environments.
Contribute to metadata and lineage tracking, as well as standards for KPI modeling and data quality.
Requirements
Strong proficiency inSQLandPython(PySpark preferred).
Solid experience withdbt CoreandGitHub-based workflows(pull requests, CI/CD).
Familiarity with theAzure data ecosystemandDelta Lake architecture.
Experience withELT/ETL tools(Airbyte, ADF, or similar, in addition to dbt).
Detail-oriented, collaborative, and comfortable working in a remote setup.
Nice to Have
Experience building or maintaining Tableau data sources and dashboards.
Understanding of data governance, documentation, and cataloging practices (e.g., Collibra, dbt docs)
Exposure to master data management tools (Reltio).
Our company is committed to fostering a diverse and inclusive workforce where all individuals are respected and valued. We are an equal opportunity employer and make all employment decisions without regard to race, color, religion, sex, gender identity, sexual orientation, age, national origin, disability, protected veteran status, or any other protected characteristic. We encourage applications from candidates of all backgrounds and are dedicated to building teams that reflect the diversity of our communities.
