The role involves working with high-volume transactional datasets, improving data quality, and enabling data-driven enhancements in operational and analytical systems.
Responsibilities
- Perform data profiling, cleansing, quality checks, and exploratory analysis on complex transactional datasets
- Collaborate with product and technical teams to define and refine rules and logic used in data-driven workflows
- Translate findings into actionable insights and prepare reports for both technical teams and business stakeholders
- Utilize Python and Git for data pipeline management, workflow automation, and documentation
- Coordinate with engineering and data science teams, including integration of new data sources and tools (experience with GCP/BigQuery is a plus)
Requirements
Requirements
- Master's degree in Data Analysis or a related discipline
- 5+ years of professional experience in data analysis following completion of the master's degree
- Demonstrated expertise in data profiling, validation, and quality assurance
- Experience handling and analyzing structured transactional data
- Proficiency in Python and Git for data processing and collaboration
- Strong communication skills in English (written and verbal)
- Familiarity with GCP/BigQuery and/or financial sector data systems is a plus
English
- B2+ (Upper-Intermediate or higher)
Type of Work
- Remote
- Full-Time
Time zone
- Central European Time