About Block Labs
Block Labs is a leading force in the Web3 space, incubating, investing in, and accelerating top-tier fintech, crypto, and iGamingprojects. With a mission to shape the future of decentralized technology, we partner with visionary startups to raise funding, refine product-market fit, and grow their audiences.
As we continue to expand, we are looking for an ambitious and self-drivenindividual to join our rapidly growing team!
About The Role
We are seeking a highly skilled Data Engineer to lead the design, implementation, and maintenance of our unified analytical data warehouse on Clickhouse. The ideal candidate is experienced in building scalable data pipelines, working with modern orchestration tools, and leveraging AWS services for data storage and processing. You will collaborate closely with data team to ensure data accuracy, reliability, and accessibility.
Key Responsibilities
Design, develop, and maintain a unified analytical data warehouse on CLickhouse (DWH) to support business intelligence and advanced analytics.
Participation in DWH architecture design
Analysis and research of current system functionality
Updating documentation based on results
Writing automated tests
Develop and manage workflows and data orchestration using tools like Apache Airflow.
Ensure seamless integration of AWS services (e.g., Redshift, S3, Lambda) and ClickHouse for data storage and transformation.
Implement CI/CD pipelines for data workflows to ensure quality and agility in deployments.
Monitor, debug, and resolve issues related to data pipelines and systems.
Maintain documentation for data infrastructure, processes, and best practices.
Required Skills and Experience:
Technical Expertise:
Python: Proficiency in Python for building and optimizing data pipelines.
Data Orchestration: Hands-on experience with tools like Apache Airflow
CI/CD: Familiarity with CI/CD tools (e.g., GitHub Actions, GitLab CI/CD) for automating data workflows.
Data Modeling: Expertise in designing efficient and scalable data models (e.g., star/snowflake schemas).
General Skills:
Proven experience in building and maintaining scalable DWH, reliable ETL/ELT pipelines.
Strong SQL skills for querying and performance optimization.
Ability to work in an agile environment and adapt to evolving priorities.
Excellent problem-solving skills with attention to detail.
Effective communication and collaboration skills to work with technical and non-technical stakeholders.
Preferred Qualifications:
Cloud Platforms: experience with AWS services, including but not limited to Redshift, S3, Lambda, and Glue.
ClickHouse: Practical experience in working with ClickHouse for data processing and analysis.
Experience of building reports in Tableau or Power BI
Experience with data governance, quality frameworks, and security best practices.
What kind of culture can I expect?
Mature, mission-driven, and low-ego. We value clarity over noise, outcomes over theatrics, and pace without chaos. If you’re one of the smartest minds in your craft and want to build with other experts, you’ll feel at home here.