Role DescriptionA Hakkoda Data Architect will lead architectural discussions and design exercises to build and automate large-scale migration of data with our clients and Hakkoda teams in the US and Costa Rica. The qualified candidate must have technical experience analyzing system requirements and implementing migration methods for existing data. The successful candidate will have hands-on experience with migrating critical databases across multiple database platforms and must be comfortable developing database migration solutions to ensure the clients' information is stored effectively and securely.
The role will enable both leadership and individual growth and development. You’d be joining a team of passionate, forward-thinking engineers and data scientists who are charting a new course. You will find a culture rich in collaboration, and personal development and one that recognizes and pursues the zest for life.
What we are looking forWe are looking for people experienced with data architecture, design and development of database mapping and migration processes. This person will have direct experience optimizing new and current databases, data pipelines and implementing advanced capabilities while ensuring data integrity and security.
Ideal candidates will have strong communication skills and the ability to guide clients and project team members. Acting as a key point of contact for direction and expertise.
Qualifications:
- 6+ yrs proven work experience in data warehousing/BI/analytics preferably in a consulting capacity
- 3+ yrs as an Architect
- 3+ yrs experience working in Cloud platforms
- Bachelor’s Degree (BA/BS) in computer science, information systems, Mathematics, MIS or related field
- Understanding of migration, dev/ops, ETL/ELT ingestion pipeline with tools like DataStage, Informatica, Matillion
- Project management skills and experience working with Scrum and Agile Development methods
- Ability to develop insights that are both actionable and measurable - you should be able to create and leverage metrics to influence business decisions
- Previous consulting experience managing and supporting large scale technology programs
Nice to Have:
- At least 6-12 months of experience working with Snowflake
- Understanding of Snowflake design patterns and migration architectures is a major plus
- Understanding of Snowflake roles and user security
- Understanding of Snowflake capabilities like Snowpipe, etc.
- SQL scripting Cloud experience on AWS (Azure, GCP are nice to have as well)Python Scripting is a plus
Benefits will vary by country:Medical, Dental, Vision Life InsurancePaid parental leavePaid time offWork from home benefits for 100% remote rolesTechnical training and certificationsRobust learning and development opportunities