🚀 What We Do
- Leveraging our expertise, we build modern Machine Learning systems for demand planning and budget forecasting.
- Developing scalable data infrastructures, we enhance high-level decision-making, tailored to each client.
- Offering comprehensive Data Engineering and custom AI solutions, we optimize cloud-based systems.
- Using Generative AI, we help e-commerce platforms and retailers create higher-quality ads, faster.
🌟 Our Partnerships
- Amazon Web Services
- Google Cloud
- Astronomer
- Databricks
- Kaszek
- Product Minds
- H2O.ai
- Soda
🌟 Our values
- 📊 We are Data Nerds
- 🤗 We are Open Team Players
- 🚀 We Take Ownership
- 🌟 We Have a Positive Mindset
Responsibilities 🤓
- Collaborate with the team to define goals and deliver custom data solutions.
- Innovate with new tools to improve infrastructure and processes.
- Design and implement ETL processes, optimize queries, and automate pipelines.
- Own projects end-to-end, working directly with clients.
- Mentor and support team members, helping them deepen their knowledge in Data Engineering and model design.
- Lead data pipeline development and guide technical decisions.
- Act as a liaison between the team and stakeholders, ensuring alignment.
- Foster team autonomy while ensuring smooth collaboration.
- Contribute to technical research and stay ahead of solutions for future projects.
- Focus on code quality—review, document, test, and integrate CI/CD.
- Participate in candidate interviews and evaluations to help Mutt grow.
Required Skills 💻
- Experience leading and mentoring teams in Data projects, guiding technical decisions, and fostering team growth.
- Proven ability to collaborate with stakeholders, translate business needs into technical solutions, and align priorities across teams.
- Experience in Data Engineering, including building and optimizing data pipelines.
- Strong knowledge of SQL and Python (Pandas, Numpy, Jupyter).
- Experience working with any cloud (AWS, GCP, Azure).
- Knowledge of Docker.
- Experience with orchestration tools like Airflow or Prefect.
- Familiarity with ETL processes and automation.
Nice to Have Skills 😉
- Experience with stream processing tools like Kafka Streams, Kinesis, or Spark.
- Knowledge of Kubernetes.
- Solid command of English for understanding and communicating technical concepts (Design Documents, etc.).
🎁 Perks
- Remote-first culture – work from anywhere! 🌍
- AWS Databricks certifications fully covered 🚀
- Birthday off + an extra vacation week (Mutt Week! 🏖️)
- Referral bonuses – help us grow the team get rewarded!
- Maslow: Monthly credits to spend in our benefits marketplace.
- ✈️🏝️ Annual Mutters' Trip – an unforgettable getaway with the team!