Responsible for:
- Creating, implementing, and maintaining cloud architecture for clients
- Migrating systems to hybrid or cloud-based solutions
- Troubleshooting and optimizing clients’ cloud infrastructure
- Working with developers to support and maintain application lifecycles
- Assisting with design and implementation of new cloud architecture that follow best practice guidelines
- Working with Project Managers and clients to provide recommendations and technology roadmaps to meet their business needs
- Actively participating in team meetings and cross-functional interactions
- Keeping team members and supervisors informed of progress and issues
- Actively contributing in client project status meetings
- Contributing to R&D projects to validate/invalidate new services offerings
- Mentoring and guiding members of the delivery team in a technical and non-technical capacity
- Remaining current with technology trends and new technologies
- Proposing latest technology usage and integration standards
How you will be successful:
- Living and breathing the “cloud-first” approach
- Having a high degree of analytical thinking capability to solve complex business problems
- Obsessively delivering amazing customer experiences
- Becoming a subject matter expert in one or more cloud platforms in 9-12 months
- Building trusting relationships with team members and collaborating departments
- Comfortable with pushing boundaries and technical limits (maintain technical aptitude and knowledge of industry)
- Always be learning
What experience you need:
- 5+ years professional IT experience
- 2+ years professional AWS or Azure experience
- At least (1) AWS or Azure Certification (professional level)
- Programming languages like Python, R, Java, Scala etc. to build data pipelines, perform data analysis and create machine learning models.
- SQL and NoSQL databases like PostgreSQL, MySQL, MongoDB, Cassandra etc. to store and query large datasets.
- Data modeling skills - ability to design conceptual, logical and physical data models.
- Understanding of dimensional modeling, star schemas, data warehouses.
- ETL (Extract, Transform, Load) tools like Informatica, Talend, Pentaho etc. to integrate and move data between systems.
- Big data frameworks like Hadoop, Spark, Kafka etc. for distributed data processing and building data lakes
- Machine learning frameworks like Tensorflow, PyTorch, Keras, Scikit-Learn for building ML models.
- Knowledge of data architecture patterns like lambda, kappa architecture. Ability to design scalable and flexible data pipelines.