Job Overview
7 or more years of demonstrated skill and experience in the design, installation, monitoring, and troubleshooting of messaging infrastructure such as RabbitMQ and Kafka in self-hosted and cloud environments is required
Job Description:
- Standing up and administer On-Prem & cloud-native Kafka clusters.
• Ability to architect and create a reference architecture for Kafka Implementation standards
• Provide expertise in Kafka brokers, zookeepers, Kafka connect, schema registry, KSQL, Rest proxy and Kafka Control center.
• Ensure optimum performance, high availability and stability of solutions
• Provide expertise and hands on experience working on Kafka connect using schema registry in a very high volume
• Provide administration and operations of the Kafka platform like provisioning, access lists Kerberos and SSL configurations.
• Provide expertise and hands on experience working on Kafka connectors such as MQ connectors, Elastic Search connectors, JDBC connectors, File stream connector, JMS source connectors, Tasks, Workers, converters, Transforms.
• Provide expertise and hands on experience on custom connectors using the Kafka core concepts and API.
• Involve in design and capacity review meetings to provide suggestions in Kafka usage.
• Participate in work planning and estimation
• Ensure optimum performance, high availability and stability of solutions.
• Create topics, setup redundancy cluster, deploy monitoring tools, alerts and has good knowledge of best practices.
• Create stubs for producers, consumers and consumer groups for helping onboard applications from different languages/platforms.
• Use automation tools like provisioning using Docker, Jenkins, and GitLab.
• Setting up security on Kafka, Monitor, prevent, and troubleshoot security-related issues.
• Ability to perform data related benchmarking, performance analysis and tuning.
• Competent working in one or more environments highly integrated with an operating system.
• Ability to manage tasks independently and take ownership of responsibilities
• Ability to communicate highly complex technical information clearly and articulately for all levels and audiences.
• Experience integrating Kafka with other technology products
• Understanding cloud data streaming technologies (Kafka/ksqlDB, StreamSets, AWS Kinesis, Google Cloud Platform PubSub, etc.), stream processing, event-driven architectures
• Knowing techniques for data injection and extraction into/from Kafka-based or similar pipelines
• Working knowledge on Kafka Rest proxy.
RabbitMQ
• Experience in Installation, Configuration and Administration of Rabbit MQ on UNIX/Cloud environments.
• Strong knowledge on Message oriented middleware concepts including different Messaging styles
• (Asynchronous, pub-sub), Messaging APIs (JMS, STOMP, AMQP, REST).
• Setting up of exchanges, queues, and virtual hosts.
• Implementation experience in Clustering, Security and High Availability of Rabbit MQ nodes.
• Monitoring Alarms of Rabbit MQ.
• Troubleshooting and migration of Rabbit MQ.
• Experience and knowledge with Cloud (AWS MQ)
• Expertise in deploying and maintaining in Cloud environments.
Good to have:
• Experience and knowledge of common issues associated with RHEL Servers and strong verbal and written communication skills
• Good scripting skills (Python, etc.); experience with Java
• Strong skills in In-memory applications and Data Integration.
• Working knowledge of Ansible & Automation
• Knowledge of Kubernetes is plus
• Working knowledge of Cluster management