This role is for one of Weekday's clients
Min Experience: 3 years
JobType: full-time
Requirements
About the Role
We are looking for a Senior Backend Engineer with hands-on experience in building and maintaining high-performance, distributed backend systems. This role is ideal for someone who thrives on solving complex streaming data problems and has a strong foundation in Java, Apache Flink, and Kafka Streams. You’ll be a critical part of our engineering team, helping to design, implement, and scale real-time data pipelines and backend services that power our core products.
As a key contributor, you will work closely with cross-functional teams including data engineering, product, and DevOps to build solutions that are robust, scalable, and reliable.
Key Responsibilities
- Design, develop, and maintain real-time data processing systems using Apache Flink and Kafka Streams
- Implement backend services and APIs in Java, ensuring code quality, test coverage, and performance
- Collaborate with product managers and architects to translate business requirements into scalable backend solutions
- Optimize Kafka streaming infrastructure for performance, scalability, and fault tolerance
- Ensure the integrity and timeliness of data flowing through distributed systems
- Monitor, troubleshoot, and resolve production issues related to data streaming and backend services
- Mentor junior developers and promote backend engineering best practices
- Write unit and integration tests to ensure system reliability and code quality
- Participate in code reviews, design discussions, and technical decision-making
Required Skills and Experience
- Minimum of 3+ years of backend development experience with a strong focus on Java
- Solid hands-on experience with Apache Flink, Kafka Streams, and Kafka streaming architectures
- Strong understanding of distributed systems, event-driven architectures, and stream processing principles
- Experience in developing and maintaining RESTful APIs and microservices
- Familiarity with common backend frameworks, dependency injection, and build tools (Maven/Gradle)
- Good understanding of message serialization formats like Avro, Protobuf, or JSON
- Experience with CI/CD pipelines, Git, Docker, and cloud-based infrastructure (AWS, GCP, or Azure)
- Excellent problem-solving and debugging skills
- Strong communication skills and the ability to collaborate with cross-functional teams
Nice to Have
- Experience with time-series data or event sourcing architectures
- Exposure to monitoring tools like Prometheus, Grafana, or ELK stack
- Familiarity with container orchestration platforms (Kubernetes)
- Knowledge of SQL or NoSQL databases used alongside streaming platforms
- Contributions to open-source projects or relevant GitHub portfolio