Key Responsibilities:
- Design, develop, and maintain real-time streaming solutions using Apache Kafka
- Build and manage Kafka producers, consumers, and streams
- Implement data pipelines and event-driven architectures
- Ensure high availability, scalability, and fault tolerance of Kafka clusters
- Monitor, troubleshoot, and optimize Kafka performance
- Integrate Kafka with various data sources and downstream systems
- Work closely with data engineers, backend developers, and DevOps teams
- Implement security, data governance, and best practices in streaming systems
Required Skills & Qualifications:
- Strong experience with Apache Kafka and its ecosystem (Kafka Streams, Connect, Schema Registry)
- Proficiency in Java, Scala, or Python
- Experience with distributed systems and microservices architecture
- Good understanding of messaging systems and event-driven design
- Knowledge of REST APIs and data integration techniques
- Experience with cloud platforms like AWS, Azure, or GCP
- Familiarity with containerization tools like Docker and Kubernetes
- Strong problem-solving and debugging skills