Develop and optimize real-time data streaming applications using Apache Kafka. Collaborate on system integration, ensuring high availability and performance. Build and troubleshoot data pipelines for analytics and operations.
Responsibilities will include but are not limited to:
- Design, develop, and maintain real-time data streaming applications using Apache Kafka.
- Collaborate with cross-functional teams to integrate Kafka solutions into existing systems.
- Monitor and optimize Kafka clusters to ensure high availability and performance.
- Implement data pipelines and streaming processes to support business analytics and operations.
- Troubleshoot and resolve issues related to data streaming and processing.
What You’ll Need:
- Bachelor’s degree in Computer Science, Information Technology, or a related field.
- Proven experience with Apache Kafka and real-time data streaming.
- Proficiency in programming languages such as Java, Scala, or Python.
- Familiarity with distributed systems and microservices architecture.
- Strong problem-solving skills and the ability to work collaboratively in a team environment