Description

We are looking for a skilled and experienced Kafka Engineer to join the team. The primary responsibility will be to manage and optimize Apache Kafka, a distributed event streaming platform, ensuring its seamless integration with other systems and supporting data workflows.

Key responsibilities include:

In-depth Kafka Expertise:

  • You will be expected to have strong knowledge and hands-on experience in all aspects of Apache Kafka, including its setup, configuration, and management.
  • You will manage Kafka topics, partitions, and replication to ensure high availability and fault tolerance.
  • A deep understanding of Kafka Producers and Consumers, and their interactions, is required to ensure efficient message streaming and data transfer.

Kafka Integration:

  • Collaborate with other teams to integrate Kafka with various applications and data systems, ensuring smooth data flow and efficient communication between platforms.
  • You will be responsible for ensuring that Kafka works effectively with other systems in the data ecosystem.

ETL from IIDR to Snowflake:

  • Utilize Kafka to facilitate the Extract, Transform, and Load (ETL) process, specifically extracting large volumes of data from IBM InfoSphere Data Replication (IIDR) and loading them into Snowflake.
  • Ensure data consistency, integrity, and performance throughout the ETL pipeline.

Education

Any Graduate