Description

We are seeking a Kafka / Integrations Developer with strong expertise in building scalable, real-time event streaming and data processing applications. 

This role involves deep integration work using Kafka and AWS services, along with responsibilities around system architecture, security, performance optimization, and deployment automation. 

The ideal candidate will have experience in modern Java development and be adept in both infrastructure and microservices integration.


 

Key Responsibilities

  • Architect, design, and develop scalable, real-time event streaming and data processing applications using Java 11/17+, Spring Boot, multithreading, and reactive programming.
  • Implement and optimize Kafka Connectors for integration with AWS services including EventBridge, S3, DynamoDB, and RDS.
  • Design and implement event-driven architectures using Confluent Kafka and AWS EventBridge to support microservices and serverless patterns.
  • Configure, deploy, and monitor Kafka Connect Source/Sink connectors for various AWS services (EventBridge, Kinesis, DynamoDB, Lambda).
  • Develop real-time processing pipelines using Kafka Streams and ksqlDB.
  • Manage Kafka Schema Registry with support for Avro, JSON Schema, and Protobuf.
  • Optimize Kafka cluster performance by tuning partitions, monitoring consumer lag, and configuring tiered storage.
  • Implement Kafka security configurations such as RBAC, ACLs, SSL/TLS, and OAuth.
  • Automate the deployment of Kafka and EventBridge integrations using tools like Terraform, Kubernetes, and Helm.
  • Develop CI/CD pipelines for Kafka applications using Jenkins and GitHub Actions.
  • Implement observability solutions with Confluent Control Center, AWS CloudWatch, Prometheus, and Datadog.
  • Collaborate with Data Engineers, DevOps, and Cloud teams to ensure end-to-end integration success.
  • Provide technical leadership and mentorship to junior engineers.


 

Required Qualifications

  • Hands-on experience with Spring Boot, multithreading, and reactive programming in Java.
  • Practical experience with Kafka connectors integrating AWS EventBridge, S3, DynamoDB, and RDS.
  • Strong understanding of Kafka performance tuning and security configurations (RBAC, ACLs, SSL/TLS, OAuth).
  • Ability to develop and manage Kafka Schema Registry using serialization formats like Avro.
  • Experience deploying scalable data solutions using Kafka in cloud-based environments.


 

Preferred Qualifications

  • Familiarity with JSON Schema and Protobuf in Kafka Schema Registry.
  • Experience with real-time processing tools such as Kafka Streams and ksqlDB.
  • Experience automating deployments with Terraform, Kubernetes, and Helm.
  • Proficiency with CI/CD tools such as Jenkins and GitHub Actions.
  • Experience with observability and monitoring platforms such as Confluent Control Center, AWS CloudWatch, Prometheus, or Datadog.
  • Background in mentoring or providing technical leadership within a team setting

Education

Any Gradute