Description

Responsibilities:

  • Proficient in search and data processing systems using Elastic, Spring Framework, Kafka, event-driven processing and API’s.
  • Optimize and scale search and data processing infrastructure to handle growing data volumes and user requests
  • Collaborate with cross-functional teams to define, design, and ship new features
  • Follow best practices for data indexing, search optimization, and data integrity
  • Monitor system performance, troubleshoot issues, and ensure high availability and reliability
  • Participate in code reviews and contribute to a high standard of code quality
  • Stay updated with the latest industry trends and technologies to ensure our systems remain cutting-edge


 

Qualifications:

  • Bachelor’s or master’s degree in computer science, Engineering, or a related field
  • 5+ years of professional software engineering experience
  • Proven experience with Elasticsearch and other search technologies
  • Strong proficiency in Java and experience with Java batch processing
  • Hands-on experience with Kafka and distributed messaging systems
  • Solid understanding of data structures, algorithms, and software design principles
  • Experience with cloud platforms and infrastructure (e.g., AWS, GCP, Azure)
  • Experience with CI/CD pipelines, containerization (Docker, Kubernetes, PCF, AKS), and microservices architecture
  • Excellent attention to detail
  • Strong communication skills and the ability to work collaboratively in a team environment

Education

Bachelor's or Master's degrees