Description

Job Description:

·Job Responsibilities: 
● Develop, test, and maintain scalable Python applications. 
● Design, implement, and optimize real-time data processing pipelines. 
● Collaborate with cross-functional teams to define, design, and ship new features and implement robust data processing pipelines. 
● Build and manage event-driven architectures and streaming pipelines. 
● Troubleshoot and debug applications to optimize performance and reliability. 
● Optimize and maintain existing codebases, ensuring high performance and reliability. 
● Integrate third-party services and APIs into Python applications. 
● Conduct code reviews and contribute to improving the quality of code and best practices. 
● Ensure software is up-to-date with the latest industry trends and technologies. 

Primary skills 
● 4-6 years of experience in Python development and 3rd party libraries (FastAPI). 
● Hands-on experience with stream processing frameworks (Kafka, Apache Flink, etc.). 
● Good understanding of Kafka. 
● Experience working with message brokers. 
● Hands-on experience of asynchronous programming and event-driven systems. 
● Familiarity with microservices architecture. 
● Solid understanding of data structures and algorithms.
● Understanding of containerization tools such as Docker and Kubernetes. 
● Experience with version control systems like Git. 
● Experience with relational and NoSQL databases. 
● BE/B.Tech degree in engineering.

Secondary Skills 
● Familiarity with Docker and Kubernetes for container orchestration. 
● Experience with RESTful APIs development and integration. 
● Exposure to CI/CD pipelines and automated testing. 
● Exposure to monitoring and observability tools such as Prometheus, Grafana, etc. 
● Knowledge of cloud platforms such as AWS, Azure, or GCP

Education

Any Graduate