Description

Job Description:

Expert proficiency in Python programming language

Expert proficiency in PySpark, including Spark SQL and other Spark APIs

Testing and debugging applications with Python test framework tools like Pytest, PyUnit etc

In-depth knowledge of Python frameworks and libraries, such as Django or Flask

Experience with AWS cloud platforms including services like S3, Databricks, and Data Lake Storage

Experience with continuous integration/continuous deployment (CI/CD) pipelines and tools

Experience with data pipeline tools such as Airflow, Kafka, and Jenkins.

Design principles that are executable for a scalable app

 

 

Qualification:

A bachelor’s degree in computer science, information technology, or any relevant field A master's degree is preferred but not mandatory.

Strong knowledge of cloud concepts, architecture patterns, and best practices

Proven experience in designing, implementing, and maintaining AWS solutions

Effective communication and teamwork skills, including the ability to collaborate with cross-functional teams

Strong problem-solving and analytical abilities

Education

bachelor’s degree in computer science