Description

Job Description

Description

 Design, develop, and maintain ETL (Extract, Transform, Load) processes to integrate data from various sources.
Write efficient and scalable Python scripts to automate data processing tasks.
Develop and maintain data pipelines using tools like Apache Airflow, AWS Glue, or similar.
Collaborate with software engineers to integrate data solutions into applications using TypeScript and other relevant technologies.
Optimize and tune data systems for performance and scalability.
Ensure data quality and integrity through rigorous testing and validation.
Knowledge of big data technologies like Hadoop, Spark, or Kafka is a plus.
Excellent Written and Oral communication skills
Experience with stakeholder management at different levels
Mandatory skill-python,pyspark 

Qualifications

Design, develop, and maintain ETL (Extract, Transform, Load) processes to integrate data from various sources.
Write efficient and scalable Python scripts to automate data processing tasks.
Develop and maintain data pipelines using tools like Apache Airflow, AWS Glue, or similar.
Collaborate with software engineers to integrate data solutions into applications using TypeScript and other relevant technologies.
Optimize and tune data systems for performance and scalability.
Ensure data quality and integrity through rigorous testing and validation.
Knowledge of big data technologies like Hadoop, Spark, or Kafka is a plus.
Excellent Written and Oral communication skills
Experience with stakeholder management at different levels
Mandatory skill-python,pyspark

Education

Any Graduate