Description

Job Description:

Responsibilities:

  • Perform and oversee data loading operations.
  • Optimize data for extraction and reporting use.
  • Manage complicated databases by performing suitable database management functions.
  • Design and implement ETL jobs.
  • Build, maintain, monitor, and orchestrate workflows or data pipelines.
  • Ensure the high performance of data retrieval processes.

Minimum requirements:

  • Bachelor's/master's degree in computer science or IT (or equivalent experience).
  • 3+ years of industry experience as an Apache Airflow developer (rare exceptions for highly skilled candidates).
  • Proficiency in Apache Airflow development.
  • Expertise in Python and its frameworks.
  • Strong understanding of data warehouse concepts and ETL tools (like Informatica, Pentaho, Apache Airflow).
  • Experience working in SQL environment and with reporting tools (like Power BI and Qlik).
  • Fluency in English to collaborate with engineering managers.

Preferred skills:

  • Familiarity with Apache Hadoop, HDFS, Hive, etc.
  • Excellent troubleshooting and debugging skills.
  • Ability to work independently as well as with multi-disciplinary teams.
  • Working knowledge of agile processes and methods

Education

Bachelor's Degree