Description

Job Title:   Airflow Developer

Duration: 24+ Months

Remote

 

Responsibilities

•     Develop and maintain data pipelines and workflows using Apache Airflow.

•     Design DAGs (Directed Acyclic Graphs) to automate and orchestrate ETL/ELT processes.

•     Collaborate with data engineers, analysts, and product teams to understand data requirements.

•     Monitor and troubleshoot Airflow jobs to ensure reliability and performance.

•     Integrate with various data sources such as APIs, databases (SQL/NoSQL), cloud storage, and data warehouses (e.g., BigQuery, Redshift, Snowflake).

•     Implement logging, error handling, retry logic, and alerting within Airflow.

•     Participate in code reviews, documentation, and testing best practices.

•     Optimize workflows for scalability and efficiency.

 

Required Qualifications

•     10+ years of experience working with Apache Airflow in a production environment.

•     Strong proficiency in Python and SQL.

•     Experience building and maintaining ETL/ELT pipelines.

•     Familiarity with cloud platforms (AWS, GCP, or Azure).

•     Solid understanding of data modeling, data quality, and data governance.

•     Experience with version control systems like Git.

•     Comfortable working with CI/CD tools and workflow automation.