Description

  • Lead the automation and orchestration of complex data workflows using Python.
  • Design, develop, and maintain robust, fault-tolerant, and auditable data pipelines across on-prem Oracle systems.
  • Modernize and migrate legacy scheduling logic (Perl/PLSQL/RunMyJobs) to Python-based Apache Airflow DAGs.
  • Integrate with job schedulers (RunMyJobs, Autosys, etc.) and create modular, observable workflows.
  • Build and deploy custom Airflow operators/sensors for Oracle, REST API integrations, file transfers (SFTP/FTP), and external triggers.
  • Ensure traceability, data quality, and recoverability in all automated processes.
  • Implement strong error handling, alerting, retry, and monitoring mechanisms for production jobs.
  • Collaborate with DBAs and application teams to analyze job dependencies, critical paths, and data lineage.
  • Establish and maintain job execution logs, audit trails, and SLA monitoring dashboards.
  • Participate in code reviews, documentation, and onboarding of new jobs to the orchestration platform.
  • Apply knowledge of Oracle 19c, SQL/PLSQL, and REST API integration to support enterprise data movement.
  • Utilize CI/CD tools (Git/Bitbucket, Jenkins) for version control and automated deployment.
  • Contribute to the modernization of data workflows, with attention to data governance and operational risk, especially in financial systems

Education

Any Gradute