Description


Key Responsibilities:
• Develop, deploy, and maintain Python-based automation scripts to orchestrate jobs across Oracle 19c on-prem systems.
• Design and implement Airflow DAGs to manage complex interdependent ETL workflows.
• Migrate existing job logic from Perl, RunMyJob, and PL/SQL-based scheduling into modular, observable Airflow DAGs.
• Build custom Airflow operators/sensors for integration with Oracle, REST APIs, file drops (SFTP/FTP), and external triggers.
• Implement robust error handling, alerting and retry mechanisms across job pipelines.
• Collaborate with DBAs and application teams to understand job dependencies, critical paths, and data lineage.
• Establish job execution logs, audit trails, and SLA monitoring dashboards.
• Participate in code reviews, documentation, and onboarding new jobs into the orchestrator.

Required Skills and Experience:
• 5+ years of Python development experience, with strong understanding of system/process automation.
• 2+ years of Apache Airflow building production DAGs.
• Solid understanding of Oracle 19c database, SQL tuning, and PL/SQL concepts.
• Experience orchestrating jobs that move large volumes of data across enterprise systems.
• Familiarity with job schedulers (RunMyJob, Autosys, etc.) and how to replace/abstract them using orchestration tools.
• Strong debugging skills across logs, databases, and filesystem for failed jobs or partial runs.
• Experience building REST API integrations, SFTP/file movement logic, and parameter-driven automation.

Bonus / Preferred Experience:
• Prior experience modernizing legacy data workflows from Perl or PL/SQL stored procs.
• Hands-on knowledge of Git/Bitbucket, Jenkins, CI/CD pipelines for code-controlled job rollouts.
• Familiarity with financial data models (e.g., holdings, transactions, NAVs, tax lots).
• Basic understanding of data governance, audit, and operational risk in financial systems

Education

Any Gradute