Description

  • Must Have total :14 years exp
  • Sate project is preferred
  • Design and build data pipelines using Azure Data Factory or Synapse in the DAAP environment
  • Develop and manage dataflows for efficient data movement and transformation
  • Create and maintain Logic Apps to automate workflows and send alerts or email notifications
  • Use Azure Functions for event-driven processing and preparing data for the data hub
  • Write optimized PySpark and Scala notebooks to pull data from ServiceNow APIs, store it in the data lake, and load it into the SQL pool
  • Make API calls from notebooks to connect with systems like Avaya, Eureka, and Zammo
  • Implement recursive and iterative logic in notebooks to retrieve Google Classroom data, including admin, courses, students, and assignments
  • Transform and map data to meet business needs
  • Build end-to-end data processing workflows and orchestration
  • Set up triggers for automated pipeline execution based on schedules or events
  • Integrate with Azure DevOps for CI/CD and managing releases

Education

Any Gradute