Description

  • Lead the migration of data assets from SAP HANA Data Warehouse to Azure Lakehouse architecture.
  • Design, develop, and maintain ETL/ELT pipelines using Azure Databricks, incorporating Delta Lake and Parquet file formats.
  • Collaborate with data architects and engineers to define and implement best practices for data modeling, partitioning, and storage.
  • Optimize query performance and cost management within Azure Data Lake Storage and Databricks environments.
  • Implement data quality frameworks and monitoring solutions to ensure data accuracy, completeness, and reliability.
  • Integrate existing NPower solutions on top of the data lake, enabling analytics and reporting capabilities.
  • Contribute to the migration plan for transitioning to Microsoft Fabric, leveraging Delta Lake Delta and Parquet assets.
  • Document data workflows, architecture diagrams, and operational runbooks.
  • Mentor junior engineers and deliver technical knowledge sharing sessions.

 

Required Qualifications:

 

  • Bachelor’s degree in Computer Science, Information Systems, or related field, or equivalent experience.
  • 5+ years of experience in data engineering, with a focus on Azure data platform technologies.
  • Demonstrated expertise in Azure Databricks, including notebooks, jobs, and cluster management.
  • Hands-on experience with Delta Lake and Parquet file formats for optimized storage and performance.
  • Strong SQL skills and experience with data modeling and ETL/ELT pipeline development.
  • Familiarity with SAP HANA data warehouse environments and migration strategies.
  • Experience with Microsoft Fabric architecture and migration considerations.
  • Proficiency in programming languages such as Python or Scala.
  • Solid understanding of data governance, security, and compliance best practices.
  • Excellent communication and collaboration skills in a fast-paced environment.

 

Preferred Qualifications:

 

  • Azure certifications (e.g., Azure Data Engineer Associate, Azure Solutions Architect).
  • Experience with infrastructure-as-code tools (Terraform, ARM templates).
  • Knowledge of streaming data technologies (Azure Event Hubs, Kafka).
  • Experience with containerization (Docker) and orchestration (Kubernetes)

Education

Bachelor's degree