Description

Key Responsibilities: 

 

  • Design and implement scalable data ingestion pipelines using Microsoft Fabric Data Factory (Dataflows) and OneLake.
  • Build and manage Lakehouse and Warehouse architectures leveraging Delta Lake, Spark Notebooks, and SQL Endpoints.
  • Define and enforce Medallion Architecture (Bronze, Silver, Gold) standards for structured data refinement and quality control. 


     

Requirments:

 

  • 10–12 years of professional experience in data engineering , with a minimum of 1 year working hands-on with Microsoft Fabric .
  • Proficiency in:
  • Languages: SQL (T-SQL), Python, Scala
  • Microsoft Fabric Components: Data Factory Dataflows, OneLake, Spark Notebooks, Lakehouse, Warehouse
  • Data Formats: Delta Lake, Parquet, CSV, JSON
  • Strong understanding of data modeling techniques: star schema , snowflake schema , normalized/denormalized structures .
  • Experience in CI/CD practices and Infrastructure-as-Code (IaC) using Git , ARM templates , or terraform .
  • Familiarity with data governance platforms such as Microsoft Purview .
  • Excellent problem-solving and analytical skills with the ability to articulate complex technical concepts to diverse audiences

Education

Any Gradute