We are looking for an experienced Azure Architect to design and implement scalable, secure, and high-performance data solutions using the latest Azure technologies. This role demands hands-on experience with end-to-end Azure services, including Azure Data Factory (ADF), PySpark, Microsoft Fabric, Azure Data Lake, and Open Lakehouse architectures. You will work closely with cross-functional teams to architect modern data platforms that support advanced analytics, machine learning, and real-time data processing.
Required Skills & Experience:
Deep expertise in:
- Azure Data Factory (ADF)
- Azure Data Lake (Gen2)
- PySpark for big data transformations and analytics
- Microsoft Fabric
- Open Lakehouse architecture (e.g., Delta Lake, Apache Hudi, or Iceberg on Azure)
Experience in building and optimizing large-scale data pipelines and data lakes.
Familiarity with Databricks, Azure Synapse Analytics, Power BI, and SQL.
Strong understanding of data governance, security, and compliance frameworks.
Proficiency in scripting languages (e.g., Python, PowerShell) and CI/CD for data projects.
Responsibilities:
- Architect and implement end-to-end data pipelines and analytics solutions on Azure.
- Design scalable and efficient data integration workflows using ADF, PySpark, and other data engineering tools.
- Develop and operationalize data lakes and Open Lakehouse architecture for large-scale data management.
- Utilize Microsoft Fabric for unified data engineering and business intelligence experiences.
- Lead cloud architecture design sessions and document solutions aligned with best practices.
- Implement governance, security, and performance optimization for Azure data environments.
- Collaborate with data engineers, BI developers, and business stakeholders to meet analytics needs.
- Evaluate and introduce new Azure services that improve solution capabilities and performance