- Proven experience collaborating with Business & IT teams on requirement gathering and solution design.
- Extensive experience preparing architecture blueprints for scalable data and analytics platforms.
- Independently created HLDs and LLDs for BI and data engineering solutions including data models, pipelines, and reporting layers.
Azure Data Engineering / Databricks / Modelling:
- 6+ years of hands-on experience with Azure Data Engineering services including Azure Data Factory (ADF), Synapse Analytics, and Data Lake Storage for orchestrating and managing end-to-end data pipelines.
- 4+ years of hands-on experience with Databricks, implementing ELT pipelines, notebooks, and ML-based data transformations across large-scale distributed datasets.
- Strong experience in designing Gold layer and reusable semantic models for BI consumption and downstream applications.
- Experience implementing CI/CD pipelines and monitoring frameworks for Azure-based data engineering workloads.
Microsoft Fabric:
- Hands-on experience building reporting and data engineering solutions using Microsoft Fabric.
- Worked with DirectLake, OneLake, and Lakehouses to design and manage scalable, unified data architectures.
- Built and orchestrated end-to-end pipelines using Fabric Dataflows Gen2 and Notebooks for real-time and batch data movement.
- Exposure to Fabric Copilots to accelerate data modeling, KPI generation, and report creation (good to have).
Power BI:
- Hands-on Experience in Power BI Pro, Premium, and Paginated Reports.
- Use of DAX, Power Query, and M language for building complex analytics logic.
- Experience in setting up Dataflows, Incremental Refresh, Row-Level Security (RLS), and optimized data modeling.
- Led multiple Data Modernization Projects
- Leverage Generative AI tools and Copilots (e.g., GitHub Copilot, Microsoft Fabric Copilot, M365 Copilot) to accelerate code development, enhance data insights, and streamline documentation.
- Applied Gen AI capabilities in tasks such as pipeline generation, DAX and SQL authoring, metadata extraction, and documentation automation.
Must Have Skills:
Communication Skills:
- Communicate effectively with internal and customer stakeholders
- Communication approach: verbal, emails and instant messages
Interpersonal Skills:
- Strong interpersonal skills to build and maintain productive relationships with team members
- Provide constructive feedback during code reviews and be open to receiving feedback on your own code.
Problem-Solving and Analytical Thinking:
- Capability to troubleshoot and resolve issues efficiently.
- Analytical mindset
- Task/ Work Updates
- Prior experience in working on Agile/Scrum projects with exposure to tools like Jira/Azure DevOps
- Provides regular updates, proactive and due diligent to carry out responsibilities
Expected Outcome:
- We are looking for an experienced professional with 15+ years of experience in end-to-end BI solution delivery, strong customer engagement skills, and hands-on expertise in Azure DE, DataBricks, Power BI and Microsoft Fabric.
Primary Skills:
- Recent 6+ years of experience working with Azure Data Engineering solutions including ADF, Synapse, and orchestration of end-to-end data pipelines. 8+projects
- 5+ years of experience in defining architecture blueprints for scalable data & analytics solutions, including cloud data platforms.
- 3+ years of hands-on experience with Databricks, atleast 4 project
- 4+ years of experience in Power BI, having delivered 4+ enterprise-grade BI projects
- Around 1 year of experience with Microsoft Fabric (DE in Fabric) on at least one customer project, contributing to integrated dataflow and lakehouse-based architecture.
- 5+ years – translating business requirements into technical specifications and delivering end-to-end BI and data solutions.
- Leverage Generative AI tools and Copilots (e.g., GitHub Copilot, Microsoft Fabric Copilot, M365 Copilot) to accelerate code development, enhance data insights, and streamline documentation.
- Applied Gen AI capabilities in tasks such as pipeline generation, DAX and SQL authoring, metadata extraction, and documentation automation