Description

Job Purpose:

To work on implementing data modeling solutions
To design data flow and structure to reduce data redundancy and improving data movement amongst systems defining a data lineage
To work in the Azure Data Warehouse
To work with large data volume of data integration
 

Experience

With overall experience between 10 to 15 years, applicant must have minimum 8 to 11 years of hard core professional experience in data modeling for large Data Warehouse with multiple Sources.

 

Technical Skills

Expertise in core skill of data modeling principles/methods including conceptual, logical & physical Data Models
Ability to utilize BI tools like Power BI, Tableau, etc to represent insights
Experience in translating/mapping relational data models into XML and Schemas
Expert knowledge of metadata management, relational & data modeling tools like ER Studio, Erwin or others.
Hands-on experience in relational, dimensional and/or analytical experience (using RDBMS, dimensional, NoSQL, ETL and data ingestion protocols).
Very strong in SQL queries
Expertise in performance tuning of SQL queries.
Ability to analyse source system and create Source to Target mapping.
Ability to understand the business use case and create data models or joined data in Datawarehouse.
Preferred experience in banking domain and experience in building data models/marts for various banking functions.
Good to have knowledge of –
Azure powershell scripting or Python scripting for data transformation in ADF
SSIS, SSAS, BI tools like Power BI
Azure PaaS components like Azure Data Factory, Azure Data Bricks, Azure Data Lake, Azure Synapse (DWH), Polybase, ExpressRoute tunneling, etc.
API integration
 

Responsibility

Understanding the existing data model, existing data warehouse design, functional domain subject areas of data, documenting the same with as is architecture and proposed one.
Understanding existing ETL process, various sources and analyzing, documenting the best approach to design logical data model where required
Work with development team to implement the proposed data model into physical data model, build data flows
Work with development team to optimize the database structure with best practices applying optimization methods.
Analyze, document and implement to re-use of data model for new initiatives.
Will interact with stakeholder, Users, other IT teams to understand the eco system and analyze for solutions
Work on user requirements and create queries for creating consumption views for users from the existing DW data.
Will train and lead a small team of data engineers.
 

Qualifications

Bachelors of Computer Science or Equivalent
Should have certification done on Data Modeling and Data Analyst.
Good to have a certification of Azure Fundamental and Azure Engineer courses (AZ900 or DP200/201)
 

Behavioral Competencies

Should have excellent problem-solving and time management skills
Strong analytical thinking skills
Applicant should have excellent communication skill and process oriented with flexible execution mindset.
Strategic Thinking with Research and Development mindset.
Clear and demonstrative communication
Efficiently identify and solves issues
Identify, track and escalate risks in a timely manner

Education

Any Graduate