As a Data Designer you’ll process, organize, structure or present data to make it meaningful for business decision making and to position data-intensive projects/programs for success. You’ll be an integral part of a project, program or team and perform a variety of tasks related to identifying business partners’ analytical needs and obtaining, collecting, organizing and interpreting data. You’ll perform analysis and interpret results, providing insights and recommendations. This may take the form of thematic maps, reports, charts, tables and other analytic visualization.
Must Have Skills:
Primary Skill - Data Modelling
Experience - 5+yrs
Programming Language: Python, SQL
Cloud: AWS
Data Integration: API, Databricks, Talend, Informatica
Data Modeling: Erwin (or experience in another Modeling Tool)
Collaboration Effectiveness – demonstrated skill working directly with stakeholders, business partners, SMEs, systems peers, and cross-functional teams to elicit data requirements and design appropriate data models that align with business needs.
Translating current and future business requirements into conceptual, logical, and physical data model designs. Proficiency in utilizing standard data modeling tools (e.g. Erwin, ER Studio, Toad DM, PowerDesigner, etc.).
Structured execution of data analysis, data profiling, and data mapping tasks.
Ensures referential integrity is maintained via data modeling design and data quality recommendations.
Ensures data governance practices are followed during model creation to support efforts such as data cataloging, taxonomy, data lineage, and data quality.
Designing data models for multiple distinct patterns (e.g. relational, dimensional, hybrid) associated with data warehouse, BI, and large data applications.
Creation of net new data models and extension of existing across multiple DBMS platforms (e.g. Oracle, SQL Server, DB2, Snowflake, Teradata, NoSQL).
Ability to understand and explain complex data integration and extraction / transformation / load processes.
Demonstrated experience identifying and resolving data model performance issues to optimize database performance and enhance overall system functionality.
Documenting and communicating data model designs and standards to ensure understanding and compliance with expectations for interoperability across the organization.
Fluency in Python, SQL, and Unix. High proficiency with generating complex SQL queries is required.
Version control of databases and metadata management tools (e.g. Git, Liquibase, etc).
Proven communication skills illustrating ability to effectively interact with people at all levels of business and technology organizations.
Experience working in an Agile environment using Lean, Kanban and Scrum practices.
Insurance and Financial domain experience is highly desired.