Design, develop and validate various Data Warehousing & Data Analysis solutions.
Work with business users to establish requirements to achieve business objectives through Data.
Decompose the business requirements into one or more reusable DW/ETL products.
Translate business requirements into Data/ETL Solutions.
Perform tasks and duties related to ad hoc analysis, design and processing.
Ensure effective communication of user requirements.
Required Skills
Understanding of Big Data Ecosystems is an advantage.
Working knowledge of Java / Python & REST API.
Understanding of Snowflake, Google Cloud Platform, or MS Azure would help.
Should possess excellent communication and presentation skills and be able to explain complete data transformation and model mapping process.
Understand of Graph Databases.
Willingness to learn new data lake technologies and hand on implementation.
Proven ability to take initiative and be innovative.
Effective time management skills.
Required Experience
Minimum of 2-3 years experience with Data Warehousing tools and technologies (e.g. MySQL or similar database, Pentaho ETL, Java ETL or any other similar ETL tools).
Experience in working with Java would be a plus.
Experience in writing complex SQL statements to validate data and ETL code based on the data mapping & requirements and perform extensive data analysis to identify the defects.
Education Requirements
Bachelor’s Degree in Computer Science, Computer Engineering or a closely related field.