Roles and responsibilities:
· Understand client’s requirement and provide effective and efficient solution.
· Understanding data transformation and translation requirements and which tools to leverage to get the job done
· Understanding data pipelines and modern ways of automating data pipeline using cloud and clearly document implementations, so others can easily understand the requirements, implementation, and test conditions
· Mentor rest of the team and guide them through the development process
· Perform data quality testing and assurance as a part of designing, building and implementing scalable data solutions in SQL
· Familiar with deployment best practices and tools and understanding of automated unit testing.
Technical and Functional Skills:
· Bachelor’s Degree with 5+ years of experience with relevant 4+ years hands-on of experience in developing Python scripts. Strong knowledge of designing and developing data engineering applications in Python.
· 3+ yrs of hands-on experience on Big Query.
· Expertise on Data warehousing methodologies and modelling techniques.
· Solution design experience using GCP components.
· In-depth understanding of data warehouse and ETL tools.
· Strong experience in writing SQL Queries, Joins, Store Procedure, User Defined Functions.
· Sound knowledge in Data architecture and design.
· Strong experience in working with CI/CD and automating deployment process.
· Experience in setting up automated testing and QA
· Experience working in Massively Parallel Processing MPP Analytical Data stores.
· Experience in Pub Sub for handling Streaming data is a plus.
· GCP Data Engineer or Solution Architect Certification is an added advantage.
Bachelor’s Degree