Job Description:
Responsibilities:
• Experience in data integration activities including: architecting, designing, coding, and testing phases
• Architect the data warehouse and provide guidance to the team in implementation using Snowflake or Hadoop or other big data technologies
• Good scripting knowledge techniques using Python
• Experience in performance tuning of the snow pipelines and should be able to trouble shoot the issue
• Should be able to demonstrate the proposed solution, with excellent communication and presentation skill
Qualifications:
• Minimum 3+ years of experience in designing and implementing a fully operational solution on Snowflake Data Warehouse or Hadoop
• Experience with Python and a major relational database.
• Should be having good presentation and communication skills, both written and verbal Ability to problem solve and able to convert the requirements to design
• Work Experience on optimizing the performance of the Spark jobs
Any Graduate