Description

  • Understanding multiple data sources like XMLs, Relational, CSV, Txt, and Excel to feed into Data Lake. ·
  • Work with technical and business stakeholders to understand the data sources, ingestion, movement, structures, and processes in existing data warehouses to be able to generate requirements and designs.·
  • Mentoring the team on big data tool and technologies and helping them in all technical aspects including best practices.

Qualifications:

  • 5+ years

Tools:

  • Strong knowledge of GCP, Looker, SQL, Python, Java, SQL, Shell Scripting.
  • Minimum 3-5 years experience in relevant technologies·
  • Experience in Healthcare (preferred) ·
  • Minimum 8 years experience in IT industry.

Education:

  • Bachelor/Master of Computer Application or equivalent engineering/IT degree from an accredited University.
  • Should have complete hands on all above tool and technologies. ·
  • Extracting data from Data Lake and feeding into Data warehouse and vice versa. ·
  • Extraction of data incrementally and full load as per requirements. · Follow configuration and change management process. ·
  • Ability to solve problems with data ·
  • Working with cloud computing environments ·
  • Tune solutions to improve performance and end-user experience · Contribute to group k
  • Knowledge sharing platforms and best practice ·
  • High attention to data accuracy ·
  • Ability to work in an agile team ·
  • Critical thinking to ask questions, determine best course and offer solutions
  • Effective analytical and decision-making skills
  • Strong interpersonal skills to build relationships and communicate effectively with managers, co-workers and Business
  • Teamwork skills within the department and on project teams ·
  • Demonstrated ability to work effectively in a fast-paced, complex, and dynamic business environment ·
  • Enjoy being challenged and to solve complex problems on a daily basis

Education

Bachelor's or Master's degrees