Description

Responsibilities

  • Analyze, model and wrangle data and files to prep for utilization.
  • Build data pipelines to move data into Redshift.
  • Write python or spark scripts to provide extracts.
  • Create views in Redshift to expose data to Tableau.
  • Conduct working sessions with customer and partner resources to define and validate solution requirements.
  • Develop deployment model or approach to provision and implement defined services.
  • Advise and implement AWS best practices.

Required Skills

  • Good knowledge of AWS services like EC2, S3, lambda, glue, data pipes, firehose, athena, kinesis etc.
  • Equal or better knowledge of Redshift.
  • Senior knowledge of Python and/or spark and JSON.
  • Other skills that might be useful are knowledge of dockers, SNS, cloudwatch.
  • Good Communication & Analytical Skills.

Required Experience

  • Hands-on data engineering experience with AWS Cloud.
  • Experience with Data architecture, Data governance, Data quality, Data Security.
  • Experience in BI technologies like Tableau, OBIEE, QlikView, Microsoft BI is a plus.
  • Experience with Enterprise Modelling, Enterprise Reporting Solutions.
  • Expertise in SQL is a must – Minimum 6 yrs of SQL exp is desired.

Education Requirements

  • Bachelor’s Degree in Computer Science, Computer Engineering or a closely related field.


 

Education

Any Graduate