Description

Responsibilities

  • Acquire, interpret data, profile and analyze results.
  • Identify, analyze and interpret trends or patterns in complex data sets, trouble shoot data quality issues.
  • Develop and maintain scalable data pipelines.
  • Work closely with a team of frontend and backend engineers, product managers and analysts.
  • Implement business requirements, data quality rules, improvement to data flows using tools such as Talend, Snow pipe, scripting, etc.

Required Skills

  • Strong background building custom data pipelines with target d/b into ALL GCP tables (Bigtable, GCS, Big query).
  • Knowledge of VPC, Private/Public Subnet, Network Security Groups, Firewalls.
  • Strong analytical, problem-solving, data analysis and research skills.

Required Experience

  • Minimum 2years of hands-on experience with using GCP services (i.e. Dataproc, Pub/Sub, Big Query, Data flow, CloudRun/Cloud Functions, GKE etc) with rich backgrounds in Big Data technologies.
  • Prior experience migrating workload; on-premise to cloud, Legacy Modernization (desirable).
  • Minimum 2 years of hands-on experience with Google (Google Cloud Platform), Google Compute, DevOps, Storage and Security components.
  • Must have worked hands on large transfers into GCP target databases, and created a strong throughput data pipeline.
  • Experience in SQL tuning to address performance and cost considerations.

Education Requirements

  • Bachelor’s Degree in Computer Science, Computer Engineering or a closely related field.


 

Education

Any Graduate