Description

Job Responsibilities:

  • Develop and implement data pipelines using Google Cloud Dataflow.
  • Optimize data processing workflows for performance and cost efficiency.
  • Collaborate with IBM UDMH Data Modeler and analysts to understand data requirements.
  • Ensure data quality and integrity throughout the pipeline.

Required Skills:

  • Google Cloud Dataflow
  • Python and Java (Strong experience is mandatory)

Primary Skills:

  • BigQuery, SQL, ETL, Cloud Functions, Data flow, Airflow
  • 10+ years of IT experience with strong expertise in BigQuery and SQL.
  • Hands-on experience in ETL and data migration.
  • Mastery of Google Cloud Platform (GCP) and related services including:
  • Cloud Dataflow ,Cloud Pub/Sub ,Cloud Storage, Google Composer
  • Solid understanding of Git and Git workflows (GitHub, GitLab, Bitbucket).
  • Experience working in Agile environments and CI/CD pipelines.
  • Ability to collaborate with business stakeholders and product managers to prioritize and deliver tasks.
  • Handle admin activities like project setup, user access, and job scheduling.
  • Write and test scripts using:
  • Unix Shell scripting
  • Oracle PL/SQL
  • Conduct unit testing and validation

Education

Any Gradute