We are seeking a skilled Data Engineer with proven experience in enterprise-level data migrations, particularly from Teradata to Google Cloud Platform (GCP). This role requires hands-on expertise in modern cloud data services, data modeling, and performance optimization.
Primary Responsibilities:
- Spearhead the end-to-end migration of data systems from Teradata to GCP BigQuery, maintaining high standards of data quality, performance, and reliability throughout the process.
- Build and maintain efficient ETL/ELT pipelines using tools such as Cloud Dataflow, Apache Beam, or Cloud Data Fusion.
- Work collaboratively with cloud engineers to uphold security, compliance, and cost-efficiency across GCP infrastructure.
- Optimize SQL queries and workloads in both Teradata and BigQuery to ensure high performance at scale.
- Collaborate with business stakeholders, analysts, and data architects to capture requirements and ensure accurate data transformation.
- Document technical workflows and implement testing procedures to validate successful data migration.
- Troubleshoot and resolve migration-related issues such as data mismatches, platform compatibility problems, and performance bottlenecks.
Core Skill Requirements:
- Advanced, hands-on experience with Teradata, including complex SQL tuning and handling large-scale datasets.
- In-depth familiarity with GCP services, especially BigQuery, Cloud Storage, Pub/Sub, and Dataflow.
- Strong knowledge of ETL development using Python, SQL, and GCP-native tools.
- Sound understanding of data governance, data quality standards, and data modeling principles.
Nice to Have:
- Working experience with orchestration tools like Cloud Composer, Apache Airflow, or Apache Beam.
- Familiarity with DevOps practices and CI/CD pipelines for automating data workflows.
- Awareness of cloud data security protocols and compliance frameworks such as HIPAA and GDPR.
- Possession of the Google Cloud Professional Data Engineer certification is an added advantage