Description

A Data Engineer is required to design and implement a cloud-native data processing and API integration system. The solution will ingest identity data from upstream sources, detect record-level changes, and synchronize user metadata to a downstream system via API. This is a hands-on role focused on scalable data handling, automation, and fault-tolerant service deployment within GCP.

  • Solution Design & Development: Build modular Python applications that process identity data files or APIs and sync them to target platforms.
  • Data Staging & Processing: Stage identity metadata in BigQuery using defined schemas and implement change detection logic (create/update/delete).
  • API Integration: Design and implement logic to call RESTful APIs to maintain target user repositories (e.G., user attributes, roles).
  • Workflow Orchestration: Use GCP Pub/Sub, Composer, and/or Cloud Run to manage asynchronous workflows and ensure event-driven processing.
  • Infrastructure as Code: Deploy and manage services using Terraform with a focus on security, idempotency, and configuration as code.
  • Observability & Resilience: Implement logging, retry logic, and incident handling to ensure system reliability and traceability.
  • Testing & Validation: Build automated test coverage for critical processing logic and API interactions.

 
Required Qualifications:

  • Bachelor’s or Master’s degree in Computer Science, Data Engineering, or equivalent work experience
  • 6+ years in backend development or data engineering roles focused on identity, security, or metadata systems
  • Strong Python engineering for data processing and backend development
  • Advanced experience with GCP services: BigQuery, Cloud Run, Cloud Functions, Cloud Composer, Pub/Sub, Cloud Storage, Secret Manager, Cloud Scheduler
  • Experience interacting with REST APIs, including OAuth2 or token-based authentication
  • Terraform for cloud infrastructure automation
  • Proficiency with SQL for data transformation and validation
  • Strong understanding of CI/CD, containers (Docker), Git workflows
  • Comfortable working with structured metadata, user roles, and directory-style data
  • Able to work independently and meet delivery milestones
  • Strong documentation and debugging skills
  • Must adhere to enterprise security and change control practices

 
Preferred Qualifications:

  • Experience integrating with IAM or identity systems (e.G., LDAP, Okta, custom directories)
  • Background working in regulated or high-security environments
  • Experience handling large-scale user datasets (millions of records)
  • Familiarity with hybrid data processing (batch + streaming)
  • GCP Certifications

Education

Bachelor's or Master's degrees

https://jayaslotapk789.com https://vivo500slot.com/ https://amirpalace-hotel.com/ https://jepe500gacor.com/ https://management.giongcayanqua.edu.vn/ https://www.theshiori.com/ https://citizensbusinesschampion2023.dja.com/ https://sevensensefest.com/ https://www.booksarepopculture.com/ https://lohanrhodes.com/ aplikasi slot dana apk dana game situs slot gacor ink789 slot dana slot dana https://haringey-irish.com/ https://nei-marine.com/ jayaslot login Situs Slot Qris
slot dana slot dana 5k rejekibet jayaslot vivo500 slot online vivo500 vivo500 vivo500 bina bangsa tunas karya permainan slot slot 5k slot 5k slot 5k jp500 jp-500 jp500 jp-500 jepe500 jepe-500 jepe500 jepe-500 APK Slot JKT8 rejekibet rejekibet
https://www.venturecapitalineducation.com/ https://www.booksarepopculture.com/ https://coolthought.org/ https://sevensensefest.com/ https://usatimesbio.com/ https://www.theshiori.com/ https://lohanrhodes.com/ https://amirpalace-hotel.com/ https://marheaven.com/ https://theisticsatanism.com/ heylink.me/vivo500gacor/