Description

We are seeking a highly skilled Data Modeler with strong experience in Google Cloud Platform (GCP) to design and implement scalable, efficient, and secure data models that support enterprise analytics, reporting, and data warehousing initiatives.

Key Responsibilities:
• Design and develop conceptual, logical, and physical data models aligned with business requirements.
• Implement data modeling best practices using GCP services such as BigQuery, Cloud SQL, Spanner, and Cloud Storage.
• Collaborate with data engineers, analysts, and business stakeholders to define data architecture and governance standards.
• Optimize data structures for performance, scalability, and cost-efficiency in GCP.
• Ensure data integrity, consistency, and compliance with security and privacy regulations.
• Document data models, metadata, and data dictionaries.

Required Skills:
• Strong expertise in data modeling techniques (e.g., star/snowflake schemas, normalization/denormalization).
• Hands-on experience with GCP services: BigQuery, Cloud SQL, Cloud Storage, Dataflow, and Pub/Sub.
• Proficiency in SQL and data profiling tools.
• Familiarity with ETL/ELT pipelines and data integration tools.
• Understanding of data governance, lineage, and cataloging (e.g., GCP Data Catalog).
• Excellent communication and documentation skills.

Preferred Qualifications:
• GCP Data Engineer or Architect certification.
• Experience with NoSQL databases (e.g., Firestore, MongoDB).
• Knowledge of tools like dbt, Looker, or Tableau.
• Background in Big Data ecosystems (e.g., Hadoop, Apache Beam).

Minimum Years of Experience:
5+ years
 

Education

Bachelor's degree