Key Responsibilities:
· Design, develop, and manage scalable data architectures and pipelines on Google Cloud Platform (GCP).
· Build and optimize complex data models and ETL processes using BigQuery, LookML, Python, PySpark, and SQL.
· Utilize GCP Data Flow and other GCP services to automate data processing and orchestration.
· Leverage GCP Looker Studio to create insightful dashboards and reports that provide business value.
· Collaborate with data scientists, analysts, and other stakeholders to gather requirements and translate them into technical solutions. -Conduct data analysis to identify trends, patterns, and insights to support data-driven decision-making.
· Apply advanced data modeling techniques to ensure data quality, integrity, and consistency.
· Manage and optimize the entire data lifecycle, from ingestion to archiving and deletion.
· Develop and implement best practices for data governance, security, and compliance on GCP.
· Communicate complex data findings to non-technical stakeholders through storytelling and data visualization.
Qualifications:
· Bachelor’s or Master’s degree in Computer Science, Data Engineering, Information Systems, or a related field.
· 8+ years of hands-on experience in data engineering, data modeling, and analytics on Google Cloud Platform (GCP).
· Proficient in BigQuery, LookML, Python, PySpark, and SQL for data processing and analysis.
Bachelor’s or Master’s degree in Computer Science