Description

Job Description:

 

Key Responsibilities

1. Solution Design & Architecture

Design and implement scalable, reliable data pipelines for both batch and real-time streaming use cases.

Architect cloud-native solutions leveraging GCP services to ensure high availability, performance, and cost efficiency.

Develop embedded analytics platforms that deliver actionable insights within operational workflows.

 

2. Data Stack Development

Constructing the Realtime data pipeline with quality metrics using GCp services with optimized way

Construct Batch pipelines with suitable services

Design and optimize streaming data architectures to process high-volume, high-velocity data streams in real time.

 

3. Advanced Analytics & Data Science Enablement

Support machine learning and data science workflows including model development, deployment, and monitoring.

Facilitate predictive analytics solutions to drive data-driven decision-making.

 

4. Application Architecture Collaboration

Work closely with software development teams to align applications with data and analytics architecture.

Define and enforce architectural standards and best practices that meet business needs.

 

5. Technology Leadership & Innovation

Stay current with industry trends and emerging technologies in data engineering, cloud, and analytics.

Provide thought leadership and mentor teams on best practices in data architecture and cloud solutions.

Collaborate cross-functionally to deliver end-to-end data solutions.

 

Required Skills & Qualifications

Technical Expertise:

Data Architecture and hands-on with Google Cloud Platform (GCP).

Deep experience with core GCP services such as BigQuery, Cloud Storage, Pub/Sub, Dataflow, Airflow, Spanner, Bigtable, Cloud SQL, Dataproc, and Cloud Composer.

Proficiency in building and managing streaming data pipelines using Apache Kafka, Google Cloud Dataflow, and serverless technologies.

Strong programming skills in Python, Java, and SQL.

Experience with containerization and orchestration tools like Docker and Kubernetes on GCP.

Familiarity with infrastructure automation tools such as Terraform and Google Cloud Deployment Manager.

Expertise in data modeling, ETL/ELT pipelines, data migration, and data warehousing concepts.

Knowledge of big data frameworks like Hadoop and Spark is a plus.

Experience with embedded analytics platforms (e.g., Tableau Embedded, Power BI Embedded) and ML frameworks (e.g., TensorFlow, PyTorch) is desirable.

 

Application Architecture:

Strong understanding of microservices architecture, API management, and cloud-native application design.

 

Soft Skills:

Excellent analytical and problem-solving abilities.

Strong communication skills with the ability to engage and manage stakeholders effectively.

Ability to work in a fast-paced environment and juggle multiple priorities.

Proven leadership in guiding technical teams and managing client relationships.

 

Education & Certification:

Bachelor’s degree in Computer Science, Information Technology, or related fields.

Google Professional Cloud Architect certification is required.

 

Preferred Experience:

solution architecture or related data roles.

Familiarity with data governance, security best practices, and compliance standards.

 

Education

Any Graduate