Description

Key Skills: Scala, Google BigQuery, Data Pipelines, ETL/ELT, Data Engineering, Data Governance, Workflow Automation, Performance Optimization.

Roles and Responsibilities:

  • Design and implement robust, scalable data pipelines using Scala and BigQuery.
  • Develop ETL/ELT processes to ingest, transform, and store large volumes of structured and semi-structured data.
  • Collaborate with data scientists, analysts, and business stakeholders to understand data requirements and deliver high-quality solutions.
  • Optimize BigQuery queries and manage cost-effective data processing.
  • Ensure data quality, integrity, and governance across all data systems.
  • Automate data workflows and implement monitoring and alerting systems.
  • Participate in code reviews, architecture discussions, and performance tuning.

Experience Requirement:

  • 5-10  years of experience in data engineering with a strong focus on building production-grade data pipelines.
  • Solid hands-on experience with Scala in developing data processing applications.
  • Proven expertise in using Google BigQuery for large-scale data processing and analytics.
  • Experience in optimizing query performance and managing data workflows.
  • Background in working with data governance, quality assurance, and pipeline monitoring.

Education: Any Graduation

Education

Any Graduate