Description

Responsibilities:

  • Design, develop, and maintain scalable data pipelines and ETL processes.
  • Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and deliver solutions.
  • Ensure data quality, integrity, and security across all data pipelines.
  • Optimize and tune data processes for performance and cost-efficiency.
  • Implement and manage data storage solutions on Google Cloud Platform.
  • Monitor and troubleshoot data pipeline issues and implement solutions.
  • Stay up-to-date with industry trends and best practices in data engineering.


Requirements:

  • Bachelor's degree in Computer Science, Information Technology, or a related field.
  • Proven experience as a Data Engineer or in a similar role.
  • Strong proficiency in SQL and Python.
  • Knowledge of Google Cloud Platform (GCP) and its data services (e.g., BigQuery, Cloud Storage, Cloud Composer).
  • Experience with ETL tools and data integration techniques.
  • Familiarity with data warehousing concepts and technologies.
  • Excellent problem-solving skills and attention to detail.
  • Strong communication and collaboration skills.
  • Experience with other cloud platforms (e.g., AWS, Azure).
  • Knowledge of data modeling and database design.
  • Experience with big data technologies (e.g., Hadoop, Spark)

Education

Any Graduate