Description

Job Summary:

  • Responsible for designing, developing, and optimizing data pipelines, ensuring the efficient storage, processing, and retrieval of large-scale datasets.
  • This role requires strong experience in ETL development, data modeling, performance tuning, and cloud-based data solutions.

 

Required Skills & Experience:

  • 5+ years of experience in Data Engineering.
  • Expertise in Snowflake, including architecture, SQL scripting, and performance tuning.
  • Strong SQL and experience with ETL/ELT development.
  • Experience with cloud platforms (AWS, Azure).
  • Proficiency in Python, Databricks for data processing.
  • Hands-on experience with DBT, or similar orchestration tools.
  • Strong understanding of data warehousing principles, data lakes, and data governance.
  • Familiarity with CI/CD pipelines, version control (Git), and DevOps for data.

Education

Any Graduate