Description

Key Responsibilities:

  • Develop and maintain scalable, performant data pipelines to ingest, transform, and deliver data efficiently.
  • Leverage Snowflake for data engineering tasks, including ingestion, transformation, performance optimization, and access control.
  • Manage data exchange workflows using AWS S3, ensuring secure and efficient data loading/unloading processes.
  • Automate and orchestrate data tasks using SQL, Python, Bash, or similar scripting tools.
  • Apply best practices in ETL/ELT design and data modeling to ensure data consistency, reliability, and performance.
  • Collaborate with data analysts and BI teams to ensure clean, curated datasets are available for reporting and analytics.
  • Contribute to CI/CD workflows and version control systems to maintain high-quality, production-ready data code.

 

Key Qualifications:

  • Strong hands-on experience with Snowflake, including performance tuning and security best practices.
  • Working knowledge of AWS S3 and secure data handling processes.
  • Proficiency in SQL and scripting (Python, Bash, or equivalent).
  • Experience with ETL/ELT patterns and a solid grasp of data modeling techniques.
  • Proven ability to collaborate across cross-functional teams, particularly with analytics and BI stakeholders.
  • Exposure to CI/CD tools and workflows is a plus

Education

Any Gradute