Description

Key Responsibilities:

  • Develop and maintain data pipelines and ETL processes using Snowflake, Streams & Tasks, and Snowpipe.
  • Leverage Snowpark to build scalable data transformations in Python.
  • Implement Secure Data Sharing, Row-Level Security, and Dynamic Data Masking for governed data access.
  • Create and manage materialized views, automatic clustering, and search optimization for performance tuning.
  • Collaborate with data scientists, analysts, and DevOps teams to deliver end-to-end data solutions.
  • Monitor query performance, troubleshoot issues, and recommend optimizations using Query Profile.

Technical Skills:

  • Strong expertise in Snowflake SQL and data modeling (Star/Snowflake schema).
  • Hands-on with Snowpark, Streams/Tasks, and Secure Data Sharing.
  • Proficiency in Python or Java for data processing with Snowpark.
  • Experience with cloud platforms: AWS, Azure, or GCP (Snowflake hosted environments).
  • Familiarity with CI/CD, Git, and orchestration tools (e.g., Airflow, DBT).
  • Working knowledge of data governance, data security, and compliance best practices

Qualifications:

  • Bachelors or Masters degree in Computer Science, Information Systems, or a related field.
  • Snowflake certification is a strong plus (e.g., SnowPro Core or Advanced).

Education

Bachelors or Masters degree in Computer Science, Information Systems