Description

Key Responsibilities

  • Design, develop, and manage Snowflake data warehouse solutions.
  • Establish best practices for Snowflake usage with tools like Airflow, DBT, and Spark.
  • Develop and deploy data pipeline frameworks using CI/CD and standard testing tools.
  • Monitor and tune query and data load performance.
  • Support QA and UAT phases by validating issues and identifying root causes.
  • Integrate Snowflake with internal platforms for data quality, cataloging, discovery, incident logging, and metrics.
  • Utilize Snowflake features such as data sharing, time travel, Snowpark, and workload optimization.
  • Ingest and manage structured and unstructured data across hybrid environments.


 

Required Qualifications

  • Bachelor’s degree in Computer Science, Software Engineering, Information Technology, or related field.
  • Minimum 10 years of experience in data development in complex, high-volume environments.
  • At least 7 years of experience with SQL/PLSQL and complex query development.
  • Minimum 5 years of experience developing data solutions on Snowflake.
  • At least 3 years of experience with Python and libraries such as Pandas, NumPy, PySpark.
  • At least 3 years of experience in hybrid data environments (on-prem and cloud).
  • Hands-on experience with Airflow (or similar tools like Dagster).
  • Snowflake SnowPro Core certification (mandatory).


 

Preferred Qualifications

  • Snowflake SnowPro Advanced Architect and Advanced Data Engineer certifications.
  • Experience with DBT.
  • Strong skills in performance tuning of SQL queries, Spark jobs, and stored procedures.
  • Understanding of E-R data models and advanced data warehouse concepts (e.g., Factless Fact Tables, Temporal/Bi-Temporal models).
  • Experience in functional and event-driven programming.
  • Familiarity with AWS, Kubernetes, and Docker.
  • Strong analytical and problem-solving skills

Education

Bachelor's degree