· Evaluate, re-architect, and optimize existing Snowflake virtual warehouses and compute resources to enhance cost efficiency and utilization.
· Develop and implement strategies for right-sizing virtual warehouses, managing auto-suspend/resume, and consolidating warehouses where appropriate.
· Configure and manage Snowflake Resource Monitors to prevent runaway costs and ensure adherence to budget constraints.
· primarily through the use of Snowflake tasks and Snowpark Container Services (SPCS).
Qualifications:
· Bachelor's degree in Computer Science, Information Technology, Data Engineering, or a related quantitative field. Master's degree preferred.
· 7+ years of experience in data architecture and engineering roles, with a minimum of 4+ years of hands-on, in-depth experience specifically with the Snowflake data platform.
· Proven track record of successfully leading and implementing Snowflake compute and cost optimization projects in a large-scale production environment.
· Expert-level proficiency in SQL, with extensive experience in query optimization and performance tuning.
· Deep understanding of Snowflake's unique architecture, including virtual warehouses, micro-partitions, caching mechanisms, clustering keys, and resource monitors.
· Strong experience with ETL/ELT processes and data pipeline optimization within Snowflake.
· Proficiency in at least one scripting/programming language (e.g., Python) for automation and data manipulation.
· Familiarity with cloud platforms (AWS, Azure, or GCP) where Snowflake is deployed, including cloud storage solutions (e.g., S3, Azure Blob Storage).
· Excellent analytical, problem-solving, and troubleshooting skills, particularly for complex performance issues.
· Strong communication, collaboration, and stakeholder management skills.
· Snowflake certification (e.g., SnowPro Advanced Architect, SnowPro Advanced Administrator) is highly desirable
Bachelor's degree