Description

Key Responsibilities

Technical

Design and implement modular reusable DBT models for data transformation in Snowflake

Optimize Snowflake performance through clustering partitioning caching and query tuning

Define and manage schema objects including databases schemas tables views and stages

Build and maintain ELT pipelines using Snowflake native features like Snowpipe Streams and Tasks

Integrate Snowflake with external data sources and cloud storage eg AWS S3 Azure Blob GCP

Optimize query performance using clustering keys result caching and materialized views

Monitor and tune warehouse performance and cost efficiency

Leverage advanced Snowflake features like Time Travel ZeroCopy Cloning and Data Sharing

Explore and implement UDFs external functions and Snowpark where applicable

Ensure compliance with data governance and privacy standards

Automate workflows using orchestration tools eg Airflow Azure Data Factory

Schedule and monitor data jobs using Snowflake Tasks and external schedulers

Collaborate with data analysts architects and business stakeholders to translate requirements into scalable data solutions

Design and implement DBT projects from scratch including folder structure model layers staging intermediate marts and naming conventions

Use Git for version control of DBT projects

Design build and maintain modular DBT models for data transformation

Implement staging intermediate and mart layers following best practices

Use Jinja templating and macros to create reusable logic

Define and manage tests eg uniqueness not null accepted values within DBT

Monitor test results and resolve data quality issues proactively

Implement CICD pipelines for DBT projects using Git Bitbucket and Jenkins

Ensure data governance lineage and documentation using tools like dbtdocs and metadata tagging

Integrate Snowflake with cloud storage eg GCP Azure Blob AWS S3 and orchestration tools eg Airflow Azure Data Factory

Troubleshoot and resolve data quality issues and performance bottlenecks

Implement rolebased access controls and data masking where required

Ensure compliance with data governance and privacy policies

Integrate DBT with orchestration tools eg Airflow Prefect

Schedule and monitor DBT run in production environments

Functional

Prior experience on working with sources like SAP ECC S4 HANA

Functional understanding one of these SAP module Supply chain Finance FICO Sales Distribution

Prior experience pulling data from SAP sources

Required Skills

3 6 years of handson experience with Snowflake including SnowSQL Snowpipe Streams Tasks and Time Travel

2 years of handson experience with DBT Core or Cloud in a production environment

Strong SQL skills and experience with data modelling starsnowflake schema normalizationdenormalization

Deep understanding of DBT features materializations table view incremental ephemeral macros seeds snapshots tests and documentation

Experience with cloud data warehouses Snowflake

Proficiency in Git CICD and workflow orchestration eg Airflow dbt Cloud

Familiarity with Jinja templating YAML configuration and DBT project structure

Strong communication skills and ability to work cross functionally

Preferred Qualifications

SnowPro Core Certification or equivalent

Experience with Airflow Azure Data Factory or similar orchestration tools

Familiarity with data cataloging and lineage tools

Knowledge of data security RBAC and masking in Snowflake

Experience working in AgileScrum environments

Skills

Mandatory Skills: Snowflake,ANSI-SQL,DBT

Education

Any Graduate