Key Skills: Snowflake development, DBT (CLI & Cloud), ELT pipeline design, SQL scripting, data modeling, GitHub CI/CD integration, Snowpipe, performance tuning, data governance, troubleshooting, and strong communication skills.
Roles and Responsibilities:
- Design, develop, and maintain scalable data pipelines and ELT workflows using Snowflake SQL and DBT.
- Utilize SnowSQL CLI and Snowpipe for real-time and batch data loading, including the creation of custom functions and stored procedures.
- Implement Snowflake Task Orchestration and schema modeling, and perform system performance tuning for large-scale data environments.
- Build, deploy, and manage robust data models within Snowflake to support reporting and analytical solutions.
- Leverage DBT (CLI and Cloud) to script and manage complex ELT logic, applying best practices for version control using GitHub.
- Independently design and execute innovative ETL and reporting solutions that align with business and operational goals.
- Conduct issue triaging, pipeline debugging, and optimization to address data quality and processing gaps.
- Ensure technical designs adhere to data governance policies, security standards, and non-functional requirements (e.g., reliability, scalability, performance).
- Provide expert guidance on Snowflake features, optimization, security best practices, and cross-environment data movement strategies.
- Create and maintain comprehensive documentation for database objects, ETL processes, and data workflows.
- Collaborate with DevOps teams to implement CI/CD pipelines involving GitHub, DBT, and Snowflake integrations.
- Troubleshoot post-deployment production issues and deliver timely resolutions.
Experience Requirements:
- 5-8 years of experience in data engineering, with a strong focus on Snowflake and modern data architecture.
- Hands-on experience with Snowflake's architecture, including SnowSQL, Snowpipe, stored procedures, schema design, and workload optimization.
- Extensive experience with DBT (CLI and Cloud), including scripting, transformation logic, and integration with GitHub for version control.
- Successfully built and deployed large-scale ELT pipelines using Snowflake and DBT, optimizing for performance and data quality.
- Proven track record in troubleshooting complex production data issues and resolving them with minimal downtime.
- Experience aligning data engineering practices with data governance and compliance standards.
- Familiarity with CI/CD pipelines in a cloud data environment, including deploying updates to production using GitHub actions and DBT integrations.
- Strong ability to communicate technical details clearly across teams and stakeholders.
Education: Any Post Graduation, Any Graduation