Description

Job Description:-

Key Responsibilities:

Architect and lead the design, development, and deployment of data warehouse solutions using Snowflake.

Develop scalable, efficient, and maintainable ETL/ELT pipelines using tools such as Informatica, Talend, Matillion, dbt, or custom Python/SQL-based pipelines.

Work with business and analytics teams to understand data requirements and translate them into scalable solutions.

Oversee data ingestion from structured and semi-structured sources (e.g., JSON, Parquet, Avro).

Optimize Snowflake performance using best practices in clustering, partitioning, and query tuning.

Implement data governance, security, and access control policies in Snowflake.

Collaborate with DevOps to manage infrastructure as code and CI/CD for data pipelines.

Mentor and lead a team of data engineers, perform code reviews, and enforce development standards.

Ensure data accuracy, consistency, and quality across all pipelines and systems.

Required Skills & Qualifications:

Bachelor’s or Master’s degree in Computer Science, Engineering, Information Systems, or related field.

6+ years of data engineering experience with at least 3+ years of hands-on experience with Snowflake.

Strong experience in building and maintaining ETL/ELT pipelines with tools like:

Informatica, Matillion, dbt, Airflow, Talend, or custom scripting (Python/SQL).

Advanced SQL and performance tuning skills.

Strong knowledge of Snowflake architecture, Warehouses, Virtual Warehouses, Streams, Tasks, Time Travel, and Data Sharing.

Experience integrating with cloud platforms (AWS, Azure, or GCP).

Familiarity with data modeling techniques (3NF, Dimensional, Data Vault).

Experience with source control (e.g., Git) and CI/CD for data engineering.

Strong understanding of data governance, data quality, and metadata management.

Education

Any Graduate