Description

Job Description:

 

10+ Years Strong hands-on experience in Lead Data Engineering using Snowflake, DBT (Data Build Tool) , AWS Lambda, Python and FiveTran.

10+ years of work experience with development, Data warehousing and end-to-end implementation of Snowflake cloud data warehouse.

Experience in Snowflake Task, Snowflake Streams, Snowflake Dynamic Tables, Snowsql, Procedure, Snowpipe, Error Handling, Optimize storage and Compute and Advanced SQL commands.

Experience in creating ETL pipeline to copy data from various sources like Vendor API, (CSV, JSON, XML) files, database systems like SAP, JDE, or SQL Server into the Snowflake platform using tools like AWS Lambda (Python) and Fivetran.

Experience in Writing DBT scripts to clean and copy the data into Snowflake.

Experience in CI/CD.

Nice to have skills like Azure DevOps, FiveTran LDP (HVR), Terraform, and Git.

Must have Analytical, troubleshooting, Problem solving and excellent communication skills.

Interacting with QA to address reported findings.

Working individually and as a team to achieve our Sprint goals.

Handling production support incident tickets based on severity.


 

Mandatory Skills

  • Snowflake, SQL
  • DBT (Data Build Tool)
  • Azure Data factory (ADF)
  • AWS Lambda, AWS Step functions, AWS S3
  • Python

Nice to have Skills (But not Mandatory)

  • Azure DevOps, Git
  • FiveTran LDP (HVR)
  • Terraform
  • Any Scheduling tools (Modern data platform)
  • Handling tickets in ServiceNow

Education

Any Graduate