Description

Responsibilities:

 

  • Develop and maintain data pipelines using Snowflake and Azure Data Factory.
  • Collaborate with data architects, data engineers, and business analysts to understand data requirements and translate them into technical solutions.
  • Design and implement data transformation logic using SQL, Python, or other programming languages.
  • Optimize data pipelines for performance and scalability.
  • Implement data security and governance best practices.
  • Troubleshoot and resolve issues related to data pipelines and processing.
  • Document data pipelines, processes, and best practices.

     

Qualifications:

 

  • Strong experience with Snowflake's cloud data platform, including data modeling, performance tuning, and security.
  • Proficiency in Azure Data Factory for data integration and orchestration.
  • Experience with SQL and other programming languages for data manipulation and analysis.
  • Familiarity with cloud computing concepts and services, particularly Azure.
  • Strong problem-solving and analytical skills.
  • Excellent communication and collaboration skills.

     

Experience:

 

  • Typically, 5+ years of experience in data engineering, ETL development, or a related field.
  • Experience working with data warehousing and data integration projects.
  • Experience designing and implementing solutions using Snowflake and ADF in a cloud environment

Education

Any Gradute