Description

  • Design and implement data pipelines using Snowflake, SQL, and ETL tools.
  • Develop and optimize complex SQL queries for data extraction and transformation.
  • Create and manage Snowflake objects such as databases, schemas, tables, views, and stored procedures.
  • Integrate Snowflake with various data sources and third-party tools.
  • Monitor and troubleshoot performance issues in Snowflake environments.
  • Collaborate with data engineers, analysts, and business stakeholders to understand data requirements.
  • Ensure data quality, security, and governance standards are met.
  • Automate data workflows and implement best practices for data management

Qualifications:

  • Proficiency in Snowflake SQL and Snowflake architecture.
  • Experience with ETL/ELT tools (e.g., Informatica, Talend, dbt, Matillion).
  • Strong knowledge of cloud platforms (AWS, Azure, or GCP).
  • Familiarity with data modeling and data warehousing concepts.
  • Experience with Python, Java, or Shell scripting is a plus.
  • Understanding of data security, role-based access control, and data sharing in Snowflake.
  • Excellent problem-solving and communication skills. Preferred Qualifications:
  • Snowflake certification (e.g., SnowPro Core).
  • Experience with CI/CD pipelines and DevOps practices.
  • Knowledge of BI tools like Tableau, Power BI, or Looker.

Required:

  • 5-15 years of experience is preferred.
  • Experience with Agile based development
  • Problem solving skills
  • Proficiency in writing performant SQL Queries/Scripts to generate business insights and drive better organizational decision making

Education

Bachelor's degree