Description

Must Have Skills:

Snowflake, Azure, SQL

Required:

  • The ideal candidate will be passionate about working with cutting-edge technologies to solve complex data engineering challenges.
  • Bachelor’s degree in Computer Science, Engineering, Information Technology, or a related field (or equivalent experience).
  • Proven experience as a Data Engineer with expertise in Azure, Databricks, DBT, and Snowflake.
  • Strong experience with Azure Data Factory, Azure Databricks, Azure Data Lake, and other Azure cloud services for data integration and processing.
  • Proficiency with DBT for implementing data transformation workflows, creating models, and writing SQL-based scripts.
  • Expertise in working with Snowflake for data warehousing, including experience with schema design, performance tuning, and optimization.
  • Strong experience with Apache Spark and working in Databricks for large-scale data processing.
  • Solid programming skills in SQL (advanced), Python, and Scala for developing data pipelines and transformation logic.
  • Experience with ETL/ELT processes, data orchestration, and automating data workflows using Azure and DBT.
  • Knowledge of data governance, security, and best practices for cloud data architectures.
  • Familiarity with version control systems like Git, and experience in Agile environments.

Preferred:

  • DBT Certifications or experience with advanced features such as DBT testing, macros, and hooks.
  • Azure, Databricks or Snowflake certifications
  • Experience with Snowflake performance tuning, including optimization of queries, schemas, and data partitioning.
  • Familiarity with CI/CD practices and experience building automated pipelines for data workflows.
  • Knowledge of cloud cost optimization in Azure and Snowflake for better resource utilization.

Education

Bachelor’s degree in Computer Science, Engineering, Information Technology