Description

 

The ideal candidate will have strong hands-on expertise in Snowflake, SQL, Python, and AWS, with a proven track record in building efficient data pipelines, optimizing queries, and ensuring data quality across enterprise data platforms.


 

Key Responsibilities

  • Translate logical and conceptual data models into optimized Snowflake database objects, adhering to governance and naming conventions.
  • Design and implement ETL/ELT processes for data ingestion, transformation, and loading.
  • Collaborate with data architects and technical leads to understand business requirements and design scalable data solutions.
  • Document table and view definitions in Alation and enforce data quality checks.
  • Optimize database performance through query tuning, indexing, and table structure improvements.
  • Support and enhance data applications deployed in AWS Cloud environments.
  • Contribute to CI/CD pipelines using tools such as Jenkins and GitHub.
  • Provide clear technical documentation and communicate effectively with both technical and non-technical stakeholders.


 

Required Qualifications

  • Strong hands-on experience with Snowflake, Oracle, AWS, Python, and SQL.
  • Proficiency in SQL and experience with relational databases (Oracle/Snowflake).
  • Solid understanding of data warehousing concepts and best practices.
  • Experience with ETL/ELT tools and frameworks.
  • Familiarity with DevOps and CI/CD tools (e.g., Jenkins, GitHub).
  • Understanding of data governance principles and data modeling practices.
  • Excellent analytical, troubleshooting, and problem-solving skills.
  • Strong communication and collaboration abilities.


 

Preferred Qualifications

  • Experience with Business Intelligence tools such as Tableau.
  • Exposure to data cataloging tools like Alation.
  • Knowledge of cloud-native data architecture and AWS services

Education

Any Gradute