We are seeking a skilled Data Engineer with strong expertise in Snowflake and SQL to support data management and quality control initiatives. The individual will be responsible for building data quality frameworks, reviewing ETL processes, and working with Snowpark using Python, Java, or Scala. Ideal candidates will have experience in data profiling and managing data quality controls within large-scale data environments.
Job Responsibilities:
- Build and implement new data quality controls in Snowflake
- Review and provide feedback on data mappings and ETL processes
- Develop data pipelines and frameworks using Snowflake and Snowpark
- Collaborate with data analysts, business users, and developers to ensure data accuracy and integrity
- Participate in architecture and code reviews to ensure best practices
- Perform data profiling to identify data quality issues and remediation strategies
- Document system and operational procedures for ongoing support and development
Required Skills:
- 5+ years of relevant experience in data engineering or data management
- Expertise in Snowflake and strong proficiency in SQL
- Familiarity with Snowpark (Snowflake framework supporting Python, Java, Scala)
- Experience with data profiling and data quality controls
- Ability to review and challenge existing data mapping and ETL pipelines
- Proficiency in Python or a similar scripting language
- Understanding of data management and governance concepts
Preferred Skills:
- Strong hands-on expertise in Python
- Industry experience in Banking, Telecom, or Pharmaceuticals
- Experience with ETL tools and business analysis
- Exposure to machine learning/artificial intelligence in the context of data management
Certifications:
Not specified (Snowflake or Data Engineering certifications are a plus)
Education:
Bachelor’s degree in computer science, Data Engineering, or a related field (or equivalent experience)
.