Job Description
Work with technology and business stakeholders to understand data requirements
Exposure to data warehousing concepts and must have participated in various implementations building data hubs, data marts etc.
Build scalable and reliable data pipelines to support data ingestions (batch and /or streaming) and transformation from multiple data sources such as Flat Files, SQL, AWS RDS, S3 etc. and help centralize the information on Snowflake.
Hands on experience in ingestion, transformation, and database technologies.
Expertise in SQL, complex query writing, query organization, optimization.
Create unit/integration tests and implement automated build and deployment.
Participate in code reviews to ensure standards and best practices.
Deploy, monitor, and maintain production systems.
Create and update user stories in backlog, participate in agile scrum ceremonies.
Collaborate with product owner, analysts, architects, QA and other team members
The experience you will bring:
Minimum 2-year hands on ETL development experience using DBT
Minimum 5+ year hands on experience working with SQL and Snowflake database
Minimum 1-year hands on experience (not just training or POC) in using Git and Python
Strong communication skills and work in collaborative environments.
What will make you stand out:
Hands on Experience in ELT development using Matillion (ingestion tool), DBT (transformation tool) and Snowflake (Database technology)
Experience working with Azure Dev Ops, Build and Release CI/CD pipelines
Experience working with AWS and Control M
Experience coding complex transformations (not just extract/load mappings) in DBT
Typical Qualifications:
5+ years of data engineering experience
BS Degree: IT or Computer Science or Engineering
Must Have
Minimum 1year hands on experience (not just training or POC) in using Git and Python
Minimum 2 year hands on ETL development experience using DBT
Minimum 2 year hands on experience working with SQL and Snowflake database
Any Graduate