The opportunity
We’re looking for candidates with strong technology and data understanding in big data engineering space, having proven delivery capability. This is a fantastic opportunity to be part of a leading firm as well as a part of a growing Data and Analytics team.
Your key responsibilities
- Lead and Architect migration of data analytics environment from Teradata to Snowflake with performance and reliability
- Develop & deploy big data pipelines in a cloud environment using Snowflake cloud DW
- ETL design, development and migration of existing on-prem ETL routines to Cloud Service
- Interact with senior leaders, understand their business goals, contribute to the delivery of the workstreams
- Design and optimize model codes for faster execution
Skills and attributes for success
- Hands on developer in the field of data warehousing, ETL
- Hands on development experience in Snowflake.
- Experience in Snowflake modelling - roles, schema, databases.
- Experience in Integrating with third-party tools, ETL, DBT tools
- Experience in Snowflake advanced concepts like setting up resource monitors and performance tuning would be preferable
- Applying object-oriented and functional programming styles to real-world Big Data Engineering problems using Java/Scala/Python
- Develop data pipelines to perform batch and Real - Time/Stream analytics on structured and unstructured data.
- Data processing patterns, distributed computing and in building applications for real-time and batch analytics.
- Collaborate with Product Management, Engineering, and Marketing to continuously improve Snowflake’s products and marketing.
To qualify for the role, you must have
- Be a computer science graduate or equivalent with 3 - 7 years of industry experience
- Have working experience in an Agile base delivery methodology (Preferable)
- Flexible and proactive/self-motivated working style with strong personal ownership of problem resolution.
- Excellent communicator (written and verbal formal and informal).
- Be a technical expert on all aspects of Snowflake
- Deploy Snowflake following best practices, including ensuring knowledge transfer so that customers are properly enabled and are able to extend the capabilities of Snowflake on their own
- Work hands-on with customers to demonstrate and communicate implementation best practices on Snowflake technology
- Maintain deep understanding of competitive and complementary technologies and vendors and how to position Snowflake in relation to them
- Work with System Integrator consultants at a deep technical level to successfully position and deploy Snowflake in customer environments
- Provide guidance on how to resolve customer-specific technical challenges
Ideally, you’ll also have
What we look for
- Minimum 5 years of experience as Architect on Analytics solutions and around 2 years of experience with Snowflake.
- People with technical experience and enthusiasm to learn new things in this fast-moving environment