Key Skills: Snowflake, SQL, Python, AWS AppFlow, Apache Spark, Kafka, Data Warehousing, ETL/ELT, DevOps, CI/CD, Git, Jira, Performance Tuning, Data Integration, Technical Support, Documentation, iPaaS (Workato), Salesforce Data Cloud, Analytical Thinking, Communication, Team Leadership.
Roles & Responsibilities:
- Lead the design, development, and maintenance of advanced software applications using Snowflake and related data technologies.
- Collaborate with stakeholders, IT teams, project managers, and end-users (India and US) to analyze and gather software and data requirements.
- Design detailed data models, schemas, views, and stored procedures utilizing Snowflake features such as time travel, zero copy cloning, and secure data sharing.
- Develop scalable and efficient data pipelines using Snowflake SQL, Snowpipe, and integration tools to ingest, transform, and deliver data from various sources.
- Lead deployment activities across multiple environments ensuring consistent and error-free releases.
- Seamlessly integrate software components into functional systems that are interoperable with existing infrastructure.
- Conduct unit testing, implement data validation and monitoring frameworks to ensure high data quality and performance.
- Debug and troubleshoot issues encountered during development and deployment, ensuring timely resolution and minimal downtime.
- Provide technical support and guidance to end-users by applying Snowflake optimization techniques such as clustering, partitioning, caching, and compression.
- Stay current with emerging technologies and Snowflake best practices, incorporating relevant advancements into ongoing projects.
- Mentor and guide junior developers to enhance team productivity and knowledge sharing.
- Create and maintain comprehensive documentation of processes, procedures, and technical specifications for development and deployment.
Experience Requirement:
- 9-13 years of experience in data engineering, data warehousing, or data analysis roles.
- At least 3 years of hands-on experience with Snowflake including performance tuning, advanced SQL development, and Snowflake-specific features.
- Experience with data integration tools such as AWS AppFlow, Apache Spark, or Kafka.
- Proven track record of building robust ETL/ELT pipelines in a cloud-based environment.
- Experience providing technical support to end-users during implementation and deployment phases.
- Strong background in debugging, performance optimization, and large-scale data handling.
- Experience leading small to mid-sized development teams or mentoring junior engineers.
- Familiarity with DevOps practices, CI/CD pipelines, version control (Git), and issue tracking tools (e.g., Jira).
- Exposure to iPaaS tools such as Workato or integration with platforms like Salesforce Data Cloud is a plus.
Education: M.B.A., B.Tech M.Tech (Dual), B.Tech, M. Tech, B. Sc., B.Com