Description

Responsibilities:

  • Participate in team stand-ups, design reviews, and sprint planning.
  • Provide technical support and implementation expertise in the Snowflake environment.
  • Ingest data from Big Data (Hadoop/IOP) to Snowflake using industry best practices.
  • Develop Snowpark features and build pipelines using Python.
  • Interface and contribute to open-source Snowflake libraries (e.g., Python Connector).
  • Manage Snowflake environments including role-based access, virtual warehouses, tasks, streams, and Snowpipe.
  • Perform performance tuning, query optimization, and monitoring within Snowflake.
  • Maintain technical documentation and ensure compliance with data governance/security policies.
  • Analyze, profile, and ensure the quality of ingested data using Hadoop ecosystem tools.
  • Develop ETL/ELT workflows using PySpark, Hive, Impala, and UNIX shell scripting.
  • Conduct unit testing, mock data creation, and performance tuning.
  • Update documentation including Run Books and Deployment Plans.
  • Monitor production data loads, troubleshoot and track issues, and ensure successful load operations.
  • Conduct code reviews, develop reusable frameworks, and support code deployment activities.
  • Collaborate with Admin teams (Snowflake, Hadoop, SAS, ETL) for deployment and maintenance.
  • Participate in functional and technical meetings to continuously enhance skill sets.

Required Skills & Qualifications:

  • 4–6 years of experience with Cloud Databases, Snowflake, and Data Warehousing.
  • 2–3 years of hands-on Snowflake platform experience including Snowpipe, Snowpark, and data migration from Big Data environments.
  • Expertise in SnowSQL, PL/SQL, SQL/Python/Java-based procedures in Snowflake.
  • Experience in performance tuning, monitoring, and data security in Snowflake.
  • Knowledge of AWS platform services.
  • 8+ years experience in Big Data technologies (Hadoop, Sqoop, PySpark, Hive, Impala, Kafka, etc.).
  • Extensive experience with UNIX shell scripting, Oracle SQL, HDFS, and StreamSets.
  • Strong ETL/ELT development background, especially in data integration and data transformation logic.
  • Experience handling PHI/PII data and adhering to data governance policies.
  • Excellent written and verbal communication skills.
  • Familiarity with Agile and Waterfall methodologies.
  • Bachelor’s degree in Computer Science, Information Systems, or related field (or equivalent experience).

Desired Skills:

  • Snowflake certification highly desirable.
  • Experience with Snowpipe, Streams, Tasks, and data masking in Snowflake.
  • Security experience with SAML, OAuth, Kerberos, etc.
  • Experience with System Disaster Recovery Plans for Snowflake.
  • Leadership experience and ability to work both independently and in teams.
  • Familiarity with tools like Visio, PowerPoint, Excel, Word.
  • Strong analytical skills and ability to solve complex technical challenges.
  • Ability to identify patterns, drive continuous improvement, and innovate new solutions

Education

Any Gradute