Description

Responsibilities:

  • Develop and optimize Snowflake-based data pipelines and SQL queries for POC use cases.
  • Work with large volumes of structured and semi-structured data from various sources, including streaming data.
  • Collaborate closely with the lead architect and stakeholders to rapidly iterate and test solutions.
  • Contribute to the architectural design and scalability planning for broader team adoption.
  • Take ownership of tasks and proactively identify areas for improvement or automation.
  • Document technical decisions, patterns, and solutions.

Required Skills:

  • 5+ years of experience in SQL development and data engineering.
  • Strong hands-on experience with Snowflake data warehousing, including performance tuning and cost optimization.
  • Experience with streaming data pipelines using tools such as SnowpipeKafkaAWS Kinesis, or Apache NiFi.
  • Solid understanding of ELT/ETL development practices.
  • Ability to work independently with minimal supervision and proactively drive deliverables.
  • Strong problem-solving skills and initiative-driven mindset.

Preferred Qualifications:

  • Experience integrating Snowflake with cloud data lakes (e.g., AWS S3).
  • Familiarity with orchestration tools like Apache Airflow or Azure Data Factory (ADF).
  • Experience working in a POC/startup-like environment with quick iterations.
  • Strong communication and collaboration skills

Education

Any Gradute