Description

Responsibilities:

  • Designs, implements, integrates, and tests complex infrastructure and software solutions.
  • Monitor database performance and implement optimization techniques to improve database performance.
  • Provides technical leadership and mentorship for other team members.
  • Reviews and critiques designs and implementations.
  • Writing high-quality, efficient, and maintainable code, often with a focus on solving complex technical challenges.
  • Facilitates collaboration of working team and often engages with stakeholders and product owners.
  • Identifying and addressing performance bottlenecks, and optimizing code and systems for improved speed and efficiency.
  • Maintaining comprehensive technical documentation, including system architecture, code comments, and documentation to assist in understanding and maintaining the software.
  • Effectively communicating technical concepts and project status to non-technical stakeholders.

 

Requirements:

  • Expert-level knowledge of Snowflake is a must.
  • Knowledge of other data warehouse products is an added advantage.
  • Degree in Computer Science, Information Systems, or equivalent experience.
  • 8+ years of related professional experience.
  • Advanced knowledge in multiple areas of:
  • data transformations, data pipelines, workflow automation, and scheduling systems
  • data-centric systems architecture
  • database systems, data warehouses, distributed file storage and compute platforms

 

Experience with several of the technologies we currently use:

  • Cloud Providers: AWS, Azure.
  • Data Pipelines: Apache Airflow, debt.
  • Data Tools: Pyspark, Snowpark, AWS Glue, Pandas, Databricks, SageMaker.
  • Databases: Snowflake, SqlServer, Postgresql.
  • Containerization: Kubernetes, Docker.
  • Languages: Python, C#, Golang, Bash, SQL.
  • CI/CD: GitLab, Azure DevOps.
  • SCM: Git, Github, GitLab.

Education

Degree in Computer Science, Information Systems