Description

Key Responsibilities 
 

  • Lead the design and development of scalable data solutions and pipelines. 

     
  • Drive technical decisions, provide mentorship, and guide the team in complex problem spaces. 

     
  • Collaborate with product managers and engineering leaders to refine business and technical requirements. 

     
  • Write clean, maintainable, and efficient code using languages such as Python, Java, Scala, or Go. 

     
  • Implement best practices for data validation, pipeline automation, and quality checks. 

     
  • Work across multiple database systems including SQL and NoSQL platforms. 

     
  • Stay updated with the latest tech trends; contribute to internal/external tech communities. 

     
  • Occasionally contribute hands-on to development and code reviews as a tech leader. 

     

Required Skills: 
 

  • Python coding – strong scripting and automation capabilities 

     
  • SQL – advanced querying and database optimization skills 

     
  • AWS Cloud – experience with core services (e.g., EC2, S3, Lambda) 

     
  • Redshift – hands-on data warehousing and performance tuning 

     
  • Snowflake – experience in data lake/warehouse architectures 

     
  • Experience with data streaming platforms and real-time processing. 

     
  • Strong knowledge of data modeling, data lake architectures, and ETL/ELT tools. 

     
  • Experience working with Snowflake, Redshift, and large-scale data warehouse environments. 

     
  • Hands-on experience with cloud services (preferably AWS; GCP or Azure also considered). 
     
  • Familiarity with CI/CD processes, DevOps tools, and version control systems (e.g., Git)

Education

Any Gradute