Description

What You Will Do -

  • Set up and manage Unity Catalog in Databricks to organize and secure data access across teams
  • Design and operationalize Feature Stores to support machine learning models in production
  • Build efficient data pipelines to process and serve features to ML workflows
  • Collaborate with teams using Databricks, Azure Cosmos DB, and other Azure tools to integrate data solutions
  • Monitor and optimize the performance of pipelines and feature stores

 

What We are Looking For -

  • Strong experience with Unity Catalog in Databricks for managing data assets and access control
  • Hands-on experience working with Databricks Feature Store or similar solutions
  • Knowledge of building and maintaining scalable ETL pipelines in Databricks
  • Familiarity with Azure tools like Azure Cosmos DB and ACR
  • Understanding of machine learning workflows and how feature stores fit into the pipeline
  • Strong problem-solving skills and a collaborative mindset
  • Proficiency in Python and Spark for data engineering tasks
  • Experience with monitoring tools like Splunk or Datadog to ensure system reliability
  • Familiarity with AKS for deploying and managing containers

Education

Any Graduate