Description

We are seeking a skilled and motivated Sr. DevOps Engineer with expertise in Databricks and Azure Cloud Services. 

The ideal candidate will have a strong background in automation, scripting, and cloud infrastructure, with a passion for optimizing data workflows and ensuring secure, scalable deployments.


 

Key Responsibilities

  • Manage and optimize Windows systems and Azure Cloud Services, including Azure Data Factory (ADF) and Azure Key Vault (AKV).
  • Automate tasks using scripting languages such as Bash, Python, and PowerShell.
  • Implement and maintain configuration management tools like Puppet, Chef, or Ansible.
  • Ensure secure operations by working with protocols such as OAuth, SCIM, TLS, and identity management systems.
  • Develop and manage data orchestration workflows using platforms like Apache Airflow.
  • Collaborate with development teams to support ETL processes and software development methodologies.
  • Continuously explore and evaluate emerging technologies to enhance infrastructure and workflows.


 

Required Qualifications

  • Strong experience with Windows systems and Azure Cloud Services.
  • Proficiency in scripting languages (Bash, Python, PowerShell).
  • Hands-on experience with configuration management tools (Puppet, Chef, Ansible).
  • Knowledge of security protocols and identity management systems.
  • Experience with Apache Airflow or similar orchestration platforms.
  • Familiarity with ETL processes and software development methodologies.
  • Strong problem-solving skills and a proactive approach to technology evaluation.


 

Preferred Qualifications

  • Experience with container technologies such as Docker and Kubernetes.
  • Familiarity with Terraform for infrastructure as code.
  • Exposure to Red Hat Linux environments.
  • Experience with data tools such as Cube Cloud (aggregation), Alation (cataloging), Dataiku (self-service analytics), Solidatus (data lineage), and Anamolo (data quality)

Education

Any Gradute