Description

  • Ensures the reliability, efficiency, and scalability of data pipelines, platforms capabilities, and connected platform tools within ONE DX Data ecosystem.
  • Build and manage capabilities allowing decentral data product teams to successfully engineer and deploy data pipelines within the platform.
  • Ensure platform capabilities allow for SLA management, total cost of ownership transparency & optimization, end2end data observability and data quality monitoring.
  • Ultimately ensure high data availability and reliability for DX business. Work with data engineers, support engineers, application management, data ecosystem architect and data platform product owner to create a lean, high performance, and highly reliant data platform.
  • Configure and monitor data platform tools (e.g. dbt, Snowflake, API services, Microsoft analytics tool stack)
  • Build robust operations and monitoring system to measure health of pipelines, health of platform and availability of data.
  • Translate and implement SLA requirements into new platform capabilities
  • Enhance automation in the area of meta data management, testing, data quality, data observability, DevOps integration
  • Design and implement platform services/features across the whole stack of data management, i.e. data ingestions (incl. real-time processing), data pipeline management and data automation, data consumption (classical analytics, data as a service, ML/AI usage, data as API service)
  • Design and build lean and efficient data security layers (data masking, row access policies, etc.) that can suffice broad requirements from various data domains.
  • Manage deployment of decentral development into reliable releases within the data platform allowing for continuous integration and deployment
  • Work with platform architect and de-central data product deployment teams to ensure high quality and agile deployments of data pipelines applying state of the art DevOps methods, such as trunk-based-development
  • Support data platform architect to continuously improve data platform and operations

Qualifications:

  • Broad and certified knowledge of cloud based data platforms (esp. Snowflake and Databricks)
  • High proficiency managing, monitoring of interconnected data pipelines (e.g. Airflow, dbt, Snowflake, Databricks, Microsoft Fabric)
  • High proficiency in scripting, automation, API services building and management
  • Hands on experience in implementing data observability, data quality measurement, logging, alerting, meta data based data contract management
  • Broad understanding of data governance and data security
  • Practical experience in defining and implementing incident management and support for data platforms
  • Practical experience in DevOps CI/CD integration with data platform tools (dbt, Snowflake, Databricks)
  • Ability to translate complex problems into simple technical solutions
  • Ability to abstract data problems into re-usable components/modules
  • Lifetime willingness to learn and explore new technologies in data, analytics and AI
  • Strong orientation towards quality and results (attention to detail and standardization)
  • Practical experience working in agile environments
  • Good communication skills
  • Analytical and out of the box thinking
  • Natural drive for innovation and optimization
  • Teamwork oriented collaborative attitude in a multi-national hybrid team environment

Education/Experience:

  • Post degree in computer science, information science, or data analytics
  • Expert knowledge and minimum 7 years professional experience on data management and/or data operations in a corporate environment
  • Well-versed in Cloud based data management technologies (Snowflake, Databricks, Neo4J, dbt, Airflow, Microsoft Fabric, API management like MuleSoft or Snaplogic)
  • Profound knowledge and hands-on experience with building large-scale and reliable data ecosystems
  • Proficient English plus an additional foreign language is preferred
  • Working experience with Agile methods within Microsoft DevOps
  • Experience working with enterprise data in the field of laboratory diagnostics or large global enterprise preferred.
  • Experience in working with multinational teams and/or different countries and cultures

Education

Any Gradute