Description

  • Develop and maintain data pipelines within ONE DX Data platform.
  • Design, build, optimize, monitor, and manage complex data flows on data related to SHS Lab Diagnostics business. Successfully collaborate with DX IT DIAI team and other stakeholders in SHS DX.
  • Design, build re-usable data platform capabilities to enable data engineering teams.
  • Support creation of data backbone for data driven applications and analytics solutions incl.
  • AI/ML solutions.
  • Build data pipelines from raw data to data products utilizing the Data Vault framework
  • Collaborate with various stakeholder by translating business requirements into data pipeline designs and implement those
  • Build and manage best in class data pipeline structures involving robust meta data capturing and automated testing
  • Design data pipelines with data quality and data observability by design principles
  • Manage, monitor and optimize data pipelines and build highly efficient and automated data lineage from raw data to analytical consumption
  • Work with deployment team to ensure high quality and agile deployments of data pipelines applying state of the art DevOps methods, such as trunk-based-development
  • Collaborate with data platform architect to continuously improve data platform and operations
  • Design, build and integrate data pipelines for data driven applications, machine learning and advanced analytics use cases

Qualifications:

  • Proven knowledge (e.g. certification) on data vault framework
  • High proficiency in designing, managing, monitoring, and administrating data pipelines and data structures in Cloud based technology stack
  • Ability to translate complex problems into simple technical solutions
  • Ability to abstract data problems into re-usable components/modules
  • Lifetime willingness to learn and explore new technologies in data, analytics and AI
  • Strong orientation towards quality and results (attention to detail and standardization)
  • Practical experience working in agile environments
  • Good communication skills
  • Analytical and out of the box thinking
  • Natural drive for innovation and optimization
  • Teamwork oriented collaborative attitude in a multi-national hybrid team environment

Education/Experience

  • Post degree in computer science, information science, or data analytics
  • Expert knowledge and minimum 5 years professional experience on data pipeline modeling/implementation, data management, data transformation in a corporate environment
  • Practical experience in automated data quality management and test automation
  • Well-versed in Cloud based data management technologies (Snowflake, Databricks, Neo4J, dbt)
  • Experience with processing, optimizing, and managing large data sets (multiple TB scale).
  • Profound knowledge and hands-on experience with advanced analytics/machine learning environments such as Phyton, Apache Spark, Snowpark on Snowflake
  • Proficient English plus an additional foreign language is preferred
  • Knowledge on data visualization such as Qlik or PowerBI is preferred
  • Working experience with Agile methods within Microsoft DevOps
  • Experience working with enterprise data in the field of laboratory diagnostics or large global enterprise preferred.
  • Experience in working with multinational teams and/or different countries and cultures

Education

Bachelor's degree