Description

Roles and Responsibilities:

  • Research and engineer repeatable and resilient ETL workflows using Databricks notebooks and Delta Live Tables for both batch and stream processing
  • Collaborate with business users to develop data products that align with business domain expectations
  • Work with DBAs to ingest data from cloud and on-prem transactional databases
  • Contribute to the development of the Data Architecture for NC DIT - Transportation:
  • By following practices for keeping sensitive data secure
  • By streamlining the development of data products for use by data analysts and data scientists
  • By developing and maintaining documentation for data engineering processes
  • By ensuring data quality through testing and validation
  • By sharing insights and experiences with stakeholders and engineers throughout DIT - Transportation

 

SKILL MATRIX:

  1. Excellent interpersonal skills as well as written and communication skills – Required 5 Years
  2. Able to write clean, easy-to-follow Databricks notebook code – Required 2 Years
  3. Deep knowledge of data engineering best practices, data warehouses, data lakes, and the Delta Lake architecture – Required 2 Years
  4. Good knowledge of Spark and Databricks SQL/PySpark – Required 2 Years
  5. Technical experience with Azure Databricks and cloud providers like AWS, Google Cloud, or Azure – Required 2 Years
  6. In-depth knowledge of OLTP and OLAP systems, Apache Spark, and streaming products like Azure Service Bus – Required 2 Years
  7. Good practical experience with Databricks Delta Live Tables – Required 2 Years
  8. Knowledge of object-oriented languages like C#, Java, or Python – Desired 7 Years

Education

Any Graduate