Description

Key Responsibilities:

  • Learn and understand a broad range of DHHS's data resources and their business application domains.
  • Drive digital transformation initiatives across the program.
  • Recognize and adopt best practices in data transformation: data integrity, test design, analysis, validation, and documentation.
  • Design, implement, and support a platform providing secured access to large datasets.
  • Own the design, development, and maintenance of ongoing metrics, reports, analyses, dashboards, etc. to drive key business decisions.
  • Provide advanced data support to data engineering and ETL teams.
  • Extract and evaluate data to ensure accurate mapping for end-user consumption.
  • Collaborate with technical support teams and business process owners to ensure seamless operations service delivery.
  • Develop processes and procedures to enhance the quality and efficiency of quality control checks in the production environment.
  • Create and implement plans to improve data quality by identifying causes of errors or discrepancies and devising effective solutions.
  • Establish robust procedures for data collection and analysis.
  • Programmatically manipulate and analyze large datasets, build data pipelines, and automate tasks.
  • Design and maintain efficient data processing flows.
  • Support test readiness reviews, test planning work groups, and pre- and post-test briefings.
  • Communicate issues effectively in writing and orally to executive audiences, business stakeholders, and technical teams.
  • Support the technical and operational architecture of complex enterprise systems.
  • Tune application and query performance using profiling tools and SQL.
  • Keep up to date with advances in big data technologies and run pilots to design the data architecture to scale with the increased data volume using Azure and / or AWS / public cloud tools
SkillRequired / DesiredAmountof Experience
Experience with Data EngineeringRequired5Years
Experience with SQLRequired8Years
Experience with data modelingRequired4Years
Experience with data warehousingRequired4Years
Experience building ETL/ELT pipelinesRequired4Years
Experience in at least one modern scripting or programming language, such as Python, Java, Scala, or NodeJSRequired3Years
Experience with AWS and / or Azure technologies like Redshift, S3, Lambda, Synapse, Azure DevOpsDesired2Years
Experience with non-relational databases / data stories (object storage, document or key-value stores, graph databases, column-family databases)Desired3Years
Experience with big data technologies such as Spark, Databricks, Snowflake, Hadoop, or similarDesired3Years
Experience with DevOps and CI/CD (Git and automation tools)Desired5Years


 

  •  

Education

Any Gradute