Description

  • As a part of our AI & D team, the Data Engineer will be responsible for leading development and validation activities of Big Data products and applications which run on the large Hadoop and Teradata clusters.
  • The qualified engineer will bring technical leadership for developing and testing ETL process, migrating different applications to cloud, developing data validation tools used for performing quality assessments and measurements on different data sets that feed Big Data products. Should have Client’s Data & System knowledge of Wireless & Wireline.
  • The candidate will be involved in,
  • Lead in design, development and testing of data ingestion pipelines, perform end to end validation of ETL process for various datasets that are being ingested into the big data platform.
  • Perform data migration and conversion validation activities on different applications and platforms.
  • Provide the technical leadership on data profiling/analysis, discovery, analysis, suitability and coverage of data, and identify the various data types, formats, and data quality issues which exist within a given data source.
  • Contribute to development of transformation logic, interfaces and reports as needed to meet project requirements.
  • Participate in discussion for technical architecture, data modeling and ETL standards, collaborate with Product Managers, Architects and Senior Developers to establish the physical application framework (e.g. libraries, modules, execution environments)
  • Lead in design and develop validation framework and integrated automated test suites to validate end to end data pipeline flow, data transformation rules, and data integrity.
  • Develop tools to measure the data quality and visualize the anomaly pattern in source and processed data.
  • Assist Manager in project planning, validation strategy development
  • Provide support in User acceptance testing and production validation activities.
  • Provide technical recommendations for identifying data validation tools, recommend new technologies to improve the validation process.
  • Evaluate existing methodologies and processes and recommend improvements.
  • Work with the stakeholders, Product Management, Data and Design, Architecture teams and executives to call out issues, guide and contribute to the resolution’s discussions.


Required Skills:

  • 10+ years of Software development and testing experience.
  • 15 Years of Client’s Data Domain knowledge With Data Architecture Background.
  • Experience with developing and testing ETL, real-time data-processing and Analytics Application Systems.
  • Strong knowledge in Spark SQL, Scala code development in big data Hadoop environment and/or BI/DW development experiences.
  • Experience with development and automated framework in a CI/CD environment.
  • Experience with cloud environments - GCP is a plus.
  • A solid understanding of common software development practices and tools.
  • Strong analytical skills with a methodical approach to problem solving applied to the Big Data domain
  • Good organizational skills and strong written and verbal communication skills.


Desired Skills:

  • Working experience on large clients ' projects is a big plus.
  • Working experience on Google Cloud platform is a big plus
  • Development experience for tools and utilities for monitoring and alert set etc.
  • Familiarity with project Management and bug tracking tools, i.e., JIRA or a similar tool.


EDUCATION/CERTIFICATIONS:

  • Please indicate whether education and/or certifications are required or desired. Bachelor's Degree in computer science or engineering

Education

Bachelor's degree