Description

Responsibilities

  • Design, develop, test, deploy, support, enhance data integration solutions seamlessly to connect and integrate client’s enterprise systems in our Enterprise Data Platform.
  • Oversee the design of the data and technical architecture that ensures integrity and accuracy of data.
  • Design, build and maintain all business reporting and dashboard.
  • Innovate for data integration in Apache Spark-based Platform to ensure the technology solutions leverage cutting edge integration capabilities.
  • Facilitate requirements gathering and process mapping workshops, review business/functional requirement documents, author technical design documents, testing plans and scripts.
  • Assist with implementing standard operating procedures, facilitate review sessions with functional owners and end-user representatives and leverage technical knowledge and expertise to drive improvements.

Required Skills

  • Good knowledge about Artificial Intelligence and Machine Learning.
  • Understand requirements and provide technical solutions.
  • Strong in Algorithms and Data structures.
  • To learn new skills and grow competencies.
  • Ability to work in a team-oriented, collaborative environment.

Required Experience

  • Minimum of 5 years working experience in data integration and pipeline development using Python/PySpark.
  • Minimum of 2 years of experience with AWS Cloud on data integration with Apache Spark, EMR, Glue, Kafka, Kinesis, and Lambda in S3, Redshift, RDS, MongoDB/DynamoDB ecosystems.
  • Experience in Core Python.
  • Strong real-life experience in python development especially in PySpark in AWS Cloud environment.
  • Experience in Python and common python libraries.
  • Strong analytical experience with database in writing complex queries, query optimization, debugging, user defined functions, views, indexes etc.
  • Strong experience with source control systems such as Git, Bitbucket, and Jenkins build and continuous integration tools.
  • Experience in developing data processing tasks using PySpark.

Education Requirements

  • Bachelor’s Degree in Computer Science, Computer Engineering or a closely related field.


 

Education

Any Graduate