Description

Responsibilities

  • Responsible for data analysis, data mining, design, development and implementation on Hadoop platform.
  • Develop, enhance, debug, support, maintain and test software applications that support business units or supporting functions.
  • Implement complex data processing algorithms in real time with optimized and efficient manner using Scala/Java.
  • Participate in the design, development and implementation of complex applications, often using new technologies.
  • Work with QA and performance testing teams to ensure software meet tollgate requirements.

Required Skills

  • Deep understanding of Big data and Hadoop architecture.
  • Knowledge of any one of the scripting languages, such as Python, SQL, Shell Scripting , PERL etc. is essential for this position.
  • Solid understanding on OOP languages.
  • Excellent planning, project management, leadership and time management skills.
  • Abilities to work independently.
  • Solid understanding of Linux operating system a plus.
  • Good communication skills.

Required Experience

  • Strong experience in Hadoop and its related technologies like MapReduce, HDFS, Spark, Kafka, and Hive.
  • Hands on experience in Java, Scala, Python, and Unix Shell Script.
  • Must have working experience in core java.
  • Solid experience in NoSQL databases like HBase, MongoDB and Cassandra.
  • Experience working with NoSQL databases such as Cassandra.
  • Experienced in shell scripting.
  • Experience in designing and developing technical solutions with big data technologies.
  • Experience with analytical programming using Python, Scala or R is desired.

Education Requirements

  • Bachelor’s Degree in Computer Science, Computer Engineering or a closely related field.


 

Education

Any Graduate