Description

Responsibilities

  • Involve in converting Hive/SQL queries into Spark transformations using Spark RDD and Pyspark concepts.
  • Identify of new business opportunities and optimization potentials.
  • Conception, development and being close to the implementation of microservices.
  • Analyze (explorative/prescriptive) of structured and unstructured Data.
  • Layout and development of new technologies in Data Science and machine learning field with a focus of text processing/natural language processing, e.g. information extraction and document classification.
  • Close collaboration with Product management, IT and management.

Required Skills

  • Knowledge in Linux and continue Integration would be a plus.
  • Expert knowledge of Big Data technologies including but not limited to Python and/or Databricks.
  • Deep knowledge of machine learning, statistics, optimization or related field.
  • Excellent written and verbal communication skills along with strong desire to work in cross functional teams.
  • Ability to learn and adapt quickly to the emerging new technologies.
  • Strong Analytical and problem-solving skills.

Required Experience

  • 6-10 years of total IT development experience in all phases of the SDLC.
  • Hadoop/Java Developer experience in all phases of Hadoop and HDFS development, ETL/Informatica exposure.
  • Extensive experience and actively involved in Requirements gathering, Analysis, Design, Coding and Code Reviews, Unit and Integration Testing.
  • Hands on experience in Hadoop ecosystem including Spark, Kafka, HBase, Pig, Impala, Sqoop, Oozie, Flume, Mahout, Storm, Tableau, Talend big data technologies.
  • Experience working with SQL, PL/SQL and NoSQL databases like Microsoft SQL Server, Oracle, HBase and Cassandra.
  • Experience in Importing and exporting data from different databases like MySQL, Oracle, Netezza, Teradata, DB2 into HDFS using Sqoop, Talend.
  • Experience in developing and scheduling ETL workflows in Hadoop using Oozie.
  • Experience with Tableau that is used as a reporting tool.
  • Extensive experience collaborating with Enterprise Architects and infrastructure engineers to identify, design, and implement highly complex, end-to-end solutions.

Education Requirements

  • Bachelor’s Degree in Computer Science, Computer Engineering or a closely related field.


 

Education

Any Graduate