Responsibilities:
· Develop and integrate software applications using suitable development methodologies and standards, applying standard architectural patterns.
· Build libraries, user defined functions, and frameworks around Hadoop
· Develop user defined functions to provide custom hive, HDFS, Kafka and SPARK capabilities
· Develop and execute test scripts independently or in concert with testing personnel. Collaborate with BusinessAnalysts, Architects and Senior Developers to establish the physical application framework (e.g. libraries, modules, execution environments).
· Perform end to end automation of ETL process for various datasets that are being ingested into the bigdata platform.
· Follow best practices and develop code as per standard defined
Required Tools/Languages:
Must Have
· CoreJava, Spark or Scala n Unix / Linux Shell Scripting
· Strong understanding of Hadoop internals - Cloudera or HWX
· HIVE/ Impala
· HBase
· Strong SQL
· Python
Any gradudate