Description

Job Duties :

Troubleshooting on Hadoop technologies including HDFS, MapReduce, YARN, Hive, Beeline, HBase, Accumolo, Tez, Sqoop, Zookeeper, Spark, Kafka, Impala and Storm. Deploying and maintaining Hadoop cluster, Adding and Removing nodes using cluster monitoring tools like Ganglia, Nagios or Cloudera Manager. Responsible for Admin activities such as UAM (User Access Management), Security Management, Data protection, Node rebalancing, Patching activity, Cluster upgrade, Service upgrade, Service health check, Cluster disk utilization, Cluster memory utilization, Resource pool allocation by using Ambari for Hortonworks Environment. Responsible for DevOps activity package installation by using Ansible in multiple nodes, verify YAML script for Ansible and Python package harmonization.Responsible for automated job scheduling by using UC4. Designing, building, configuring applications to lend themselves to a continuous integration environment. Maintaining Security with Kerberos, Knox and Ranger. Working with data delivery teams to setup new Hadoop users. This job includes setting up Linux users, setting up Kerberos principals and UAT for HDFS, Pig and MapReduce access for all groups. Installing and performing platform level Hadoop infrastructure including additional tools like SAS, Informatica, Presto and R.

Work Location :

various unanticipated work locations throughout the United States; relocation may be required. Must be willing to relocate.

Minimum Requirements:

Education : Bachelor’s degree in Computer Science or a related Information Technology field.

Education

Any Graduate