- Set up Hadoop clusters (Hortonworks/Cloudera), perform upgrades and apply required configuration changes to Hadoop clusters. Involve in the complete process of configuration and monitoring of Hadoop cluster.
- Work with Linux commands to maintain Linux RedHat servers.
- Perform various data operations on Hadoop environment.
- Interact with different application teams and provide hardware architectural guidance, plan and estimate cluster capacity/storage and create roadmaps for Hadoop cluster deployment.
- Provide support to teams using Hadoop stack.
- Work on DataRobot, Unravel, Cloud Break, AWS cloud, Azure HD insights and Google cloud platform and additional capabilities of current platforms that support client’s vision of Big Data.
- Work with Cloudera support to maintain log the issues in cloudera portal and perform the fixes as per the recommendations.
- Work on different Hadoop ecosystems like Hive, Hbase, Kafka, Hdfs, etc, and set up disaster recovery for clusters in order to protect the data in unexpected data crash.
Requirements: Master’s degree in Comp Science, IT, Engg, or related with at least 6 months of experience