Description

Minimum Qualifications:
·    Bachelor's degree in Information Systems, Engineering, Computer Science, or related field from an accredited university.
·    Intermediate experience in a Hadoop production environment.
·    Must have intermediate experience and expert knowledge with at least 4 of the following:
o  Hands on experience with Hadoop administration in Linux and virtual environments.
o  Well versed in installing & managing distributions of Hadoop (Cloudera).
o  Expert knowledge and hands-on experience in Hadoop ecosystem components; including HDFS, Yarn, Hive, LLAP, Druid, Impala, Spark, Kafka, HBase, Cloudera Work Bench, etc.
o  Thorough knowledge of Hadoop overall architecture.
o  Experience using and troubleshooting Open Source technologies including configuration management and deployment.
o  Data Lake and Data Warehousing design and development.
o  Experience reviewing existing DB and Hadoop infrastructure and determine areas of improvement
o  Implementing software lifecycle methodology to ensure supported release and roadmap adherence.
o  Configuring high availability of name-nodes.
o  Scheduling and taking backups for Hadoop ecosystem.
o  Data movement in and out of Hadoop clusters. 
o  Good hands-on scripting experience in a Linux environment.
o  Experience in project management concepts, tools (MS Project) and techniques
o  A record of working effectively with application and infrastructure teams.
·    Strong ability to organize information, manage tasks and use available tools to effectively contribute to a team and the organization.

Education

Bachelor's degree in Information Systems, Engineering