Description

Job Description:

• Perform operation, installation, and monitoring of big data platforms

• Ensure the reliability and normal operation of multiple core systems for big data and online computing, resolving issues within the agreed SLA

• Evaluate infrastructure requirements for deployments and upgrade activities

• Develop and maintain technical documentations related to design, configuration, troubleshooting, backup, DR, etc

• Ensures deployment and operations of the data platform comply with prevailing security, regulations and audit requirements

• Provide recommendations on data governance practices and in-depth optimizations best practices

Requirements:

Primary skillset: DevOps and system engineers in Unix

• Solid computer software basic knowledge; understand Linux operating system, storage, network IO and other related principles

• Familiar with one or more programming languages, such as Python/Go/Java/Shell/Ansible

• Experience with Cloudera, Informatica, or Denodo and its runtime components would be highly advantageous

• Relevant computing/distributed/big data system experience (Hadoop/HDFS /Kubernetes/Docker/OpenStack/Hadoop/Spark/Flink,etc.) would be advantageous

• Good data structure and system design skills highly preferred

• Good knowledge of information security highly preferred

Education

Any Graduate