Description

Responsibilities

  • Work with various mission stakeholders, communicating system features and concepts to other users and developers.
  • Performance tuning and management of SQL based Databases.
  • Interact with program management.
  • Additional responsibilities include assisting other developers.
  • Document, standardize, and automate deployment processes.
  • Debug various development and O&M processes and system.
  • Support and troubleshoot scalability, HA, performance, monitoring and
    backup/restore.

Required Skills

  • Development skills on ETL applications written using Apache Spark, Java, SQL/Pentaho and Python.
  • Ability to learn new technologies and evaluate multiple technologies to solve a problem.
  • Extensive knowledge of Databases like Oracle, MySQL, Redshift etc.
  • Data warehousing Concepts ,any RDBMS (Teradata, Oracle , SQL Server etc ) development.
  • Strong interpersonal skills and a positive attitude.
  • Must have skills in Data warehouse, ETL and Hadoop(Big Data).

Required Experience

  • ETL experience working in Hadoop and databases ,moving data across platforms and partnering with Ops for deployments.
  • Java/Python experience is a must.
  • Extensive ETL and Database experience with SQL based DB.
  • 5 year’s experience in DevOps specially test automation and application deployment/monitoring.
  • Experience with Open stack and/or AWS Cloud services.
  • Experience with Linux (Ubuntu preferred) system.

Education Requirements

  • Bachelor’s Degree in Computer Science, Computer Engineering or a closely related field.


 

Education

Any Graduate