Description

Job Duties :

Under limited supervision, develop complex software artifacts and systems using Big Data technologies; utilize Agile methodology; provide expertise during SDLC; attend meetings with stakeholders to understand project requirements and the need for development of new systems or enhancements to existing systems; create object models and process flows with sequence diagrams to capture intended system behavior; create architecture and design data flow; design and develop data pipeline and ETL processes to load data from multiple sources to HDFS; develop scripts to process structured data; convert complex queries into Hive query language; design and develop software using latest tools and frameworks; implement required software that meets code quality standards; use test driven development and behavior driven development to build required software test scripts; improve maintainability of software; support QA Analysts through validation of defect reporting and tracking; perform design and code fixes to address defects; perform code reviews; mentor/assist less experienced team members; provide production support for ongoing issues; and use Windows, UNIX, Teradata, Oracle, SQL Server, Hadoop, Hive, Sqoop, Oozie, HDFS, HBase, Pig, Mongo DB, Spark, Scala, SQL, NiFi, Kafka, Informatica, Ab Initio, Putty, Shell Scripting, Jenkins, Autosys, BMC Remedy and related tools.

Minimum Qualifications Education :

Master degree in Computer Science, Computer/Electronic Engineering or in a related field of study (will accept equivalent foreign degree).

Experience :

Two (2) years of experience as a Software Developer, Systems Analyst, Programmer or in a related occupation;

Other Requirements:

Experience must include 2 years of work with Oracle, Teradata, Hadoop, Informatica and Ab Initio; must be willing to relocate

Education

Any Graduate