Description

 

    
 

Skills & Job Description :


 

 

 

 

 

Data Engineering Focus (Core Function)

  • Design and develop ETL jobs using Ab-Initio (primary requirement).
  • Work with large-scale datasets and ensure efficient data processing.
  • Apply strong SQL skills to manipulate and query data effectively.
  • Understand and work with various data formats (e.g., JSON, Parquet, CSV).
  • Leverage AWS services such as RedshiftS3, and Lambda for cloud-based data solutions.

Application Development Focus

  • Perform high-complexity analysis, design, development, and unit testing of system-level applications.
  • Translate user requirements and design documents into functional software.
  • Resolve defects identified during testing cycles.
  • Apply solid knowledge of programming languages, application servers, and database servers.
  • Use modern languages (e.g., PythonSQLAb-Initio) to meet internal business needs.

 

Required Skills & Technologies:

  • ETL Tools: Ab-Initio (must-have)
  • Programming Languages: Python, SQL(must have)
  • Cloud Platforms: AWS (S3, Lambda, Redshift, EMR)(must have)
  • Data Handling: Strong SQL, data formats, high-volume data processing(must have)
  • Development Practices: Proficiency with Software Development Lifecycle (SDLC)
  • Additional Knowledge (Nice to Have): J2EE, Java, EJB, ASP, PowerBuilder, C/C++, Visual Basic, Oracle, Sybase, MQ Series

 

Qualifications:

  • Bachelor’s degree in Computer Science, Information Systems, or a related field.
  • Professional certifications are a plus.
  • 8+ years of experience in software development and data engineering.
  • Proven track record in roles with similar scope and responsibility

Education

Bachelor's degree