Description

Job description:

Experience & Core Skills

  1. 10+ years hands-on software development experience
  2. Strong algorithms, data structures, OOP, design patterns
  3. Proficient in multi-threaded programming, troubleshooting, debugging, and analytics
  4. Google Cloud Platform development expertise

 

Mandatory Technical Skills

  1. SQL (BQ, Hive, Spark SQL)
  2. Spark job debugging & performance tuning
  3. Python, PySpark
  4. Big Data Architecture – Hadoop, Hive, SQL, Airflow
  5. Strong architecture skills for end-to-end data pipelines
  6. Unix & GitHub proficiency

 

Secondary/Preferred Skills

  1. Client & stakeholder management
  2. Offshore team & delivery leadership
  3. Relevant certifications (big data, GCP, etc.)

 

Key Responsibilities

  1. Design, develop, test, and maintain multi-tier software & large-scale data solutions
  2. Own full lifecycle: architecture → coding → testing → deployment
  3. Collaborate with engineers, Ops, and business stakeholders on scalable solutions
  4. Ensure code is high-quality, well-tested (unit tests in Golang), reusable, and maintainable
  5. Deliver features/sub-systems on time and to spec
  6. Drive solutions with excellent problem-solving and system design thinking

Education

Any Graduate