Job description:
Experience & Core Skills
- 10+ years hands-on software development experience
- Strong algorithms, data structures, OOP, design patterns
- Proficient in multi-threaded programming, troubleshooting, debugging, and analytics
- Google Cloud Platform development expertise
Mandatory Technical Skills
- SQL (BQ, Hive, Spark SQL)
- Spark job debugging & performance tuning
- Python, PySpark
- Big Data Architecture – Hadoop, Hive, SQL, Airflow
- Strong architecture skills for end-to-end data pipelines
- Unix & GitHub proficiency
Secondary/Preferred Skills
- Client & stakeholder management
- Offshore team & delivery leadership
- Relevant certifications (big data, GCP, etc.)
Key Responsibilities
- Design, develop, test, and maintain multi-tier software & large-scale data solutions
- Own full lifecycle: architecture → coding → testing → deployment
- Collaborate with engineers, Ops, and business stakeholders on scalable solutions
- Ensure code is high-quality, well-tested (unit tests in Golang), reusable, and maintainable
- Deliver features/sub-systems on time and to spec
- Drive solutions with excellent problem-solving and system design thinking