Description

We are seeking a seasoned Solutions Architect with 15–20 years of experience in designing and delivering enterprise-grade data and application solutions. The ideal candidate will have strong expertise across Data Engineering, Cloud platforms (AWS), and full-stack development using Java (Spring Boot, Microservices) and Python (including PySpark). This role demands a strategic mindset combined with hands-on capabilities to lead architectural efforts across cloud-native data and application ecosystems.
 


 

Key Responsibilities:
 

  • Design and implement end-to-end cloud-native solutions leveraging AWS, Java, and Python technologies.
     
  • Lead architecture and development of data pipelines using PySpark and AWS Glue, and backend systems using Java (Spring Boot) and Microservices.
     
  • Collaborate with stakeholders to understand business requirements and translate them into scalable, secure, and high-performance technical solutions.
     
  • Architect data lakes, data warehouses, and real-time streaming platforms on AWS using services like S3, Redshift, Glue, EMR, Lambda, Kinesis, etc.
     
  • Design and build microservices-based architectures using Java, Spring Boot, and containerization technologies (Docker, Kubernetes).
     
  • Drive best practices for CI/CD, DevOps, and infrastructure-as-code (Terraform, CloudFormation).
     
  • Lead design reviews, architecture evaluations, and performance tuning across systems.
     
  • Provide technical leadership, mentorship, and architectural governance to engineering teams.
     
  • Ensure solutions adhere to security, compliance, scalability, and cost-efficiency standards
     


 

Requirements


 

Required Skills and Experience:
 

  • 15–20 years of experience in enterprise software architecture, with deep expertise in data engineering, cloud solutions, and backend development.
     
  • Strong hands-on skills in Python, PySpark, and Java (Spring Boot, Microservices).
     
  • Proven experience designing solutions on AWS (e.g., S3, Glue, Lambda, Redshift, EMR, Step Functions, Kinesis).
     
  • Strong understanding of ETL/ELT pipelines, real-time data processing, and streaming architectures.
     
  • Experience in designing and deploying microservices and RESTful APIs using Java.
     
  • Solid understanding of data modeling, data partitioning, performance tuning, and security best practices.
     
  • Experience with containerization (Docker) and orchestration platforms like Kubernetes.
     
  • Strong knowledge of CI/CD tools, DevOps pipelines, and infrastructure automation.
     
  • Excellent communication and interpersonal skills; ability to engage with technical and business stakeholders

Education

Bachelor's degree