Description

Responsibilities:

  • Develop and maintain data processing applications using Spark and Scala.
  • Collaborate with cross-functional teams to understand data requirements and design efficient solutions.
  • Implement test-driven deployment practices to enhance the reliability of application.
  • Deploy artifacts from lower to higher environment ensuring smooth transition
  • Troubleshoot and debug Spark performance issues to ensure optimal data processing.
  • Work in an agile environment, contributing to sprint planning, development and delivering high quality solutions on time
  • Provide essential support for production batches, addressing issues and providing fix to meet critical business needs Skills/Competencies:
  • Strong knowledge of Scala programming language
  • Excellent problem-solving and analytical skills.
  • Proficiency in Spark, including the development and optimization of Spark applications.
  • Ability to troubleshoot and debug performance issues in Spark.
  • Understanding of design patterns and data structure for efficient data processing
  • Familiarity with database concepts and SQL * Java and Snowflake (Good to have).
  • Experience with test-driven deployment practices (Good to have).
  • Familiarity with Python (Good to have).
  • Knowledge of Databricks (Good to have).
  • Understanding of DevOps practices (Good to have)

Education

Any Graduate