Description

Key Skills: SCALA, Apache, Java, GCP, Data Engineer, Apache Spark, Google Cloud Applications, Kafka

Roles and Responsibilities:

  • Engineer the data transformations and analysis for the Cash Equities Trading platform.
  • Technology SME on the real-time stream processing paradigm.
  • Bring your experience in Low latency, High through-put, auto scaling platform design and implementation.
  • Implementing an end-to-end platform service, assessing the operations and non-functional needs clearly.
  • Mentor and Coach the engineering and SME talent to realize their potential and build a high-performance team.
  • Manage complex end to end functional transformation module from planning estimations to execution.
  • Improve the platform standards by bringing in new ideas and solutions on the table.

Skills Required:

  • 12+ years of experience in data engineering technology and tools.
  • Must have experience with Java / Scala based implementations for enterprise-wide platforms.
  • Experience with Apache Beam, Google Dataflow, Apache Kafka for real-time steam processing technology stack.
  • Complex state-full processing of events with partitioning for higher throughputs.
  • Have dealt with fine-tuning the through-puts and improving the performance aspects on data pipelines.
  • Experience with analytical data store optimizations, querying and managing them.
  • Experience with alternate data engineering tools (Apache Flink, Apache Spark etc).
  • Reason and have an ability to convince the stake holders and wider technology team about your decisions.
  • Set highest standards of integrity and ethics and lead with examples on technology implementations.

Education:  Bachelor's Degree in related field

Education

Any Graduate