Description

Must Have:

  • Spark development with batch and streaming experience
  • Experience with Pyspark and Data Engineering
  • ETL implementation and migration to spark
  • Programming knowledge on scala, java & python
  • Experience with Kafka and Spark streaming (Dstream and Structured Streaming)
  • Experience with using Jupyter notebooks or any other developer tool
  • Experience with Airflow or other workflow engines

Good to Have Skills:

  • Flink and Kudu streaming
  • Automation of workflows CI/CD
  • Nifi streaming and transformation
  • Informatica workflow migration

Education

Any Graduate