Description

Required Skills: 
 

  • Big Data Frameworks: Extensive experience in PySpark, Spark-Scala, Hadoop, Hive, and HBase. 

     
  • Programming: Proficiency in Python, SQL, and Linux shell scripting; familiarity with Scala is a plus. 

     
  • Cloud Services: In-depth experience with AWS S3, EMR, RDS, and MWAA (Airflow). 

     
  • Data Storage and Integration: Strong knowledge of Oracle, Redshift, and other relational and NoSQL databases. 

     
  • Data Processing: Expertise in handling diverse file formats such as CSV, Parquet, JSON, and Avro

Education

Any Gradute