Job Description :
Bachelor’s degree or foreign equivalent required from an accredited institution.
At least 3 years of hands-on experience with Hadoop distributed frameworks while handling large amount of big data using Spark and Hadoop Ecosystems.
At least 3 years of experience with Spark/PySpark is required.
At least 3 years of strong experience with Scala is required.
At least 3 years of experience with Python is required.
At least 3 years of experience with SQL and any RDBMS
Any Graduate