Minimum 8+ Years of Experience
Big Data Engineer Responsibilities:
Meeting with managers to determine the company’s Big Data needs.
PySpark , HDFS, Hive are Must
AWS or Snowflake
Developing Hadoop systems.
Loading disparate data sets and conducting pre-processing services using Hive or Pig.
Finalizing the scope of the system and delivering Big Data solutions.
Training staff on data resource management.
Big Data Engineer Requirements:
Bachelor’s degree in computer engineering or computer science.
Previous experience as a big data engineer.
In-depth knowledge of Hadoop, PySpark, Spark, Python and similar frameworks.
Knowledge of NoSQL and RDBMS databases including Redis and MongoDB.
Excellent project management skills.
Good communication skills.
Ability to solve complex networking, data, and software issues.
Any Graduate