• Installed and configured Hadoop clusters for application development and Hadoop tools
like Hive, Pig, Sqoop, HBase and Zookeeper.
• Worked on Developing ETL processes to load data into HDFS using sqoop and export
the results back to RDBMS.
• Used Pig as ETL tool to do transformations, event joins and some pre -aggregations
before sorting the data onto HDFS.
• Developed MapReduce programs to cleanse the data in HDFS obtained from
heterogeneous data sources to make it suitable for ingestion into hive scheme analysis.
Requirements: Master’s Degree in Computer Science, IT, or related with at least 12 months of
Master's degree