Description

Job Description:
Mandate skills - Data Bricks Pyspark, Python AWS ,SQL.
Technical Skills:
• Strong knowledge & hands-on experience in AWS Data Bricks
• Experience in AWS S3, Redshift, EC2 and Lambda services
• Extensive experience in developing and deploying Bigdata pipelines
• Extensive Experience in SQL and Pyspark
• Strong hands on in SQL development and in-depth understanding of optimization and tuning techniques in SQL with Redshift
• Development in Notebooks (like Jupyter, DataBricks, Zeppelin etc)
• Development experience in Spark
• Experience in scripting language like python and any other programming language
Roles and Responsibilities:
• Candidate must have hands on experience in AWS Data Databricks
• Good development experience using Python/Scala, Spark SQL and Data Frames
• Hands-on with Databricks, Data Lake and SQL knowledge is a must.
• Performance tuning, troubleshooting, and debugging SparkTM
Process Skills:
• Agile – Scrum
Qualification:
• Bachelor of Engineering (Computer background preferred)

Education

Any Graduate