Description

  • Should build Databricks Notebook developed in spark sql/pyspark · 
  • Should be able to create unity catalog tables · 
  • Should have good hands on experience in delta version concepts · 
  • Should be able to develop Workflows in databricks ·
  • Should be able to load data from Databricks to Snowflake · 
  • Should be able to pull the data from on premise to Azure environment · 
  • Should be able to convert mainframe jobs to Databricks Notebook
  • Will work in various unanticipated client locations throughout the U.S.  

Position Requirements:
Bachelor’s degree in computer science, Information Technology, Computer Applications, Engineering (any) or related field.

Employer will accept any suitable combination of education, training or experiences determined to be equivalent to a U.S bachelor’s degree by an accredited credentials evaluation service.

Special Requirements:
Employer will accept any suitable combination of education, training or experience determined to be equivalent to a U.S Bachelor’s degree by an accredited credentials evaluation service.
Position requires travel/relocation to various unanticipated client locations throughout the U.S. with expenses paid by the employer.

Education

Any Graduate