Description

Required Qualifications:

  • Bachelor’s Degree or additional years of experience
  • 5-7+ years of experience
  • Experience with Google Cloud Platform (GCP)
  • Experience with data engineering tools and technologies, such as Apache Spark, Hadoop, and Hive
  • Experience with Hadoop and Hive.
  • Experience with Spark and PySpark.
  • Familiarity with cloud computing concepts, such as virtual machines, storage, and networking
  • Experience with cloud-based data engineering tools and services
  • Experience with Google Cloud Platform (GCP)
  • Familiarity with GCP services, such as Google Cloud Storage, Cloud Functions, Pub/Sub, Cloud Scheduler, Cloud Run, BigQuery, and Cloud SQL
  • Experience with GCP APIs and SDKs
  • Familiarity with data modeling and database design
  • Strong SQL skills
  • Strong problem-solving and analytical skills
  • Excellent communication and teamwork skills

 

Top 3 Required Skills:

  • Knowledge of the GCP environment
  • Terraform development
  • Database modeling
  • Preferred Qualifications:
  • Experience writing Python scripts for data engineering tasks
  • Familiarity with Python libraries for data manipulation and analysis, such as NumPy, Pandas, and SciPy
  • Healthcare or insurance background
  • Experience Hadoop and Hive

Education

Any Graduate