Description

Job Description:

  • 5+ years of experience in building out data pipelines using Python
  • 3 + years of experience working in AWS Cloud especially services like S3, EMR, Lambda, Glue, Athena, Event Bridge and Step Functions.
  • 3+ years of experience with Spark
  • Experience with workflow management tools like Airflow and messaging technologies like Kafka.
  • Cloud database experience using Snowflake and RedShift.
  • Database and data warehousing design and implementation as well as dimensional modeling experience is required.
  • Has good working knowledge of conceptual, logical, and physical data modeling concepts.
  • Ability to communicate the status and challenges and align with the team.
  • Demonstrating the ability to learn new skills and work as a team

Education

Any Graduate