Required Skills & Experience:
- Experience with large-scale distributed data processing systems.
- Expertise in data modelling, testing, quality, access, and storage.
- Proficiency in Python, SQL, and experience with Databricks and DBT.
- Experience implementing cloud data technologies (GCP, Azure, or AWS).
- Knowledge of improving the data development lifecycle and shortening lead times.
- Agile delivery experience.
Skills:
Python, Pyspark, Azure Databricks, SQL