AWS, IAC (Terraform), Postgres and SQL • 4-year College Degree required; Bachelor’s Degree in Information Technology field or related technical discipline preferred. • 3+ years as a Python, PySpark, building scalable real-time streaming ETL applications and data warehouses. • Advanced proficiency programming in PySpark and Python ETL modules is required. • Experience in working with and processing large data sets in a time-sensitive environment while minimizing errors. • Proficient experience working within the AWS and AWS tools (S3, Glue, Lake Formation, Athena, Redshift). • Experienced in maintaining infrastructure as code using Terraform. • Advanced understanding of both SQL and NoSQL technologies such as MongoDB / DocumentDB. • Hands-on experience working with Qlik (Attunity) Replicate.