Required Skills:
• 3+ years of AWS Console, S3, Lambda, Kinesis, Glue, and CloudWatch.
• 3+ years of Kafka streams experience.
• 5+ years of data engineering, data pipeline development, and ETL experience using Python, SQL, and Snowflake
• Experience requesting, transforming, and ingesting data from REST and SOAP APIs
• Proficiency in the Python scripting language, SQL, Cloud databases, and ETL development processes & tools
• Strong understanding of traditional relational databases, data and dimensional modeling principles, and data normalization techniques
• Ability to initiate, drive, and manage projects with competing priorities.
• Ability to communicate effectively with business leaders, IT leadership, and engineers.
• Must have a passion for data and helping the business turn data into information and action.
Bonus Skills:
• Experience with data streaming technologies like PySpark
• Experience with pipeline technologies like DBT, Apache Airflow or FiveTran
• Experience with MPP technologies and databases
• Experience with data visualization tools like Tableau or Sigma
• Experience with container orchestration tools like Docker or Kubernetes
• Experience with Azure data product offerings and platform
• Experience working with Salesforce and SAP data.
• Experience using Terraform or other infrastructure as code tools.
Required Education:
• Bachelor's degree in information systems, computer science, or related technical field
Bachelor's Degree