What you’ll do
Design and implement data engineering frameworks to scale development and deployment of data pipelines across the D&A organization
Build complex batch and streaming pipelines to ingest data from upstream Equifax cloud systems
Enable new GCP services and build self-service capabilities to enhance functionality of our data platform
Act as a mentor to junior engineers within the Data Platform team and also the broader D&A organization
Play an active role in setting engineering standards and best practices in EWS D&A
What experience you need
At least 7 years of experience in data engineering, data architecture, or a related field
Strong understanding of data engineering principles and best practices, including data modeling, data warehousing, and data integration
At least 3 years of experience working in a GCP big data environment
Experience building complex data pipelines and solutions using two or more of the following - BigQuery, DataFlow, DataProc, Pub/Sub, Cloud Functions
Experience with Airflow or Cloud Composer
Proficiency in Python development
Professional experience with SQL
Proven ability to effectively communicate complex technical concepts to both technical and non-technical stakeholders
Bachelor's degree or higher in Computer Science, Information Systems, or a related field
What could set you apart
GCP Professional Data Engineer or Professional Cloud Architect certification
Experience programming in Java
Experience scaling Airflow
Experience with dbt
DevOps / CICD experience (Jenkins and GitHub) and IaC experience using Terraform for managing GCP infrastructure
Any Graduate