As a Senior Data Engineer, you will play a pivotal role in designing, implementing, and optimizing data services and pipelines, ensuring that our data services support the needs of the business. You will collaborate with cross-functional teams to enable operational data analysis, enhance eventing systems, and develop tools that empower our data and services teams. Your expertise will also be critical in driving 100% automation, best practices, creating scalable architectures, and optimizing our data environments. This role is required to be hybrid three days per week in our San Francisco, Los Angeles or Dallas office and will report into our Director of Data Engineering - Platform Engineering.
Qualifications
- Bachelor's degree in Computer Science, Information Technology, or a related field.
- 8+ years of experience in data engineering, with a focus on building and optimizing data pipelines, data architecture, and eventing systems.
- Extensive experience with AWS cloud platform and their data-related services.
- Proficiency in automation frameworks (e.g., Terraform, Cloud Formation).
- Proficiency in data lake and pipeline tools and frameworks (e.g., Databricks, Apache Kafka/Kinesis, AWS Glue).
- Proficiency in one or more programming languages (e.g. Python, Java)
- Strong understanding of SQL (e.g. Redshift, Snowflake, RDS) and NoSQL databases (e.g., DynamoDB) and experience in enabling analytical queries on these platforms.
- Experience with search optimization, including schema design and query tuning.
- Experience in building CI/CD pipelines, testing frameworks (e.g. dbt), and best practices in data engineering.
- Strong problem-solving skills and a proactive approach to identifying and addressing issues.
- Ability to independently own and execute projects while effectively collaborating with the team to influence and shape the vision of the data engineering organization.
- Strong communication skills and the ability to mentor and guide junior engineers