The Vulnerability Management Platforms Development Squad is seeking a highly skilled Data Engineer with deep expertise in PostgreSQL, Snowflake, or ElasticSearch. The ideal candidate will possess advanced experience in data modeling, ETL processes, and building large-scale data solutions. A strong technical background in data engineering, as well as expertise in streaming services (e.g., Kafka), is required for this role.
Key Responsibilities:
- Data Pipeline Development: Build, optimize, and manage data pipelines, orchestration tools, and governance frameworks to ensure efficient, high-quality data flow.
- Data Quality & Governance: Perform data quality checks, enforce governance standards, and apply quality rules to ensure data integrity.
- Real-time Data Processing: Utilize Python and streaming technologies (e.g., Kafka) for real-time data processing and analytics, with PostgreSQL as the source.
- Advanced SQL Development: Write complex SQL queries for data manipulation, analysis, and integration.
- Snowflake or ElasticSearch Architecture & Implementation: Design and implement large-scale data integration solutions using Snowflake or ElasticSearch.
Required Qualifications:
- Experience: 8+ years in IT with a focus on data engineering and architecture.
- Technical Expertise: In-depth knowledge of PostgreSQL, with additional expertise in Snowflake or ElasticSearch. Proficient in architecture, functions, and data warehousing concepts for both platforms.
- ETL & Data Modeling: Advanced skills in ETL processes, data modeling, and data warehousing.
- Programming: Proficiency in Python for data engineering tasks.
- Streaming Services: Experience with Kafka or similar real-time data streaming services.
- Communication: Strong analytical, architectural design, and communication skills for engaging with diverse technical stakeholders.
Preferred Qualifications:
- Experience in designing and implementing large-scale data integration solutions.
- Expertise in high-performance data engineering solutions across various cloud platforms.
- Familiarity with other big data tools and technologies is a plus.
Certifications (if any):
Relevant certifications in Data Engineering, Cloud, or Big Data technologies (e.g., Snowflake, Kafka) are a plus but not required