Responsibilities:
Design, build, and maintain data pipelines.
Develop ETL processes for data integration.
Build and optimize data warehouses and lakes.
Ensure data quality and security.
Collaborate with data scientists and analysts.
Requirements:
Must have 7+ yrs of experience
Bachelor's degree in CS or related field.
Experience with SQL, Python, and big data technologies.
Proficiency in cloud platforms (AWS, GCP, Azure).
Strong problem-solving and communication skills.
Bachelor's degree