Description

Key Responsibilities:

  • Design and build ETL/ELT pipelines for structured and unstructured data.
  • Develop and optimize data models for analytics and ML workflows.
  • Support API integration and collaborate on lightweight services exposing data assets.
  • Work with data scientists and ML engineers to productionize datasets and features.
  • Ensure data quality, scalability, and performance across systems.

Must-Have:

  • 5+ years of experience in data engineering or backend development.
  • Strong in Python, SQL, and distributed systems (e.g., Spark, Kafka, Airflow).
  • Experience with cloud platforms (AWS preferred) and data lake/data warehouse design.
  • Familiarity with APIs or event-driven architecture is a plus.

Nice-to-Have:

  • Exposure to ML pipelines, feature stores, or AI platforms.
  • Experience in financial services or regulated environments.
  • Understanding of data governance and security best practices

Education

Any Gradute