Description

Responsibilities:

  • Design and develop scalable ETL/ELT pipelines to ingest and transform data from various sources.
  • Build and maintain data models and transformations using DBT in Snowflake.
  • Work closely with analysts and business users to deliver Power BI dashboards and datasets.
  • Integrate and automate data extraction from external APIs using Python, loading clean data into Snowflake.
  • Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions.
  • Optimize and monitor data workflows in ADF, AWS Glue, or similar tools.
  • Work with relational databases and ensure data integrity, quality, and performance.
  • Maintain documentation and follow best practices for data engineering workflows.
  • Support data migration or integration efforts with SAP BW systems (a plus, not mandatory).

Required Skills:

  • Strong experience with Snowflake (data modeling, performance tuning, warehousing concepts).
  • Proficiency in DBT (Data Build Tool) for transformation pipelines.
  • Hands-on experience with Power BI (data modeling, DAX, visualization best practices) or other tools like SAC/Tableau.
  • Solid knowledge of Python, especially for API integrations and data processing.
  • Experience with cloud-based data pipeline tools, such as Azure Data Factory (ADF) and/or AWS Glue.
  • Good understanding of relational databases, SQL, and data modeling concepts.
  • Strong problem-solving skills and ability to work independently or in a team.
  • Excellent communication and documentation skills.

Nice to Have:

  • Exposure to SAP BW data sources or integration with Snowflake.
  • Experience with DevOps tools for data (CI/CD, version control in data workflows).
  • Familiarity with orchestration tools like Airflow or Azure Data Factory triggers

Education

Any Gradute