Description

Role Description

Job Description

Responsibilities include but are not limited to:

Data Pipeline Development:


 

Design, develop, and maintain scalable and efficient data pipelines.

Extract, transform, and load (ETL) data from various sources into our data warehouse.

Enable data quality and integrity throughout the ETL process.

Data Architecture:


 

Collaborate with cross functional tech leads and architects to design and optimize data models and database structures.

Implement best practices for end to end data pipe management on data lake.

Work on data warehousing solutions, such as Azure ADF, Snowflake etc.

Data Integration:


 

Integrate third-party data sources and APIs to enrich our datasets.

Enable processes for monitoring, exception management across end to end data pipe build to ensure integrity and reliability of data engineering solutions

Implement data connectors and data ingestion processes

Work on designing and defining new ways of data integrations while managing existing data integrations

Performance Optimization:


 

Monitor and optimize data pipelines and query performance.

Troubleshoot and resolve data-related issues in a timely manner.

Data Security and Compliance:


 

Ensure data security and compliance with relevant data protection regulations (e.g., GDPR, HIPAA).

Implement access controls and encryption mechanisms.

Collaboration:


 

Collaborate with analytics, product leads and business product owners to define and build best in class data ecosystem driving business analytic capabilities

Be part of agile operating model alongside analytics and business teams to drive collective data & analytics capabilities

Work alongside planning, master data and other teams looking for clean, connected data and provision datasets as API’s or onetime per need

Support data consumers by providing access to clean and well-organized datasets.

Documentation:


 

Maintain documentation for data pipelines, schemas, and data dictionaries.

Document end to end data pipes and ongoing enhancements to them

Create and update documentation on data engineering processes and standards.

Qualifications

Bachelor's degree in Computer Science, Information Technology, or a related field. Master's degree preferred.

5 years of experience as a Data Engineer or similar role.

Proficiency in data modeling, ETL development, and data warehousing.

Strong programming skills in languages like Python, Java, or Scala.

Experience with data pipeline orchestration tools

Knowledge of SQL and proficiency in working with relational and NoSQL databases.

Familiarity with cloud platforms (e.g., Azure, Snowflake) and associated data services.

Excellent problem-solving and communication skills.

Ability to work independently and as part of a cross-functional team.

Any specific certifications or additional qualifications preferred.

If you are a highly motivated and detail-oriented Data Engineer with a passion for working with data,performance apparel business.

Education

Bachelor's degree in Computer Science, Information Technology