Description

We are seeking a skilled Data Engineer to join our growing data team. The ideal candidate will design, build, and maintain robust data pipelines, ensure data quality, and enable analytics and data-driven decision-making across the organization. This is a fully remote role based in the USA.

Key Responsibilities:

Design, develop, and maintain scalable ETL/ELT pipelines for structured and unstructured data.

Work with data warehouses, lakes, and cloud-based data storage solutions (AWS, Azure, or GCP).

Ensure data quality, integrity, and consistency across multiple data sources.

Collaborate with data analysts, data scientists, and business teams to understand data requirements.

Optimize data processing workflows for performance and scalability.

Implement best practices for data security and governance.

Troubleshoot and resolve data-related issues proactively.

Required Skills & Qualifications:

Bachelor’s or Master’s degree in Computer Science, Information Systems, or a related field.

3+ years of experience as a Data Engineer or in a similar data-focused role.

Strong SQL skills and experience with relational and NoSQL databases.

Hands-on experience with ETL/ELT tools and data pipeline frameworks (e.g., Apache Airflow, Talend, Informatica).

Experience with cloud platforms such as AWS, Azure, or GCP.

Familiarity with big data technologies like Hadoop, Spark, or Kafka is a plus.

Strong problem-solving skills and ability to work independently in a remote setup.

Preferred Skills:

Experience with Python, Java, or Scala for data engineering tasks.

Knowledge of data modeling, data warehousing, and BI tools.

Experience with CI/CD pipelines for data workflows.

Familiarity with data governance and compliance standards

Education

Bachelor's or Master's degrees