Description

Responsibilities for Data Engineer

  • Design, develop, and maintain scalable and efficient data pipelines using Airflow and Shell scripts to support ETL/ELT processes.
  • Build and optimize data flows and integrations into Snowflake, ensuring high-performance and low-latency data operations.
  • Collaborate with cross-functional teams to design data solutions that meet business needs and technical standards.
  • Develop and manage data integration workflows using Informatica Cloud, ensuring seamless integration with source and target systems.
  • Monitor and fine-tune pipeline performance to meet SLAs and ensure data quality.


Qualifications for Data Engineer

  • 5+ years of advanced working SQL knowledge and experience working with relational databases, query authoring (SQL), and working familiarity with various databases.
  • 5+ years of work experience in data engineering roles focusing on Snowflake, Airlow, and Shell Scripting.
  • Proficiency in writing and debugging Shell scripts for automation and data processing tasks.
  • ETL development experience in Python, Java/Scala.
  • Knowledge of the AWS cloud platform and its integration with Snowflake is preferred.
  • Strong communication to work with technical and non-technical stakeholders
  • Experience with working in a distributed team
  • Hands-on experience with Informatica Cloud tools for data integration and transformation is a plus.
  • Familiarity with modern data architecture concepts like data lakehouse and data mesh is a plus.
  • Experience with other dashboarding solutions like Tableau is a plus.


 

Education

Any Graduate