Description

We are seeking a skilled and experienced Data Integration Specialist with a strong background in HVR and Talend. The ideal candidate will have at least 6 years of hands-on experience designing, implementing, and maintaining high-volume data replication and ETL/ELT pipelines across heterogeneous environments. You will play a key role in ensuring real-time and batch data flows are robust, secure, and efficient.


Key Responsibilities:
·                    Design and implement real-time data replication solutions using HVR across on-premise and cloud environments.
·                    Develop and maintain ETL/ELT pipelines using Talend Data Integration and Talend Big Data tools.
·                    Optimize data flows for performance, scalability, and reliability.
·                    Monitor, troubleshoot, and resolve issues related to data replication and ETL jobs.
·                    Work closely with DBAs, Data Engineers, and Business Analysts to meet data integration requirements.
·                    Document technical specifications, workflows, and best practices.
·                    Participate in data architecture and integration strategy planning.
·                    Ensure data quality, consistency, and security across all systems.
·                    Support data migration and transformation efforts during system upgrades or transitions.


Required Qualifications:
·                    Bachelor’s degree in Computer Science, Information Systems, or related field.
·                    Minimum 5 years of hands-on experience with Talend (Studio, TAC, ESB, etc.).
·                    Minimum 3+ years of experience with HVR for real-time replication.
·                    Strong understanding of data warehousing concepts, data lakes, and cloud platforms (AWS, Azure, or GCP).
·                    Proficiency with SQL, relational databases (e.g., Oracle, SQL Server, PostgreSQL), and data modeling.
·                    Experience with Linux/Unix scripting and job automation tools (e.g., Autosys, Cron).
·                    Familiarity with data governance and data quality best practices.


Preferred Skills:
·                    Knowledge of Big Data ecosystems (Hadoop, Spark).
·                    Experience with CI/CD tools and version control (Git, Jenkins).
·                    Familiarity with REST APIs and web services integration.
·                    Exposure to data security, encryption, and compliance standards (GDPR, HIPAA, etc.)

Education

Bachelor's degree