Key Skills: ETL, Hadoop, Hive, Bigdata, GCP, AWS
Roles and Responsibilities:
- Lead the creation of efficient ETL workflows to extract, transform, and load data into Transaction Monitoring systems.
- Implement data validation and cleansing techniques to ensure high data quality and integrity.
- Collaborate with developers and architects to design scalable and sustainable solutions that meet business needs.
- Ensure compliance with industry standards and best practices in all engineering work.
- Troubleshoot and resolve technical issues to optimize data flows within Transaction Monitoring.
Skills Required:
- Strong expertise in ETL design and development, handling complex data pipelines and workflows
- In-depth knowledge of Hadoop ecosystem tools including HDFS, Hive, and MapReduce
- Hands-on experience with Big Data technologies for large-scale data processing and storage
- Proficient in HiveQL for querying and managing data in Hadoop-based data warehouses
- Familiarity with cloud platforms such as AWS and GCP, particularly in deploying data solutions
- Experience with data quality, validation, and cleansing to ensure accuracy and integrity
- Understanding of data governance, compliance requirements, and financial crime detection systems
- Strong problem-solving skills and ability to optimize performance of data flows
- Ability to collaborate with cross-functional technical teams, including architects and developers
- Exposure to agile methodologies and version control tools (e.g., Git, Jenkins) in data projects
Education: Bachelor's degree in Computer Science, Engineering, or a related field