Description

A ETL Developer is needed to perform the following duties: Successfully implemented and promoted ETL development best practices across multiple projects, leading to enhanced data quality and maintainability. Led and participated in projects requiring in-depth analysis of complex business processes and system standards, delivering high-quality data solutions. Developed and optimized shell scripts to enhance the efficiency and reliability of the notification process, reducing manual intervention and improving overall workflow automation. Integrated functionality within the shell script to dynamically attach relevant reports generated by Informatica jobs, ensuring accurate and up-to-date information delivery. Designed efficient database schemas and optimized queries, stored procedures, and views, resulting in improved data retrieval times. Collaborated with the MOVE it team to set up and manage the transfer of files from clients/vendors to AWS S3 buckets using SFTP. Implemented automated archiving solutions for files in AWS S3 buckets, enhancing data organization and accessibility. Implemented automated archiving solutions for files in AWS S3 buckets, enhancing data organization and accessibility. Designed and developed an Informatica job to automate file transfers between AWS S3 buckets and a Linux server, integrating seamlessly with existing data workflows. Created a shell script to move files from various directories in an AWS S3 bucket to a Linux server for processing, and back to S3 after job completion. Utilized AWS CLI commands within the shell script for efficient and reliable file transfers. Configured and scheduled the Informatica job for automated execution, reducing manual intervention. Integrated comprehensive logging within the shell script to track file transfer processes and facilitate monitoring and troubleshooting. Implemented robust error handling to ensure data integrity and quick issue resolution during file transfers. Led the enhancement of SnapLogic pipelines to incorporate business logic for flagging data in a Redshift database based on different programs. Collaborated with business stakeholders to understand program requirements and implemented complex business logic within SnapLogic pipelines to generate the required flags. Worked closely with the Cognos reporting team to integrate the newly flagged data from Redshift into existing report templates, enabling business users to generate reports based on the flagged data. Provided on-call support for monitoring and maintaining SnapLogic pipelines and Informatica workflows, ensuring continuous and efficient data processing. Led the project team in providing production support during a major migration to ensure outbound files were sent via SFTP without disruption. Managed the transition of outbound file transfer processes, ensuring secure and efficient operations post-migration. Monitored post-migration performance of outbound file transfer processes, ensuring efficiency and addressing any issues. Documented the migration process and provided training to team members, ensuring knowledge transfer and support continuity. Ensured post-processing actions moved files back to AWS S3, maintaining data consistency for downstream applications. Applied performance tuning techniques to enhance ETL job efficiency. Conducted thorough technical analysis of client business objectives, aligning ETL processes to meet their needs effectively. Implemented robust error handling and logging mechanisms within the shell script to track execution status, capture errors, and facilitate troubleshooting, ensuring reliable operation and easy maintenance. Developed, documented, and tested robust ETL interfaces using Informatica and SnapLogic, ensuring seamless data integration. Designed and developed complex workflows, sessions, and mappings in Informatica and SnapLogic, improving data flow efficiency. Developed an Informatica job to move the files from AWS S3 Bucket to Linux server. Provided comprehensive post-deployment support, resolving defects, and maintaining ETL process stability in production environments. Evaluated functional requirements and mapping documents, troubleshooting, and resolving development issues promptly. Collaborated with application development teams to design and implement effective ETL solutions, meeting program requirements. Documented technical specifications and project deliverables, ensuring clarity and accessibility for future reference and audits. Designed and executed comprehensive test cases, performing unit tests to validate ETL process functionality and performance. Offered sporadic production support, troubleshooting, and resolving ETL-related issues based on priority and impact. Developed and executed complex SQL and PL/SQL scripts to correct and update data in Oracle, Aurora, and Redshift databases according to business requirements. Analyzed and translated business requirements into technical specifications for accurate data corrections. Utilized expertise in Oracle, Aurora, and Redshift to ensure seamless integration and consistency across databases. Optimized SQL and PL/SQL queries for performance enhancement, ensuring efficient data correction without disrupting production operations. Conducted thorough testing and validation of data correction scripts in a staging environment before deploying to production. Maintained comprehensive documentation of data correction scripts, including version control and change logs. Provided ongoing support and troubleshooting for data correction activities, ensuring continuous data quality. Analyzed data at source and target systems, identifying and resolving data quality and integrity issues within specified timelines. Continuously tuned Informatica mappings, enhancing data processing efficiency and reducing job execution times. Mentored junior ETL developers, sharing best practices, technical expertise, and project knowledge to foster team growth. Maintained clear communication with stakeholders, providing project updates and addressing concerns to ensure alignment with business goals. Bachelor's Degree required in Computer Science or Computer Engineering or Information Systems or Healthcare Informatics

Education

Any Graduate