Design and develop complex ETL workflows using Informatica Power Center to integrate data from various sources into Oracle databases. Administer, configure, and maintain Snowflake data warehousing environments in a multicloud setup, ensuring optimal performance and reliability. Collaborate with data engineering and data analytics teams to design and implement data pipelines and ETL processes within Snowflake. Monitor system health and performance using Snowflake's built-in tools and third-party monitoring solutions, proactively addressing issues to maintain uptime. Implement security best practices, including role-based access control, encryption, and auditing, to protect sensitive data. Automate routine tasks using Snowflake's scripting capabilities and external scheduling tools to improve operational efficiency. Implemented advanced transformations and mappings to ensure data accuracy, consistency, and data quality. Optimize Informatica workflows and SQL queries for enhanced performance and efficiency. Utilize Informatica Power Center's error handling and recovery mechanisms to maintain data integrity during ETL processes.
Must possess 1 year of experience in the job offered or in a related role. Must also possess experience with AWS Cloud, Azure, Databricks, S3, Spark, Python 3.6, Bigdata, Snowflake, GBQ, GCP, CICD, Kubernetes, Git, Jenkins, JIRA, Airflow, Pyspark, Azure Analysis Services, SSAS, Tableau, Hadoop, Hive, HDFS, Oozie, Shell, HBase, Kafka, Scala, Elasticsearch, and Splunk
Any Gradute