Talend and Bigdata Developer
• Around 8-12 years of industry experience in application development.
• Collaborate with stakeholders to understand data integration requirements, and design and implement Talend ETL processes.
• Hands on Experience in using Talend data integration and must make the code generic, dynamic, reusable.
• Experience with Apache Kafka for real time data streaming
• Develop and maintain data processing frameworks using Scala or Python
• Proven experience working with Apache Spark , Hive and Oracle
• Troubleshoot issues, implement logging, and error handling mechanisms at individual job level and project level.
• Must have knowledge in performance improvement steps performed in Talend and parallel execution of jobs.
• Experience in loading various data warehouses and Data marts using Talend ETL technologies.
• Hands on experience in working with structured files like JSON, XML, Excel, CSV as well as extracting data from different databases like Oracle/DB2.
• Must have extensive knowledge in using APIs in Talend to fetch data as well as load files to shared locations.
• Highly proficient in writing complex yet efficient SQL
• Extensively worked on PL/SQL packages, Procedures, Functions, Triggers, Views, M Views, partitions and Exception handling for retrieving, manipulating, checking and migrating complex data sets in Oracle.
• Must have experience in Data Modelling and Warehousing concepts such as Start Schema, OLAP, OLTP, Snowflake Schema, Fact Tables for Measurements, Dimension Tables.
• Must have knowledge in Unix Commands that involve file operations.
• Good to have knowledge of batch scheduling and creation of JIL using Autosys.
• Good to have experience in working with GIT, Bitbucket, confluence.
• Good to have knowledge in Big data Concepts.
• Good to have knowledge in Java/Python/Spark.
Qualification:- BACHELOR OF COMPUTER SCIENCE
Any Graduate