Mandate : Big Data, Python, Spark , Data bricks , SQL , Azure , CICD , Data warehouse
Coding in Python is Mandate during interview.
Job Description:
Build Transformation and Load processes using Kafka streamed data into a Snowflake database. Analyze layouts and SQL design requirements. Define metadata for identifying and ingesting source files. Create and update source to target mapping lineage, transformation rules, and data definitions. Identify PII details and conform with standard naming conventions. Coordinate data integration, conformity, quality, integrity, and consolidation efforts. Analyze Source to Target Field level mapping for data sources. Design and implement transformation rules within the transformation framework. Provide support for resolving data quality issues. Coordinate with technical teams, SMEs, and architects on enhancements and change requests. Provide training as well as create detailed documentation, implementation plans, and project trackers.
Prior experience must include:
Any Graduate