Job Description
Implement monitoring and alerting solutions to provide insight into overall platform health.
As an integral part of the Data Platform team, onboard various data sources by creating ETL pipelines.
Provide resolutions and/ or Work around to data pipeline related queries/ issue as appropriate
Ensure that Ingestion pipelines that empower the Data Lake and Data Warehouses are up and running.
Assist the End users of Data Platform with Query debugging and optimisation.
Collaborate with different teams in order to understand / resolve data availability and consistency issues.
Requirements
· 2+ years of experience in Technical/ Application Support for ETL applications.
· Ability to read and write SQL - and understanding of one of Relational Databases such as MySQL, Oracle, Postgres, SQL Server.
· Good knowledge of python programming knowledge
· Comfortable with Linux with ability to write small scripts in Bash/Python. Ability to grapple with log files and unix processes.
· Prior experience in working on cloud services, preferably AWS.
· Ability to learn complex new things quickly
· Occasional involvement in support over weekends/ early mornings/ late nights.
· Be a team player with an ability to work under pressure with good time management skills
· Excellent written and verbal communication skills
· Troubleshooting skills and doing root cause analysis RCA
· Prior experience in big data technologies like Spark, Hadoop, Kafka , Airflow etc .. is preferred
Any Graduate