Description

Job Description

 Mandatory Skills: AWS, Python, SQL, spark, Airflow, Snowflake
 

Responsibilities:
 

Create and manage cloud resources in AWS
 

Data ingestion from different data sources which exposes data using different technologies, such as: RDBMS, REST HTTP API, flat files, Streams, and Time series data based on various proprietary systems. Implement data ingestion and processing with the help of Big Data technologies
 

Data processing/transformation using various technologies such as Spark and Cloud Services. You will need to understand your part of business logic and implement it using the language supported by the base data platform
 

Develop automated data quality check to make sure right data enters the platform and verifying the results of the calculations
 

Develop an infrastructure to collect, transform, combine and publish/distribute customer data.??
 

Define process improvement opportunities to optimize data collection, insights and displays.?
 

Ensure data and results are accessible, scalable, efficient, accurate, complete and flexible
 

Identify and interpret trends and patterns from complex data sets
 

Construct a framework utilizing data visualization tools and techniques to present consolidated analytical and actionable results to relevant stakeholders.
 

Key participant in regular Scrum ceremonies with the agile teams
 

Proficient at developing queries, writing reports and presenting findings
 

Mentor junior members and bring best industry practices

Education

Any Graduate