Description

Relevant Experience(in yrs):-  8+ years
Technical/Functional Skills:-       
1) Strong proficiency in SQL and database technologies
2) Familiarity with data warehousing and data modelling concepts
3) Hands-on experience with Google Cloud Platform (GCP) tools such as  Big Query, Cloud Storage, Dataproc, Dataflow, Pub/Sub, and Bigtable
4) Experience with big data technologies such as Hadoop, Spark, Hive, Kafka etc.
5) Familiarity with data warehousing and ETL tools such as Amazon Redshift, Google Big Query, or Apache Airflow
6) Knowledge of Airflow and CI/CD pipelines
7) Exposure to data visualization tools such as Looker, Power BI, Tableau etc.

 

Roles & Responsibilities:-
• Design, develop, and maintain data architectures and infrastructure
• Collaborating with cross-functional teams to understand complex data requirements and deliver efficient solutions
• Design, build, and maintain data pipelines to support data ingestion, ETL, and storage
• Monitor, troubleshoot, and improve the performance and reliability of data systems.
• Monitor Big Query usage and manage cost-effective scaling of resources.
• Develop and maintain data models to support analytics and reporting

Qualification:- BACHELOR OF COMPUTER SCIENCE

Education

Any Graduate