Description

Mandate : Big Data, Python, Spark , Data bricks , SQL , Azure , Terraform , CICD , Data warehouse

Coding in Python is Mandate during interview.

 

Basic Qualifications for consideration:

  • 8+ years' experience with building large scale big data applications development
  • Bachelor’s in computer science or related field
  • Provide technical leadership in developing data solutions and building frameworks.
  • Expertise in solutions for processing large volumes of data, using data processing tools and Big Data platforms.
  • Experience building Data Lake, EDW and data applications using Azure, AWS or GCP
  • Hands-on experience in cloud Data stack (preference is Azure)
  • Understanding of cluster and parallel architecture as well as high-scale or distributed RDBMS, SQL experience
  • Hands on experience on major programming/scripting languages like Java
  • Java experience with OOPS concepts, multithreading
  • Nice to have experience deploying code on containers.
  • Conduct code reviews and strive for improvement in software engineering quality.
  • Hands-on experience in production rollout and infrastructure configuration
  • Demonstrable experience of successfully delivering big data projects using Kafka, Spark
  • Exposure working on NoSQL Databases such as Cassandra, HBase, DynamoDB, and Elastic Search
  • Experience working with PCI Data and working with data scientists is a plus.
  • In depth knowledge of design principles and patterns
  • Able to tune big data solutions to improve performance.

 

Preferred Skills, Experience and Education:

  • Exposure to Big Data tools and solutions a strong plus
  • Exposure to Relational Modeling, Dimensional Modeling, and Modeling of Unstructured Data
  • Bachelors in computer science or related field
  • Experience in Design and architecture review
  • Experience in Banking, Financial domain

Education

Any Graduate