Description

Job Description

 

What you will do

You would:

  • Be responsible for development and maintenance of applications with technologies involving Enterprise Python and Distributed technologies.
  • Collaborate with developers, product manager, business analysts and business users in conceptualizing, estimating and developing new software applications and enhancements. 
  • Assist in the development, and documentation of software’s objectives, deliverables, and specifications in collaboration with internal users and departments. 
  • Collaborate with QA team to define test cases, metrics, and resolve questions about test results. 
  • Assist in the design and implementation process for new products, research and create POC for possible solutions.
  • Develop components based on business and/or application requirements 
  • Create unit tests in accordance with team policies & procedures 
  • Advise, and mentor team members in specialized technical areas as well as fulfill administrative duties as defined by support process 
  • Create Value-adds that would contribute to Cost Optimizations/ Scalability/ Reliability/Secure solution

Qualifications

  • Bachelor’s degree or equivalent in computer science
  • 7+ years’ experience in Python/BigData/Spark/Kafka/SQL/Angular/AWS/Microservices
  • Preferred knowledge/experience in the following technologies
    • Big Data Ecosystems: Hadoop, Spark, Kafka
    • Spark, Python
    • Streaming and Batch Analytics processes
  • Experience in following Tools: Eclipse, Maven, Gradle, DB tools, Bitbucket/JIRA/Confluence
  • Can develop SOA services and good knowledge of REST API and Micro service architectures
  • Solid knowledge of web architectural and design patterns
  • Knowledge of JavaScript UI frameworks is desirable (backbone, angular, react, etc.)
  • Understands software security practices including user authentication and authorization, data validation and an understanding of common DOS and SQL injection techniques.
  • Familiar with profiling, code coverage, logging, common IDE’s and other development tools.
  • Scripting Languages: JSP & Servlets,  JavaScript, XML, HTML, Python and Bash
  • Familiar with Agile Methodologies – SCRUM and Strong communication skills (verbal and written)
  • Ability to work within tight deadlines and effectively prioritize and execute tasks in a high-pressure environment.
  • Demonstrated verbal and written communication skills, and ability to interface with Business, Analytics and IT organizations
  • Ability to work effectively in short-cycle, team oriented environment, managing multiple priorities and tasks
  • Ability to identify non-obvious solutions to complex problems
  • Behavioral Attributes
    • Team player with excellent interpersonal collaboration skills
    • Strong verbal and written communication 
    • Possess Can-Do attitude to overcome challenges with high energy levels
    • Self-motivated, directed and passionate

Nice to have qualifications:

  • Technical 
    • BigData/Python/Spark/Cloud Certifications
    • Strong technical knowledge of SQL and data analysis with relational database modeling principles and techniques
    • Working knowledge of Data Technologies in the Cloud such as AWS services of compute, storage, messaging, RDS/Redshift or Snowflake
    • Nice to have strong skills in writing SQL queries (Oracle/SQL Server/Warehouse/NoSQL) and experience in Databricks
    • Basic UNIX/Linux knowledge/exposure is a plus but not required
    • Knowledge of data warehousing concepts
    • Nice to have proven expertise in ETL/ELT tooling such as Pentaho, Integration Services, Informatica, Data Pipeline, Glue
  • Non-Technical
    • Functional knowledge of CRM, Marketing, Loyalty
    • Excellent Analytical and problem solving skills 
    • Ability to diagnose and troubleshoot problems quickly 
    • Strong time management skills with accountability 
    • Ability to take full ownership of tasks and projects 

 

 

Education

Any Graduate