Consult on complex initiatives with broad impact and largescale planning for Software Engineering.
Review and analyze complex multifaceted, larger scale or longer-term Software Engineering challenges that require in-depth evaluation of multiple factors including intangibles or unprecedented factors.
Contribute to the resolution of complex and multifaceted situations requiring solid understanding of the function, policies, procedures, and compliance requirements that meet deliverables. Strategically collaborate and consult with client personnel.
Required Qualifications:
5 plus years of Software Engineering experience, or equivalent demonstrated through one or a combination of the following: work or consulting experience, training, military experience, education.
8 plus years of handson experience as a Data Architect working on innovative, scalable, data-heavy, on-demand enterprise applications and guiding data analysts & engineers
5 plus years
Handson experience working on Greenfield solutions, leading data tracks and building data sourcing, integration from scratch
Working closely with partners in Infosec, Infrastructure and Operations and establish data & system integration best practices
Identifying, managing and remediating IT risk and tech debt
7 plus years of handson experience building and delivering self-serve solutions leveraging various approved on-prem and cloud capabilities
Split time between leading design, implementation & software engineers and performing below handson role.
Additional Skills:
7 plus years of backend and ETL experience in consuming, producing and integration with enterprise data providers in batch or on-demand
5 plus years of experience with Data Analytics & Reporting using PowerBI, SAS Viya, Tableau
5 plus years of experience with cloudbased data solutions like Openshift Data Foundation (ODF), Openshift Data Access (ODA), BigQuery, Snowflake, Talend Cloud, Databricks
5 plus years of experience with ETL tools like – Alteryx, Talend, Xceptor, Apache Camel, Kafka Streams
4 plus years of experience with data virtualization using tools like – Dremio, BigQuery Omni, Redhat Virtualization
7 plus years of experience with Rest API, Apigee, Kafka, JSON to consume and produce data topics
7 plus years of experience with data modeling, metadata management, data governance, data lineage and data dictionary
7 plus years of experience with MS SQL, MYSQL, PostgreSQL, MongoDB, Teradata, Apache Spark & deployment tools like Liquibase
3 plus years of experience with Java 8 plus, Spring Framework, SpringBoot and Microservices
3 plus years of experience with Docker, Kubernetes, Openshift containers
3 plus years of experience with Cloud Poviders like Openshift (OCP), GCP, Azure
3 plus years of experience with CI/CD tools like Jenkins, Harness, UCD, GitHub, Maven, Gradle
3 plus years of experience with DevSecOps tools like SonarQube, GitLab, Checkmarx, Black Duck
3 plus years of experience with Logging, Monitoring tools ELK Stack, Splunk, AppDynamics
5 plus years of experience Proficiency in scripting languages such as Python, Bash, Shell
3 plus years of experience with tests frameworks using Jasmine, Karma, Selenium, Junit, Jmeter, RestAssured, Postman
3 plus years of experience working in Agile environment using Scrum/Kanban
3 plus years of experience working in Jira, Confluence