Job Description
- Participate in the design and implementation of new security systems that support our loss prevention agents' investigation suite of software, including providing recommendations on solutions
- Analyze, design, develop and implement RESTful services and APIs
- Responsible for designing, building, and managing real-time data streaming pipelines using Apache Kafka, including setting up Kafka clusters, configuring Kafka Streams, and ensuring efficient data processing within these pipelines
- Working to integrate data from various sources and deliver it to downstream applications in real time.
- Evolve and optimize enterprise-grade Kafka topologies as organizational utilization grows.
- Address performance and scalability challenges posed by new or changing Kafka producers and consumers.
- Implement solutions to monitor Kafka components to address any Kafka messaging issues proactively.
- Troubleshoot new security installations to ensure the systems function accurately and satisfy quality and performance standards.
- Skill to balance driving the outstanding architecture with the realities of live customers and the need to ship software.
- Collaborate with product management and engineering leadership to understand business requirements and plan products and features.
- Work as a team to design, develop, test, deploy, maintain, and improve software.
- As a Senior Engineer, knowledge share and support pier code review efforts, etc.
- Be a model of best practices for junior level engineers
- Conduct code reviews for fellow team members, as required.
- Create unit tests to help ensure code quality throughout the application's life cycle.
- Analyze and improve the efficiency, scalability, and stability of existing and new systems and resources
- Improve code quality by tracking, reducing, and avoiding technical debt.
- Comfortable deploying service-oriented / micro-service-based architectures
- Ability to create and deploy event-driven architectures using messaging systems/service buses with technologies such as Confluent Kafka.
Required Qualifications
- Bachelor's degree in technology or information systems or equivalent experience
- 8+ years' experience in software engineering teams.
- 7+ years of experience in C# and the .NET Framework
- 4+ experience in .Net Core
- 4+ years of Azure cloud experience
- 4+ years of experience developing and scaling distributed systems
- Experience using Confluent Kafka
- Demonstrated ability to work both independently and within cross-functional project teams effectively
- Experienced in compliance best practices for technology platforms
- Able to adapt quickly to changing requirements and priorities
- Experience scaling and deploying applications in the public cloud using technologies like the following:
- Azure, Message Services, Docker (all required)
Preferred/Desired Qualifications
- 4+ years of experience with a microservices architecture
- Experience with software development lifecycle (SDLC) and Agile Methodologies
- Experience scaling and deploying applications in the public cloud using technologies like the following - Kubernetes
- A can-do demeanor and ability to positively impact our culture