5+ years of experience in building out data pipelines in Python or Java
2+ years of experience working in AzureCloud Service
Experience with data processing platform, such as Azure Data Factory
Must be Fluent with SQL for data analysis
Expertise with transactional database engines, such as SQL server
Experience with data lake/data marts/data warehouse
Must have Asset Management industry domain experience, ideally supporting Fixed Income products
Excellent analytical and problem-solving skills with the ability to think quickly and offer alternatives both independently and within teams.
Exposure working in an Agile environment with Scrum Master/Product owner and ability to deliver
Ability to communicate the status and challenges with the team
Demonstrating the ability to learn new skills and work as a team
Experience with Spark
Experience working in Hadoop or other Big data platforms
Exposure to deploying code through pipeline
Good exposure to Containers like ECS or Docker
Working experience in a Linux based environment
Direct experience supporting multiple business units for foundational data work and sound understanding of capital markets within Fixed Income
Knowledge of Jira, Confluence, SAFe development methodology & DevOps
Proven ability to work quickly in a dynamic environment.
Bachelor's degree Computer Science or a related field.
Responsibilities:
Build and maintain new and existing applications in preparation for a large-scale architectural migration within an Agile function.
Align with the Product Owner and Scrum Master in assessing business needs and transforming them into scalable applications.
Build and maintain code to manage data received from heterogenous data formats including web-based sources, internal/external databases, flat files, heterogenous data formats (binary, ASCII).
Help build new enterprise Datawarehouse and maintain the existing one.
Design and support effective storage and retrieval of very large internal and external data set and be forward think about the convergence strategy with our cloud migration
Assess the impact of scaling up and scaling out and ensure sustained data management and data delivery performance.
Build interfaces for supporting evolving and new applications and accommodating new data sources and types of data.