Minimum of 12 years of practical experience with one or more strategic Data Architecture and Engineering, Cloud Data Modernization, Data migration, and Event-driven architecture.
Financial services/capital markets experience is strongly preferred, but not mandatory.
Experience architecting and designing large data platforms, data warehouses, data lakes, streaming applications using Kafka/similar technologies, data ingestion, integration, and distribution pipelines.
Practical experience using the latest technologies for data ingestion, integration, transformation, storage, mining/warehousing, big data analytics, and visualization.
Deep understanding of traditional data architecture practices (data warehousing, datahub, MDM, ODS, etc.) as well as the transition to Next-Gen platform architectures such as distributed data lake and data mesh on the cloud, etc.
Practitioner experience in multiple data technologies such as:
MPP (Snowflake, BigQuery, AWS Redshift, etc.)
Implementation/migration experience with Databricks Unity Catalog to support medallion architecture data platform.
Cloud Data Platforms (AWS, Azure, Google Platform)
Data integration Tools (Informatica, Talend, etc.)
Experience with scripting languages: Python, PySpark, Java
Traditional RDBMS (Teradata, Netezza, MS SQL Server, Oracle, MySQL, PostgreSQL)
Prior consulting experience.
Undergraduate or master’s degree in a quantitative field (e.g. engineering, computer science, business, economics, finance, statistics, and/or analytics).
Ability to prioritize efforts across multiple projects and manage competing deadlines with stakeholders.
Ability to work independently in ambiguous environments that are not clearly defined.
Ability to be flexible and follow tight deadlines.
Excellent verbal and written communication and presentation skills commensurate with the production and presentation of management-ready materials
Any Graduate