In early 2013 the <bank that I work for>‘s technology and application infrastructure was at a crossroads. The bank had avoided many of the mortgage related business mis-steps of it’s north-american counterparts and experienced a healthy growth in each of it’s business units. Until then the bank had relied on a number of vendor products in order to support it’s core business processes. The near instant deployment and integration of these vendor products were instrumental to its successes and allowed it to quickly establish itself as a player in the various Fixed Income and Equity markets in which it actively participated.
As the bank’s market share and number of actively traded markets grew, so did the number of vendor products that provided various pricing and risk management capabilities for these markets. The growing number of siloed solutions soon highlighted a number of challenges that needed to be addressed in order for the bank’s internal risk management capabilities to continue to scale up with it’s external business activities. Relying on various vendor products for trading and risk management generally means utilizing different analytics, different ways of bootstrapping discount curves, and inevitably differing takes on what a traded security is worth at a point in time. This increasingly complicated the ability of the bank to aggregate and combine outputs from these systems, and made it difficult to evolve the bank’s methodology for valuation with changing market conventions. There were also the more obvious challenges associated with duplicating trades, curves, and market data and having to put in place various mitigating controls in order to keep them in sync.
This bank is certainly not the first bank to find itself at these cross-roads and like many of it’s predecessors decided to build up it’s in-house technology capabilities in order to produce valuations using consistent in-house built analytics. What’s generally known to be a lengthy multi-year effort was delivered here in 19 successive weekly sprints and went live on March 7th 2014. The team tasked with delivering this platform relied on proven technologies and rapid application development techniques to deliver the first phase of this project in a very short time. The deliverables in this first phase included:
Unified Quant Library – quantitative library written in C++, provides fast and consistent access to analytical models and consolidates previously disjoint implementations
Derived Market Data Management – fully bi-temporal data store and compute grid for storage and bootstrapping of closing and intraday curves
Risk and Valuation Management – a scalable compute grid for generating intraday or end of day valuations and risk using multiple valuation modes
Streaming Quotes/Prices – market data distribution capable of delivering a sustained throughput of 400K transactions per second and used for aggregating internal and external market data
User tools – in Excel and as a C# GUI to give transparency to all aspects of the valuation to end users
The team was able to deliver these components in a short period of time without cutting corners on maintainability, security, monitoring, scalability, and performance. For example: aside from the many user facing graphical user interfaces and Excel plugins, the dev team put together a highly scalable monitoring and metrics infrastructure as well as a real-time dashboard for easy access to these metrics.
This web-based UI displays current rates and historical trends for every API call exposed in the application infrastructure and is an invaluable tool for identifying and troubleshooting any potential issues in the system before any business processes are impacted.