As the world’s second largest reinsurer, Swiss Re must take into account a broad set of risk factors from across the globe. For the past 10 years, we have calculated risk measures such as value at risk (VaR) and expected shortfall using the Internal Capital Adequacy Model (ICAM), a core risk model built with MATLAB®. As we continued to expand ICAM’s capabilities over the years, however, it became increasingly hard to manage the complexity. Numerous interdependencies made it difficult to fully understand how the model worked.
This year, to make ICAM easier to understand, update, and maintain, we completed a major overhaul. We made the revamped ICAM core available as a production IT system—the Integrated Risk Analytics and Modeling Platform (IRAMP)—and sped up risk modeling calculations by executing them on a computer cluster. MATLAB, MATLAB Production Server™, and MATLAB Distributed Computing Server™ enabled us to achieve both these objectives without having to develop custom IT infrastructure.
Applying Object-Oriented Programming to Improve ICAM Transparency and Maintainability
ICAM is designed to enable risk reporters to understand the aggregate effect of approximately 300,000 risk factors on the company’s total economic balance sheet. Categories include interest rates, equity prices, real estate prices, credit spreads, and claims inflation, as well as operational risks, natural disasters, and mortality trends.
In rewriting ICAM, we wanted to make it easier for risk reporters to see how these factors affected risk measures. One of our most effective changes was to apply more object-oriented programming principles in writing the MATLAB code. Today’s version of ICAM has more 75,000 lines of MATLAB code—all under version control—comprising 400 data classes and 250 classes for risk factors and loss functions. The graphics and objects classes in our code have enabled us to increase the number of user interfaces in ICAM and to control them in a maintainable way (Figure 1).
Building an Enterprise Application for Risk Analysis
Calculating VaR, expected shortfall, and other risk measures with a one-year horizon from 300,000 risk factors is a computationally intensive process, involving Monte Carlo simulations in which 1,000,000 realizations are generated for each risk factor. We are using Statistics and Machine Learning Toolbox™ for regression, generalized linear models, and data compression and preparation as well as Monte Carlo simulations with random samples drawn from a variety of distributions.
We employed a three-part strategy for building an enterprise IT application to manage the lengthy compute times this process requires. First, we set up a computing cluster to support parallel computations with Parallel Computing Toolbox™ and MATLAB Distributed Computing Server. Second, we broke the process down into multiple distinct workflows, including validation, preprocessing, calculation, and evaluation. Third, we used MATLAB Production Server to establish a production IT framework that risk reporters could use to execute multiple workflows on the computing cluster.
We maintain two environments for developing and maintaining ICAM, one for production and one for development and training. The computing cluster in our production environment includes 165 workers. Our development and training environment has a similar computing cluster with 111 workers (Figure 2). After validating our ICAM application in the development and training environment, we prepared it for deployment in the production environment by compiling it into a standalone component using MATLAB Compiler SDK™.
Workers in the cluster are allocated as needed to complete workflows initiated by risk reporters. Each workflow is initiated from the IRAMP web interface and orchestrated by MATLAB Production Server. To begin the process, for example, risk reporters initiate the validate workflow, which verifies that the input data is internally consistent. Next, they kick off the preprocessing workflow, which transforms the raw input data into a format ready for use by the risk model. In the calculate workflow, all the Monte Carlo simulations are performed. This workflow requires the largest number of workers and the most time to complete. The results are stored as a snapshot in a 200 GB file on a shared file system. In the evaluate workflow, the risk reporters use a MATLAB application that we created to query results from the image and perform what-if analyses.
From Desktop to Cluster to Cloud
The overhaul of ICAM and development of IRAMP have been well received by risk reporters because the system is more transparent from end to end. While MATLAB provides a powerful and efficient development environment, by using MATLAB Production Server and MATLAB Distributed Computing Server for both the development and production environments, we ensure consistent results and increased stability in production.
We are now working with MathWorks engineers to migrate the IRAMP system to an external cloud-based system such as Microsoft® Azure®. This will provide for a larger scale and more flexible system, allowing us to reduce costs by scaling down during periods of low demand and to reduce wait times by scaling up during periods of high demand.