The Bank of England’s Forecasting Platform: COMPASS, MAPS, EASE, and the Suite of Models
Economic forecasting is an important function of Monetary Analysis at the Bank of England, since it is an input into the Monetary Policy Committee’s (MPC’s) deliberations and a means of communicating their decisions in the "Inflation Report." Those forecasts are ultimately a matter of committee judgement, but that judgement is in part informed by model-based forecasts and analysis produced by bank staff. In 2011, bank staff completed a project to replace its macroeconomic forecasting platform with a new one, incorporating the flexibility to use multiple models within a new IT infrastructure built with MATLAB at its heart. In this session, Matt discusses the Bank of England’s use of MATLAB in building the MPC’s judgemental forecasts and in producing model-based analysis for the MPC.
Matt Waldron, Bank of England
New Developments in MATLAB for Computational Finance
In this presentation, Kevin identifies new developments in MATLAB with a computational finance focus. He highlights new features in derivatives pricing, econometrics, and portfolio optimisation, and how those features utilise core MATLAB improvements, statistics, and optimisation functionality. Kevin also discusses approaches for accessing third-party trading and analytics systems. The presentation concludes with thoughts from a computational finance perspective about future directions in MATLAB.
Machine Learning and Applications in Finance
This presentation provides a high-level introduction to machine learning as a generic, data-driven approach to finding structure in data. It gives an overview of methods for unsupervised and supervised learning problems. Key concepts and aspects of these techniques are examined and explained using intuitive graphical examples. A selection of illustrative applications of unsupervised and supervised learning methods to financial data is then used to motivate the use of machine learning in finance and discuss potential benefits. The talk concludes by highlighting some practical considerations of working with machine learning techniques and large financial data sets.
Model Calibration in MATLAB
Insurance companies are required by regulators to model their 1-in-200 worst-case losses. Building the models themselves is one step in the process. A further step is the calibration of those models and the continuous recalibration as market conditions change.
In this presentation, Sam discusses Prudential's migration from a Microsoft® Excel based calibration process into one which is completely automated in MATLAB, which can be rapidly rerun with up-to-date data and a repeatable audit trail. The process includes loading from a datafeed, configuration via a GUI, distribution fitting, stochastic modelling, and direct update of our database of market risk records.
An Object-Based Policy Simulator for a Systems Strategy Project
HSBC is a global bank with complex, geographically distributed operations. In addition to this operational complexity, HSBC needs to comply with multiple regulatory frameworks simultaneously. As part of a larger programme of continuous improvement, HSBC is investigating ways of introducing a greater level of automation within the global risk rating system. In order to allow the project team to systematically test a large number of policy permutations and dynamics, HSBC decided to create a synthetic prototype, or policy simulator, in object-oriented MATLAB.
In this presentation, Seth focuses on management of the customer rating process and systems that embody the associated risk measurement policies and workflows. He also highlights the power of MATLAB in its object-oriented mode for facilitating problem-solving in the early phases of a systems project. This has enabled HSBC to respond quickly to changes in policy ideas, explore different what-if scenarios, and generate deployment-friendly code ahead of the formal implementation phase, accelerating the total cycle time of business projects.
Quantitative Investment: Research and Implementation with MATLAB
In this presentation, Ed describes how quantitative researchers at Fulcrum Asset Management harness MATLAB for the development and implementation of systematic trading strategies. Strategy development includes the thorough testing of trading signals for robustness, profitability, and diversification. Implementation includes the deployment of models in a production environment and their integration with Fulcrum’s compliance engine and algorithmic trading system.
Modelling Hedge Fund Returns Using a State-Space Framework
Linear factor models constitute one of the workhorses of financial time-series econometrics, commonly used for a variety of purposes including quantitative risk management, style analysis, and performance measurement and attribution. They are, however, limited by normality, independence, and linearity assumptions. Using recent enhancements to MATLAB toolboxes, we look at how state-space models can be used to overcome some of these issues, specifically in the context of modelling hedge fund returns. More generally, state-space models provide a flexible and intuitive approach to forecasting, testing, and simulating many different types of economic and financial market models, in a framework that helps bridge the gap between frequentist and Bayesian statistical approaches. These broader applications are also briefly discussed.
Calibration and Simulation of Multifactor Interest Rate Models for Risk Applications
In this master class, Kevin highlights a number of modelling, optimisation, and acceleration techniques in MATLAB by using the example of Monte Carlo simulation of interest rate models for counterparty credit risk analysis. He shows how single-factor and multifactor models can be calibrated to both current market data and historical data using optimisation solvers. Calibrated models are simulated and counterparty credit risk measures are computed for a portfolio of interest rate instruments. He concludes by discussing approaches for speeding up performance using parallel computing.
Developing Multilayered MATLAB Applications
Application developers face numerous challenges including managing code complexity, supporting both ad hoc and production use cases, scaling performance, and facilitating software reuse. In this master class, David discusses practical techniques for developing multilayered MATLAB applications that separate data access, data structures, algorithms, and user interaction in order to address these challenges. He includes examples from trading and risk.
Machine Learning with MATLAB
As the size and complexity of financial datasets increase, analysts are making use of machine learning techniques to understand trends and build predictive models in applications such as trading, risk, asset management, and fraud detection.
In this master class, Kirsty outlines some financial examples where a machine learning approach could be followed. She discusses a variety of techniques that could be used to make sense of large and complex financial datasets, including:
- Data exploration and visualisation
- Classification, clustering, and regression modelling
- Processes of converging upon and tuning the best predictive model for the data
Finally, Kirsty highlights common pitfalls that are encountered in a machine learning approach, and discusses methods of counteracting them.
Scaling Up MATLAB Analytics
In this master class, Marta discusses transitioning the execution of MATLAB code from desktop to high-performance computing and production environments. She covers:
Accelerating MATLAB code by parallel execution on multicore machines, grids, and the Amazon Cloud
Packaging and sharing MATLAB code to accelerate the development cycle, with examples of:
- Sharing applications via MATLAB apps
- Distributing standalone applications
- Integrating MATLAB analytics with other environments (e.g. spreadsheets, custom enterprise applications, or web services)
After reviewing the related technologies, Marta highlights criteria that help determine the best route for scaling up including language utilisation, scheduling, data interfaces, latency, throughput, and robustness.