In December 2017, the Prudential Regulation Authority brought forward a consultation document highlighting themes around effective model management further to initial communications in the 2017 annual concurrent stress tests.
The consultation document outlines four principles, which address the need for a comprehensive governance framework, including clear identification of models and purpose, an appropriate governance structure, defined roles of stakeholders, model developers, model owners and control functions. For model life cycle management, key themes include development, validation, independent review, use of judgement, implementation and the use of models supported by adequate documentation, IT systems and appropriate levels of reporting to senior management.
In this talk, Diederick will discuss the four principles on model risk management for stress testing outlined in the consultation paper, as well as offer thoughts on model risk management and governance structures more generally.
In 2018, financial institutions face significant demands to generate a multitude of models for managing risk and generating returns across diverse activities and markets. Furthermore, internal reviewers and external regulators are scrutinizing the associated model governance and management processes ever more closely. In this context, overall time to implementation has become a point of competitive advantage.
This talk describes how MATLAB® is being used to reduce end-to-end model development time and risk through:
- Supporting smooth data preparation and modelling workflows from exploration to automation
- Enabling the practical application of machine learning and optimisation techniques
- Combining automatically generated model documentation with expert insights and feedback
- Bundling data, models, and documentation for internal and external review
- Simplifying the deployment of models into production
- Orchestrating the ongoing validation of models in service
Giles Spungin will discuss recent work on improving and extending HSBC’s Pillar 2 operational risk model from a desktop-based process to a dynamically scalable cloud-based solution. He will examine how the model was scaled, first in serial by translating the legacy model into MATLAB® and then in parallel running the application on the cloud, reducing the model execution time from days to hours. He will provide his thoughts on the opportunities that cloud implementation can offer financial
Swiss Re is the world's
For a decade, Swiss Re has used MATLAB® to implement its internal risk model, ICAM (Internal Capital Adequacy Model). Dynamic and increasingly complex internal and regulatory requirements create a challenging development environment, where MATLAB proved to be the perfect development platform to quickly react to changing requirements. In 2017, Swiss Re concluded a major project to overhaul its internal risk model with the key goals being transparency, flexibility for future developments, speed, and precision of risk measures.
This presentation demonstrates how MATLAB is used within Swiss Re's internal risk model, including organizing and processing data with the table data structure in MATLAB; running fast algorithms that are in some cases accelerated by MATLAB Distributed Computing Server™; building graphical user interfaces, and visualizing data.
The aim of this presentation is to introduce online lecture construction using recent developments in MATLAB®, such as live scripts, video recording, and graphical applications. Its focus is on advanced quantitative topics, such as stochastic modelling and Monte Carlo simulation, and it is intended for an audience with an interest in financial applications. The merits of the proposed blend of theory and practice by means of various examples is highlighted.
Simulated macro-stress scenarios have become an important part of the analytical toolkit central banks and regulators use to assess vulnerabilities on the balance sheets of financial institutions. Jaromir from GPM Network presents a MATLAB® based toolkit that integrates and streamlines the main tasks, from macroeconomic shock identification through macro-financial feedback, to aggregate balance sheets and regulatory indicators. The toolkit is used in our technical assistance projects in different parts of the world.
This talk considers improved financial forecasting in possibly nonlinear dynamic settings, with high dimension and many predictors (“big data” environments). To overcome the curse of dimensionality and manage data and model complexity, deep learning algorithms is examined. In an application to forecast equity returns, the proposed approach captures nonlinear dynamics between equities to enhance forecast performance. It offers a significant improvement over current univariate and multivariate models in terms of trading simulation.
Robust, realistic, and efficient modelling of profitability and risk are essential to maintaining a competitive edge in the global reinsurance market. In this talk, Paul Bassan describes how and why Aspen Re perform frequency severity simulations and how they are implemented, serving hundreds of users with seconds-level responsiveness. He describes Aspen Re’s MATLAB® based development, debug and test processes, as well as their MATLAB Production Server™ infrastructure deployments using Excel® over Citrix.
Machine and deep learning are increasingly prominent technologies, causing a media sensation and challenging many technical disciplines, invigorating financial quantitative modelling and driving FinTech. Learn how to get started quickly with machine learning and deep learning techniques in MATLAB® and how such techniques can support facilitate time-series forecasting, stock classification, and risk management model selection workflows. The talk also outlines how machine and deep learning facilitates image, computer vision, and audio processing capabilities in MATLAB, relevant to cheque and currency identification, cashpoint monitoring, and market abuse regulatory compliance (e.g. voice recordings).
This presentation discusses and demonstrates key new features for risk management, time-series modelling, instrument pricing, and financial data connectivity. Highlights include recent foundation capabilities, such as the table and timetable data types, and focus on key applications, such as credit scorecard modelling, value-at-risk, expected shortfall backtesting, instrument pricing with stochastic volatility, and a demonstration of time-series modelling with the Econometric Modeler app.
As the size and complexity of your MATLAB® application increases, you want make sure to structure software projects well, ensuring users can run code without encountering unexpected behaviour or errors, for example. In this talk, you will learn about relevant advanced MATLAB software development capabilities, including error handling, object-oriented programming (OOP), unit testing, version control, and change tracking.
Sentiment scores, derived from text, such as newsfeeds and social media, offer important information to determine portfolio positions and trading signals, while text analytics can offer opportunities to identify misconduct. However, a document’s sentiment is often a weak signal surrounded by a large amount of noise. Extracting that signal requires a variety of techniques for working with data both in text and numeric formats, as well as machine learning techniques for automating the sentiment scoring process on large quantities of data.
Learn how to use text analytics capabilities in MATLAB® to build your own sentiment analysis tools. This presentation covers the entire sentiment scoring workflow, including importing social media feed data into MATLAB, preprocessing and cleaning up the raw text, converting text to a numeric format, and applying machine learning techniques to derive sentiment scores.
This presentation describes best practices for how you can make MATLAB® applications enterprise-ready, scale models appropriately no matter your IT infrastructure, and incorporate analytics into production systems. It outlines how MATLAB can interact with data lakes and data streaming environments.