The evolution of modern portfolio theory, and in particular derivatives valuation, has proceeded through many small steps and occasional large epiphanies. In his talk, Emanuel Derman points out what he sees as some of the intellectual breakthroughs and failures, from the idea of dependency, the description of diffusion, the definition of risk, the principle of no riskless arbitrage, optimization, diversification, Sharpe ratios, replication, Black-Scholes-Merton, Feynman-Kac, calibration, the invention of implied volatility, the Smile, behavioral finance, and the infinite regress of financial modeling.
Big data, cloud computing, text analytics, deep learning, and artificial intelligence are creating disruptive opportunities across industries and have led to startup FinTech companies challenging established financial firm’s businesses. Learn from experts as they debate trends; introduce how MATLAB® has evolved over the past two years to help companies remain agile while innovating at a rapid pace; announce new products and capabilities released with R2017b; and answer audience questions.
Financial engineers from MathWorks and quants from industry
In today’s world, there is an overabundance of data, generated from many different sources. Big data represents an opportunity for quantitative analysts and data scientists to impact the way organizations make informed business decisions.
Machine learning techniques are often used for financial analysis tasks such as time series analysis, forecasting, risk classification, estimating default probabilities, and data mining. However, implementing and comparing modeling techniques to choose the best method can be challenging, especially with big data. MATLAB® minimizes these challenges by providing several built-in tools for quick prototyping and scaling, without bringing the data into memory.
In this session, Heather Gorr introduces techniques for gaining insight into big data on a Spark™ enabled Hadoop® cluster, determining the best algorithm for your problem, evaluating model performance, and deploying machine learning models.
11:30 a.m.–12:10 p.m.
Attilio Meucci introduces the ARPM Lab®, a multimedia platform to learn how engineering and data science apply to quantitative asset management. Attilio demonstrates the ARPM Lab, including how theory, videos, and MATLAB® exercises are used to learn historical and current state-of-the-art practices for quantitative asset management.
Sentiment scores, derived from text such as newsfeeds and social media, provide important information for determining portfolio positions. However, a document’s sentiment is often a weak signal surrounded by a large amount of noise. Extracting that signal requires a variety of techniques for working with data both in text and numeric formats, as well as machine learning techniques for automating the sentiment scoring process on large amounts of data.
Learn how to use text analytics capabilities in MATLAB® to build your own sentiment analysis tools. This presentation covers the entire sentiment scoring workflow, including getting social media feed data into MATLAB, preprocessing and cleaning up the raw text, converting text to a numeric format, and applying machine learning techniques, such as word2vec, to derive sentiment scores.
Dr. Kissell provides an overview of quantitative sports analytics using MATLAB®. He presents models and methodologies from his book, Optimal Sports Math, Statistics, and Fantasy and provides techniques and MATLAB functions to solve an array of sports-related problems.
These approaches and techniques can be used by the entire sports community—students, professionals, fantasy gamers, and casual sports fans—to objectively analyze and rank teams, predict winning team and win probability, evaluate player skill and forecast future performance, compute the probability that a team will beat a sports line, as well as be applied to fantasy sports competitions. These models can also be employed by professional sports teams to assist in the player selection process, determine game-to-game match-ups, as well as for salary negotiations and salary caps problems.
Key items addressed in the presentation include:
Artificial intelligent systems in finance have exploded over the last few years. This is especially true in risk modeling where financial institutions are being hit with massive losses and fines. Most institutions are not incurring losses because of their inability to build complex models, but due to the fact they are not able to showcase to regulators that they are versatile enough to adapt to the volatility. With this in mind, MathWorks has built a Statistics and Machine Learning Toolbox™ with specialized applications to facilitate the quick building of machine learning models, such as logistic regression, boosted and bagged trees, and regression style classifiers. In addition, MathWorks has built a Risk Management Toolbox™ with the goal of giving you complete out of the box models with transparency to areas such as credit and market risk models. This session aims to provide an intensive overview of the current state of art in financial risk modeling using high performance computing tools on big data.
The emergence of big data in finance has shifted the alpha focus away from being faster to being smarter and more efficient than the competition. Access to alternative data sources is considered a key input to such a process. During his talk, Peter Hafez will provide an overview of the changing investment landscape and the “winning formula” for successful quant investing. He will also cover some of his team's latest research in the space.
Even today, when CROs need to see results, they ask for reports. Using high performance models combined with intuitive interactive visualization, CROs and other business users can have all the results in their fingertips with no delays.
In this presentation, Timo Salminen from Model IT demonstrates how MATLAB® can be used both for building modular high-performance risk models and modular interactive dashboards of those models. While there are multiple big data visualization tools available, interacting with complex models often requires custom built solutions.
Timo Salminen explores multiple use cases, such as stress testing, economic capital calculation, and stochastic portfolio optimization with complex liabilities, non-normal distributions, and derivative overlays.
Petter Kolm and Gordon Ritter describe a novel approach to the study of multiperiod portfolio selection problems with time-varying alphas, trading costs, and constraints. They show that, to each multiperiod portfolio optimization problem, one may associate a “dual” Bayesian dynamic model. The dual model is constructed so that the most likely sequence of hidden states is the trading path which optimizes expected utility of the portfolio. The existence of such a model has numerous implications, both theoretical and computational. Sophisticated computational tools developed for Bayesian state estimation can be brought to bear on the problem, and the intuitive theoretical structure attained by recasting the problem as a hidden state estimation problem allows for easy generalization to other problems in finance. Time permitting, they will discuss several applications to this approach.
11:30 a.m.–12:10 p.m.
Big data represents an opportunity to impact the bottom line of organizations. By building advanced analytics and predictive models on top of these large repositories of data business decisions can be vastly improved. Some tasks that stand to gain the most from such improvements include predicting fraud, determining trading opportunities, improving the customer experience and journey, and even forecasting which employees are at risk for switching companies.
In this session, Ian McKenna introduces ways of working with big data and rapidly deploying analytics into production.
By observing the gap between user tools created by researchers using MATLAB® and IT production systems created by IT professionals using general programming languages such as Java®, C++, and Python®, we propose an open paradigm of integrating MATLAB with other systems and different programming languages, with enabling features from MATLAB such as MATLAB Production Server™, web service, JSON support, and object-oriented programming in MATLAB.
To demonstrate the process, we will discuss how we built an asset management system using MATLAB integrated with our Java based production environment. The system is used for multi-factor regression on multi-asset fund-of-funds portfolios with complex factor selection rules.
To build such a mixed system, we need to carefully handle business object models, security, data caching, exception handling, logging, etc. We also propose a micro-service architecture with MATLAB as one of the core components to enable scalable MATLAB based computation from other systems.
MATLAB® has evolved from an ad hoc analysis environment to a full development platform, enabling financial models and analytics developed in MATLAB to be seamlessly and efficiently deployed to financial Institutions. In this talk, Peter Orr describes how to use MATLAB to develop and deploy a multi-market fixed-income model (with a Microsoft® Excel® front-end) that simultaneously captures U.S. Treasury, corporate bond, mortgage, municipal, and other markets for risk management and performance analysis.
Scotiabank has implemented a novel web-based analytics platform that is being used within a large international bank treasury department for asset liability management, earnings optimization, interest rate risk analytics, and balance sheet simulation. The in-house built analytics engine integrates MATLAB Production Server™ and MATLAB Distributed Computing Server™ together with standard SQL, Python®, and web technologies to provide on-demand analytics to multiple concurrent users.
xVA is rapidly becoming a standard for assessing risk associated with counterparties since the financial crisis. Many regulations (FRS13, Basel III) are requiring CVA as part of the reporting frameworks. In his talk, Andy will provide an overview of xVA pricing and risk management, the main computational challenges faced when implementing them, and how the MATLAB Numerix Interface can help organizations overcome them while simplifying the management of these exotic instruments.
New hardware, distributed computing, and a plethora of machine learning frameworks have led to multiple systems with very similar, but just-different-enough in-memory formats to make sharing data between these systems difficult. At best, it is still extremely inefficient to move data between infrastructure pieces because of multiple serialization or deserialization steps. Apache Arrow™ is a solution for this problem. Arrow enables data infrastructure to use the same memory layout and share data in the most efficient way possible, including support for streaming as well as batch data. In this talk, Phillip Cloud discusses the motivation for Arrow, the in-memory format, and architecture details, as well as gives real-world example use cases.
10:50 a.m.–12:10 p.m.
This master class introduces MATLAB® functionality for risk management. Through examples carried out in MATLAB, you will learn how to import data, perform data analysis and visualization, create simple risk models and perform simulations on them, and share your models and calculations with others in the form of Microsoft® Excel® add-ins. This master class provides you with the tools you need to get started with MATLAB in your own work. It is also perfect for those interested in learning about new functionality designed to make you more productive when working in MATLAB.
This session is based on a Master Class delivered at the GARP 15th Annual Risk Management Convention in 2014.
To maintain a competitive advantage, it is critical to have the right tools in place that are flexible, allow you to iterate rapidly, and scale easily.
In this session, Siddharth Sundar demonstrates how MATLAB® is an ideal ETL platform for exploratory analysis, quick data transformation and cleansing, and scaling to any size data with nearly no code changes. New capabilities in MATLAB ranging from built-in functions, data containers, and external interfaces will be introduced to save time and simplify complex data management tasks.
MATLAB® is often used for solving financial engineering and scientific problems. As the size and complexity of your application increases, it becomes more challenging to manage your development process.
MATLAB provides advanced software development capabilities, including error handling, object-oriented programming (OOP), and unit testing as well as workflows for deploying production applications.
Sean de Wolski provides best practices and tips for developing complex applications with MATLAB, including: