This post concludes my day-to-day review of the workshop on Computational Methods that took place at Fields last week.
As could be inferred from his informal presentation on GPUs the previous day, Mike Giles does heavy duty numerical computations with a vengeance. He described a Monte Carlo method implemented on different levels of resolutions in order to achieve a prescribed accuracy and showed how to apply it to challenging exotic option pricing and sensitivity calculations, as well as a stochatic PDE example.
Peter Forsyth started by explaining what GMWB (guaranteed minimum withdraw benefits) are and introduced a penalty method to price them numerically by solving the associated HJB equations. Interestingly, his results were not sensitive to the size of the penalty term, which silences a common criticism to such methods. Even more interestingly, we learned at the end of the talk that policyholders in Hamilton tend to exercise their guarantees optimally (well, at least in one historical example), thereby wreaking havoc for insurance companies, which routinely underprice such policies betting on consumer's sub-optimality.
Nizar Touzi described his probabilistic approach for solving fully nonlinear PDEs. As it is well-known, probabilistic methods can be used to solve second order linear parabolic PDEs via the Feynman-Kac formula, which can then be approximated by Monte Carlo. For quasi-linear PDEs, a similar approach leads to a numerical scheme for solving BSDEs. What Nizar showed is how the scheme can be generalized to the fully nonlinear case, leading to a scheme that involves both Monte Carlo and finite-differences, but without relying directly on BSDEs.
Phillip Protter brought the workshop to conclusion with a talk on absolutely continuous compensators. Now, these are objects that show up naturally in the study of totally inaccessible stoping times (such as those used in the majority of reduced-form credit risk models), so the motivation for studying them is perfectly understandable. Not the same for the machinery necessary to appreciate the results... As another participant observed, it is likely that about 2 people in the audience knew what the Levy system of a Hunt process is (don't count me as one of them). In the end, the message I got from the talk is that any good old totally inaccessible stopping time that has any chance of making into a financial model will likely have an absolutely continuous compensator - but it is nice to know that there are people proving such results. Before I finish, I used to say that Phillip is the Woody Allen of mathematics, but the analogy is no longer valid: Woody is not that funny anymore, whereas Phillip is in great form.
Wednesday, March 31, 2010
Tuesday, March 30, 2010
Computational workshop - second day (a week later)
During the panel discussion on the first day of the workshop, Jim Gatheral pointed out that "for some reason" exotic derivatives have gone out of fashion during the past two years, with the result that a whole bunch of talented quants had to divert their brain power to something else, in particular algorithmic trading. Naturally, the workshop organizers were aware of that and algorithmic trading was the predominant topic of the second day. I suspect this was one of the reasons for changing the original name of the workshop from "numerical" to "computational" methods in finance - a very appropriate generalization.
The first talk by Ciamac Moallemi allowed me for the first time to properly understand the dynamics of a limit order book, mainly because he used a deterministic fluid model, thereby developing a good deal of intuition before actually solving the optimal execution problem in such model.
In impeccably characteristic style, Jim Gatheral followed with a review of the most well-known optimal execution models in the literature, adding personal touches along the way (such as re-deriving closed-form optimal policies by Euller-Lagrange instead of HJB) and leading up to his own new results about the possibility of price manipulation. As anyone who ready his volatility surface book, Jim is not a fan of unnecessarily complicated notation, but not afraid of deep mathematics either. As a result, everything that he writes looks very simple, but moving from one equation to the next can require unexpected effort (if you really want to understand what is going on, that is).
Petter Kolm shifted the focus of the discussion by concentration on the "buy-side" of optimal execution, that is, incorporation the views of a portfolio manager into the execution problem. Since these bring different objectives (maximizing returns over a long time horizon, improve diversification, etc), he showed that traditional execution policies are generally sub-optimal. In particular, he prescribed that rebalancing should be done more often than traditionally suggested, and predicted a corresponding growth in research and practice of multi-period portfolio optimization.
Chris Rogers reverted back to numerical methods and introduced a way to solve optimal stopping problems using convex approximations to a convex value function. As another participant commented during the break, you might not be clever enough to exactly reproduce what Chris does, but his presentations always make perfectly clear what is going on each step of the way. He is also very good at identifying fatal flaws in other people's work (I personally tremble whenever he is listening to my talks), but unlike other acute critics, he also applies the same level of rigor to his own work. As a result, he can be disconcertingly honest in pointing out weaknesses of his proposed methods. My impression is that it is too early to pass judgment on this particular method, but at least all the advantages and disadvantages were clearly spelled out.
In an informal presentation after lunch, Mike Giles gave a fascinating overview of the use of GPU (graphic processing units) in quantitative finance. It was highly entertaining to hear that banks in Canary Wharf had exhausted the energy capacity available to power up their computer (with any new power generation being committed to the 2012 Olympic Games), so that they face the choice of either migrating to less power-hungry GPUs (and having to rewrite all their code) or build new data centers in the periphery of London.
In the context of portfolio credit risk, Kay Giesecke showed how to use importance sampling to efficiently simulate rare events and calculate several related risk measure. Liming Feng concluded the day by introducing a discrete Hilbert transform to calculate option prices for several exotic options (bermuda, loockbacks, etc) in a variety of Levy models with surprisingly good error estimates. Apologies for being brief on these last two talks, by I'm running out of time for this post.
The first talk by Ciamac Moallemi allowed me for the first time to properly understand the dynamics of a limit order book, mainly because he used a deterministic fluid model, thereby developing a good deal of intuition before actually solving the optimal execution problem in such model.
In impeccably characteristic style, Jim Gatheral followed with a review of the most well-known optimal execution models in the literature, adding personal touches along the way (such as re-deriving closed-form optimal policies by Euller-Lagrange instead of HJB) and leading up to his own new results about the possibility of price manipulation. As anyone who ready his volatility surface book, Jim is not a fan of unnecessarily complicated notation, but not afraid of deep mathematics either. As a result, everything that he writes looks very simple, but moving from one equation to the next can require unexpected effort (if you really want to understand what is going on, that is).
Petter Kolm shifted the focus of the discussion by concentration on the "buy-side" of optimal execution, that is, incorporation the views of a portfolio manager into the execution problem. Since these bring different objectives (maximizing returns over a long time horizon, improve diversification, etc), he showed that traditional execution policies are generally sub-optimal. In particular, he prescribed that rebalancing should be done more often than traditionally suggested, and predicted a corresponding growth in research and practice of multi-period portfolio optimization.
Chris Rogers reverted back to numerical methods and introduced a way to solve optimal stopping problems using convex approximations to a convex value function. As another participant commented during the break, you might not be clever enough to exactly reproduce what Chris does, but his presentations always make perfectly clear what is going on each step of the way. He is also very good at identifying fatal flaws in other people's work (I personally tremble whenever he is listening to my talks), but unlike other acute critics, he also applies the same level of rigor to his own work. As a result, he can be disconcertingly honest in pointing out weaknesses of his proposed methods. My impression is that it is too early to pass judgment on this particular method, but at least all the advantages and disadvantages were clearly spelled out.
In an informal presentation after lunch, Mike Giles gave a fascinating overview of the use of GPU (graphic processing units) in quantitative finance. It was highly entertaining to hear that banks in Canary Wharf had exhausted the energy capacity available to power up their computer (with any new power generation being committed to the 2012 Olympic Games), so that they face the choice of either migrating to less power-hungry GPUs (and having to rewrite all their code) or build new data centers in the periphery of London.
In the context of portfolio credit risk, Kay Giesecke showed how to use importance sampling to efficiently simulate rare events and calculate several related risk measure. Liming Feng concluded the day by introducing a discrete Hilbert transform to calculate option prices for several exotic options (bermuda, loockbacks, etc) in a variety of Levy models with surprisingly good error estimates. Apologies for being brief on these last two talks, by I'm running out of time for this post.
Monday, March 29, 2010
Computational workshop - first day (a week later)
As promised, here is a day-by-day lookback at the workshop on Computational Methods in Finance that took place at Fields last week.
The majority of the talks in the first days consisted of what I would call "traditional" numerical methods (for lack of a better name), namely, the thorough investigation of very clever ways to obtain numerical solutions to challenging problems.
Ralf Korn opened the workshop with a very elegant overview of binomial methods. To tackle the tricky problem of matching the tree parameters to a complicated correlation structure in high dimensional, he proposed a universal method of decoupling based on a diagonalization of the correlation matrix, which reduces the matching problem to several uncorrelated one-dimensional trees, for which well refined methods already exist.
Kumar Muthuraman followed with a moving boundary method to solve free-boundary problems: start with a blatantly suboptimal boundary (often suggested naturally by the problem at hand) and systematically improve it in the region where the associated variational inequality is violated until the true free-boundary is well approximated.
John Chadan gracefully replaced Garud Iyengar, who could not attend, and explained how integral equations methods can lead to very detailed results for the American put option problem, in particular the fact that the exercise boundary might fail to be convex when dividends are higher than the risk-free interest rate.
In the context of portfolio optimization, Lorenzo Garlappi showed how a carefull decompostion of the state variables into an observable component and a random error, combined with a Taylor expansion of the value function (expressed in monetary terms) can lead to very accurate numerical approximations.
After lunch, a panel consisting of Jim Gatheral, Chris Rogers, Ernst Eberlein and Jeremy Staum discussed current challenges in quantitative finance, especially in light of the crisis of 2008.
Jeremy Staum then followed with a comprehensive set of proposals for the use of repeated simulations in finance. It is hard to do justice to his far reaching talk in just one line, so I'll just mention the key concepts of Database Monte Carlo and Metamodeling, while hoping to have more chance to study this in the future.
Birgit Rudloff concluded with perhaps the most technically demanding talk of the day (for me at least), investigation the effect of transaction costs on risk measures. My mind went in to a swirl right at the begining of her talk, since many of the familiar concepts such as arbitrage, portfolio value, risk measure, immediately became a special "scalar" case of their mutli-dimensional generalizations (a consequence of not being permitted to calculate the value of a portfolio as a simple scalar product at each point in time due to the presence of transaction costs). I partially recovered some sanity with the 2-dimensional example at the end, where nice pictures can be drawn ont he board, but at the point everyone was rightly ready for a glass of wine in the reception that followed.
The majority of the talks in the first days consisted of what I would call "traditional" numerical methods (for lack of a better name), namely, the thorough investigation of very clever ways to obtain numerical solutions to challenging problems.
Ralf Korn opened the workshop with a very elegant overview of binomial methods. To tackle the tricky problem of matching the tree parameters to a complicated correlation structure in high dimensional, he proposed a universal method of decoupling based on a diagonalization of the correlation matrix, which reduces the matching problem to several uncorrelated one-dimensional trees, for which well refined methods already exist.
Kumar Muthuraman followed with a moving boundary method to solve free-boundary problems: start with a blatantly suboptimal boundary (often suggested naturally by the problem at hand) and systematically improve it in the region where the associated variational inequality is violated until the true free-boundary is well approximated.
John Chadan gracefully replaced Garud Iyengar, who could not attend, and explained how integral equations methods can lead to very detailed results for the American put option problem, in particular the fact that the exercise boundary might fail to be convex when dividends are higher than the risk-free interest rate.
In the context of portfolio optimization, Lorenzo Garlappi showed how a carefull decompostion of the state variables into an observable component and a random error, combined with a Taylor expansion of the value function (expressed in monetary terms) can lead to very accurate numerical approximations.
After lunch, a panel consisting of Jim Gatheral, Chris Rogers, Ernst Eberlein and Jeremy Staum discussed current challenges in quantitative finance, especially in light of the crisis of 2008.
Jeremy Staum then followed with a comprehensive set of proposals for the use of repeated simulations in finance. It is hard to do justice to his far reaching talk in just one line, so I'll just mention the key concepts of Database Monte Carlo and Metamodeling, while hoping to have more chance to study this in the future.
Birgit Rudloff concluded with perhaps the most technically demanding talk of the day (for me at least), investigation the effect of transaction costs on risk measures. My mind went in to a swirl right at the begining of her talk, since many of the familiar concepts such as arbitrage, portfolio value, risk measure, immediately became a special "scalar" case of their mutli-dimensional generalizations (a consequence of not being permitted to calculate the value of a portfolio as a simple scalar product at each point in time due to the presence of transaction costs). I partially recovered some sanity with the 2-dimensional example at the end, where nice pictures can be drawn ont he board, but at the point everyone was rightly ready for a glass of wine in the reception that followed.
Sunday, March 28, 2010
Operational Risk Forum
We kicked off the series of Industrial-Academic Forums with the topic of Operational Risk this past Friday and Saturday. Since I knew very little about the subject, the Forum was an opportunity to learn quite a lot. But aside from my personal experience, I heard from most participants that this was a very well organized, timely and productive forum.
For lack of expertise, instead of reviewing the talks individually, I will list the things that I learn from the workshop as a whole, in no particular order, except perhaps the number of times they were mentioned in the talks.
- because banks don't make money out of operational risk (as opposed to market and credit risk, each related to core commercial activities, with a lot of money to be made by beating the competition in the corresponding markets), research in the area has been mostly driven by regulation rather than market pressure;
- as a consequence, most of the effort focus on how to calculate the exact quantity specified by Basel II, namely VaR 99.9% of operational risk losses, which are primarily modeled using LDA (loss distribution analysis);
- LDA consists of modeling the losses in each business line as compound Poisson processes (therefore modeling the frequency and the severity of the losses separately) and then bringing business line together under some dependence assumption (say copulas). The severity of losses is modeled as a heavy tail distribution, with log-normal (10,2) being everyone's darling, closely followed by Pareto and something called "g and h" (google it !);
- clever numerical methods and analytic approximations compete to produce the final shape of the loss distribution and calculate its VaR 99.9%.
- it is really really hard to estimate the parameters of such heavy-tailed distributions, and there is not nearly enough data to accurately distinguishe them. As a consequence, regulators (and just about everyone else) have their work cut out for them in trying to validate the models used by different banks. Much care is needed to ensure a "level playing field" in the process;
- VaR is a particularly boneheaded choice of risk measure in this case, especially when tying to measure the "diversification effect" (the parameter "delta" in Basel II), which can jump from positive to negative (i.e adverse diversification) by slightly changing the marginal distribution of losses, without even touching their dependence structure;
- it would be really nice to have a way to model operational risk from first principles, like a structural model in credit risk, rather than simply trying to fit (highly unstable) statistical models to (largely unavaiable) data;
- at the end of the day, research in operational risk should lead to better risk management rather than measurament. As one participant put it: "operational risk groups should be seen more like a think-tanks instead of data centre";
As I said, a wonderful learning experience.
For lack of expertise, instead of reviewing the talks individually, I will list the things that I learn from the workshop as a whole, in no particular order, except perhaps the number of times they were mentioned in the talks.
- because banks don't make money out of operational risk (as opposed to market and credit risk, each related to core commercial activities, with a lot of money to be made by beating the competition in the corresponding markets), research in the area has been mostly driven by regulation rather than market pressure;
- as a consequence, most of the effort focus on how to calculate the exact quantity specified by Basel II, namely VaR 99.9% of operational risk losses, which are primarily modeled using LDA (loss distribution analysis);
- LDA consists of modeling the losses in each business line as compound Poisson processes (therefore modeling the frequency and the severity of the losses separately) and then bringing business line together under some dependence assumption (say copulas). The severity of losses is modeled as a heavy tail distribution, with log-normal (10,2) being everyone's darling, closely followed by Pareto and something called "g and h" (google it !);
- clever numerical methods and analytic approximations compete to produce the final shape of the loss distribution and calculate its VaR 99.9%.
- it is really really hard to estimate the parameters of such heavy-tailed distributions, and there is not nearly enough data to accurately distinguishe them. As a consequence, regulators (and just about everyone else) have their work cut out for them in trying to validate the models used by different banks. Much care is needed to ensure a "level playing field" in the process;
- VaR is a particularly boneheaded choice of risk measure in this case, especially when tying to measure the "diversification effect" (the parameter "delta" in Basel II), which can jump from positive to negative (i.e adverse diversification) by slightly changing the marginal distribution of losses, without even touching their dependence structure;
- it would be really nice to have a way to model operational risk from first principles, like a structural model in credit risk, rather than simply trying to fit (highly unstable) statistical models to (largely unavaiable) data;
- at the end of the day, research in operational risk should lead to better risk management rather than measurament. As one participant put it: "operational risk groups should be seen more like a think-tanks instead of data centre";
As I said, a wonderful learning experience.
Friday, March 26, 2010
Hard choices
I was afflicted by an embarrassment of riches yesterday, having to choose between attending the 6-hour guest lectures by Kay Giesecke in one of the graduate course at Fields or attend the AIMS/Phimac talk by Santiago Moreno-Bromberg at McMaster.
In the end I apply a least effort criterion and stayed at McMaster, especially because this was a heavy week, with Thursday sandwiched between a 3-day workshop and a 2-day forum.
Santiago's talk was nothing less than fascinating, touching on many important topics (optimal design of contracts, risk sharing, principal-agent problems with multiple agents AND multiple principals, etc). It is a perfect example of a recent but unmistakable trend of mathematical finance research expanding from strict financial markets and moving in the direction of mainstream economics.
ps: I'm sure that Kay's course was also lovely (and the credit risk students at Fields are a very lucky bunch for having him here), just can't comment because I wasn't there. Maybe this is a good point for some of my secret readers to post a comment :)
In the end I apply a least effort criterion and stayed at McMaster, especially because this was a heavy week, with Thursday sandwiched between a 3-day workshop and a 2-day forum.
Santiago's talk was nothing less than fascinating, touching on many important topics (optimal design of contracts, risk sharing, principal-agent problems with multiple agents AND multiple principals, etc). It is a perfect example of a recent but unmistakable trend of mathematical finance research expanding from strict financial markets and moving in the direction of mainstream economics.
ps: I'm sure that Kay's course was also lovely (and the credit risk students at Fields are a very lucky bunch for having him here), just can't comment because I wasn't there. Maybe this is a good point for some of my secret readers to post a comment :)
Wednesday, March 24, 2010
So much to blog, so little time
The Computational Methods workshop has come and gone and I didn't have time to post a single entry about it yet. I promise to do a day-by-day restrospective in the next few days, but now I'm off to give a talk to the U of T Math Union.
Sunday, March 21, 2010
Computational Workshop starts tomorrow
The second major workshop of the thematic program starts tomorrow at Fields and I thought I should provide a little bit of history like I did for the first workshop.
Paul Glasserman was a supporter of the thematic program from the very beginning. In fact he was part of a handful of people who gave us valuable advice when we were writing our Letter of Intent back in 2005. After that I contacted him again in January 2008 asking if he would like to chair the scientific committe for one of the workshops. By October 2008, with the addition of Mark Broadie as co-chair and Stathis Tompaidis and David Saunders as members, we had the committee formed and ready to work. They sent me a complete list of speakers in May 2009, at which point all the hard work was done and we could start publicizing the workshop.
Paul Glasserman was a supporter of the thematic program from the very beginning. In fact he was part of a handful of people who gave us valuable advice when we were writing our Letter of Intent back in 2005. After that I contacted him again in January 2008 asking if he would like to chair the scientific committe for one of the workshops. By October 2008, with the addition of Mark Broadie as co-chair and Stathis Tompaidis and David Saunders as members, we had the committee formed and ready to work. They sent me a complete list of speakers in May 2009, at which point all the hard work was done and we could start publicizing the workshop.
ABM talk
Leigh Tesfatsion gave a colloquium talk at McMaster yesterday on the use of agent-based models (ABM) as the right mathematics for the social sciences.
I met Leigh in May 2009, during a Perimeter Institute workshop on the The Economic Crises and Its Implications for The Science of Economics. Of all the ambitious ideas put forward during the workshop (including things like gauge theory and economic preferences, evolutionary biology foundations for behavioral finance, and even something called 'particle economics'), her presentation on Agent-Based Computational Economics (ACE) struck me as the most promising way to achieve real progress in the understanding of economic crisis.
Since then, I have started my own little ABM project related to systemic risk in banking networks, so it was a great pleasure to host Leigh for a day and discuss future research avenues, not to mention that her talk was every bit as inspiring as the one she delivered at PI last year.
I met Leigh in May 2009, during a Perimeter Institute workshop on the The Economic Crises and Its Implications for The Science of Economics. Of all the ambitious ideas put forward during the workshop (including things like gauge theory and economic preferences, evolutionary biology foundations for behavioral finance, and even something called 'particle economics'), her presentation on Agent-Based Computational Economics (ACE) struck me as the most promising way to achieve real progress in the understanding of economic crisis.
Since then, I have started my own little ABM project related to systemic risk in banking networks, so it was a great pleasure to host Leigh for a day and discuss future research avenues, not to mention that her talk was every bit as inspiring as the one she delivered at PI last year.
Wednesday, March 17, 2010
York day
The visitors seminar series this week had back-to-back talks by Tom Salisbury and Fouad Marri, both from York University.
Tom spoke about the hot topic of retirement income products with embedded guarantees. In the end he complained to himself about spending too much time on a relaxed introduction and not enough on the mathematical result the he wanted to show, but the talk went so smoothly (as his talks usually do) that my sense is that nobody else shared the complaint.
Fouad showed how to explicitly compute the moment generating function of compound Poisson processes when the jump sizes are correlated with the jump times according to a specific class of copula functions. As pointed out by Sebastian Jaimungal in the audience, such processes can be useful to model insurance claims arising from earthquakes, since typically their magnitude tends to increase the longer you wait.
Tom spoke about the hot topic of retirement income products with embedded guarantees. In the end he complained to himself about spending too much time on a relaxed introduction and not enough on the mathematical result the he wanted to show, but the talk went so smoothly (as his talks usually do) that my sense is that nobody else shared the complaint.
Fouad showed how to explicitly compute the moment generating function of compound Poisson processes when the jump sizes are correlated with the jump times according to a specific class of copula functions. As pointed out by Sebastian Jaimungal in the audience, such processes can be useful to model insurance claims arising from earthquakes, since typically their magnitude tends to increase the longer you wait.
Sunday, March 14, 2010
Wednesday, March 10, 2010
Statistical arbitrage done right
Rudra Jena gave a talk in the visitor seminar series yesterday about misspecified stochastic volatility models and their corresponding (statistical) arbitrage opportunities. This generalizes the famous work of El Karoui, Jeanblanc and Shreve for misspecified Black-Scholes models and leads to several interesting conclusions. For example, Rudra showed that, in general, the popular Risk Reversal and Butterfly Spread strategies, commonly used by traders to benefit from perceived misspecification in the correlation between volatility and stock price and in the vol of vol, are not profit maximizing.
Thursday, March 4, 2010
Riccati rules
Ken Kim spoke about stability of Riccati equations on the visitors seminar series this week. This is truly beautiful stuff relating dynamical systems concepts like equilibrium points and stable manifolds to probabilistic properties of the corresponding affine diffusion.
Interestingly enough, Tom Hurd was asking me just the other week if I knew of anybody looking at stability of large dimensional systems of Riccati equations, since they arise naturally in his new model for systemic risk. Thought their motivations are different, I'm sure Tom and Ken will have lots to talk about until the end of the thematic program.
Interestingly enough, Tom Hurd was asking me just the other week if I knew of anybody looking at stability of large dimensional systems of Riccati equations, since they arise naturally in his new model for systemic risk. Thought their motivations are different, I'm sure Tom and Ken will have lots to talk about until the end of the thematic program.
Subscribe to:
Posts (Atom)