This week marks one year since I started this blog. The original intent was to report on the several activities that took place during the thematic program we had at the Fields Institute during the first half of 2010.

When the program ended in July, the people in charge of the communications strategy at Fields asked me to continue the blog by posting about topics related to the program, such as conferences or other important developments in the area. The hope was to create a focal point for people who participated in the program, both for historical purposes and to generate discussions and commentary.

Inevitably, I restricted my posts to activities that I attended, which gave the blog more of a personal flavor than I would have liked. That is why I keep encouraging my colleagues to become regular contributors too. So far I got many promises and no actual posts, but I'm keeping my expectations high.

In the mean time I'll keep writing, acutely aware of the advice in Ecclesiastes 12:12:

"Of making many books there is no end; and much study is a weariness of the flesh."

## Friday, December 17, 2010

## Saturday, December 11, 2010

### The value of being a leader - ETH edition

I gave a talk at ETH on what I call the priority option this week. This was my first visit to both ETH and Zurich and I was throughly impressed by the city, the school, and the math department. For example, here is a sample of what was going on in just one afternoon:

Their Financial and Insurance Mathematics group is of course well known to anyone working in the area. In fact, I would be hard pressed to find a higher concentration of talent dedicated to mathematical finance under the same roof anywhere else in the world. And speaking about roofs, they also have gigantically high ceilings:

Their Financial and Insurance Mathematics group is of course well known to anyone working in the area. In fact, I would be hard pressed to find a higher concentration of talent dedicated to mathematical finance under the same roof anywhere else in the world. And speaking about roofs, they also have gigantically high ceilings:

ps: just in case you wonder, this was

*not*the room where I gave my talk :)## Monday, December 6, 2010

### Nobel week in Stockholm

I'm visiting Tomas Bjork in Stockholm for a few days this week, where I gave a talk about chaos models for interest rates at KTH (paper is available here).

It has been a truly remarkable visit for many reasons: having a traditional Swedish Christmas dinner, meeting the mathematician Lars Svensson (the other Lars Svensson is an economist who also works on interest rate theory - and I thought Martino and I had a problem with common names), see the famous Vaza warship, etc.

But it is hard to top the excitement of being in Stockholm during a week full of Nobel lectures, leading up to the prize ceremony on Nobel day, December 10th.

It has been a truly remarkable visit for many reasons: having a traditional Swedish Christmas dinner, meeting the mathematician Lars Svensson (the other Lars Svensson is an economist who also works on interest rate theory - and I thought Martino and I had a problem with common names), see the famous Vaza warship, etc.

But it is hard to top the excitement of being in Stockholm during a week full of Nobel lectures, leading up to the prize ceremony on Nobel day, December 10th.

## Sunday, December 5, 2010

### Research in Options Conference

Financial mathematicians around the world know that Rio de Janeiro is the place to be around this time of the year, when IMPA, the premier math institute in Brazil, hosts the aptly named Research in Options conference. To be entirely accurate, the first conference in this series, which took place in 2006, was generically called Mathematics and Finance: from Theory to Practice, but thanks to Bruno Dupire's creative genius, the subsequent events are simply known as RiO 2007, 2008, 2009 and now 2010.

The core program is a mixture of high caliber invited speakers, contributed talks with a lot of local flavor, and short courses aimed at graduate students and practitioners. Apart from that, the warm atmosphere of the conference provides an ideal venue for informal (and sometimes heated) discussions on the abundance of important practical and theoretical problems that flooded our research area in these turbulent times. I like to think of it as the Brazilian response to Davos, with much better views and drinks, complete with our own chair lifts, which clearly put the Swiss's to shame.

Here is a partial visual record of the event:

The core program is a mixture of high caliber invited speakers, contributed talks with a lot of local flavor, and short courses aimed at graduate students and practitioners. Apart from that, the warm atmosphere of the conference provides an ideal venue for informal (and sometimes heated) discussions on the abundance of important practical and theoretical problems that flooded our research area in these turbulent times. I like to think of it as the Brazilian response to Davos, with much better views and drinks, complete with our own chair lifts, which clearly put the Swiss's to shame.

Here is a partial visual record of the event:

## Thursday, November 25, 2010

### Légion d'Honneur

I had a rare glimpse of the French establishment today when I attended the ceremony in which Monique Jeanblanc became a Chevalier de l'Ordre National de la Légion d'Honneur.

Being a member of the Order is the highest decoration in France, fully deserved by Monique. The nice touch is that, although new members are appointed directly by the President of the Republic, the formal induction can be performed by a current member with some personal connections with the inductee, in this case Nicole El Karoui, herself a Chevalier and a lifelong friend and colleague of Monique's.

Being a member of the Order is the highest decoration in France, fully deserved by Monique. The nice touch is that, although new members are appointed directly by the President of the Republic, the formal induction can be performed by a current member with some personal connections with the inductee, in this case Nicole El Karoui, herself a Chevalier and a lifelong friend and colleague of Monique's.

## Tuesday, November 23, 2010

### When computer nerds try to communicate

A propos of nothing, except for the fact that I'm under a couple of deadlines and having to type a lot in Latex and transfer many Unix files, it never ceases to amuse me how computer experts really don't have any language skills.

My all times favorite irritant from Latex is:

"Missing $ inserted."

by which the lovely folks who created Latex simply mean that the compiler found an unmatched dollar sign, as opposed to the existential puzzle it evokes.

Then of course there is Unix saying that something or other cannot be "canonicalized", which even this blog editor knows is not a word. Memo to Unix creators: the noun is

My all times favorite irritant from Latex is:

"Missing $ inserted."

by which the lovely folks who created Latex simply mean that the compiler found an unmatched dollar sign, as opposed to the existential puzzle it evokes.

Then of course there is Unix saying that something or other cannot be "canonicalized", which even this blog editor knows is not a word. Memo to Unix creators: the noun is

*canon*, the adjective is*canonical*and the verb is*canonize*.## Saturday, November 20, 2010

### End of bubbles course

I gave the 4th and final lecture for my Cours Bachelier on bubbles last Friday, following the lecture that I reviewed here (where you can find links to the first two).

The central topic for this lecture was the local martingale approach of Jarrow, Protter and Shimbo (and previous references therein), where bubbles are characterized under the NFLVR (no free lunch with vanishing risk) condition. This is the current accepted way to express no arbitrage in modern mathematical finance language and is significantly weaker than an equilibrium condition, since it does not require any notion of optimality of market clearing. As such, the results for bubbles in this setting generalize those related to rational bubbles, which are typically done in equilibrium. On the other hand, it would be nice to use a similar mathematical framework (i.e semimartingales) to address the models for irrational bubbles that were discussed in the previous lectures. In my view, there are still many low hanging fruits for mathematicians to pick in this field.

The last part of the lecture (and the course) was dedicated to a brief review of the statistical tests proposed to detect bubbles in real data. Since I'm not a statistician, I mostly followed the this review, with a special emphasis on the work on volatility bounds.

I then concluded with a breakneck-speed tour of famous bubbles throughout history, including the tulipmania, which may of may not have been a bubble, the Mississippi and South Sea bubbles, which might have been just a huge (yet failed) macroeconomic experiment, and the crash of 1929, which almost certainly was a bubble, even under the most optimistic definitions of fundamental values.

All references (and possibly slides and lecture notes in the near future) can be found here.

The central topic for this lecture was the local martingale approach of Jarrow, Protter and Shimbo (and previous references therein), where bubbles are characterized under the NFLVR (no free lunch with vanishing risk) condition. This is the current accepted way to express no arbitrage in modern mathematical finance language and is significantly weaker than an equilibrium condition, since it does not require any notion of optimality of market clearing. As such, the results for bubbles in this setting generalize those related to rational bubbles, which are typically done in equilibrium. On the other hand, it would be nice to use a similar mathematical framework (i.e semimartingales) to address the models for irrational bubbles that were discussed in the previous lectures. In my view, there are still many low hanging fruits for mathematicians to pick in this field.

The last part of the lecture (and the course) was dedicated to a brief review of the statistical tests proposed to detect bubbles in real data. Since I'm not a statistician, I mostly followed the this review, with a special emphasis on the work on volatility bounds.

I then concluded with a breakneck-speed tour of famous bubbles throughout history, including the tulipmania, which may of may not have been a bubble, the Mississippi and South Sea bubbles, which might have been just a huge (yet failed) macroeconomic experiment, and the crash of 1929, which almost certainly was a bubble, even under the most optimistic definitions of fundamental values.

All references (and possibly slides and lecture notes in the near future) can be found here.

## Saturday, November 6, 2010

### Bubbles course - part III

I gave the third lecture of my Cours Bachelier at IHP yesterday, which followed the lectures that I described here and here.

I first reviewed the work of Frank Allen and Douglas Gale on the role that financial intermediation has in creating bubbles and crises. The idea is that when investors buy assets with borrowed money, the possibility of defaulting on the original loans makes the equilibrium price higher than it would be if they had to buy with their own money. Essentially, the loan plays the role of a "call option" whose convex payoff drives the asset price up. The key insight is that this type of bubble can be created by uncertainty in the amount of credit itself, rather than in the real economy. They show that in some cases a crises can occur even when credit is expanding, essentially because it didn't expand as much as necessary to keep fueling the overinflated asset price. In my view this is a neat toy model for the Minsky story about financial instability.

For the second part of the lecture, I focused bubbles caused by heterogeneous beliefs. The essential idea is simple: when short sales are possible, the views of optimists are balances by those of pessimists and asset prices reflect fundamental value, whereas if there are short sale constraints, pessimists sit on the side lines, prices reflect the views of optimists and a bubble ensues. This mechanism was described for discrete-time models by Harrison and Kreps in a paper on speculative behavior.

One possible source for heterogeneous beliefs is overconfidence, as proposed in a continuous-time model by Scheinkman and Xiong, which I reviewed towards the end of the lecture.

All references for the course can be found here.

I first reviewed the work of Frank Allen and Douglas Gale on the role that financial intermediation has in creating bubbles and crises. The idea is that when investors buy assets with borrowed money, the possibility of defaulting on the original loans makes the equilibrium price higher than it would be if they had to buy with their own money. Essentially, the loan plays the role of a "call option" whose convex payoff drives the asset price up. The key insight is that this type of bubble can be created by uncertainty in the amount of credit itself, rather than in the real economy. They show that in some cases a crises can occur even when credit is expanding, essentially because it didn't expand as much as necessary to keep fueling the overinflated asset price. In my view this is a neat toy model for the Minsky story about financial instability.

For the second part of the lecture, I focused bubbles caused by heterogeneous beliefs. The essential idea is simple: when short sales are possible, the views of optimists are balances by those of pessimists and asset prices reflect fundamental value, whereas if there are short sale constraints, pessimists sit on the side lines, prices reflect the views of optimists and a bubble ensues. This mechanism was described for discrete-time models by Harrison and Kreps in a paper on speculative behavior.

One possible source for heterogeneous beliefs is overconfidence, as proposed in a continuous-time model by Scheinkman and Xiong, which I reviewed towards the end of the lecture.

All references for the course can be found here.

## Friday, November 5, 2010

### Monique Jeanblanc's influence in architecture

The mathematical finance group at Evry is so influential that they convince the local architects to build a

*cadlag*elevator:## Wednesday, November 3, 2010

### Grasselli x Grasselli

People at the Groupe parisien Bachelier must have a sense of humor, as they schedule both Martino Grasselli and myself to speak on the same morning this week :)

This should give irrefutable evidence that we are not the same person, or are we ?

This should give irrefutable evidence that we are not the same person, or are we ?

## Tuesday, November 2, 2010

### Bubbles course - part II

Following up on this post, I gave the second lecture of my graduate course on asset price bubbles on October 22nd.

I started by discussing how rumors (say mean--reverting shocks not related to fundamentals) can be incorporated into an asset price in a way that is compatible with the fact that observed prices are not

The point is that these "market inefficiencies" can lead to asset prices which deviate a lot from their fundamental values in a way that is different from the rational bubbles discussed in the first lecture. In particular, these inefficiency bubbles are not necessarily "growing bubbles", and therefore not subject to the type of transversality conditions that are typically used to rule out rational bubbles.

Specific mechanisms for the formation of these inefficiency bubbles will be discussed in the next lecture.

All references for the course can be found here.

I started by discussing how rumors (say mean--reverting shocks not related to fundamentals) can be incorporated into an asset price in a way that is compatible with the fact that observed prices are not

*very*forecastable, as explained by Robert Shiller in a paper on social dynamics. I then further explained the idea that prices can deviate from fundamentals because of irrational traders using a famous paper by Brad DeLong, Andrei Shleifer, Larry Summers and Robert Waldmann on noise trader risk. I concluded with a discussion of the equally famous paper by Andrei Shleifer and Robert Vishny on limits of arbitrage when investors use other people's money and are subject to performance reviews.The point is that these "market inefficiencies" can lead to asset prices which deviate a lot from their fundamental values in a way that is different from the rational bubbles discussed in the first lecture. In particular, these inefficiency bubbles are not necessarily "growing bubbles", and therefore not subject to the type of transversality conditions that are typically used to rule out rational bubbles.

Specific mechanisms for the formation of these inefficiency bubbles will be discussed in the next lecture.

All references for the course can be found here.

## Saturday, October 30, 2010

### Stock loans in Harry Potter land

I ended my short visit to the UK by giving a talk on stock loans at the Nomura Seminar Series in Oxford yesterday.

There could not be a better place for me to present about this particular paper. It extends previous results by Xunyu Zhou to incomplete markets, using techniques developed by Vicky Henderson and Thaleia Zariphopoulou, all three of them members of the impressive Oxford-Man Institute. Alas, by a series of unfortunate events, they were all out of the country and could not attend my talk, but I maintain that the place was ideal, if only because it gave me the chance to take pictures of some Harry Potter locations.

There could not be a better place for me to present about this particular paper. It extends previous results by Xunyu Zhou to incomplete markets, using techniques developed by Vicky Henderson and Thaleia Zariphopoulou, all three of them members of the impressive Oxford-Man Institute. Alas, by a series of unfortunate events, they were all out of the country and could not attend my talk, but I maintain that the place was ideal, if only because it gave me the chance to take pictures of some Harry Potter locations.

## Thursday, October 28, 2010

### Chaos at Imperial College

No, no, I don't mean that they run a particularly disorganized college.

It is just that I presented the most recent version of my work on calibration of chaotic models for interest rates there yesterday. The slides for the talk can be found here, but I have not figured out a way to upload the movies to my website in a concise manner yet, so if you are interested you can write me an e-mail.

As I mentioned in the talk, presenting about chaotic models at Imperial College is a bit like giving a talk at the Sistine Chapel on recent advances in fresco technique. Most of the fundamental work on axiomatic positive interest rates was done by Lane Hughston and his collaborators, so it was a real treat to show them my modest contribution to the subject.

Many new and old friends were in attendance, including my phd supervisor Ray Streater, as well as my recent phd student Tsunehiro Tsujimoto, but somehow I missed the opportunity to register this tri-generation encounter with a picture.

## Sunday, October 24, 2010

### Games and options at Paris Nord

So there is irrefutable evidence that the number of reader of this blog is strictly larger than zero: Jean-Michel Courtault told me he got wind that I was spending time in Paris through a colleague who follows the blog and invited yours truly to give a talk at Universite Paris Nord.

I use the opportunity to brush up on some old work I did on real options and game theory. The intersection between the two approaches is actually quite tricky, especially in incomplete markets. But I'm getting close to completing two papers on the topic - one in discrete, the other in continuous time - and will post them when they are ready. In the mean time, the slides for the talk can be found here.

I use the opportunity to brush up on some old work I did on real options and game theory. The intersection between the two approaches is actually quite tricky, especially in incomplete markets. But I'm getting close to completing two papers on the topic - one in discrete, the other in continuous time - and will post them when they are ready. In the mean time, the slides for the talk can be found here.

## Saturday, October 23, 2010

### Bubbles course

As part of my sabbatical duties here in Paris, I agreed to give a graduate course on asset price bubbles.

The course started on October 15th, and in the first lecture I discussed the theory of "rational bubbles", that is to say, bubbles that can arise in a well defined rational expectations model, simply for being a possible solution of the corresponding homogeneous Euler equation (apart from the usual "fundamental value" term, which is a particular solution associated with the inhomogenous equation with dividends).

The catch is that the expected value of such rational bubbles must grow exponentially, so they are typically ruled out by some "transversality" condition (such as constraints in the total wealth in the economy). Nevertheless, they are a good place to start studying bubbles, if only because the "irrational" ones are much more difficult to characterize.

I'll be using this page to post references for the course as it progresses. Eventually there might be slides or even notes available, but don't create any expectation (rational or otherwise) about them.

The course started on October 15th, and in the first lecture I discussed the theory of "rational bubbles", that is to say, bubbles that can arise in a well defined rational expectations model, simply for being a possible solution of the corresponding homogeneous Euler equation (apart from the usual "fundamental value" term, which is a particular solution associated with the inhomogenous equation with dividends).

The catch is that the expected value of such rational bubbles must grow exponentially, so they are typically ruled out by some "transversality" condition (such as constraints in the total wealth in the economy). Nevertheless, they are a good place to start studying bubbles, if only because the "irrational" ones are much more difficult to characterize.

I'll be using this page to post references for the course as it progresses. Eventually there might be slides or even notes available, but don't create any expectation (rational or otherwise) about them.

## Tuesday, October 19, 2010

### Mathematical finance at the old Paris Bourse

It turns out that most things related to mathematical finance in Paris are named after Louis Bachelier. Apart form the Bachelier seminars that I mentioned in this post, there is also the Institute Louis Bachelier, a high level research center house at the Palais Brongniart, the original quarters for the Bourse de Paris.

The name and location couldn't be more perfect, given that Bachelier is the undisputed founder of mathematical finance and used the Bourse as a practical example for his

In any case, I went there yesterday for two back-to-back talks (no link available). The first was by Lorenzo Bergomi, who spoke about issues related to estimating asset volatility and correlation in asynchronous markets and the second was by Emannuel Gobet, who gave a comprehensive overview of expansion techniques for calculating expectations (e.g option prices) with several examples drawn from popular models for stock prices.

The name and location couldn't be more perfect, given that Bachelier is the undisputed founder of mathematical finance and used the Bourse as a practical example for his

*Theorie de la Speculation.*

In any case, I went there yesterday for two back-to-back talks (no link available). The first was by Lorenzo Bergomi, who spoke about issues related to estimating asset volatility and correlation in asynchronous markets and the second was by Emannuel Gobet, who gave a comprehensive overview of expansion techniques for calculating expectations (e.g option prices) with several examples drawn from popular models for stock prices.

## Wednesday, October 13, 2010

### The Theory of Interstellar Trade

When I switched from physics to finance one of the first things I was told was that economics papers have an abnormally long review process. It is said that the CIR paper, for example, was in circulation 5 years before it was finally published.

But this Krugman paper clearly holds the record: it was written when he was a young assistant professor in 1978 and has just appeared in

Besides combining two of my passions, the paper is pricelessly funny. Favorite quote so far: "the reader must, of course, be careful not to confuse relative generality with general relativity".

I'll read it carefully and try to come up with a follow up taking into account what happened in financial markets in the last 32 years !

But this Krugman paper clearly holds the record: it was written when he was a young assistant professor in 1978 and has just appeared in

*Economic Inquiry*this month.Besides combining two of my passions, the paper is pricelessly funny. Favorite quote so far: "the reader must, of course, be careful not to confuse relative generality with general relativity".

I'll read it carefully and try to come up with a follow up taking into account what happened in financial markets in the last 32 years !

## Monday, October 11, 2010

### Steven Shreve in Paris

Steve gave a talk at Ecole Polytechnique today describing a model for the optimal way to purchase a large number of shares of an asset against a limit-order sell book of arbitrary shape. This generalizes the recent but already well--known work of Obizhaeva and Wang using clever stochastic analysis arguments.

If you missed his talk today you can still watch it tomorrow at the Petits Déjeuners de la Finance.

After that he will be en route to London where he is going to deliver one of his trademark courses on stochastic calculus. Everyone who has ever attended one of his talks knows why banking types pay big money for his lectures.

If you missed his talk today you can still watch it tomorrow at the Petits Déjeuners de la Finance.

After that he will be en route to London where he is going to deliver one of his trademark courses on stochastic calculus. Everyone who has ever attended one of his talks knows why banking types pay big money for his lectures.

## Saturday, October 9, 2010

### Seminaire Bachelier

The different math finance research groups in Paris (from Ecole Polytechnique, Université Paris Dauphine, CREST, Universite de Paris VI- VII, Université d'Evry, and Université de Marne La Vallée) jointly organize a series of seminars at the Institute Henry Poincare, at the heart of the Latin Quarter.

For visitors like me, it is a wonderful opportunity to meet students and researchers from all over Paris under the same roof.

Once a year, they dedicate an entire day to talks by PhD students, in many cases giving them the first chance to present their work in front of a friendly but rigorous audience.

I spent the day yesterday attending one of these

For visitors like me, it is a wonderful opportunity to meet students and researchers from all over Paris under the same roof.

Once a year, they dedicate an entire day to talks by PhD students, in many cases giving them the first chance to present their work in front of a friendly but rigorous audience.

I spent the day yesterday attending one of these

*journee des doctorants*and was immensely impressed by the quality of the talks and the results presented. It also forced me to step up my french.## Friday, September 10, 2010

### Systemic Risk Conference in Paris

I'm a little late posting this, but last week I attended a short and sweet one-day conference on systemic risk organized by the Chaires FBF (Federation Bancaire Francaise) in Paris. I had already seen the works of Rama Cont, Stephane Crepre and Andrea Minca at three separate activities at Fields earlier this year, so for me the highlight of the day was the work of

**Stefano Battiston, who showed a possible mechanism by which increased diversification of risk in network of banks can lead to***increased*systemic risk.## Sunday, August 29, 2010

### Summer School in Paris

After a month of relative peace and quiet, I attended my first official scientific activity in Paris, the Third SMAI European Summer School in Financial Mathematics, organized by the good folks at Ecole Polytechnique. The format consisted of senior speakers giving a series of lectures each, interlaced with contributed talks by some of the junior participants.

The general theme of the school was stochastic volatility and its many asymptotics. The main speakers were Huyen Pham, who gave an inspiring, albeit technically demanding, overview of large deviations and applications to finance (notably stochastic volatility in short maturity, but also importance sampling and risk management), Roger Lee, the discoverer of the eponymous moment formula, and Jean-Pierre Fouque, whose lectures were based on a completely new version of his famous book on fast mean-reversion asymptotics.

The general theme of the school was stochastic volatility and its many asymptotics. The main speakers were Huyen Pham, who gave an inspiring, albeit technically demanding, overview of large deviations and applications to finance (notably stochastic volatility in short maturity, but also importance sampling and risk management), Roger Lee, the discoverer of the eponymous moment formula, and Jean-Pierre Fouque, whose lectures were based on a completely new version of his famous book on fast mean-reversion asymptotics.

## Saturday, July 24, 2010

### La vie en rose

In case my readers wonder, I have started my sabbatical in Paris, which explains the sudden silence while I got my feet on the ground. I'll be reporting on whatever activity on quantitative finance that I happen to come across during my stay here. But since Paris is pretty empty in the summer, I have nothing to report so far.

Talking about reports, I am keeping busy by writing the final report for the thematic program, a version of which might appear as a front-end article in Quantitative Finance.

I'll have more to say about it when it is ready, but for now I'll leave you with the view from my desk at my apartment:

Talking about reports, I am keeping busy by writing the final report for the thematic program, a version of which might appear as a front-end article in Quantitative Finance.

I'll have more to say about it when it is ready, but for now I'll leave you with the view from my desk at my apartment:

*Pas mal*, hein ?## Friday, July 9, 2010

### IJTAF special issues

The International Journal of Theoretical and Applied Finance will publish a series of special issues dedicated to the workshops that took place during our thematic program. Since I'll be a guest editor for each of the issue, I thought I should give a status update.

For the first one, covering the Workshop on Foundations of Mathematical Finance, we received 8 papers, which are now in different stages of the refereeing process. If all goes well this will be finalized early in the summer and will be ready to go into production.

For the second one, dedicated to the Workshop on Computational Methods in Finance, we received 3 papers (currently under review) and 3 more authors have promised to submit. It is hard to say when the entire review process will be finished, but my hope is to have it ready by the middle of next Fall.

The organizers of the Workshop on Financial Econometrics declined the offer of a special issue, so there is nothing to report on it.

Finally, today is the deadline for paper submissions for the special issue dedicated to the Workshop on Financial Derivatives and Risk Management. So far we have received 2 papers, with 6 more promised. Given the time it took to review the papers for the first special issue, I estimate that this one will not be ready until early next year.

As you can see, I'll have a lot of editorial fun for the next little while.

For the first one, covering the Workshop on Foundations of Mathematical Finance, we received 8 papers, which are now in different stages of the refereeing process. If all goes well this will be finalized early in the summer and will be ready to go into production.

For the second one, dedicated to the Workshop on Computational Methods in Finance, we received 3 papers (currently under review) and 3 more authors have promised to submit. It is hard to say when the entire review process will be finished, but my hope is to have it ready by the middle of next Fall.

The organizers of the Workshop on Financial Econometrics declined the offer of a special issue, so there is nothing to report on it.

Finally, today is the deadline for paper submissions for the special issue dedicated to the Workshop on Financial Derivatives and Risk Management. So far we have received 2 papers, with 6 more promised. Given the time it took to review the papers for the first special issue, I estimate that this one will not be ready until early next year.

As you can see, I'll have a lot of editorial fun for the next little while.

## Wednesday, June 30, 2010

### This is the end, my only friend, the end

So after 6 months, 4 workshops, 5 industrial-academic forums, 3 public lecture series, 4 graduate courses, 2 seminar series, and 2 international conferences, our thematic program is coming to an end today.

I counted around 180 invited talks during the program, not including the talks at IME and Bachelier, which although part of the program took place outside Fields. The slides and audio for the majority of the talks can be found here.

Throughout the program, I experimented with posting comments on this blog and had great fun along the way. At the request of Fields, I will keep the blog up and report on any further developments that might arise in connection with the activities that took place here in the past 6 months. I plan to write about future meetings, papers, sites and people directly or inderectly connected with the program. Hopefully other participants will also become regular contributors to this blog (just send me a message and I'll add you as an author).

In any case, this is your blogger extraordinaire signing off for now, until I find something else to write about. In the mean time, let me thank you all for reading and please keep in touch !

I counted around 180 invited talks during the program, not including the talks at IME and Bachelier, which although part of the program took place outside Fields. The slides and audio for the majority of the talks can be found here.

Throughout the program, I experimented with posting comments on this blog and had great fun along the way. At the request of Fields, I will keep the blog up and report on any further developments that might arise in connection with the activities that took place here in the past 6 months. I plan to write about future meetings, papers, sites and people directly or inderectly connected with the program. Hopefully other participants will also become regular contributors to this blog (just send me a message and I'll add you as an author).

In any case, this is your blogger extraordinaire signing off for now, until I find something else to write about. In the mean time, let me thank you all for reading and please keep in touch !

## Sunday, June 27, 2010

### Most exciting Bachelier Congress ever

The 6th World Congress of the Bachelier Finance Society ended yesterday in Toronto, featuring excellent plenary and contributed talks, more than 500 enthusiastic participants, and an abundance of interesting discussions all around. What with the World Cup, the G20 summit and even an earthquake to boot, there was certainly no shortage of excitement during the Congress.

These are a few pictures I took from different vantage points at the Hilton, showing some of the protesters and the tight security around the hotel. What a week !

These are a few pictures I took from different vantage points at the Hilton, showing some of the protesters and the tight security around the hotel. What a week !

## Monday, June 21, 2010

### Last Industrial-Academic Forum

The forum on Financial Engineering and Insurance Mathematics took place today at Fields, bringing to conclusion the series of Industrial-Academic Forums that we had during the thematic program.

This was also the last program activity hosted at the Fields building, since the Bachelier Congress that starts tomorrow will take place at the Hilton. I was walking around the downtown core near the hotel and snapped these pictures of the ridiculous fence built up as part of the G20 security:

Welcome to Toronto !!

This was also the last program activity hosted at the Fields building, since the Bachelier Congress that starts tomorrow will take place at the Hilton. I was walking around the downtown core near the hotel and snapped these pictures of the ridiculous fence built up as part of the G20 security:

Welcome to Toronto !!

## Monday, June 14, 2010

### The end is nigh

Not trying to be alarmist or anything - just observing that these are going to be the last two weeks in thematic program, culminating with the IME Congress from Thursday to Saturday, the Forum on Financial Engineering and Insurance Mathematics next Monday, and the Bachelier Congress from Tuesday to Saturday next week.

Time flies !

Time flies !

## Saturday, June 12, 2010

### Fourth and final graduate course

Dan Rosen delivered an intense graduate course on risk management this week, concluding the series of courses offered during the thematic program. Such courses are a key difference between the program at Fields and similar activities at other places, as they provide an element of continuity for students, postdocs and long term participants, besides being a great resource for content and information about important aspects of the program.

## Tuesday, June 8, 2010

### Stochastic control fest ends

Nizar Touzi concluded his graduate course today. For the past 6 weeks he delivered an intense and comprehensive exposition of stochastic control, dynamic programming, HJB equations, viscosity solutions, BSDEs, and their applications to finance, complete with guest lectures by Bruno Bouchard, Agnes Tourin and Mete Soner.

Several admin distractions (including a trip to Paris) prevented me from attending the lectures, but I look forward to the monograph that Fields will publish based on his lecture notes.

ps: we also had a celebratory barbecue for all thematic program participants, successfully put together by the Social Activities Committee - Walid Mnif and Arash Fahim. Thank you guys.

Several admin distractions (including a trip to Paris) prevented me from attending the lectures, but I look forward to the monograph that Fields will publish based on his lecture notes.

ps: we also had a celebratory barbecue for all thematic program participants, successfully put together by the Social Activities Committee - Walid Mnif and Arash Fahim. Thank you guys.

## Friday, June 4, 2010

### Quantitative Finance Seminar - Season finale

The 2009-10 season for the Quantitative Finance Seminar Series came to a glorious end this week with two heavy weight talks by Freddy Delbaen and Mete Soner, both from ETH (Zurich).

Freddy gave an insider account of convex risk measures (he is a co-author of the paper that started it all), with a particular emphasis on the requirement of time-consistency, which in the Brownian setting leads to a natural connection with BSDEs and the associated semi-linear PDEs.

Mete then explored yet another application of BSDEs (this time related with fully nonlinear PDEs) in the context of (super) hedging a claim on markets with uncertain volatility, which provides the motivation to consider probability measures that are not necessarily absolutely continuous with respect to one another.

I did warn it was heavy weight, didn't I ?

Freddy gave an insider account of convex risk measures (he is a co-author of the paper that started it all), with a particular emphasis on the requirement of time-consistency, which in the Brownian setting leads to a natural connection with BSDEs and the associated semi-linear PDEs.

Mete then explored yet another application of BSDEs (this time related with fully nonlinear PDEs) in the context of (super) hedging a claim on markets with uncertain volatility, which provides the motivation to consider probability measures that are not necessarily absolutely continuous with respect to one another.

I did warn it was heavy weight, didn't I ?

## Wednesday, June 2, 2010

### Diffusive visitors seminars

We had two talks in the visitors seminar yesterday, both related to diffusions. Pavel Gapeev from LSE spoke about how to construct Levy driven processes by applying an invertible state space transformation to a solvable diffusion (say a CIR process), whereas Jean Francois Renaud from Waterloo described a simple yet far reaching method for discretizing a positive diffusion (say a CIR process). It is nice when back-to-back talks have a commonality like that, so the mind doesn't have to be jogged from one place to another.

## Monday, May 31, 2010

### PSI

As I predicted, with so much going on, I'm falling seriously behind with my blogging duties. But I have good excuses. After a gorgeous weekend dedicated to serious gardening and landscaping, I went to the Perimeter Institute today to serve as an external examiner for their PSI program (yes, the acronym makes up the Greek letter generally used to represent a wave function in quantum mechanics... they are clever that way). As I mentioned here, the Institute hosted a very successful workshop on the foundations of economics last year, and I'm always glad to see them involved in financial math, which was the subject of the essay I examined today, so thanks to John Berlinski for the invitation and for the ride to Waterloo.

I'll be back to Fields dedicated blogging tomorrow though.

I'll be back to Fields dedicated blogging tomorrow though.

## Wednesday, May 26, 2010

### Derivatives workshop: opening remarks

It is hard to believe, but we have arrived at the last workshop in our thematic program. I used my opening remarks on Monday to reminisce about the formation of the scientific committee for the workshop. Back in 2008 we already had Peter Carr, Darrell Duffie and Roger Lee as confirmed members (Rama Cont joined the group later that year), but still lacked a chair for the committee. I realized at the time that the broad scope for the workshop would make it a Bachelier Congress in miniature and decided to invite Lane Hughston, who together with Mark Davis was in the midst of organizing the 2008 congress in London, to take the role of chair of the scientific committee. I now joke that the expensive dinner (at Indigo on One Aldwych) that I used as a pretext to pose the invitation to him provided the best return on capital of my life, resulting in the spectacular array of speakers and participants that we have here this week.

I'll be using my commuting time in the train to review the talks day-by-day in the next few blog posts, but I apologize in advance if I fall a bit behind with the blogging, since there are just too many interesting interactions to be had this week.

I'll be using my commuting time in the train to review the talks day-by-day in the next few blog posts, but I apologize in advance if I fall a bit behind with the blogging, since there are just too many interesting interactions to be had this week.

## Saturday, May 22, 2010

### It's all about networking

The Workshop on Financial Networks and Risk Assessment took place at Fields this week as part of the Mitacs International Focus Period on Advances in Network Analysis and its Applications, following the Industrial-Academic Forum on Systemic Stability and Liquidity at the beginning of the week.

Back in 2009, when the cool thing was to hold summits, Mitacs organized a very stimulating event on systemic risk. It was at that point that Michael Lynch and I had the idea of organizing a joint event for both the upcoming Mitacs Focus period and Fields thematic program.

Our idea was to mimic the format for the summit itself, with concentrated research talks by academics and practitioners sparking off the discussion during the first two days in the Forum, followed by a more relaxed set of mini-courses, lectures and problem solving sessions during the three days workshop. I think we managed to convey the idea to the organizing committees of the two activities, who in turn did a great job in producing the end result we saw this week. Talk about the power of networking !

Back in 2009, when the cool thing was to hold summits, Mitacs organized a very stimulating event on systemic risk. It was at that point that Michael Lynch and I had the idea of organizing a joint event for both the upcoming Mitacs Focus period and Fields thematic program.

Our idea was to mimic the format for the summit itself, with concentrated research talks by academics and practitioners sparking off the discussion during the first two days in the Forum, followed by a more relaxed set of mini-courses, lectures and problem solving sessions during the three days workshop. I think we managed to convey the idea to the organizing committees of the two activities, who in turn did a great job in producing the end result we saw this week. Talk about the power of networking !

## Wednesday, May 19, 2010

### Systemic risk forum

We held the long awaited Forum on Systemic Stability and Liquidity this week at Fields. This was one of the activities that I was looking forward to the most in the thematic program, not only because it relates directly to the financial crises, but because it brought to Fields a diverse group of speakers, including some renowned economists, who are hard to find in typical mathematics conference.

Wei Xiong opened the proceedings with a talk incorporating financing choices and diverse beliefs into asset pricing and associated booms and crashes. This naturally delighted me because of my recent obsession with finding mathematical formulations for the Minsky story about asset bubbles and fragility of financial systems.

Jennifer Huang followed with a simple model explaining market failures in liquidity provision which allowed her to analyze the effects of several different policy interventions, most of them used in the aftermath of the crisis of 2008.

Rama Cont concluded the morning with his network model for banking systems and the corresponding measures for the default impact of any single bank and the associated systemic risk that it poses on the network. I was particularly interested in this talk, since one of my PhD students, Omneia Ismail, is currently working on an agent-based extension of this model.

Hao Zhou, the first of three speakers from the Fed, presented an intriguing way of measuring systemic risk by constructing a hypothetical insurance contract covering the losses of the entire banking sector. Although such contract would never be sold by any individual insurance company (for obvious reasons), the marginal contribution of each bank to the hypothetical premium is an indication of the risk it poses to the system.

In a different direction, Kay Giesecke proposed to link systemic risk directly with the probability of default of a large fraction of the financial sector and suggested a clever two-step maximum likelihood method to estimate this probability from default events occurring in the entire economy.

The first day ended with a panel discussion featuring Aaron Brown, Michael Gordy, Joseph Langsam and David Rowe and provided unique insights on key factors leading up to the crisis and a lively discussion on policy, legislation and general banking practices.

Frank Milne started the second day with a

Viral Acharya proposed yet any way to measure systemic risk based on the degree of under-capitalization of a financial institute. This leads to the concept of Marginal Expected Shortfall, which I still think should have the words "systemic" in it, just so that they can properly call it a MESS (this was my piece of advice to Robert Engle when I heard him talk about it in the econometrics workshop last month)

Itay Goldstein presented a game-theoretical model for credit freezes, where the lending actions of banks reinforce the probability of successful outcomes for the loans of other banks. He showed how this can lead to different equilibria: efficient no-lending (when the economic outlook is sufficiently bad that no bank would recover their loans, efficient lending (when the economic outlook is so good that all loans are fully recovered no matter what) and an interesting regime where both lending and no-lending are possible equilibria, and credit can freeze due to lack of coordination between banks. He then analyze the effect of different policy interventions according to this model.

The last two talks of the forum were dedicated to the concept of risk appetite. Jon Danielsson provided the theoretical background by explaining that risk appetite is the result of risk preferences and beliefs under constraints, and therefore can very wildly in the market as constraints move from non-biding to biding and back again. Erkko Etula provided empirical evidence for exactly this type of phenomenon using constraints in the funding liquidity for US financial intermediaries. The associated changes in the corresponding risk appetite can then be used to forecast changes in exchange rate between the US dollars and most major currencies.

Wei Xiong opened the proceedings with a talk incorporating financing choices and diverse beliefs into asset pricing and associated booms and crashes. This naturally delighted me because of my recent obsession with finding mathematical formulations for the Minsky story about asset bubbles and fragility of financial systems.

Jennifer Huang followed with a simple model explaining market failures in liquidity provision which allowed her to analyze the effects of several different policy interventions, most of them used in the aftermath of the crisis of 2008.

Rama Cont concluded the morning with his network model for banking systems and the corresponding measures for the default impact of any single bank and the associated systemic risk that it poses on the network. I was particularly interested in this talk, since one of my PhD students, Omneia Ismail, is currently working on an agent-based extension of this model.

Hao Zhou, the first of three speakers from the Fed, presented an intriguing way of measuring systemic risk by constructing a hypothetical insurance contract covering the losses of the entire banking sector. Although such contract would never be sold by any individual insurance company (for obvious reasons), the marginal contribution of each bank to the hypothetical premium is an indication of the risk it poses to the system.

In a different direction, Kay Giesecke proposed to link systemic risk directly with the probability of default of a large fraction of the financial sector and suggested a clever two-step maximum likelihood method to estimate this probability from default events occurring in the entire economy.

The first day ended with a panel discussion featuring Aaron Brown, Michael Gordy, Joseph Langsam and David Rowe and provided unique insights on key factors leading up to the crisis and a lively discussion on policy, legislation and general banking practices.

Frank Milne started the second day with a

*tour de force*on roughly sixty years of attempts to introduce illiquidity into traditional economic models (read general equilibrium*a la*Arrow-Debreu). He knows all the tricks in this trade, and anyone working in this area should bounce their ideas off him to make sure that they are neither reinventing the wheel or moving towards a dead end. What got him really excited towards the end of his talk was the idea of financial war games, which sits squarely within the agent-based models for financial systems that Omneia and I are working on.Viral Acharya proposed yet any way to measure systemic risk based on the degree of under-capitalization of a financial institute. This leads to the concept of Marginal Expected Shortfall, which I still think should have the words "systemic" in it, just so that they can properly call it a MESS (this was my piece of advice to Robert Engle when I heard him talk about it in the econometrics workshop last month)

Itay Goldstein presented a game-theoretical model for credit freezes, where the lending actions of banks reinforce the probability of successful outcomes for the loans of other banks. He showed how this can lead to different equilibria: efficient no-lending (when the economic outlook is sufficiently bad that no bank would recover their loans, efficient lending (when the economic outlook is so good that all loans are fully recovered no matter what) and an interesting regime where both lending and no-lending are possible equilibria, and credit can freeze due to lack of coordination between banks. He then analyze the effect of different policy interventions according to this model.

The last two talks of the forum were dedicated to the concept of risk appetite. Jon Danielsson provided the theoretical background by explaining that risk appetite is the result of risk preferences and beliefs under constraints, and therefore can very wildly in the market as constraints move from non-biding to biding and back again. Erkko Etula provided empirical evidence for exactly this type of phenomenon using constraints in the funding liquidity for US financial intermediaries. The associated changes in the corresponding risk appetite can then be used to forecast changes in exchange rate between the US dollars and most major currencies.

## Monday, May 10, 2010

### Back in town

I was in Paris last week, hence the lack of posting in this blog. Here is what I missed:

- a talk by Eddie Ng in the visitors seminar series

- the second week of lectures in Nizar Touzi's graduate course

- the crash of 2:40 in the NYSE. Ok, this didn't really have anything to do with our thematic program, or did it ? See what I wrote about algorithmic trading here and make up your own mind.

- a talk by Eddie Ng in the visitors seminar series

- the second week of lectures in Nizar Touzi's graduate course

- the crash of 2:40 in the NYSE. Ok, this didn't really have anything to do with our thematic program, or did it ? See what I wrote about algorithmic trading here and make up your own mind.

## Saturday, May 1, 2010

### How I became a quant

Don't worry, I didn't really become a quant - I'm still a poor math professor. But this was the title of a panel we had at Fields this Friday, organized jointly by the IAFE and our thematic program.

The panel was preceded by a Program Directors luncheon bringing together for the first time representatives from the local financial math programs and the major Canadian banks to discuss topics of common interest, such as curriculum, scope, duration and format of the different programs.

After the panel we held a Recruiters Reception, where students could network with potential employers in quantitative finance.

An altogether successful and productive day, not to mention that I got to share a ride to the airport with two of the panelists, Raphael Douady and Marco Avellaneda, who are among the most interesting people in this business.

The panel was preceded by a Program Directors luncheon bringing together for the first time representatives from the local financial math programs and the major Canadian banks to discuss topics of common interest, such as curriculum, scope, duration and format of the different programs.

After the panel we held a Recruiters Reception, where students could network with potential employers in quantitative finance.

An altogether successful and productive day, not to mention that I got to share a ride to the airport with two of the panelists, Raphael Douady and Marco Avellaneda, who are among the most interesting people in this business.

## Friday, April 30, 2010

### Hopf algebras, BSDEs, booms, crashes, and all that

I spent most of the week preparing for my trip to Europe today, so only kept an eye on what was going on in the thematic program at a distance, hence this all encompassing but not too detailed blog post.

Anke Wiese spoke on the visitors seminar on Tuesday about approximations of nonlinear systems of SDEs using more general schemes than the ones based on a stochastic Taylor expansion. She showed how a sinh-log series leads to truncation errors which are always better than a conventional Taylor series, as well as providing an opportunity to use Hopf algebras in finance, which is altogether very cool.

Nizar Touzi started his concentrated graduate course on stochastic control and BSDEs on Wednesday. He is going to teach for 6 hours every Wednesday for the next six weeks, so this promises to be one for the books (incidentally, he is typing his lectures notes, so it might even turn into a book...)

The April edition of the Quantitative Finance Seminar series had Jorge Sobehart from Citigroup talking about booms, crashes and market behavior. I had to miss it because of a faculty meeting at McMaster, which is a pity, since his talk fits right into my recent obsession with modeling bubbles in financial markets. But I guess that is where the audio recordings provided by Fields will come in handy.

Anke Wiese spoke on the visitors seminar on Tuesday about approximations of nonlinear systems of SDEs using more general schemes than the ones based on a stochastic Taylor expansion. She showed how a sinh-log series leads to truncation errors which are always better than a conventional Taylor series, as well as providing an opportunity to use Hopf algebras in finance, which is altogether very cool.

Nizar Touzi started his concentrated graduate course on stochastic control and BSDEs on Wednesday. He is going to teach for 6 hours every Wednesday for the next six weeks, so this promises to be one for the books (incidentally, he is typing his lectures notes, so it might even turn into a book...)

The April edition of the Quantitative Finance Seminar series had Jorge Sobehart from Citigroup talking about booms, crashes and market behavior. I had to miss it because of a faculty meeting at McMaster, which is a pity, since his talk fits right into my recent obsession with modeling bubbles in financial markets. But I guess that is where the audio recordings provided by Fields will come in handy.

## Tuesday, April 27, 2010

### Econometrics workshop

The third workshop of our thematic program took place at Fields last week. The focus was Financial Econometrics, and the organizers, led by Yacine Ait-Sahalia, treated us with a stellar line up of 32 speakers spread in two days of talks, including the third lecture in the Distinguished Lecture Series by Darrell Duffie.

As a result, each speaker (apart from Darrell) had to give a 20 minutes talk, which forced them to focus on the important results and cut back on introductory remarks or generalities. I can't think of a better way to provide a broad overview of a very dynamic area of research. The downside of course is that there is no way I can review every single talk as I did for the other two workshops, so I'll restrict myself to the following highlights:

- Robert Engle described a measure of the systemic risk associated with an individual bank. This was developed at NYU and is updated regularly and made available on the web for investor and regulators. I pointed out to him that he should change the name of measure from Marginal Expected Shortfall to

- Andy Lo proposed different levels of uncertainty, extending the well known distinction made by Knight between risk and uncertainty to a finer classification. His application of the idea to a physical problem (the harmonic oscillator), was entertaining, since ideas usually flow from physics to economics, but rarely the other way around.

- Jean Jacob gave an impeccable presentation using only chalk and blackboard and got a round of applause when he said that he never managed to learn powerpoint (that alone was worth attending the entire workshop).

As a result, each speaker (apart from Darrell) had to give a 20 minutes talk, which forced them to focus on the important results and cut back on introductory remarks or generalities. I can't think of a better way to provide a broad overview of a very dynamic area of research. The downside of course is that there is no way I can review every single talk as I did for the other two workshops, so I'll restrict myself to the following highlights:

- Robert Engle described a measure of the systemic risk associated with an individual bank. This was developed at NYU and is updated regularly and made available on the web for investor and regulators. I pointed out to him that he should change the name of measure from Marginal Expected Shortfall to

**M**arginal**E**xpected**S**ystemic**S**hortfall, since MESS is a much better description of what goes on in the financial system. He can thank me for that when he gets his next Nobel Prize.- Andy Lo proposed different levels of uncertainty, extending the well known distinction made by Knight between risk and uncertainty to a finer classification. His application of the idea to a physical problem (the harmonic oscillator), was entertaining, since ideas usually flow from physics to economics, but rarely the other way around.

- Jean Jacob gave an impeccable presentation using only chalk and blackboard and got a round of applause when he said that he never managed to learn powerpoint (that alone was worth attending the entire workshop).

## Monday, April 26, 2010

### Darrell Duffie, the Dark Lord

And by that I don't mean that Darrell replaced Lord Voldemort as the most powerful dark wizard of all times. What I do mean is that he is the person who knows the most about Dark Markets, as he demonstrated during the Distinguished Lecture Series at Fields last week.

In the first lecture Darrell described how over-the-counter markets differ from centralized ones, in particular with respect to the transfer of capital, which tend to be slow in the former, resulting in asset prices which can show a persistent deviation from "fundamentals". He also remarked prices for the same asset at the same time can show a large dispersion, since agents trade bilaterally, with no access to information that can reveal a unique "fair" price at the time of trade. By way of examples, he showed intriguing evidence from the time signature of prices for treasury bonds (that is, how they vary in time near the moment of issuance), as well as cross sectional dispersion in prices. Towards the end of the lecture, he commented on the benefits of clearing houses for derivative contracts.

Having laid the intuition for OTC markets, Darrell used his second lecture to explain an idealized mathematical model for a continuum of agents meeting for bilateral trades at random times according to a given intensity. Through a heavy use of infinite population, the law of large numbers, and independence, he was able to derive an evolution equation (a version of the Boltzman equation) for the "types" of agents in the population. Since at equilibrium bids and types are in a one-to-one correspondence, this evolution equation describes how information "percolates" in the population through an infinite series of double auctions.

In the third and final lecture, which took place during the Financial Econometric workshop, Darrell focused on the interbank market for Fed Funds and used a logit model to describe the probability of a transaction occurring between two banks and fitted the model to a data set comprising of 225 million observations for 8000 banks in 2005.

In the first lecture Darrell described how over-the-counter markets differ from centralized ones, in particular with respect to the transfer of capital, which tend to be slow in the former, resulting in asset prices which can show a persistent deviation from "fundamentals". He also remarked prices for the same asset at the same time can show a large dispersion, since agents trade bilaterally, with no access to information that can reveal a unique "fair" price at the time of trade. By way of examples, he showed intriguing evidence from the time signature of prices for treasury bonds (that is, how they vary in time near the moment of issuance), as well as cross sectional dispersion in prices. Towards the end of the lecture, he commented on the benefits of clearing houses for derivative contracts.

Having laid the intuition for OTC markets, Darrell used his second lecture to explain an idealized mathematical model for a continuum of agents meeting for bilateral trades at random times according to a given intensity. Through a heavy use of infinite population, the law of large numbers, and independence, he was able to derive an evolution equation (a version of the Boltzman equation) for the "types" of agents in the population. Since at equilibrium bids and types are in a one-to-one correspondence, this evolution equation describes how information "percolates" in the population through an infinite series of double auctions.

In the third and final lecture, which took place during the Financial Econometric workshop, Darrell focused on the interbank market for Fed Funds and used a logit model to describe the probability of a transaction occurring between two banks and fitted the model to a data set comprising of 225 million observations for 8000 banks in 2005.

## Friday, April 23, 2010

### The Frittelli lectures

Marco Frittelli was here this week and gave two guest lectures in my graduate course. In his first lecture he reviewed the theory of risk measures, with particular emphasis on their representation in terms of the bi-conjugate functional given by the Fenchel-Moreau theorem. This leads naturally to the study of dual systems (other than the classical setting with bounded random variables and their dual set "ba"), and in particular Orlicz spaces. From there it was just a small leap into Banach lattices and the study of order continuous functionals. In the end it all came back to risk measures, for example by showing the the Fatou property is nothing but continuity below, etc.

The focus of his second lecture was utility maximization, where he showed that allowing for generally unbounded price processes and contingent claims naturally leads again to Orlicz spaces, in the sense that admissible portfolios have losses controlled by random variables that are compatible with the given utility function, and therefore belong to a specific Orlicz space associated with it. From there it doesn't take long to realize that the indifference price of a claim gives rise to a risk measure on an Orlicz space.

In short, the two lectures were a prime example of elegance and internal consistency, and a great way to conclude the graduate course. Apart from that, Marco also discovered the best cappuccino in Toronto. But since this information is valuable, you have to contact me personally to obtain it :)

The focus of his second lecture was utility maximization, where he showed that allowing for generally unbounded price processes and contingent claims naturally leads again to Orlicz spaces, in the sense that admissible portfolios have losses controlled by random variables that are compatible with the given utility function, and therefore belong to a specific Orlicz space associated with it. From there it doesn't take long to realize that the indifference price of a claim gives rise to a risk measure on an Orlicz space.

In short, the two lectures were a prime example of elegance and internal consistency, and a great way to conclude the graduate course. Apart from that, Marco also discovered the best cappuccino in Toronto. But since this information is valuable, you have to contact me personally to obtain it :)

## Thursday, April 22, 2010

### Goldman in my mind

Xianhua Peng spoke about default clustering and CDO valuation in the visitors seminar talk this week. I was able to follow the part of the presentation in which he reviewed the key literature on the subject, pointing out the main disadvantages of many approaches, in particular the problem of providing a good fit for both marginal default rates for each name in a CDO while at the same time having sufficiently strong correlations to fit the observed tranche prices. He then introduced his own model, in collaboration with Steve Kou, who was his PhD advisor at Columbia, based on cumulative default intensities.

At the point the talk got a bit too technical for me, so when he mentioned Goldman Sachs I let my mind drift. For the record, I think the SEC is weak and was pushed in a hurry to create momentum for FinReg reform. While not necessarily a bad thing if it achieves the purpose of passing reform, I think this strategy creates the wrong type of outrage.

At the point the talk got a bit too technical for me, so when he mentioned Goldman Sachs I let my mind drift. For the record, I think the SEC is weak and was pushed in a hurry to create momentum for FinReg reform. While not necessarily a bad thing if it achieves the purpose of passing reform, I think this strategy creates the wrong type of outrage.

## Sunday, April 18, 2010

### Credit-Hybrid risk Forum

The third in our series of Industrial-Academic forums took place at the end of last week at Fields. I didn't attend many of the talks, because of a combination of train delays, family appointments and being busy looking after Robert Merton, so this post is not as detailed as I would normally have liked.

A quick glance at the program reveals that counterparty risk in general, and CVA in particular, was the prominent theme of the forum, followed closely by game options. This made for a very diverse forum, since the former is mostly grounded in practical day-to-day considerations for both banks and regulators (case in point: how to deal with so called "wrong-way risk", which doesn't strike me as a groundbreaking theoretical question, but somehow gets everybody else excited), whereas the latter is as theoretical as it gets (case in point: as Jan Kallsen showed, game options incorporate both European and American options as special cases, and pricing and hedging them quickly leads one to think deep about the limitations of arbitrage and replication, utility-based approaches, and so on).

A third pillar for the forum was the joint modeling of equity and credit markets, which sits between the other two in the theory-practice spectrum. For example, I missed Claudio Albanese's talk but understood from the comments and discussion that he is agnostic about models, but extremely concerned about the computational paradigms

(third and forth level BLAS, whatever that means) that are necessary to calibrate them to all available credit and equity derivatives. On the other hand, Tom Hurd, Julien Turc and Rafael Mendoza-Arriaga all have subtly different ways to jointly model credit and equity, ultimately relying on a deeper understanding of the capital structure of a company.

A quick glance at the program reveals that counterparty risk in general, and CVA in particular, was the prominent theme of the forum, followed closely by game options. This made for a very diverse forum, since the former is mostly grounded in practical day-to-day considerations for both banks and regulators (case in point: how to deal with so called "wrong-way risk", which doesn't strike me as a groundbreaking theoretical question, but somehow gets everybody else excited), whereas the latter is as theoretical as it gets (case in point: as Jan Kallsen showed, game options incorporate both European and American options as special cases, and pricing and hedging them quickly leads one to think deep about the limitations of arbitrage and replication, utility-based approaches, and so on).

A third pillar for the forum was the joint modeling of equity and credit markets, which sits between the other two in the theory-practice spectrum. For example, I missed Claudio Albanese's talk but understood from the comments and discussion that he is agnostic about models, but extremely concerned about the computational paradigms

(third and forth level BLAS, whatever that means) that are necessary to calibrate them to all available credit and equity derivatives. On the other hand, Tom Hurd, Julien Turc and Rafael Mendoza-Arriaga all have subtly different ways to jointly model credit and equity, ultimately relying on a deeper understanding of the capital structure of a company.

## Saturday, April 17, 2010

### Merton and me

Robert Merton delivered the The Nathan and Beatrice Keyfitz Lectures in Mathematics and the Social Sciences this past Thursday to a delighted audience of about 300 people who packed the large auditorium at the Bahen Center. He touched on many important aspects of the financial crisis, including the structural challenges posed by the intrinsic "put option" embedded in any risky loan, the role of composition when seemingly prudent actions taken by individuals in isolation lead to large systemic risks, and the limitations of mathematical models, which in his view cannot be judged separately from their users and applications. Contrary to populist clamors for "common sense", he said that the crisis accentuates the need for

Apart from the lecture itself, I had the privilege to host him at Fields during the afternoon and to take him out for dinner afterward together with other distinguished guests associated with the Institute. We were unanimously impressed by how engaging he was, not only by being ready to share personal experiences from the vantage point he occupied for the past 40 years, but also being genuinely interested in the research ideas that we timidly put forward for discussion.

At least for me, it was undoubtedly the high moment of the program so far, the kind of stuff that makes it all worthwhile.

*more*quantitative research in finance, since none of these problems will go away by a magical return to "simplicity" in the financial world. Needless to say, I agree with all that, otherwise we would not have gone ahead with a thematic program on this subject.Apart from the lecture itself, I had the privilege to host him at Fields during the afternoon and to take him out for dinner afterward together with other distinguished guests associated with the Institute. We were unanimously impressed by how engaging he was, not only by being ready to share personal experiences from the vantage point he occupied for the past 40 years, but also being genuinely interested in the research ideas that we timidly put forward for discussion.

At least for me, it was undoubtedly the high moment of the program so far, the kind of stuff that makes it all worthwhile.

## Wednesday, April 14, 2010

### Monte Carlo for PDEs on steroids

Fresh from his PhD defense - itself a memorable event that took place at Fields last week, with a star-filled examining committee consisting of two professors transplanted from France, one from Iran and two from Canada - Arash Fahim used the visitors seminar this week to describe the contents of his thesis to the rest of us.

Together with Nizar Touzi and others, he developed a probabilistic scheme to numerically approximate the solution of a certain type of fully nonlinear PDE. This generalizes the well-known use of Monte Carlo to solve linear parabolic PDEs through the Feymann-Kac formula, as well as the semi-linear case, which corresponds to approximating the solution of a BSDE. In the fully nonlinear case one is lead to consider the so-called 2BSDEs, as well as clever ways to approximate derivatives inside expectations, leading to what Nizar likes to call a Monte Carlo/Finite Differences scheme.

Arash showed several convincing numerical examples, but it is clear that this is a vast area with plenty of room for a lot more work.

Together with Nizar Touzi and others, he developed a probabilistic scheme to numerically approximate the solution of a certain type of fully nonlinear PDE. This generalizes the well-known use of Monte Carlo to solve linear parabolic PDEs through the Feymann-Kac formula, as well as the semi-linear case, which corresponds to approximating the solution of a BSDE. In the fully nonlinear case one is lead to consider the so-called 2BSDEs, as well as clever ways to approximate derivatives inside expectations, leading to what Nizar likes to call a Monte Carlo/Finite Differences scheme.

Arash showed several convincing numerical examples, but it is clear that this is a vast area with plenty of room for a lot more work.

## Tuesday, April 13, 2010

### Energy forum

I'm a bit late in commenting on the forum on Commodities, energy markets and emissions trading that took place at Fields last week. Like the Operational Risk forum last month, this was another great learning opportunity for a outsider like myself. So in the same spirit of my blog post on that forum, here is a summary of what I learned from this one:

- commodity markets are fertile ground for exotic derivatives, because physical constraints (say the rate at which you can pump oil in and out of a tanker) influence the design and valuation of even the most basic contracts;

- most things (contracts, hedging strategies, etc) depend on spreads, consequently correlations (between dates, locations, type of fuel, etc) play an essential role. It becomes important to know when one can "Margrabe the world away";

- different players in the market (say producers and retailers) have different likes and dislikes (say towards spikes in prices and their persistence), so the story behind the change of measure from physical to risk-neutral probabilities is an elaborate one (for example, both the long-term level and the speed of mean reversion might change from one measure to another);

- enforcing consistency constraints while modeling and fitting implied volatility curves and surfaces across time is as difficult a problem for commodities as it is for interest rates, and perhaps might benefit from common techniques;

- there exist reasonably advanced models for carbon emission markets and they perform relatively well when compared to actual data for the first phase of the EU-ETS market, whereas a lot needs to be done to model newer features in these markets (multi-periods, banking of allowances, CERs, etc)

- electricity markets are very developed and well understood both in theory and practice. As a consequences, linking them to carbon emission markets in a unified way might shed some light.

The forum concluded with an extremely lively panel (sadly not available in audio recording). I tried my best to stir the discussion towards the benefits and flaws of carbon emission markets, but what really caught the attention of most panelists and people in the audience was the relation between academia and industry. The highlight for me was Nicole El Karoui standing up at the back of the room and delivering a passionate defense (in French) of the role of independence of academic research. She said that although she learned important lessons from the markets (for example the importance of model robustness, something that academics seldom think about), we need to keep a healthy distance and reject the notion that markets are always right. Bravo Nicole !

- commodity markets are fertile ground for exotic derivatives, because physical constraints (say the rate at which you can pump oil in and out of a tanker) influence the design and valuation of even the most basic contracts;

- most things (contracts, hedging strategies, etc) depend on spreads, consequently correlations (between dates, locations, type of fuel, etc) play an essential role. It becomes important to know when one can "Margrabe the world away";

- different players in the market (say producers and retailers) have different likes and dislikes (say towards spikes in prices and their persistence), so the story behind the change of measure from physical to risk-neutral probabilities is an elaborate one (for example, both the long-term level and the speed of mean reversion might change from one measure to another);

- enforcing consistency constraints while modeling and fitting implied volatility curves and surfaces across time is as difficult a problem for commodities as it is for interest rates, and perhaps might benefit from common techniques;

- there exist reasonably advanced models for carbon emission markets and they perform relatively well when compared to actual data for the first phase of the EU-ETS market, whereas a lot needs to be done to model newer features in these markets (multi-periods, banking of allowances, CERs, etc)

- electricity markets are very developed and well understood both in theory and practice. As a consequences, linking them to carbon emission markets in a unified way might shed some light.

The forum concluded with an extremely lively panel (sadly not available in audio recording). I tried my best to stir the discussion towards the benefits and flaws of carbon emission markets, but what really caught the attention of most panelists and people in the audience was the relation between academia and industry. The highlight for me was Nicole El Karoui standing up at the back of the room and delivering a passionate defense (in French) of the role of independence of academic research. She said that although she learned important lessons from the markets (for example the importance of model robustness, something that academics seldom think about), we need to keep a healthy distance and reject the notion that markets are always right. Bravo Nicole !

## Friday, April 9, 2010

### Coxeter Lectures

Nicole El Karoui delivered the Coxeter Lectures Series for our thematic program this week. The theme of the lectures were backward stochastic differential equations (BSDEs), a vast and deep topic to which she has made groundbreaking contributions over the past couple couple of decades.

The first lecture was an overview of the main definitions, results and classical applications to finance, along the lines of her well-known 1997 paper with Peng and Quenez. It went about half an hour overtime, but she is so passionate about the subject that nobody minded much.

I could not attend the second lecture (was organizing a "help your child with math" event at my son's school the same evening), but was told by other participants that it covered the more mathematical aspects of the theory, showing what goes under the hood when one tries to prove things like existence, uniqueness and stability of solutions of BSDEs.

In the third and final lecture, Nicole concluded a discussion of numerical schemes for approximating solutions of BSDEs by showing what kind of errors need to be controlled, together with a few convergence results. She then switched gears to finance again and described how certain BSDEs can be viewed as dynamic risk measures - a challenging new focus of intense research in the area.

The first lecture was an overview of the main definitions, results and classical applications to finance, along the lines of her well-known 1997 paper with Peng and Quenez. It went about half an hour overtime, but she is so passionate about the subject that nobody minded much.

I could not attend the second lecture (was organizing a "help your child with math" event at my son's school the same evening), but was told by other participants that it covered the more mathematical aspects of the theory, showing what goes under the hood when one tries to prove things like existence, uniqueness and stability of solutions of BSDEs.

In the third and final lecture, Nicole concluded a discussion of numerical schemes for approximating solutions of BSDEs by showing what kind of errors need to be controlled, together with a few convergence results. She then switched gears to finance again and described how certain BSDEs can be viewed as dynamic risk measures - a challenging new focus of intense research in the area.

## Wednesday, April 7, 2010

### All you ever wanted to know about Levy processes

Well, maybe not all, but pretty close...

Alexey Kuznetsov gave the visitors seminar this week on the theme above. He showed how to efficiently compute the probability distributions for all sorts of functionals (minimum, maximum, first passage time, last maximum before a first passage time, time of the last maximum before a first passage time, etc...) for a class of Levy processes that he says is as natural as CGMY, but with much nicer analytic properties.

I knew of Alexey's fascination with Levy processes (among other things) ever since his first postdoc at McMaster several years ago. He has become a world expert on computing the kind of functionals mentioned above, engaging in collaboration with probability heavy-weights such as Andreas E. Kyprianou. A pleasure to watch.

Alexey Kuznetsov gave the visitors seminar this week on the theme above. He showed how to efficiently compute the probability distributions for all sorts of functionals (minimum, maximum, first passage time, last maximum before a first passage time, time of the last maximum before a first passage time, etc...) for a class of Levy processes that he says is as natural as CGMY, but with much nicer analytic properties.

I knew of Alexey's fascination with Levy processes (among other things) ever since his first postdoc at McMaster several years ago. He has become a world expert on computing the kind of functionals mentioned above, engaging in collaboration with probability heavy-weights such as Andreas E. Kyprianou. A pleasure to watch.

## Thursday, April 1, 2010

### Quantitave Finance Seminars - March edition

We celebrated the end of the first half of our thematic program with the March installment of the Quantitative Finance Seminar Series .

Dilip Madan is thinking hard about the unintended consequences of the limited liabilities paradigm, the very foundation of publicly traded firms, in the presence of financial practices that lead to unbounded exposures to risk (think naked short sales by hedge funds). As he explained, this implicit leads to the appearance of a "taxpayer put", which can be extremely valuable (of the orders of hundreds of billions of dollars for major US banks) and yet are entirely absent from traditional valuations of financial institutions. His proposal is to introduce a capital requirement that off-sets the perverse incentives of this implicit put option. As FinReg marches through the US Congress, nothing could be more topical.

Stan Uryasev based his entire talk in what he called a "quadrangle" with risk, deviation, regret and error in each of four corners and the arrows linking them representing specific procedures to go from one corner to another (things like minimization, taking expectations, etc). Just in case you find this too abstract, he showed how it all works for VaR and CVaR, and also advertised his own company, which runs funds based on the algorithm he described. Later during dinner I observed that one of his funds had over 100% return in 2008, which was better than Renaissance, which posted about 80% return that year. Dilip then remarked that this means very little, since

Dilip Madan is thinking hard about the unintended consequences of the limited liabilities paradigm, the very foundation of publicly traded firms, in the presence of financial practices that lead to unbounded exposures to risk (think naked short sales by hedge funds). As he explained, this implicit leads to the appearance of a "taxpayer put", which can be extremely valuable (of the orders of hundreds of billions of dollars for major US banks) and yet are entirely absent from traditional valuations of financial institutions. His proposal is to introduce a capital requirement that off-sets the perverse incentives of this implicit put option. As FinReg marches through the US Congress, nothing could be more topical.

Stan Uryasev based his entire talk in what he called a "quadrangle" with risk, deviation, regret and error in each of four corners and the arrows linking them representing specific procedures to go from one corner to another (things like minimization, taking expectations, etc). Just in case you find this too abstract, he showed how it all works for VaR and CVaR, and also advertised his own company, which runs funds based on the algorithm he described. Later during dinner I observed that one of his funds had over 100% return in 2008, which was better than Renaissance, which posted about 80% return that year. Dilip then remarked that this means very little, since

*everyone*made a lot of money in 2008. Ok...

## Wednesday, March 31, 2010

### Computational workshop - final day (a week later)

This post concludes my day-to-day review of the workshop on Computational Methods that took place at Fields last week.

As could be inferred from his informal presentation on GPUs the previous day, Mike Giles does heavy duty numerical computations with a vengeance. He described a Monte Carlo method implemented on different levels of resolutions in order to achieve a prescribed accuracy and showed how to apply it to challenging exotic option pricing and sensitivity calculations, as well as a stochatic PDE example.

Peter Forsyth started by explaining what GMWB (guaranteed minimum withdraw benefits) are and introduced a penalty method to price them numerically by solving the associated HJB equations. Interestingly, his results were not sensitive to the size of the penalty term, which silences a common criticism to such methods. Even more interestingly, we learned at the end of the talk that policyholders in Hamilton tend to exercise their guarantees optimally (well, at least in one historical example), thereby wreaking havoc for insurance companies, which routinely underprice such policies betting on consumer's sub-optimality.

Nizar Touzi described his probabilistic approach for solving fully nonlinear PDEs. As it is well-known, probabilistic methods can be used to solve second order linear parabolic PDEs via the Feynman-Kac formula, which can then be approximated by Monte Carlo. For quasi-linear PDEs, a similar approach leads to a numerical scheme for solving BSDEs. What Nizar showed is how the scheme can be generalized to the fully nonlinear case, leading to a scheme that involves both Monte Carlo and finite-differences, but without relying directly on BSDEs.

Phillip Protter brought the workshop to conclusion with a talk on absolutely continuous compensators. Now, these are objects that show up naturally in the study of totally inaccessible stoping times (such as those used in the majority of reduced-form credit risk models), so the motivation for studying them is perfectly understandable. Not the same for the machinery necessary to appreciate the results... As another participant observed, it is likely that about 2 people in the audience knew what the Levy system of a Hunt process is (don't count me as one of them). In the end, the message I got from the talk is that any good old totally inaccessible stopping time that has any chance of making into a financial model will likely have an absolutely continuous compensator - but it is nice to know that there are people proving such results. Before I finish, I used to say that Phillip is the Woody Allen of mathematics, but the analogy is no longer valid: Woody is not that funny anymore, whereas Phillip is in great form.

As could be inferred from his informal presentation on GPUs the previous day, Mike Giles does heavy duty numerical computations with a vengeance. He described a Monte Carlo method implemented on different levels of resolutions in order to achieve a prescribed accuracy and showed how to apply it to challenging exotic option pricing and sensitivity calculations, as well as a stochatic PDE example.

Peter Forsyth started by explaining what GMWB (guaranteed minimum withdraw benefits) are and introduced a penalty method to price them numerically by solving the associated HJB equations. Interestingly, his results were not sensitive to the size of the penalty term, which silences a common criticism to such methods. Even more interestingly, we learned at the end of the talk that policyholders in Hamilton tend to exercise their guarantees optimally (well, at least in one historical example), thereby wreaking havoc for insurance companies, which routinely underprice such policies betting on consumer's sub-optimality.

Nizar Touzi described his probabilistic approach for solving fully nonlinear PDEs. As it is well-known, probabilistic methods can be used to solve second order linear parabolic PDEs via the Feynman-Kac formula, which can then be approximated by Monte Carlo. For quasi-linear PDEs, a similar approach leads to a numerical scheme for solving BSDEs. What Nizar showed is how the scheme can be generalized to the fully nonlinear case, leading to a scheme that involves both Monte Carlo and finite-differences, but without relying directly on BSDEs.

Phillip Protter brought the workshop to conclusion with a talk on absolutely continuous compensators. Now, these are objects that show up naturally in the study of totally inaccessible stoping times (such as those used in the majority of reduced-form credit risk models), so the motivation for studying them is perfectly understandable. Not the same for the machinery necessary to appreciate the results... As another participant observed, it is likely that about 2 people in the audience knew what the Levy system of a Hunt process is (don't count me as one of them). In the end, the message I got from the talk is that any good old totally inaccessible stopping time that has any chance of making into a financial model will likely have an absolutely continuous compensator - but it is nice to know that there are people proving such results. Before I finish, I used to say that Phillip is the Woody Allen of mathematics, but the analogy is no longer valid: Woody is not that funny anymore, whereas Phillip is in great form.

## Tuesday, March 30, 2010

### Computational workshop - second day (a week later)

During the panel discussion on the first day of the workshop, Jim Gatheral pointed out that "for some reason" exotic derivatives have gone out of fashion during the past two years, with the result that a whole bunch of talented quants had to divert their brain power to something else, in particular algorithmic trading. Naturally, the workshop organizers were aware of that and algorithmic trading was the predominant topic of the second day. I suspect this was one of the reasons for changing the original name of the workshop from "numerical" to "computational" methods in finance - a very appropriate generalization.

The first talk by Ciamac Moallemi allowed me for the first time to properly understand the dynamics of a limit order book, mainly because he used a deterministic fluid model, thereby developing a good deal of intuition before actually solving the optimal execution problem in such model.

In impeccably characteristic style, Jim Gatheral followed with a review of the most well-known optimal execution models in the literature, adding personal touches along the way (such as re-deriving closed-form optimal policies by Euller-Lagrange instead of HJB) and leading up to his own new results about the possibility of price manipulation. As anyone who ready his volatility surface book, Jim is not a fan of unnecessarily complicated notation, but not afraid of deep mathematics either. As a result, everything that he writes

Petter Kolm shifted the focus of the discussion by concentration on the "buy-side" of optimal execution, that is, incorporation the views of a portfolio manager into the execution problem. Since these bring different objectives (maximizing returns over a long time horizon, improve diversification, etc), he showed that traditional execution policies are generally sub-optimal. In particular, he prescribed that rebalancing should be done more often than traditionally suggested, and predicted a corresponding growth in research and practice of multi-period portfolio optimization.

Chris Rogers reverted back to numerical methods and introduced a way to solve optimal stopping problems using convex approximations to a convex value function. As another participant commented during the break, you might not be clever enough to exactly reproduce what Chris does, but his presentations always make perfectly clear what is going on each step of the way. He is also very good at identifying fatal flaws in other people's work (I personally tremble whenever he is listening to my talks), but unlike other acute critics, he also applies the same level of rigor to his own work. As a result, he can be disconcertingly honest in pointing out weaknesses of his proposed methods. My impression is that it is too early to pass judgment on this particular method, but at least all the advantages and disadvantages were clearly spelled out.

In an informal presentation after lunch, Mike Giles gave a fascinating overview of the use of GPU (graphic processing units) in quantitative finance. It was highly entertaining to hear that banks in Canary Wharf had exhausted the energy capacity available to power up their computer (with any new power generation being committed to the 2012 Olympic Games), so that they face the choice of either migrating to less power-hungry GPUs (and having to rewrite all their code) or build new data centers in the periphery of London.

In the context of portfolio credit risk, Kay Giesecke showed how to use importance sampling to efficiently simulate rare events and calculate several related risk measure. Liming Feng concluded the day by introducing a discrete Hilbert transform to calculate option prices for several exotic options (bermuda, loockbacks, etc) in a variety of Levy models with surprisingly good error estimates. Apologies for being brief on these last two talks, by I'm running out of time for this post.

The first talk by Ciamac Moallemi allowed me for the first time to properly understand the dynamics of a limit order book, mainly because he used a deterministic fluid model, thereby developing a good deal of intuition before actually solving the optimal execution problem in such model.

In impeccably characteristic style, Jim Gatheral followed with a review of the most well-known optimal execution models in the literature, adding personal touches along the way (such as re-deriving closed-form optimal policies by Euller-Lagrange instead of HJB) and leading up to his own new results about the possibility of price manipulation. As anyone who ready his volatility surface book, Jim is not a fan of unnecessarily complicated notation, but not afraid of deep mathematics either. As a result, everything that he writes

*looks*very simple, but moving from one equation to the next can require unexpected effort (if you really want to understand what is going on, that is).Petter Kolm shifted the focus of the discussion by concentration on the "buy-side" of optimal execution, that is, incorporation the views of a portfolio manager into the execution problem. Since these bring different objectives (maximizing returns over a long time horizon, improve diversification, etc), he showed that traditional execution policies are generally sub-optimal. In particular, he prescribed that rebalancing should be done more often than traditionally suggested, and predicted a corresponding growth in research and practice of multi-period portfolio optimization.

Chris Rogers reverted back to numerical methods and introduced a way to solve optimal stopping problems using convex approximations to a convex value function. As another participant commented during the break, you might not be clever enough to exactly reproduce what Chris does, but his presentations always make perfectly clear what is going on each step of the way. He is also very good at identifying fatal flaws in other people's work (I personally tremble whenever he is listening to my talks), but unlike other acute critics, he also applies the same level of rigor to his own work. As a result, he can be disconcertingly honest in pointing out weaknesses of his proposed methods. My impression is that it is too early to pass judgment on this particular method, but at least all the advantages and disadvantages were clearly spelled out.

In an informal presentation after lunch, Mike Giles gave a fascinating overview of the use of GPU (graphic processing units) in quantitative finance. It was highly entertaining to hear that banks in Canary Wharf had exhausted the energy capacity available to power up their computer (with any new power generation being committed to the 2012 Olympic Games), so that they face the choice of either migrating to less power-hungry GPUs (and having to rewrite all their code) or build new data centers in the periphery of London.

In the context of portfolio credit risk, Kay Giesecke showed how to use importance sampling to efficiently simulate rare events and calculate several related risk measure. Liming Feng concluded the day by introducing a discrete Hilbert transform to calculate option prices for several exotic options (bermuda, loockbacks, etc) in a variety of Levy models with surprisingly good error estimates. Apologies for being brief on these last two talks, by I'm running out of time for this post.

## Monday, March 29, 2010

### Computational workshop - first day (a week later)

As promised, here is a day-by-day lookback at the workshop on Computational Methods in Finance that took place at Fields last week.

The majority of the talks in the first days consisted of what I would call "traditional" numerical methods (for lack of a better name), namely, the thorough investigation of very clever ways to obtain numerical solutions to challenging problems.

Ralf Korn opened the workshop with a very elegant overview of binomial methods. To tackle the tricky problem of matching the tree parameters to a complicated correlation structure in high dimensional, he proposed a universal method of decoupling based on a diagonalization of the correlation matrix, which reduces the matching problem to several uncorrelated one-dimensional trees, for which well refined methods already exist.

Kumar Muthuraman followed with a moving boundary method to solve free-boundary problems: start with a blatantly suboptimal boundary (often suggested naturally by the problem at hand) and systematically improve it in the region where the associated variational inequality is violated until the true free-boundary is well approximated.

John Chadan gracefully replaced Garud Iyengar, who could not attend, and explained how integral equations methods can lead to very detailed results for the American put option problem, in particular the fact that the exercise boundary might fail to be convex when dividends are higher than the risk-free interest rate.

In the context of portfolio optimization, Lorenzo Garlappi showed how a carefull decompostion of the state variables into an observable component and a random error, combined with a Taylor expansion of the value function (expressed in monetary terms) can lead to very accurate numerical approximations.

After lunch, a panel consisting of Jim Gatheral, Chris Rogers, Ernst Eberlein and Jeremy Staum discussed current challenges in quantitative finance, especially in light of the crisis of 2008.

Jeremy Staum then followed with a comprehensive set of proposals for the use of repeated simulations in finance. It is hard to do justice to his far reaching talk in just one line, so I'll just mention the key concepts of Database Monte Carlo and Metamodeling, while hoping to have more chance to study this in the future.

Birgit Rudloff concluded with perhaps the most technically demanding talk of the day (for me at least), investigation the effect of transaction costs on risk measures. My mind went in to a swirl right at the begining of her talk, since many of the familiar concepts such as arbitrage, portfolio value, risk measure, immediately became a special "scalar" case of their mutli-dimensional generalizations (a consequence of not being permitted to calculate the value of a portfolio as a simple scalar product at each point in time due to the presence of transaction costs). I partially recovered some sanity with the 2-dimensional example at the end, where nice pictures can be drawn ont he board, but at the point everyone was rightly ready for a glass of wine in the reception that followed.

The majority of the talks in the first days consisted of what I would call "traditional" numerical methods (for lack of a better name), namely, the thorough investigation of very clever ways to obtain numerical solutions to challenging problems.

Ralf Korn opened the workshop with a very elegant overview of binomial methods. To tackle the tricky problem of matching the tree parameters to a complicated correlation structure in high dimensional, he proposed a universal method of decoupling based on a diagonalization of the correlation matrix, which reduces the matching problem to several uncorrelated one-dimensional trees, for which well refined methods already exist.

Kumar Muthuraman followed with a moving boundary method to solve free-boundary problems: start with a blatantly suboptimal boundary (often suggested naturally by the problem at hand) and systematically improve it in the region where the associated variational inequality is violated until the true free-boundary is well approximated.

John Chadan gracefully replaced Garud Iyengar, who could not attend, and explained how integral equations methods can lead to very detailed results for the American put option problem, in particular the fact that the exercise boundary might fail to be convex when dividends are higher than the risk-free interest rate.

In the context of portfolio optimization, Lorenzo Garlappi showed how a carefull decompostion of the state variables into an observable component and a random error, combined with a Taylor expansion of the value function (expressed in monetary terms) can lead to very accurate numerical approximations.

After lunch, a panel consisting of Jim Gatheral, Chris Rogers, Ernst Eberlein and Jeremy Staum discussed current challenges in quantitative finance, especially in light of the crisis of 2008.

Jeremy Staum then followed with a comprehensive set of proposals for the use of repeated simulations in finance. It is hard to do justice to his far reaching talk in just one line, so I'll just mention the key concepts of Database Monte Carlo and Metamodeling, while hoping to have more chance to study this in the future.

Birgit Rudloff concluded with perhaps the most technically demanding talk of the day (for me at least), investigation the effect of transaction costs on risk measures. My mind went in to a swirl right at the begining of her talk, since many of the familiar concepts such as arbitrage, portfolio value, risk measure, immediately became a special "scalar" case of their mutli-dimensional generalizations (a consequence of not being permitted to calculate the value of a portfolio as a simple scalar product at each point in time due to the presence of transaction costs). I partially recovered some sanity with the 2-dimensional example at the end, where nice pictures can be drawn ont he board, but at the point everyone was rightly ready for a glass of wine in the reception that followed.

## Sunday, March 28, 2010

### Operational Risk Forum

We kicked off the series of Industrial-Academic Forums with the topic of Operational Risk this past Friday and Saturday. Since I knew very little about the subject, the Forum was an opportunity to learn quite a lot. But aside from my personal experience, I heard from most participants that this was a very well organized, timely and productive forum.

For lack of expertise, instead of reviewing the talks individually, I will list the things that I learn from the workshop as a whole, in no particular order, except perhaps the number of times they were mentioned in the talks.

- because banks don't make money out of operational risk (as opposed to market and credit risk, each related to core commercial activities, with a lot of money to be made by beating the competition in the corresponding markets), research in the area has been mostly driven by regulation rather than market pressure;

- as a consequence, most of the effort focus on how to calculate the exact quantity specified by Basel II, namely VaR 99.9% of operational risk losses, which are primarily modeled using LDA (loss distribution analysis);

- LDA consists of modeling the losses in each business line as compound Poisson processes (therefore modeling the frequency and the severity of the losses separately) and then bringing business line together under some dependence assumption (say copulas). The severity of losses is modeled as a heavy tail distribution, with log-normal (10,2) being everyone's darling, closely followed by Pareto and something called "g and h" (google it !);

- clever numerical methods and analytic approximations compete to produce the final shape of the loss distribution and calculate its VaR 99.9%.

- it is really really hard to estimate the parameters of such heavy-tailed distributions, and there is not nearly enough data to accurately distinguishe them. As a consequence, regulators (and just about everyone else) have their work cut out for them in trying to validate the models used by different banks. Much care is needed to ensure a "level playing field" in the process;

- VaR is a particularly boneheaded choice of risk measure in this case, especially when tying to measure the "diversification effect" (the parameter "delta" in Basel II), which can jump from positive to negative (i.e adverse diversification) by slightly changing the marginal distribution of losses, without even touching their dependence structure;

- it would be really nice to have a way to model operational risk from first principles, like a structural model in credit risk, rather than simply trying to fit (highly unstable) statistical models to (largely unavaiable) data;

- at the end of the day, research in operational risk should lead to better risk management rather than measurament. As one participant put it: "operational risk groups should be seen more like a think-tanks instead of data centre";

As I said, a wonderful learning experience.

For lack of expertise, instead of reviewing the talks individually, I will list the things that I learn from the workshop as a whole, in no particular order, except perhaps the number of times they were mentioned in the talks.

- because banks don't make money out of operational risk (as opposed to market and credit risk, each related to core commercial activities, with a lot of money to be made by beating the competition in the corresponding markets), research in the area has been mostly driven by regulation rather than market pressure;

- as a consequence, most of the effort focus on how to calculate the exact quantity specified by Basel II, namely VaR 99.9% of operational risk losses, which are primarily modeled using LDA (loss distribution analysis);

- LDA consists of modeling the losses in each business line as compound Poisson processes (therefore modeling the frequency and the severity of the losses separately) and then bringing business line together under some dependence assumption (say copulas). The severity of losses is modeled as a heavy tail distribution, with log-normal (10,2) being everyone's darling, closely followed by Pareto and something called "g and h" (google it !);

- clever numerical methods and analytic approximations compete to produce the final shape of the loss distribution and calculate its VaR 99.9%.

- it is really really hard to estimate the parameters of such heavy-tailed distributions, and there is not nearly enough data to accurately distinguishe them. As a consequence, regulators (and just about everyone else) have their work cut out for them in trying to validate the models used by different banks. Much care is needed to ensure a "level playing field" in the process;

- VaR is a particularly boneheaded choice of risk measure in this case, especially when tying to measure the "diversification effect" (the parameter "delta" in Basel II), which can jump from positive to negative (i.e adverse diversification) by slightly changing the marginal distribution of losses, without even touching their dependence structure;

- it would be really nice to have a way to model operational risk from first principles, like a structural model in credit risk, rather than simply trying to fit (highly unstable) statistical models to (largely unavaiable) data;

- at the end of the day, research in operational risk should lead to better risk management rather than measurament. As one participant put it: "operational risk groups should be seen more like a think-tanks instead of data centre";

As I said, a wonderful learning experience.

## Friday, March 26, 2010

### Hard choices

I was afflicted by an embarrassment of riches yesterday, having to choose between attending the 6-hour guest lectures by Kay Giesecke in one of the graduate course at Fields or attend the AIMS/Phimac talk by Santiago Moreno-Bromberg at McMaster.

In the end I apply a least effort criterion and stayed at McMaster, especially because this was a heavy week, with Thursday sandwiched between a 3-day workshop and a 2-day forum.

Santiago's talk was nothing less than fascinating, touching on many important topics (optimal design of contracts, risk sharing, principal-agent problems with multiple agents AND multiple principals, etc). It is a perfect example of a recent but unmistakable trend of mathematical finance research expanding from strict financial markets and moving in the direction of mainstream economics.

ps: I'm sure that Kay's course was also lovely (and the credit risk students at Fields are a very lucky bunch for having him here), just can't comment because I wasn't there. Maybe this is a good point for some of my secret readers to post a comment :)

In the end I apply a least effort criterion and stayed at McMaster, especially because this was a heavy week, with Thursday sandwiched between a 3-day workshop and a 2-day forum.

Santiago's talk was nothing less than fascinating, touching on many important topics (optimal design of contracts, risk sharing, principal-agent problems with multiple agents AND multiple principals, etc). It is a perfect example of a recent but unmistakable trend of mathematical finance research expanding from strict financial markets and moving in the direction of mainstream economics.

ps: I'm sure that Kay's course was also lovely (and the credit risk students at Fields are a very lucky bunch for having him here), just can't comment because I wasn't there. Maybe this is a good point for some of my secret readers to post a comment :)

## Wednesday, March 24, 2010

### So much to blog, so little time

The Computational Methods workshop has come and gone and I didn't have time to post a single entry about it yet. I promise to do a day-by-day restrospective in the next few days, but now I'm off to give a talk to the U of T Math Union.

## Sunday, March 21, 2010

### Computational Workshop starts tomorrow

The second major workshop of the thematic program starts tomorrow at Fields and I thought I should provide a little bit of history like I did for the first workshop.

Paul Glasserman was a supporter of the thematic program from the very beginning. In fact he was part of a handful of people who gave us valuable advice when we were writing our Letter of Intent back in 2005. After that I contacted him again in January 2008 asking if he would like to chair the scientific committe for one of the workshops. By October 2008, with the addition of Mark Broadie as co-chair and Stathis Tompaidis and David Saunders as members, we had the committee formed and ready to work. They sent me a complete list of speakers in May 2009, at which point all the hard work was done and we could start publicizing the workshop.

Paul Glasserman was a supporter of the thematic program from the very beginning. In fact he was part of a handful of people who gave us valuable advice when we were writing our Letter of Intent back in 2005. After that I contacted him again in January 2008 asking if he would like to chair the scientific committe for one of the workshops. By October 2008, with the addition of Mark Broadie as co-chair and Stathis Tompaidis and David Saunders as members, we had the committee formed and ready to work. They sent me a complete list of speakers in May 2009, at which point all the hard work was done and we could start publicizing the workshop.

### ABM talk

Leigh Tesfatsion gave a colloquium talk at McMaster yesterday on the use of agent-based models (ABM) as the right mathematics for the social sciences.

I met Leigh in May 2009, during a Perimeter Institute workshop on the

Since then, I have started my own little ABM project related to systemic risk in banking networks, so it was a great pleasure to host Leigh for a day and discuss future research avenues, not to mention that her talk was every bit as inspiring as the one she delivered at PI last year.

I met Leigh in May 2009, during a Perimeter Institute workshop on the

*The Economic Crises and Its Implications for The Science of Economics*. Of all the ambitious ideas put forward during the workshop (including things like gauge theory and economic preferences, evolutionary biology foundations for behavioral finance, and even something called 'particle economics'), her presentation on Agent-Based Computational Economics (ACE) struck me as the most promising way to achieve real progress in the understanding of economic crisis.Since then, I have started my own little ABM project related to systemic risk in banking networks, so it was a great pleasure to host Leigh for a day and discuss future research avenues, not to mention that her talk was every bit as inspiring as the one she delivered at PI last year.

## Wednesday, March 17, 2010

### York day

The visitors seminar series this week had back-to-back talks by Tom Salisbury and Fouad Marri, both from York University.

Tom spoke about the hot topic of retirement income products with embedded guarantees. In the end he complained to himself about spending too much time on a relaxed introduction and not enough on the mathematical result the he wanted to show, but the talk went so smoothly (as his talks usually do) that my sense is that nobody else shared the complaint.

Fouad showed how to explicitly compute the moment generating function of compound Poisson processes when the jump sizes are correlated with the jump times according to a specific class of copula functions. As pointed out by Sebastian Jaimungal in the audience, such processes can be useful to model insurance claims arising from earthquakes, since typically their magnitude tends to increase the longer you wait.

Tom spoke about the hot topic of retirement income products with embedded guarantees. In the end he complained to himself about spending too much time on a relaxed introduction and not enough on the mathematical result the he wanted to show, but the talk went so smoothly (as his talks usually do) that my sense is that nobody else shared the complaint.

Fouad showed how to explicitly compute the moment generating function of compound Poisson processes when the jump sizes are correlated with the jump times according to a specific class of copula functions. As pointed out by Sebastian Jaimungal in the audience, such processes can be useful to model insurance claims arising from earthquakes, since typically their magnitude tends to increase the longer you wait.

## Sunday, March 14, 2010

## Wednesday, March 10, 2010

### Statistical arbitrage done right

Rudra Jena gave a talk in the visitor seminar series yesterday about misspecified stochastic volatility models and their corresponding (statistical) arbitrage opportunities. This generalizes the famous work of El Karoui, Jeanblanc and Shreve for misspecified Black-Scholes models and leads to several interesting conclusions. For example, Rudra showed that, in general, the popular Risk Reversal and Butterfly Spread strategies, commonly used by traders to benefit from perceived misspecification in the correlation between volatility and stock price and in the vol of vol, are

*not*profit maximizing.## Thursday, March 4, 2010

### Riccati rules

Ken Kim spoke about stability of Riccati equations on the visitors seminar series this week. This is truly beautiful stuff relating dynamical systems concepts like equilibrium points and stable manifolds to probabilistic properties of the corresponding affine diffusion.

Interestingly enough, Tom Hurd was asking me just the other week if I knew of anybody looking at stability of large dimensional systems of Riccati equations, since they arise naturally in his new model for systemic risk. Thought their motivations are different, I'm sure Tom and Ken will have lots to talk about until the end of the thematic program.

Interestingly enough, Tom Hurd was asking me just the other week if I knew of anybody looking at stability of large dimensional systems of Riccati equations, since they arise naturally in his new model for systemic risk. Thought their motivations are different, I'm sure Tom and Ken will have lots to talk about until the end of the thematic program.

## Thursday, February 25, 2010

### A day to remember

Days don't come much more memorable than yesterday.

Stan Pliska concluded his guest lectures with the end of the Harrison-Pliska story and a comprehesive overview of Optimal Portfolio Theory, from Markowitz to Merton and beyond, complete with empirical results illustrating the advantages and disadvantages of the different approaches.

Later in the day, Raphael Douady, himself an olympian, used his talk in the Quantitative Finance Seminar Series to explain how to evaluate the risk of a hedge fund using a herculean risk measure based on nonlinear factor models - the StressVaR.

And of course Team Canada beat Russia for the first time in 50 years!

Stan Pliska concluded his guest lectures with the end of the Harrison-Pliska story and a comprehesive overview of Optimal Portfolio Theory, from Markowitz to Merton and beyond, complete with empirical results illustrating the advantages and disadvantages of the different approaches.

Later in the day, Raphael Douady, himself an olympian, used his talk in the Quantitative Finance Seminar Series to explain how to evaluate the risk of a hedge fund using a herculean risk measure based on nonlinear factor models - the StressVaR.

And of course Team Canada beat Russia for the first time in 50 years!

## Tuesday, February 23, 2010

### Power play

Coinciding with an Olympics-related family discussion that we had yesterday on the meaning of the expression power play, we had a long and informative talk today by Vladimir Vinogradov about the Power-Variance Family - a rich class of processes that can be used to model equity prices.

And on a completely unrelated note, via Brad DeLong, I can't avoid noticing that John Cochrane is sounding weirder and weirder by the day. In my formative years I attempted to read his

And on a completely unrelated note, via Brad DeLong, I can't avoid noticing that John Cochrane is sounding weirder and weirder by the day. In my formative years I attempted to read his

*Asset Pricing*book and gave up thinking it was simply badly written, but it is becoming obvious that the confusing and obscure writing style is merely a reflection of a muddled world view.## Monday, February 22, 2010

### The return of excitment

And by that I mean: finally there is snow on the ground again !

More generally, many new visitors arrived today. Sebastiano Silla will be working with me on insurance problems and participate in the program until June, Rudra Jena will work with Tom Hurd on systemic risk and visit for a month and Vladimir Vinogradov will give a talk in the Visitor's seminar tomorrow. So it definitely felt like the program was in full swing again after a relatively quiet month.

More generally, many new visitors arrived today. Sebastiano Silla will be working with me on insurance problems and participate in the program until June, Rudra Jena will work with Tom Hurd on systemic risk and visit for a month and Vladimir Vinogradov will give a talk in the Visitor's seminar tomorrow. So it definitely felt like the program was in full swing again after a relatively quiet month.

## Thursday, February 18, 2010

### Straight from the horse's mouth

Stan Pliska gave the first of two guest lectures in my graduate course yesterday by covering an enourmous amount of material in a commented tour of option pricing history from the Middle Ages to the 1980s.

As I mentioned in my September 09 Fields Notes article, I always knew that the Harrison and Pliska paper had opened the flood gates for the use of sophisticated stochastic analysis in finance, thereby more or less starting the discipline of mathematical finance as we know it. What I didn't know was how deliberate this was, to the extent that Stan, Mike Harrison and Rick Durrett (another hero of mine for completely different reasons stemming from the time I was working in statistical mechanics) had formed a reading group in 1980 specifically to study the work done in semimartingales and stochastic integration by the French school of probability (a la Dellacherie and Meyer).

It was delightful to hear all this and much more from Stan, and we look forward to his second lecture next week.

As I mentioned in my September 09 Fields Notes article, I always knew that the Harrison and Pliska paper had opened the flood gates for the use of sophisticated stochastic analysis in finance, thereby more or less starting the discipline of mathematical finance as we know it. What I didn't know was how deliberate this was, to the extent that Stan, Mike Harrison and Rick Durrett (another hero of mine for completely different reasons stemming from the time I was working in statistical mechanics) had formed a reading group in 1980 specifically to study the work done in semimartingales and stochastic integration by the French school of probability (a la Dellacherie and Meyer).

It was delightful to hear all this and much more from Stan, and we look forward to his second lecture next week.

## Tuesday, February 16, 2010

### Demo or die

Publish or perish used to be the standard for academic promotion and still describes the career path in the majority of disciplines. In some areas however, most notably computer science, it is not enough to publish papers with the solution of a problem - one also needs to demonstrate that the solution actually works in the real world.

I think computational finance is also one of these areas, and borrowing a phrase from the performing arts, I would like to propose that its standard should be demo or die.

All this came to my mind at today's seminar talk by Vladimir Surkov, who together with Sebastian Jaimungal and Ken Jackson developed a method for calculating option prices called Fourier Space-Time stepping (FST for short). Now these guys can publish and demo !

I think computational finance is also one of these areas, and borrowing a phrase from the performing arts, I would like to propose that its standard should be demo or die.

All this came to my mind at today's seminar talk by Vladimir Surkov, who together with Sebastian Jaimungal and Ken Jackson developed a method for calculating option prices called Fourier Space-Time stepping (FST for short). Now these guys can publish and demo !

Subscribe to:
Posts (Atom)