« May 2011 | Main | July 2011 »

9 posts from June 2011

24 June 2011

PRMIA on Data and Analytics

Final presentation at the PRMIA event yesterday was by Clifford Rossi and was entitled "The Brave New World of Data & Analytics Following the Crisis: A Risk Manager's Perspective".

Clifford got his presentation going with a humorous and self-depricating start by suggesting that his past employment history could in fact be the missing "leading indicator" for predicting orgnisations in crisis, having worked at CitiGroup, WaMu, Countrywide, Freddie Mac and Fannie Mae. One of the other professors present said that he didn't do the same to academia (University of Maryland beware maybe!).

Clifford said that the crisis had laid bare the inadequacy and underinvestment in data and risk technology in the financial services sector. He suggested that the OFR had the potential to be a game changer in correcting this issue and in helping the role of CRO to gain in stature.

He gave an example of a project at one of the GSEs he had worked at called "Project Enterprise" which was to replace 40 year old mainframe based systems (systems that for instance only had 3 digits to identify a transaction). He said that he noted that this project had recently been killed, having cost around $500M. With history like this, it is not surprising that enterpring risk data warehousing capabilities were viewed as black holes without much payoff prior to the crisis. In fact it was only due to Basel that data management projects in risk received any attention from senior management in his view.

During the recent stress test process (SCAP) the regulators found just how woeful these systems were as the banks struggled to produce the scenario results in a timely manner. Clifford said that many banks struggled to produce a consistent view of risk even for one asset type, and that in many cases, corporate acquisitions had exascerbated this lack of consistency in obtaining accurate, timely exposure data. He said that the mortgage processing fiasco showed the inadequacy of these types of systems (echoing something I heard at another event about mortgage tagging information being completely "free-fromat", without even designated fields for "City" and "State" for instance)

Data integrity was another key issue that Clifford discussed, here talking about the lack of historical performance data leading to myopia in dealing with new products and poor defintions of product leading to risk assessments based on the originator rather than on the characteristics of the product. (side note: I remember prior to the crisis the credit derivatives department at one UK bank requisitioning all new server hardware to price new CDO squared deals given it was supposedly so profitable, it was at that point that maybe I should have known something was brewing...) Clifford also outlined some further data challenges, such as the changing statistical relationship between Debt to Income ratio and mortgage defaults once incomes were self-declared on mortgages.

Moving on to consider analytics and models, Clifford outlined a lot of the concerns covered by the Modeller's Manifesto, such as the lack of qualitative judgement and over-reliance on the quantitative, efficiency and automation superceding risk management, limited capability to stress test on a regular basis, regime change, poor model validation, and cognitive biases reinforced by backward-looking statistical analysis. He made the additional point that in relation to the OFR, they should concentrate on getting good data in place before spending resource on building models.

In terms of focus going forward, Clifford said the liquidity, counterparty and credit risk management were not well understood. Possibly echoing Ricardo Rebonato's ideas, he suggested that leading indicators need to be integrated into risk modelling to provide the early warning systems we need. He advocated that the was more to do on integrating risk views across lines of business, counterparties and between the banking and trading book.

Whilst being a proponent of the OFRs potential to mandate better Analytics and data management, he warned (sensibly in my view) that we should not think that the solution to future crises is simply to set up a massive data collection and Modelling entity (see earlier post on the proposed ECB data utility)

Clifford thinks that Dodd-Frank has the potential to do for the CRO role what Sarbanes-Oxley did in elevating the CFO role. He wants risk managers to take the opportunity presented in this post-crisis period to lead the way in promoting good judgement based on sound management of data and Analytics. He warned that senior management buy-in to risk management was essential and could be forced through by regulatory edict.

This last and closing point is where I think where the role of risk management (as opposed to risk reporting) faces it's biggest challenge, in that how can a risk manager be supported in preventing a senior business manager from seeking a overly risky new business opportunity based on what "might" happen in the future - we human beings don't think about uncertainty very clearly and the lack of a resulting negative outcome will be seen by many to invalidate the concerns put forward before a decision was made. Risk management will become known as the "business prevention" department and not regarded as the key role it should be.

23 June 2011

SIFMA declines...

I almost forgot to mention that I went along to the SIFMA event (previously known as the SIA show) last week to take a look around. For those of you not familiar with the SIFMA/SIA event, then it was the biggest financial services technology event I had ever attended/exhibited, taking up 5 massive floors at the Hilton NY. Everyone used to go there, and indeed that was one of the reasons (the only reason?) to go along. Now the event seems to dying a slow death, something I was going to write about but Adam Honore of Aite and Melanie Rodier beat me to it.

22 June 2011

PRMIA on Systemic Risk Part #2 - plus the OFR

Lewis Alexander (ex-US Treasury) carried on the theme of systemic risk at the PRMIA seminar "Risk, Regulation and Financial Technology & Identifying the Next Crisis". He started by saying that whilst systemic risk was a risk to the economy and industry as a whole, systemic risk was also relevant to the risks (such as market or credit) that a risk manager at an individual institution needs to assess.

Lewis said that there had really only been three systemic crises over the past century or so (1907, 1933 and 2008) with obviously many more disruptions in markets that should not be described at systemic. As such this is one problem of assessing systemic risk which is that crises are rare events so there is little data to analyse. He also warned that the way the system responds to small shocks should not be taken as a proxy for how it responds to large ones, that the relationship between asset prices and systemic risk is a complex one, and that reporting (mainly accounting but also in risk) had not kept up with financial markets innovation.

Lewis said that "stress test" methods can help to identify vunerable institutions but that this method of looking at systemic risk does not deal with the propogation of risk from one institution to another. He said that network analysis can help to assess propogation but the weakness with these methods was the lack of counterparty data. Liquidity methods also suffer from a lack of data. He said that "Leading Indicators" (see past post on Bubble Indices) tell us little of what creates systemic risk.

He mentioned the use of CoVaR (based on VaR) for systemic risk, using CDS pricing to theoretically "insure" the industry against crisis and a "Merton Model" approach to estimate potential losses due to default for a group of banks. He said that all of these models were good comparators, but not good as indicators.

Given the previous talk on systemic risk, Lewis switched his focus to what can done with the main focus for him being data where we need:

  • Robust data on both asset and counterparty exposures
  • Data on leverage through the system
  • Data on the depth of liquidity to assess the vunerability of assets to fire sales

A final few points from his talk:

  • Dodd-Frank will help given new reporting mandates e.g. swap data repositories being invaluable sources of data for regulators
  • Could we use the payments/settlement system to provide yet more insight into what is going on by sensibly tagging transactional flows (DTCC take note apparently!)
  • SEC registration of a new financial product could help to enforce what is reported, how and to act as a limit on what products can be sold
  • Lewis said that up to 5,000 attributes are needed to describe any financial transaction so it can be done
  • As he became involved in the FSOC and the formation of the OFR he thought initially that collecting all the data needed was impossible, but his view has changed on this with modern technology and processing power.
  • The above said, he thought that until standards were in place (such as LEI) then it did not make sense for the OFR to start collecting data
  • A member of the audience suggested that if data could be published in a standard form, it would "Google to the rescue" in terms of doing aggregation across the industry without centralising the data in one store. (maybe Google plans to usurp Microsoft Excel as the defacto trading and risk management system for the industry?)

Lewis gave a very good and interesting talk. I think some of his ideas on the OFR were good, but given the state of the data infrastructure that I have observed at many large institutions I would be worried that he is being optimistic on how quickly the industry is able to pull all the data together, however standardised. I think the industry will get there (particularly if mandated), but given the legacy of past systems and infrastructure it will take some good time to achieve yet.  

PRMIA on Systemic Risk Part #1

I attend a PRMIA seminar this morning at the offices of Ernst & Young with the rather long title of "Risk, Regulation and Financial Technology & Identifying the Next Crisis".

First up was Matthew Richardson of NYU Stern with a presentation entitled "Identifying the Next Crisis". The focus of his presentation was on systemic risk, which he defined as the risk that financial institutions lose the ability to intermediate (i.e. continue to provide services) due to an aggregate capital shortfall. He presented a precise definition of the systemic risk of a firm as:

        Expected real social costs in a crisis per dollar of capital shortage
    x  Expected capital shortfall of the firm in a crisis

Matthew explained that there are three approaches to estimating systemic risk contribution:

  1. Statistical approach based on public data
  2. Stress tests
  3. Market approach based on insurance against capital losses in a crisis

He explained that the methods his team have used have had some statistical success against data from the past crisis in showing those organisations in crisis early. I found his presentation reasonably dry (more regression analysis etc) but I thought the following where worth a mention:

  • Crisis Insurance - Approach 3 on getting firms to insure themselves against capital shortfalls in a crisis sounded interesting but ended up with the insurer being the regulator (not enough capital to insure privately) and the beneficiary being the regulator. So effectively this was a tax on the systemically significant institutions, where the involvement of the private insurers was mainly to do with price discovery (i.e. setting the right level of premium (i.e. tax) for each institution)
  • Short-term Indicators - Many of the approaches we have currently (VaR etc) are short term indicators and so in good times do not inhibit market behaviour as would be desired by the regulators. A good illustration was given of how short term volatility was very much lower than long term prior to the crisis and how these merged to similar levels once the crisis hit.
  • Regulatory Loopholes - He put forward that this crisis was as a result not of monetary policy but of large complex financial institutions exploiting loopholes in regulation. The AIG Quarterly Filings of Feb 2008 showed that $379Billion of the $527Billion of CDS were with clients that were explicitly seeking regulatory capital relief (i.e. get the CDS in place and your capital requirement dropped to zero). He also explained how Fannie Mae and Freddie Mac were used by banks to simply "rubber stamp" mortgage pools and magically reduce the capital required down from 4% to 1.6%.
  • Where to look - He said that "like water flows downhill, capital flows to its most levered point". He said to look for which parts of the financial sector are treated different under Dodd-Frank, Basel III etc and that the key candidates were 1) shadow banking and 2) government guarantees. Also you should look for those asset classes that get preferred risk weights for a given level of risk.

As often seems to be the case, I found the side comments more interesting than the main body of the presentation, but Matthew's presentation showed that a lot of work is being done on systemic risk identification and measurement in academia.

 

 

20 June 2011

A glass of red and contrary ideas on Triple-A risk

I enjoyed myself at the drinks reception after the NYU-Poly event. Nothing new in that I guess for those of you that know me well and like me find it difficult to resist a glass or two of red wine. Whilst attempting to circulate (I am almost 2 metres tall, so rather than "circulate" I think a more appropriate word might unfortunately be "intimidate"), I struck up a conversation with an interesting gentleman by the name of Per Kurowski.

Per is a former director of the World Bank and has some contrary and interesting ideas on the financial crisis and our current methods of regulation. His first that financial crises rarely start with assets that are perceived as "risky", which I think is a pretty self-evident point but not one that I had not previously registered. His second line of argument is that our current regulation biasses our banks away from "riskier" assets and hence away from just the kinds of organisations that are a) needed for employment creation and b) do not cause crises.

Per argues that many of the big institutions are near triple-A rated and hence benefit from being able to leverage up cheaply (at low-interest rates, since they are triple-A) and are then biased by lower capital requirements to use this leveraged funding to invest in yet more triple-A assets (SPVs/other institutions such as themselves). Hence you get the double-whammy of cheap funding and biased capital requirements which naturally leads to potential distortions in anything perceived as triple-A, and a bias away from riskier assets and the risk-takers that the world economy needs.

Per expands upon these arguments in his blog and on YouTube.

Removing the punchbowl at NYU-Poly

A few of choice quotes from the rest of the day at NYU-Poly:

  • "The difference between economists and meteorologists is that meteorologists can at least agree on what happended yesterday"
  • "A bubble can only be identified from a trend when the bubble bursts"
  • "Capital flows from strange places to strange destinations in today's financial markets"
  • "In a Basel III world, the stock price of Morgan Stanley would rise if its investment banking division were sold off"
  • "Basel III is a good attempt at managing systemic risk"
  • "Hedge Funds are the risk takers of the future"
  • "Hedge Funds have the partnership mentality that the commercial banks have lost and should regain"
  • "CCPs should not compete on risk management"
  • "Economists are trained to predict everything except the future"
  • "Dodd Frank was a missed opportunity to consolidate the many regulators in the United States"
  • "Washing D.C. is all about turf and theatre"
  • "Insolvency and liquidity risk are not clearly separable"
  • "Beware the Golden Rule. He who makes the Gold makes the Rule"
  • "Systemic risk is not the sum of individual institutional risk"
  • "As Chuck Prince said "As long as the music is playing, you’ve got to get up and dance""
  • "Systemic risk management only works when we all stop dancing"
  • "Regulation should remove the punchbowl just when the party is getting started"

18 June 2011

Regulation - Putting out the fire once you know where the fire is - NYU-Poly

The first panel session at NYU-Poly after Nassim Taleb concerned itself with the increasing competition between banks and insurers, which I didn't think reached any great conclusions as to where things are heading but did give background for why banks and insurers are increasingly offering the same services (disintermediation, regulation and industry structural changes being the main reasons). One of the presenters also said that acturial methods may provide a useful framework for unhedgeable risks taken by banks. I must acknowledge that my attention span was also challenged during this session by a very early start (up pre-6am) and a distinct lack of caffeine (later rectified many times over).

Second panel session up was entitled "The Future of Financial Regulation" and proved a lot more interesting to me given that I think I learned a few new things. Main presenter was Allen Ferell from Harvard Law School. Main point I took away from this presentation was that regulation should focus more on the resolution of financial distress after (ex-post) it has occurred at an institution rather than rules and regulations to prevent it before it happens.

I found this argument quite appealing since to a large degree it avoids provisioning for the "unknown unknowns" through more and more rules and increases in capital. The reduction in pre (ex-ante) rules would also reduce the gaming of the rules that enevitability would occur, and shareholders knowing that they would be penalised and penalised quickly following financial distress would encourage them to become more interested in the levels of risk being taken on their behalf. I guess one of the main issues for the above is how such a level of financial distress would be defined and enforced in order to act as a trigger for say automatic conversion of debt to equity. Anyway, on with what Allen Ferrell had to say:

Allen said that if a financial institution had had the foresight to see the financial crisis coming, then looking across the industry there would have been a great variation in the amount of capital needed to survive the crisis. I guess here the implication here was that higher levels of capital across the industry will help, but they are unlikely to be enough for some organisations in the crisis to come.

After the crisis had hit, he said that financing from the repo market dried up as repo haircuts exploded, and he said that this was like the modern day equivalent of a bank run (where a solvent bank faced difficulty due to having to sell good assets cheaply to satisfy demands for returning of cash deposits).

Allen said that leverage and "debt overhang" made it much less likely that a financial institution would get in more equity capital following the crisis since it implied a transfer of wealth from the stockholders to bondholders. More of this important point later.

He put forward that it was not yet clear whether the 2007-8 crisis was mainly due to insolvency or due to a bank run. He argued that it was some combination of both, and referred back to the recent re-assessment of the Great Depression being caused not by a run on (solvent) banks but rather by flight of retail investors away from insolvent banks.

He concluded that much of the action for any future crisis will have to take place after any new crisis hits (ex-post), partly due to his assessment of the disconnect between equity capital needed (the current focus of things like Basel III) prior to a crisis and an institution's financial health following a crisis.

Allen suggested that contingent capital, i.e. debt capital that automatically converted in equity based on some market trigger might be very helpful in dealing with a financial crisis. Such a conversion would happen early than if an institution agreed to it earlier and would automatically dilute existing stockholders. Overall this was a thought provoking talk and the panel discussion afterwards was interesting too. One of the panelists commented that he looked for a high leverage and high ratios of CEO to CRO compensation as his measure of where to look for the next set of risky institutions. The panel also seemed to agree that with the benefit of hindsight, allowing Lehmans to fail and the resultant drying up of the money markets was a mistake, and more consistency was needed in bankruptcy and distress resolution.

17 June 2011

Taleb and Model Fragility - NYU-Poly

I went along to spend a day in Brooklyn yesterday at NYU-Poly, now the engineering school of NYU containing the Department of Finance and Risk Engineering. The event was called the "The Post Crisis World of Finance" was sponsored by Capco.

First up was Nassim Taleb (he of Black Swan fame). His presentation was entitled "A Simple Heuristic to Assess Tail Exposure and Model Error". First time I had seen Nassim talk and like many of us he was an interesting mix of seeming nervousness and confidence whilst presenting. He started by saying that given the success and apparent accessibility to the public of his Black Swan book, he had a deficit to make up in unreadability in this presentation and his future books.

Nassim recommenced his on-going battle with proponents of Value at Risk (see earlier posts on VaR) and economists in general. He said that economics continues to be marred by the lack of any stochastic component within the models that most economists use and develop. He restated his view that economists change the world to fit their choice of model, rather than the other way round. He mentioned "The Bed of Procrustes" from Greek mythology in which a man who made his visitors fit his bed to perfection by either stretching them or cutting their limbs (good analogy but also good plug for his latest book too I guess)

He categorized the most common errors in economic models as follows:

  1. Linear risks/errors - these were rare but show themselves early in testing
  2. Missing variables - rare and usually gave rise to small effects (as an aside he mentioned that good models should not have too many variables)
  3. Missing 2nd order effects - very common, harder to detect and potentially very harmful

He gave a few real-life examples of 3 above such as a 10% increase in traffic on the roads could result in doubling journey times whilst a 10% reduction would deliver very little benefit. He targeted Heathrow airport in London, saying that landing there was an exercise in understanding a convex function in which you never arrive 2 hours early, but arriving 2 hours later than scheduled was relatively common.

He described the effects of convexity firstly in "English" (his words):

"Don't try to cross a river that is on average 4ft deep"

and secondly in "French" (again his words - maybe a dig at Anglo-Saxon mathematical comprehension or in praise of French mathematics/mathematicians? Probably both?):

"A convex function of an average is not the average of a convex function"

Nassim then progressed to show the fragility of VaR models and their sensitivity to estimates of volatility. He showed that a 10% estimate error in volatility could produce a massive error in VaR level calculated. His arguments here on model fragility reflected a lot of what he had proposed a while back on the conversion of debt to equity in order to reduce the fragility of the world's economy (see post).

His heuristic measure mentioned in the title was then described which is to peturb some input variable such as volatility by say 15%, 20% and 25%. If the 20% result is much worse than the average of the 15 and 25 ones then you have a fragile system and should be very wary of the results and conclusions you draw from your model. He acknowledged that this was only a heuristic but said that with complex systems/models a simple heuristic like this was both pragmatic and insightful. Overall he gave a very entertaining talk with something of practical value at the end.

08 June 2011

IKEA and Market Risk Management – Choice is a worrying thing!

Risk management and data control remain at the top of the agenda at many financial institutions. Many have said that the recent crisis highlighted the need for more consistent, transparent, high quality data management, which I totally agree with (but working for Xenomorph, I would I guess!). Although the crisis started in 2007, it would seem that many organizations still do not have the data management infrastructure in place to achieve better risk management.

I moved apartment last week and had to face the terrifying prospect of visiting IKEA to buy some new furniture. On walking through the endless corridors of furniture ideas I wondered whether the people at major financial institutions feel as I did: I knew I needed two wardrobes, I knew the dimensions of the rooms, I knew how many drawers I wanted. Then I got to the wardrobes showroom, sat in front of the “Create your own wardrobe” IKEA software and the nightmare started. How many solutions are there to solve your problems? And how many solutions, once you get to know of their existence, make you aware of a problem you didn’t know you had? That’s how I spent 2 days at IKEA choosing my furniture and still I wonder whether in the end I got the right solution for my needs.

Coming back to risk management, I imagine the same dilemma may be faced by financial institutions looking to implement a data management solution. How many software providers are out there? What data model do they use? Are they flexible enough to satisfy evolving requirements? How can we achieve an integrated data management approach? Will they support all kind of asset classes, even the most complex? 

In these times of new regulations where time goes fast and budget is tight, selection processes have become more scrupulous. 

As often happens in life, when we need a plumber for example, or a new dentist, we look for positive recommendations, people willing to endorse the efficiency and reliability of the service. So, with this in mind, please take a look at the case study we put together with Rabobank International, who have been using our TimeScape analytics and data management system at their risk department since 2002 for consolidated data management. More client stories are also available on our website here: www.xenomorph.com/casestudies

I hope that many of you will benefit from reading the case study and for any questions (on IKEA wardrobes too!), please get in touch...

 

Xenomorph: analytics and data management

About Xenomorph

Xenomorph is the leading provider of analytics and data management solutions to the financial markets. Risk, trading, quant research and IT staff use Xenomorph’s TimeScape analytics and data management solution at investment banks, hedge funds and asset management institutions across the world’s main financial centres.

@XenomorphNews



Blog powered by TypePad
Member since 02/2008