« June 2009 | Main | September 2009 »

9 posts from July 2009

17 July 2009

Heavyweight Data Management...

...I am very concerned that I have previously missed an important requirement for data management solutions - a heavweight one judging by this great discussion on one of the Microsoft forums.

15 July 2009

Regulatory moves and moods

Seems that the latest EU and Basel Committee proposals on banking regulation cannot make everyone happy (now there's a surprise...). Whilst many seem very happy at the incremental nature of the proposals to increase capital requirements for securitisations and proprietary trading, some of those in the Glass-Stiegal/banking utility camp are less than impressed. I am with the incremental camp myself, but have to acknowledge that the sceptics are not short of ammunition when saying that we are heading back to the future...meanwhile over in hedge fund land, London is currently in a very bad mood with the EU...

Debt hides volatility from Taleb

Nassim Nicholas Taleb and one of his colleagues are back in the FT today with an article on the "evils" of debt and why the only solution to the economic system's woes is (start the fanfare, this is scary stuff!) the "immediate, forcible and systematic conversion of debt to equity". The main points of the article are that:

  • Debt and leverage lead directly to fragility in the system whereas equity is robust at absorbing extreme variations in the system.
  • The economic system is experiencing more extreme events (more "Black Swans") than ever before rendering mainstream economic forecasting useless.
  • Debt hides volatility as a loan does not vary outside of default whereas an equity investment has volatility but its risk is more visible and as a result more manageable.

I think the last point on debt hiding volatility is quite profound - on a personal basis I would put it into the category of one of those things that you know but it becomes clearer when expressed in a different way, usually (in my case!) by somebody else. Its implications are illustrated particularly well in the following extract from the text:

"Thus debt is the province of both the overconfident borrower who underestimates large deviations, and of the investor who wants to be deluded by hiding risks."

The article is dramatic (as is usual with Taleb, see post) and short on detail of how such a fundamental conversion of debt to equity should happen from a practical point of view. It is nonetheless thought-provoking, particular around the use of flawed economic models being used to get us out of a crisis that the underlying maths helped us to get into, and the consequent proposal that we shouldn't try to model and control the risks of the system but instead endorse equity as the defensive, stabilising shock-absorber of choice. Maybe I should call my insurance broker, I think I need to increase my cover...

09 July 2009

Tick Size Harmony...

...in a rare show of co-operation (I wonder what is the carrot or (regulatory) stick here to motivate this?) European exchanges and MTFs seem to have agreed on standardising tick sizes (or at least to have two standards rather than twenty five!). Extract from article on AutomatedTrader:

"From the perspective of each trading venue, strong incentives exist to undercut others in terms of tick sizes, which is not in the interest of market efficiency or the users and end investors. This might, in turn, lead to excessively reduced tick sizes in the market. Excessively granular tick sizes in securities can have a detrimental effect to market depth (i.e. to liquidity). An excessive granularity of tick sizes could lead to significantly increased costs for the many users of each exchange throughout the value chain; and have spillover costs for the derivatives exchanges' clients."

08 July 2009

Das's Dazzling Derivatives

Satyajit Das adds an interesting contribution the debate on OTC derivatives and the drive towards CCP in his article in the FT today (see earlier post for background). The opening paragraph sets the tone:

'US and European Union proposals for over-the-counter derivative regulations are consistent with H.L. Mencken's proposition that "there is always a well-known solution to every human problem - neat, plausible and wrong".'

Main points from the article:

  • A single CCP would certainly qualify for "too big to fail"
  • The success of CCP depends on collateral and collateral valuations may underestimate risk and value since these are usually based on historical volatility
  • Cross-margining exposes the CCP to correlation risks in offset methodologies
  • CCP depends on valuing contracts that depend upon liquid markets
  • CCP margining requirements may communicate market stress to more participants and in turn create more stress
  • Regulators are missing the point with CCP and should look addressing the core issue of innovation and complexity hiding excessive profits in derivatives

As a related aside, probably also worth taking a look at the following article on the return of securitisation.

03 July 2009

Lessons for Risk Management - Wilmott and Rowe

Great event organised by PRMIA and IAFE last night at Goldman's London offices with a long title:

 "A Little Thought Goes A Long Way and Lessons for Risk Management from the Current Crisis".

The event was moderated by Giovanni Bellossi of FGS Capital, and featured speaking slots by Paul Wilmott and David Rowe of Sungard. Here are my notes on the evening, please forgive any innaccuracies, and please persevere through some of the techy quant stuff, as their general points are well worth understanding.

  • Giovanni quoted from Nassim Taleb about how VAR is invalid and that mainstream financial mathematics should be banned (or words to that effect, see earlier post on Taleb)
  • He added that whilst what Taleb says cannot be ignored, he said that despite the current crisis and its causes that we should not "throw the baby out with the bathwater" and added that Taleb "...is not only able to recognise a cow but also knows how to milk one."

  • Giovanni said that financial mathematics has much to offer and that whilst VAR is simply a number, one of its great benefits has to make one measure of risk simple and compelling enough to get traders and risk managers talking.

Paul Wilmott then took the floor and put forward his thoughts:

On Taleb and the Black-Scholes Model

  • Paul mentioned that he and Taleb were great friends, and whilst he agreed with much of what Taleb says he has areas of disagreement, particularly over the use of the Gaussian distribution in finance and its implications for "fat tail" events
  • Paul Googled "Taleb" and found more entries for Taleb than for Stephen Hawkin which shows how much attention had come his way due to the "Black Swan" debate
  • He thinks that he and Taleb are the "Marmite of finance" (for those of you not in the UK who do not know Marmite, it is a sandwich spread that you either love or hate, never anything inbetween)
  • He suggested that every quant needs a much more fundamental and practically grounded understanding of financial mathematics.
  • Paul refered to some work (mentioned by Giovanni) that Peter Carr of Bloomberg had done on discrete daily hedging that showed that this option replication technique could remove up to 85% of the risk and that all quants should know about this 15% error term when trying to calculate an option price to the Nth decimal place.
  • He described how in the past he had set up a volatility arbitrage hedge fund, wanting to improve upon the flawed assumption of the Black-Scholes (B-S) model that volatility is constant and to build the world's best volatility model for option pricing.
  • Paul said that he did build the world's best volatility model (?!), but soon found it took too long to calculate, so he reverted back to B-S and has become an unfashionable fan of the model and its assumptions.
  • He added that many of the variants on B-S to overcome its limitations have made the model worse and harder to calibrate.
  • In some part due to Taleb's opinions on fat tails of distributions, B-S and other models are now very unpopular but Paul claims that not many people have actually bothered to robustly test the B-S model or take a practical, evidence based approach such as that adopted by Peter Carr.
  • Paul then showed some example charts and said that with a limited number of opportunities for regular time-period hedging it was not valid to use risk-neutral pricing whereas if the same number of hedges could be used optimally (implying at irregular time periods) then risk-neutral was valid and hedging could be more effective. He emphasised that this was the kind of practical stuff that a quant should know and that quants show know less about esoteric complex financial mathematics.

Correlation

  • Paul said that of all of the issues that need addressing in mathematical finance, the one that he has very few answers on is correlation.
  • He showed that even basic questions about correlation are poorly understood, even by quants - a question he asks some quants was that if two asset prices both start out at 100, and they have a correlation (of returns) of 1 (perfect correlation) what is the price of the second asset after a year if the first moves to 200. The answer is not 200, and he showed how assets could diverge in overall direction but still have a correlation of 1 or rise together with a perfect negative correlation of -1.
  • Paul illustrated how correlation was a very blunt measure that is mis-used by people to summarise the highly complex and historically unstable relationships between assets driven for example by industry sector success (leading to +ve correlation) or competitive success (leading to -ve correlation)
  • As a result, he said that financial products whose value depends on correlation should not be transacted in any great size and moved on to the example of CDOs, where a CDO with 1,000 underlying mortgages has been modelled with 1/2 million correlations all assumed to be 0.6. Why this assumption should be made was his main point.

Sensitivity to Parameters

  • His main point here was that a constant should not be varied, otherwise it is not a "constant", in particular focussing on volatility used in the B-S model and the calculation of Vega as prices are moving.
  • Paul added that sensitivity measures may apply locally and is such may look comparible from one situation to another, but quants need to understand how outputs respond over a wider range of inputs, and not to be inhibited by accepted practices and beliefs.

Complexity

  • Models need to be robust and transparent, and that quants should aim for the mathematical sweet spot.
  • Paul put forward the following analogy that at least when driving an old car over a long distance, you knew that the car was likely to break down at least once, but you also knew that it was likely that you could fix it. Contrast this with driving a modern sports supercar and finding that it has (unexpectedly?) broken down - you don't know how to fix it, you do not complete your journey and it costs you an ordinate amount of money to put things right...

Self-Referential Feedback

  • Paul described here how the hedging of derivatives contracts in the underlying markets can cause price movements in underlying markets that cause derivatives contracts to re-price that cause more hedging in the underlying markets...
  • He was critical of credit derivative pricing as being too complex and too "mathsy" (...but had to admit that he had also endorsed some of this work at the time)

Calibration

  • Paul said that model parameter calibration is the devil's work...
  • He refered us to inverse problems in mathematics as a background to this issue in mathematical finance.
  • He emphasised how markets and price behaviour is fickle and driven by human opinions and behaviours
  • He said that on-going and regular re-calibration of a model is very, very likely to mean that the model is wrong (he had a particular example of calibrating a particular model he hates where vol is a function of underlying price and time.

David Rowe, Sungard's specialist spokesman on risk management, then took over from Paul and set out his five topics for discussion:

  • Statistical Entropy - fundamentally that information can only be extracted from data, with the emphasis on extraction of information (from that already in the data) rather than creation of new information.
  • Structural Imagination - that we need to be aware of how the market assumptions we make are themselves a model and that we need to spend more time on thinking about what could happen outside our current understanding or market experience.
  • Self-Referential Feedback - the feedback loops in pricing, risk management and economics
  • Complexity and Dark Risk - when you add (untested) complexity of a model to limited data sets you get a recipe for disaster.
  • Alternate Means of Valuation - when the primary means of valuing a security is not available (illiquid markets anyone?) then what is the secondary means of calculation value.

Some further notes from David's talk:

  • AAA rating should imply a failing once every 10,000 years, with some super senior CDO tranches being rated as better than AAA - David pointed out that even as recently as the early 1990s there were problems in the US housing market that indicated that AAA did not mean what it was taken to mean.
  • On structural imagination, David said that quants and risk managers must look for unrepresented variables in a model and track them early to monitor their effects
  • On feedback he cited an example where increased returns drove product innovation which drove up (CDO) volumes, which caused underwriting standards to fall, that allowed further complexity, that then led to unreliable risk estimation which then led to more product innovation... and so on.
  • He suggested that quants adopt the "second means of valuation" mantra in a similar way to credit specialists always having the mantra when assessing credit of "what is the second means of repayment" (e.g. a lien on a house) when the primary means (mortgage payments) goes away.
  • David showed a nice classification from an IASB paper on classifying financial instruments:

Level 1: fair values measured using quoted prices in active markets for the same instrument.

Level 2: fair values measured using quoted prices in active markets for similar instruments or using other valuation techniques for which all significant inputs are based on observable market data

Level 3: fair values measured using valuation techniques for which any significant input is not based on observable market data

David additional proposed the interesting level of "Level ?" for some products, and said that obviously more attention needs to spent on Level 2 and 3 instruments under conditions of reduced (non-existant?) market liquidity.

Summary Session:

Paul and David then answered some questions from the audience:

  • Paul said that some risk managers lacked the imagination necessary for good risk management, being confined in standard procedures, beliefs and ways of doing things. He wants risk managers who are good at thinking laterally.
  • Paul said that risk management was often an afterthought, not part of the trading process.
  • David said that VAR has proven useful despite its weaknesses, in his opinion preventing failures from non-extreme events regardless of the recent extremes
  • David said that in answer to Taleb's criticism of using history in modelling, it quite frankly is all we have to go on. He quoted Mark Twain in that:

"History does not repeat itself but it does rhyme"

The talks were interesting, and even on points that have been discussed elsewhere both speakers had some interesting slants and good analogies. But maybe I am biassed, as the wine afterwards wasn't bad either!...


02 July 2009

Best execution 2009 - July 1st 2009

A few summary points I took from the Best Execution Europe 2009 event courtesy of Incisive Media that I attended yesterday morning.

The event started with a presentation by Michael Fridrich, Legal and Policy Affairs Officer of the European Commission:

  • From what Michael was saying then in my view, it seems that the EU is using the G20 declaration on financial stability in April as a remit to regulate in many areas (not all of which related to the current crisis, see last paragraph in this post)
  • He said that the EU is currently working on removing national options/discretions with respect to financial markets in order to create a single EU rule book and combining this with stronger powers for supervisors including much harsher sanctions against offending institutions
  • They are also reviewing the necessary information provided to investors in OTCs, even if the investors qualify as "professional investors" under Mifid.
  • The EU is currently reviewing Mifid and the Market Abuse Directive (called "MAD" which is at least humorous...)
  • EU is also unsurprisingly looking at the regulation of Credit Ratings Agencies (CRAs) given their involvement in rating CDOs and other structured products

So in summary it was a civil servant PR exercise with few surprises, other than we are going to regulate anything that moves. On to a panel debate on "build vs. buy" for execution management software. I will try and put my obvious vendor bias to one side in summarising this one:

  • The panel summarised that this decision was about the usual issues of time to market and what is an institutions core IP
  • A senior IT manager from JPMorgan said they both build and buy - but given the size of their organisation and the need to innovate they do build a lot
  • The COO of Majedie Asset Management said that "build" was "20th Century" and the IT should focus now on "assembly"
  • He added that if IT lead a procurement process he finds this tends to lead to more proprietary solutions than if business is managing it.
  • He summarised that business people should have the mandate to define inputs/outputs to a requirement and that IT were not qualified to do this.
  • Putting it more controvertially he suggested that IT people should work for IT companies
  • The JPMorgan guy responded that "assembly" of external components can lead to excessive staffing in managing all the plumbing, and that build in house could build a more generic and targetted platform that would need less management
  • The moderator summarised the build vs. buy decision as one of balancing time to market and how bespoke a solution is alongside of looking at the risks for buying of 1) integration risk 2) vendor risk and for building of 1) delivery risk 2) key man risk

The debate on this was pretty standard, but the guy from Majedie was at least controvertial in what he was saying, (including at one point that "investment management does not scale"). I assume he is trading simple products and as such is able to outsource more than the JPMorgan manager. My own slant is that more vendor products need to be designed to integrate easily with the IPR of a financial institution i.e. less black box.

Tom Middleton of Citi then did a presentation on (equity) market liquidity and market fragmentation:

  • He started by saying the Smart Order Routing (SOR) was like "Putting Humpty-Dumpty back together again" from all the sources of liquidity now available under Mifid.
  • Being no expert in SOR, I was excited (?) to learn a new term which was "finding Icebergs" - apparently an "Iceberg" is a large non-public ("dark")  order being posted with a much smaller public trade order.
  • He said that market fragmentation will increase further but there will be less trading venues as the market consolidates.
  • New algorithms will be developed more specifically for trading on dark pools of liquidity
  • Clearing and settlement costs are still high across Europe which limits the usage of small size orders in trading but trading volumes will continue to grow
  • The drive to ever-lower latency will also continue
  • Usage of SOR will grow

Tom's presentation was then followed by a panel debate on Smart Order Routing:

  • A manager from Baader said that the German area market of Europe was not very sophisticated yet, with most German clients specifying exactly where the trade should be executed hence nullifying the need for SOR.
  • Deutsche Bank (DB) mentioned that having both US and EU operations had helped them get SOR in place for the EU quicker given their US experience.
  • UBS and Baader both said that Algo trading and SOR are increasingly integrated and will merge with the Algo define what and how to trade and the SOR component determining where
  • DB said that a "tipping point" towards usage of SOR in the EU will occur when more than 20% of trading occurs away from the primary exchanges.
  • DB said that 60% of US liquidity was due to algorithmic trading and that there were now no EU barriers to this happening in European markets and bringing with it increased liquidity, although issues such as not having a consolidated market tape for trading made things more difficult
  • Neonet said that clearing and settlement costs were still a barrier to widescale SOR adoption.
  • IGNIS Asset Management said that SOR was a "high touch" service for them, requiring SOR vendors to be very responsive and client focussed. In selecting SOR vendors they were concerned with data privacy and also with having a real-time reporting facility to see how orders were being filled.

And finally (at least before I had to leave) there was a presentation by Richard Semark of UBS on Transaction Cost Analysis (TCA):

  • He was surprised to find that there were not many presentations around on TCA
  • TCA vendors are behind the times and are not up to date with current developments
  • Historically TCA was about what had happened (about 3-4 months ago!)
  • Mifid has driven fund managers and traders to talk more and TCA is a key part of this conversation
  • It is hard to look bad against traditional TCA measures such as VWAP if a stock is always rising or always falling, and this can hide a lack of performance and "value add"
  • Using "Dark" for non-displayed liquidity has been a publicity disaster for the electronic trading industry
  • Much Smart Order Routing (SOR) is still based on static tables of trading venues that are updated on a monthly or quarterly basis
  • Market share by volume of a venue is not necessarily correlated with obtaining the best prices in the market
  • TCA should be based upon a dynamic benchmark that responds to the market and trades done not against a static one
  • Trade performance is not linear with trade size which is an incorrect assumption in much of TCA
  • Trade risk (variability in outcomes) deserves more focus
  • Portfolio TCA is much more complicated where the trading of a single stock cannot be looked at in isolation of its effects on the whole portfolio
  • Real-Time TCA is becoming ever more important to clients since it allows them to understand more of what is going wrong/right with filling an order
  • TCA providers are not doing a good job for clients, not using the right data or answering the right questions for clients

Not sure who the TCA providers he refers to are, but maybe I should find out to see what they offer...

 

 

 


 



Over The Counter Arguments

George Soros has waded back into the current saga concerning OTC derivatives in his article last week in the FT. The main part of the article focusses on financial markets reform, but ends with a vehement attack on derivatives, building upon some of his earlier ideas (see post) and seemingly going much further:

"Finally, I have strong views on the regulation of derivatives. The prevailing opinion is that they ought to be traded on regulated exchanges. That is not enough. The issuance and trading of derivatives ought to be as strictly regulated as stocks. Regulators ought to insist that derivatives be homogenous, standardised and transparent."

He ends by saying that "CDS are instruments of destruction that ought to be outlawed.". To the extent that Mr Soros attracts press/political attention is probably something the OTC markets should worry about, although it would seem his views are already consistent with many involved in influencing the US financial markets policy - take for instance the submission by Christopher Whalen to the US Senate on OTC Derivatives:

"Simply stated, the supra-normal returns paid to the dealers in the closed OTC derivatives market are effectively a tax on other market participants, especially investors who trade on open, public exchanges and markets."

Fortunately however there are also some more balanced views around - I found the following post on the "(in)efficient frontiers" blog, which references the earlier Senate submission by Richard Bookstaber on OTCs. Mr Bookstaber starts by saying that derivatives can improve financial markets, allowing investors to shape returns, exactly meet contingencies and package risk. Mr Bookstaber also puts forward a very clear summary how participants have also over recent years use derivatives to game the system to achieve tax avoidance, investment mandate avoidance, speculation and to hide risk-taking.

So back to the Soros article, there was a letter in response a few days later from a partner at the legal firm Ashurst's, saying that unfortunately risk does not confirm to a standard. In this I agree, standardising contracts can lead to increased complexity - there was a recent example given by a swaps dealer at JPMorgan who said that a corporate with particular cashflows to be hedged does want to be dealing with the basis risk and admin of using standardised contracts - the corporate treasurer wants something that matches the exposure they have and takes it away, end of story. Again this is an example of derivatives "risk" not being just about the product type, but also about which institution is holding the contract and what they are using it for (see earlier post).

Not sure however how much the Ashurst's partner who wrote the response letter is worried about lucrative legal fees for OTC derivative contracts dying off if Soros-like standardisation occurs - it is a world of vested interests at the moment, never more vested than in a crisis...

 

Risk in the Hands of the Holder?

Given the ongoing debate about "too big to fail" and whether we should head back to the days of the Glass-Steagal Act, then here is a slightly different slant on the problem of systematic risk put forward in an article by Avinash D. Persaud.

In the article, Avinash makes the very good point that increasing capital requirements across the board is not the only response that regulators should consider, and that the risk of a financial product cannot be determined in isolation of who is holding it:

"At the heart of modern regulation is the erroneous view that risk is a quantifiable property of an asset. But risk isn't singular. There are credit, liquidity, and market risks, for instance—and different parts of the financial system have different capacities to hedge each. Thus, risk has as much to do with who is holding an asset as with what that asset is. The notion—popular in the U.S. Congress—that there are "safe" instruments to be promoted and "risky" ones to be banned is deceptive."

Obviously the last point is very relevant to the OTC markets at the moment. Avinash suggests that capital requirements should be tailored to what type of organisation is holding a risk and that organisations ability to hedge it, and outlines past mistakes made by regulators:

"By requiring banks to set aside more capital for credit risks than nonbanks must, regulators unintentionally encouraged banks to shift their credit risks to those who wanted the extra yield but had limited ability to hedge this type of risk. By not requiring banks to put aside capital for maturity mismatches, they encouraged banks to take on liquidity risks they couldn't offset. Moreover, by supporting mark-to-market asset valuations (which make institutions value holdings at their current price) and short-term solvency requirements, regulators discouraged insurers and pension funds from taking the very liquidity risks they are best suited for."

On banks and credit risk, then for those interested there is a good regulatory arbitrage example for credit risk described in the following article. Fundamentally I think the paragraph above illustrates some of the reasons why it is right to worry about rushing in new regulation too quickly - certainly things need to change but when dealing with large and complex systems (i.e. in this case Financial Markets) changes should be introduced incrementally in order to understand how the system responds.

Given the political imperative to "do something" then regulators find it all too tempting to stick their noses in everywhere, even in areas that did not lead us to the current crisis - take for instance the regulatory initiatives over the past year in short selling, hedge fund regulation and more recently the dangers of "dark pools" (at least dark pools sound scary I guess?). Where will the next "bogey man" appear on the regulator's radar and what will be the unintended consequences of government pressure on regulators to keep us all "safe"?

Xenomorph: analytics and data management

About Xenomorph

Xenomorph is the leading provider of analytics and data management solutions to the financial markets. Risk, trading, quant research and IT staff use Xenomorph’s TimeScape analytics and data management solution at investment banks, hedge funds and asset management institutions across the world’s main financial centres.

@XenomorphNews



Blog powered by TypePad
Member since 02/2008