« December 2011 | Main | February 2012 »

7 posts from January 2012

27 January 2012

PRMIA - Operational Risk, Big Data and Human Behaviour

I attended Challenges and Innovations in Operational Risk Management event last night which was surprisingly interesting. I say surprising since I must admit to some prejudice against learning about operational risk, which has for me the unfortunate historical reputation of being on the dull side.

Definition of Operational Risk

Michael Duffy (IBM GRC Strategy Leader, Ex-CEO of OpenPages) was asked by the moderator to define Operational Risk. Michael answered that he assumed that most folks attending already knew the definition (fair comment, the auditorium was full of risk managers...), but he sees it in practice as the definition of policy, the controls to enforce the policies and ongoing monitoring of the performance of the controls. Michael suggestion that many where looking to move the scope and remit of Operational Risk into business performance improvement, but clients are not there yet on this more advanced aspect.

Vick Panwar (Financial Services Industry Lead, SAS) added that Operational Risk was there to mitigate the risks for those unexpected future events (getting into the territory of Dick Cheney's Unknown Unknowns which I never tire of, particularly after a glass of wine).

Rajeev Lakra (Director Operational Risk Management, GE Treasury) took his definition from Basel II of Operational Risk as risk of loss resulting from inadequate or failed internal processes, people and systems, or from external events. Coming from GE, he said that he thought of best practice Operational Risk as similar to another GE initiative in the use of Six Sigma for improving process management. Raj said that his operational risks were mainly concerned with trade execution so covering data quality/errors, human error and settlement errors.

Beyond Box Ticking for Operational Risk

Raj said that Operational Risk is treated seriously at GE with the Head of Operational Risk reporting into the CRO and leaders of Operational Risk in each business division.

Michael suggested that the "regulators force us to do it" motive for Operational Risk had reduced given some of the operational failures during the financial crisis and recent "rogue trader" events, with the majority of institutions post-2008 having created risk committees at the "C" level and being so much more aware of tail events and the reputational damage that can damage shareholder value.

Vik said that Operational Risk is concerned primarily with "tail events" which by definition are not limited in size and therefore should be treated seriously. Pragmatically, he suggested that "the regulators need it" should be used as an excuse if there was no other way to get people to pay attention, but getting them to understand the importance of it was far more powerful.

The "What's in it for you" Approach to Operational Risk

Raj emphasised that it was possible to emphasise the benefits of operational risk to people in their everyday jobs, explaining to operators/managers that if they get frustated with failures/problems in the working day, then wouldn't it be great if these problems/losses were recorded so that they could justify a process change to senior management. He emphasised that this was a big cultural challange at GE.

Michael suggested that his clients in financial markets had gone through risk assessment, controls and recording of losses, but had not yet progressed to the use of Operational Risk to improve business performance.

Duplication of Effort

A key thing that all the panelists discussed was the overlap at many organisations between Operational Risk, Audit and Compliance. The said that the testing of the controls used for each had much in overlap, but was not based on a common nomenclature nor on common systems. For instance Vik pointed out that many of the tests on controls in Sarbanes-Oxley compliance were re-usable in an Operational Risk context, but that this was not yet happening. Vik said that this pointed to the need for comprehensive GRC platform rather than many siloed platforms.

Michael said that regulators want an integrated view, but no institution has an integrated nomenclature as yet. He recounted that one client sent 12 different control tests to branches that needed to be filled in for head office, which was a waste of resources and confusing/demotivating for staff. Raj said that the integration of Audit and Operational Risk at GE had proved to be a very difficult process. All agreed that senior management need to get involved and that a 5 year vision of how things should be incrementally integrated needs to be put in place.

Audience Questions:

Is business process risk different to business product risk? Michael said that Operational Risk certainly does and should cover both internal process and also the risks produced by the introduction of a new financial product for instance (is it well understood for instance, do clients understand what they are being sold?). He added that Operational Risk encompassed both the quantitative (statistical number of failures for instance) and the qualitative for which statistics were either not available (or not relevant to the risk).

Are there any surrogate measures for Operational Risk? Here a member of the audience was relaying senior management comments and frustration over the stereotyped red/amber/green traffic lights approach to reporting on operational risk. Michael mentioned the Operational Riskdata eXchange Association (ORX) where a number of financial institutions anonymously share operational risk loss data with a view to using this data to build better models and measures of operational risk. Apparently this has been going on since 2003 and the participants already have a shared taxonomy for Operational Risk. (my only comment on having a single measure for "operational riskiness" is that do you really want a "single number" approach to make things simple for C-level managers to understand, or should the C-levels be willing to understand more of the detail behind the number?)

Is "Rogue Trading" Operational Risk? Michael said that it definitely was, and that obviously each institution must control and monitor its trading policies to ensure they were being followed. The panel proposed that Operational Risk applied to trading activity could be a good application of "Big Data" (much hyped by industry journalists lately) to understand typical trading patterns and understand unusual trading patterns and behaviours. (Outside of bulk tick-data analysis this is one of the first sensible applications of Big Data so far that I have heard suggested so far given how much journalists seem to be in love with the "bigness" of it all without any business context to why you actually would invest in it...sorry, mini-rant there for a moment...)

Summary

Good event with an interesting panel, the GE speaker had lots of practical insight and the vendor speakers were knowledgeable without towing the marketing line too much. Operational Risk seems to be growing up in its linkage into and across market, credit and liquidity risk. The panel agreed however that it was very early days for the discipline and a lot more needs to be done.

Given the role of human behaviour in all aspects of the recent financial crisis, then in my view Operational Risk has a lot to offer but also a lot to learn, not least in that I think it should market itself more agressively along the lines of being the field of risk management that encompasses the study and understanding of human behaviour. Maybe there is a new career path looming for anthropologists in financial risk management...

 

 

 

 

 

 

20 January 2012

The Volcker Rule - aka one man's trade is another man's hedge

One of the PRMIA folks in New York kindly recommended this paper on the Volcker Rule, in which Darrell Duffie criticises the proposed this new US regulation design to drastically reduce proprietary ("own account") trading at banks.

As with all complex systems like financial markets, the more prescriptive the regulations become the harder it is "lock down" the principles that were originally intended. In this case the rules (due July 2012) make an exception to the proprietary trading ban where the bank is involved in "market-making", but Darrell suggests that the basis for what types of trades are "market-making" and what types of trades are more pure "proprietary trading" are problematic in this case, as there will always be trades that are part of "market-making" process (i.e. providing immediacy of execution to customers) that are not directly and immediately associated with actual customer trading requests.

He suggests that the consequences of the Volcker Rule as it is currently drafted will be higher bid-offer spreads, higher financing costs and reduced liquidity in the short-term, and a movement of liquidity to unregulated entities in the medium term possibly further increasing systemic risk rather than reducing it. Seems like another example of "one man's trade is another man's hedge" combined with "the law of unintended consequences". The latter law doesn't give me a lot of confidence about the Dodd-Frank regulations (of which the Volcker Rule forms part), 2319 pages of regulation probably have a lot more unintended consequences to come.

 

19 January 2012

In quiet praise of introverts

Corporate (and social) America does lots of things very well - positiveness, enthusiam and lack of (English?) cynicism being some of the best attributes in my view - but other things are not so good such as long "townhall" conference calls with 30 people on the call and only 3 people taking part, and the seeming need to continue talking when it is already evident to you and many listening that you don't know what you are talking about. With these things in mind, I think this article "The Rise of the New Groupthink" in the New York Times is worth a read, as it challenges some of the mainstream practices on corporate collaboration and teaming, and comes out in quiet praise of the creative power of introverts. Seems like Dilbert's cubicle still has its merits in these days of open plan offices and desk sharing.

18 January 2012

The financial crisis and Andrew Lo's reading list

I spotted this in the FT recently - for those of you diligent enough to want to read more about the possible causes and possible solutions to the (ongoing) financial crisis, then Andrew Lo may have saved us all a lot of time in his 21-book review of the financial crisis. Andrew reviews 10 books by academics, 10 by journalists and one by former Treasury Secretary Henry Paulson.

Andrew finds a wide range of opinions on the causes and solutions to the crisis, which I guess in part reflects that regardless of the economic/technical causes, human nature is both at the heart of the crisis and evidently also at the heart of its analysis. He regards the differences in opinion quite healthy in that they will be a catalyst for more research and investigation. I also like the way Andrew starts his review with a description of how people's view of the same events they have lived through can be entirely different, something that I have always found interesting (and difficult!).

A quote from Napolean (that I am in danger of over-using) seems appropriate to Andrew's review: "History is the version of past events that people have decided to agree upon" but maybe Churchill wins in this context with: "History will be kind to me for I intend to write it.". Maybe we should all get writing now before it is too late...

13 January 2012

Latest from the EDM Council

Click here for an executive summary of what the EDM Council is up to on regulation, LEI and the Semantics Repository etc. Due credit to the Council for getting Bloomberg on board - sounds increasingly like Bloomberg may have decided to treat the topic seriously as opposed to assuming having a terminal solves everything.

12 January 2012

Pandit on Comparing Apples and Risk

For someone who has been criticised a lot over recent years, Vikram Pandit CEO of Citigroup, seems to have come up with an interesting risk management idea in his latest article in the FT. Vikram proposes that regulators put together an standard, multi-asset "benchmark" portfolio that all financial institutions would have to provide risk numbers on, enabling regulators to understand more of the risk management capabilities of each institution and avoiding any detailed disclosure of the portfolio actually held by each firm.

I guess a key thing would be that such numbers would have to be disclosed to the regulator away from public view, since we all know that otherwise the numbers would converge and all the banks would be doing the same thing (or at least copying each other's numbers?). Reminds me of a great talk at the RiskMinds event a few years back, praising diversity of approach and criticising regulators for effectively forcing everyone to do the same thing.

11 January 2012

FaceBank

Thought-provoking post by Alex Bray on finextra.com, about how internet banking sites are becoming outdated just like physical "branches" of banks did, and how they need to integrate more tightly with social networking sites (what doesn't these days?). The power of the network continues to rise, and it seems like FaceBank is becoming a reality (see the first part of my "tongue in cheek" Wilmott article from a few years back).

Xenomorph: analytics and data management

About Xenomorph

Xenomorph is the leading provider of analytics and data management solutions to the financial markets. Risk, trading, quant research and IT staff use Xenomorph’s TimeScape analytics and data management solution at investment banks, hedge funds and asset management institutions across the world’s main financial centres.

@XenomorphNews



Blog powered by TypePad
Member since 02/2008