79 posts categorized "Analytics Management"

24 November 2014

PRMIA Risk Year in Review 2014

PRMIA put on their Risk Year in Review event at the New York Life Insurance Company on Thursday. Some of the main points from the panel, starting with trade:

  • The world continues to polarize between "open" and "closed" societies with associated attitudes towards trade and international exposure.
  • US growth at around 3% is better than the rest of the world but this progress is not seen/benefitting a lot of the poplation yet.
  • This against an economic background of Japan, Europe and China all struggling to maintain "healthy" growth (if at all).
  • Looking back at the financial crisis of 2008/9 it was the WTO rules that were in place that kept markets open and prevented isolationist and closed policies from really taking hold - although such populist inward-looking policies are still are major issue and risk for the global economy today.
  • Some optimistic examples of progress howver on world trade recently:
  • US Government is divided and needs to get back to pragmatic decision making
  • The Federval Reserve currently believes that external factors/the rest of the world are not major risks to growth in the US economy.

James Church of sponsor FINCAD then did a brief presentation on their recent experience and a recent survey of their clients in the area valuation and risk management in financial markets:

  • Risk management is now considered as a source of competitive advantage by many insitutions
  • 63% of survey respondents are currently involved in replacing risk systems
  • James gave the example of Alex Lurye saying risk is a differentiator
  • Aggregate view of risk is still difficult due to siloed systems (hello BCBS239)
  • Risk aggregation also needs consistency of modelling assumptions, data and analytics all together if you are avoid adding apples and pears
  • Institutions now need more flexibility in building curves post-crisis with OIS/Libor discounting (see FINCAD white paper)
    • 70% of survey respondents are involved in changes to curve basis
  • Many new calculations to be considered in collateralization given the move to central clearing
  • 62% of survey respondents are investing in better risk management process, so not just technology but people and process aswell

James was followed by a discussion on market/risk events this year:

  • Predictions are hard but 50 years ago Isaac Asimov made 10 predictions for 2014 and 8 of which have come true
  • Bonds and the Dollar are still up but yields are low - this is as a result of relatively poor performance of other currencies and the inward strength of US economy. US is firmly post-crisis economically and markets are anticipating both oil independence and future interest rate movements.
  • Employment level movements are no longer a predictor of interest rate moves, now more balance of payments
  • October 15th 40bp movement in yields in 3 hours (7 standard deviation move) - this was more positioning/liquidity risk in the absence of news - and an illustration of how regulation has moved power from banks to hedge funds
  • Risk On/Off - trading correlation is very difficult - oil price goes means demand up but 30% diver in price over the past 6 months - the correlation has changed
  • On the movie Interstellar, on one planet an astronaut sees a huge mountain but another sees it is a wave larger than anything seen before - all depends on forming your own view of the same information as to what you perceive or understand as risk

Some points of macro economics:

  • Modest slow down this quarter
  • Unemployment to drop to 5.2% in 2015 from 5.8%
  • CS see the Fed hiking rates in mid-2015 followed by 3 further hikes
    • The market does not yet agree, seeing a move in Q3 2015
  • Downside risks are inflation, slow US growth and wages growth anaemic
  • Upside risks - oil price boost to spending reducing cost of gas from 3.2% down to 2.4% of disposable income

Time for some audience questions/discussions:

  • One audience member asked the panel for thoughts on the high price of US Treasuries
  • Quantitative Easing (QE) was (understandably) targetted as having distorting effects
  • Treasury yields have been a proxy for the risk free rate in the past, but the volatility in this rate due to QE has a profound effect on equity valuations
  • Replacing maturing bonds with lower yielding instruments is painful
  • The Fed are concerned to not appear to loose control of interest rates, nor wants to kill the fixed income markets so rate rises will be slow.
  • One of the panelists said that all this had a human dimension not just markets, citing effectively non-existing interest rate levels but with -ve equity still in Florida, no incentive to save so money heads into stock which is risky, low IR of little benefit to senior citizens etc.
  • Taper talk last year saw massive sell off of emerging market currencies - one problem in assessing this is to define which economies are emerging markets - but key is that current account deficits/surpluses matter - which the US escapes as the world's reserve currency but emergining markets do not.
  • Emergining market boom of the past was really a commodities boom, and the US still leads the world's economies and current challenges may expose the limits of authoritarian capitalism

The discussion moved onto central clearing/collateral:

  • Interest rate assets for collateral purposes are currently expensive
  • Regulation may exacerbate volatility with unintended consequencies
  • $4.5T of collateral set aside currently set to rise to $12-13T
  • Risk is that other sovereign nations will target the production of AAA securities for collateral use that are not AAA
  • Banks will not be the place for risk, the shadow banking system will
  • Futures markets may be under collateralized and a source of future risk

One audience member was interested in downside risks for the US and couldn't understand why anyone was pessimistic given the stock market performance and other measures. The panel put forward the following as possible reasons behind a potential slow down:

  • Income inequality meaning benefits are not throughout the economy
  • Corporations making more and more money but not proportionate increase in jobs
  • Wages are flat and senior citizens are struggling
  • (The financial district is not representative of the rest of the economy in the US however surprising that may be to folks in Manhattan)
  • The rest of the US does not have jobs that make them think the future is going to get better

Other points:

  • Banks have badly underperformed the S&P
  • Regulation is a burden on the US economy that is holding US growth back
  • Republicans and Democrats need to co-operate much more
  • House prices need more oversight
  • Currently $1.2T in student loands and students are not expecting to earn more than their parents
  • Top 10 oil producers are all pumping full out
    • The Saudis are refusing to cut production
    • Venezuela funding policies from oil
    • Russia desparately generating dollars from oil
    • Will the US oil bonanza break OPEC - will they be able to co-ordinate effectively given their conflicting interests

 Summary - overall good event with a fair amount of economics to sum up the risks for 2014 and on into 2015. Food and wine tolerably good afterwards too!

 

05 November 2014

Data Management Summit NYC from the A-Team

The A-Team put on another good event at DMS New York yesterday. Lots of good stuff talked and here are a few takeaways that I remember, after a photo of Ludwig D'Angelo of JPMorgan:

WP_20141104_12_33_59_Raw

  • Data Utilities - One of presenters said that "Data Utility" was a really overused term second only to "Big Data". My comment would be that a lot of the managed services folks seem to want to talk about "Data Utilities" - seeming to prefer that term rather than what they are? Maybe because they perceive as better marketing and/or maybe because they hope to be annointed/appointed (how I don't know) as an industry "Data Utility". Anyway for me they fail to address the issue of client-specific data and its management very well, much to the detriment of their argument imho - although SmartStream did say that client data can be mixed up into the data services they offer. 
  • Andrew Gets Literaturally Physical - Andrew Delaney of the A-Team expressed a preference for "physical" books when talking about why the A-Team also prints the Regulatory Data Handbook2 as well as making it available online. I have to agree that holding a book still beats my Kindle experience but maybe I am just getting old. Andrew should check out this YouTube video on how the book was first introduced...
  • FIBO - The Financial Instrument Business Ontology (FIBO) was discussed in the context of trying to establish industry standards for data. As ever the usage of words like "Ontology" I suspect leaves a lot of business folks looking for the nearest double shot of expresso but that aside, seems like the EDM Council are making some progress on developing this standard. Main point from the event was industry adoption is key. I found some of the comments during the day a bit schizophrenic, in that some said that the regulators should not mandate standards (i.e. leave it to industry adoption and principles) but then in the next breath discussing the benefits (or otherwise) of the LEI (ok, not mandated but specific and coming from the regulators). Certainly the industry needs "help" (is that a strong enough word?) to get standards in place.
  • Data Quality - Lots on data quality with assessing the business value of data quality initiatives being a key point. On the same subject, Predrag of element-22 announced that the EDM Council will soon be announcing adoption of the Data Quality Index, which could be used to correlate data quality with operational KPIs for the business. 
  • Regulation (doh!) - It wouldn't be a data management event without lots of discussion on regulation - a key point being that even those regulations that are not directly/explicitly about data still imply that data management is key (take CVA calcs for example) - and on a related note it was suggested that BCBS239 should be considered as a more general data managment template for any business objective. 
  • Entity Hierarchies/LEI - Ludwig D'Angelo of JPMorgan gave a great talk and said that vendors were missing a massive opportunity in delivering good hierarchy datasets to clients, and that the effort expended on this at firms was enormous. Ludwig said that the lack of hierarchies in the Legal Entity Identifier (LEI) is a gap that the private sector could and should fill.  Ludwig also seemed initially to be thrown when one of the audience suggested that they were multiple "golden copies" of hierarchies needed, since definitions of ownership can differ depending on which department you are in (old battle of risk and finance departments again). Good discussion later of how regulation was driving all systems to be much more entity-centric rather than portfolio-centric, emphasising the importance of getting entity hierarchies right. 
  • DCAM - John Bottega did a great presentation on the Data Management Capability Model (DCAM). John asked Predrag of element-22 to speak about DCAM and he said that unlike previous models (DMM) then this framework would not only assess where you are in data management but will also show you where you need to go. DCAM covers data management strategy / operations / quality / business case / data architecture / tech architecture / governance / program. From what I could see it looked like a great framework - it appeared like common sense and obvious but that is in itself difficult to achieve so good effort I think. Element-22 will offer an online service around DCAM that will also allow anonymous benchmarking of data management capabilities as more institutions get involved (update: the service is called pellustro).
  • BCBS239 - Big thanks to John M. Fleming of BNY Mellon and Srikant Ganesan of Risk Focus for taking part in the panel with me. Less focus on spreadsheet use and abuse on this panel unlike the London Panel from last month. John had some very practical ideas such as the use of Wikis to publish/gather data dictionary information and with a large legacy infrastructure you are better documenting differences in definitions across systems rather than trying to change the world from day one. Echoing some of the points from DMS London, it was thought that making the use of internal data standards as part of a project sign off was very pragmatic data governance, but that also some systems should be marked/assessed as obsolete/declining and hence blocked from any additional usage in new project work. Bit of a plug for some of our recent work on data validation and exception management, but the panel said that BCBS239 needs to encompass audit/lineage on calculations/derived data/rules in addition to just the raw data

You can get more on the day by taking a look at my feed via @TheLongSentance and involving others at #DMSNYC.

 

16 October 2014

TabbForum MarketTech 2014: Game of Smarts

A great afternoon event put on by TabbFORUM in New York yesterday with a number of panels and one on one interviews (see agenda). You can see some of went on at the event via the hashtag #TabbTech or via the @XenomorphNews feed.

WP_20141015_16_36_01_Raw

"Death of Legacy" Panel Discussion

13 October 2014

A-Team DMS London Event and BCBS239 Panel

Good day at the A-Team's DMS London event last Wednesday. The day started with Tom Dalglish doing a pretty passable impression of a stand-up comedian in the morning keynote to open the day - not exactly an easy thing to do if 1) you are asked to do it very much at the last minute and 2) this is data management, not the subject that most comedians would immediately reach out for. So due kudos to Tom, and some of the comments he made about technology architects and technology builders were funny and resonated with the audience, such as this quote coming from a technologist: "How can I give you the requirements, I haven't finished the code yet?" (I think we have all been there on that one a few times in our careers...).

You can find some of the main points from the various panels at via @XenomorphNews or more generally by #dmslondon (you could also find out a bit via my twitter account @TheLongSentance so long as you don't mind the odd photograph and a few bits of personal baggage now and again).

BCBS239 Panel - I took part in the panel on BCBS239 on risk data aggregation and reporting, something which I have written about before, and obviously a prime example of how regulation is influencing (dictating?) financial markets institutions to take data management seriously. Dennis Slattery of EDMWorks moderated the panel, and on the panel with me was Sally Hinds of DCMS, and Mikael Soboen, head of risk systems at BNP Paribas. 

IMG_3401

BCBS239 Panel at DMS London

Dennis started by outlining the four pillars of BCBS239:

  • Pillar 1 “Overarching Governance and Infrastructure.”
  • Pillar 2 “Risk aggregation” capabilities.
  • Pillar 3 “Risk reporting” capabilities.
  • Pillar 4 “Supervisory review, tools and cooperation."

Regulatory Chicken - Dennis started by asking the panel whether BCBS was another game of regulatory "chicken" where the approach of "principles" means 1) the banks do the minimum and wait for the banks to inspect and tell them what they specifically have to do 2) the regulators don't really want to be more specific beyond principles because they themselves are unsure of what is needed and want to learn from what different banks have done. General concensus from the panel debate was that firms were not doing as much as they could, but that banks needed to show at least that they had a program in place and running by the January 2016 deadline or face big issues with the regulators (so the game of regulatory chicken is "on" seems to be the conclusion). Mikael Soboen added that he was unsure whether his regulator would have the time to conduct the BCBS239 given the workload that the regulators currently faced. 

The End of Spreadsheets? - Dennis asked whether BCBS239 and the requirements for having a clear data lineage meant this sounded the bell for the end of spreadsheet usage at banks. I said not - I personally feel that a lot of folks in technology underestimate how difficult using software is for many business users and tools that make manipulating data easy like spreadsheets will have a role for the foreseeable future. I suggested that spreadsheets are a great adhoc reporting and analysis tool, and things mainly go wrong when they are used as a personal, "siloed" desktop database.

BCBS239 does not itself preclude the usage of spreadsheets and end user computing, but rather like a lot of regulation says that their usage must be taken seriously - in my view there is a tendency for some in IT to regard spreadsheets as someone else's problem, which is understandable but problematic for any CDO. Also there are approaches to spreadsheet usage that can help maintain data lineage, such as what Microsoft offers with web provision of spreadsheet dashboards using PowerView and PowerBI (used in our TimeScape MarketPlace offering), folks such as Cluster7 with their "closed circuit TV" for spreadsheet monitoring, and indeed Xenomorph with our SpreadSheet Inside approach of including centralised spreadsheet-like calculations as a supported data type within the audited data management process.

Data Dictionary - Mikael said that one responsibility he had was to represent the investment bank within the wider data dictionary initiatives due to BCBS239 at the retail bank, and said that this was challenging given the different terminology sometimes used. 

Is BCBS239 a Project or Data Governance? - The panel thought that the best approach was to use BCBS239 as a framework for compliance with current regulation and regulation to come, but that this needs to obviously be subject to having the budget to do so. There were some general comments on how the data management needs of the front office and risk were converging. Standards such as FIBO were also discussed, with feedback being that they are desirable but that it is early days where their immaturity means they are often used for specific areas such as modeling counterparty data. 

Overall a good panel (I hope!) with a good amount of audience questions and participation. Again you can find some of the main points from the various panels at via @XenomorphNews or more generally by #dmslondon (you could also find out a bit via my twitter account @TheLongSentance so long as you don't mind the odd photograph and a few bits of personal baggage now and again).

IMG_3352

A bit of fun - Brian looking up to Ron Wilbraham at DMS London

11 September 2014

A-Team DMS Awards 2014 - Xenomorph on the Cloud

A-Team’s DMS Data Management Awards close on the 26th of September so if you haven't already, please vote for Xenomorph!

Xenomorph on the Cloud - First of a few lookbacks at what we have been doing over the past year - firstly with a short animation about one of our major initiatives this year, cloud provision of data management and a new venture into cloud-based data publishing with the TimeScape MarketPlace

So it would be fantastic if you could support Xenomorph by voting here

Thank you!

05 August 2014

A-Team DMS Data Management Awards 2014

Very pleased to announce that we have been nominated again this year in the A-Team’s DMS Data Management Awards. The categories we’ve been selected for are: 

  • Best Sell-Side Enterprise Data Management Platform
  • Best Buy-Side EDM Platform
  • Best EDM Platform (Portfolio Pricing & Valuations)
  • Best Risk Data Aggregation Platform
  • Best Analytics Platform.

Last year we were delighted to win the Best Risk Data Management/Analytics Platform award – even more so as the awards are voted for by our clients and industry peers.

So if you would like to support us again this year the voting is open now:

http://referencedatareview.hs-sites.com/data-management-summit-awards-2014-survey

and runs through to the 26th September. The award winners will be announced at A-Team’s Data Management Summit, at the America Square Conference Centre in London on October 8th.

01 July 2014

Cloud, data and analytics in London - thanks for coming along!

We had over 60 folks along to our event our the Merchant Taylors' Hall last week in London. Thanks to all who attended, all who helped with the organization of the event and sorry to miss those of you that couldn't come along this time.

Some photos from the event are below starting with Brad Sevenko of Microsoft (Director, Capital Markets Technology Strategy) in the foreground with a few of the speakers doing some last minute adjustments at the front of the room before the guests arrived:

AzureUK-1

 

Rupesh Khendry of Microsoft (Head of World-Wide Capital Markets Solutions) started off the presentations at the event, introducing Microsoft's capital markets technology strategy to a packed audience:

AzureUK-3

 

After a presentation by Virginie O'Shea of Aite Group on Cloud adoption in capital markets, Antonio Zurlo (below) of Microsoft (Senior Program Manager) gave a quick introduction to the services available through the Microsoft Azure cloud and then moved on to more detail around Microsoft Power BI:

AzureUK-5

 

After Antonio, then yours truly (Brian Sentance, CEO, Xenomorph) gave a presentation on what we have been building with Microsoft over the past 18 months, the TimeScape MarketPlace. At this point in the presentation I was giving some introductory background on the challenges of regulatory compliance and the pros and cons between point solutions and having a more general data framework in place:

AzureUK-6

 

The event ended with some networking and further discussions. Big thanks to those who came forward to speak with me afterwards, great to get some early feedback.

AzureUK-8

 

24 June 2014

Cloud, data and analytics in London. Tomorrow Wednesday 25th June.

One day to go until our TimeScape MarketPlace breakfast briefing "Financial Markets Data and Analytics. Everywhere You Need Them" at Merchant Taylor's Hall tomorrow, Wednesday June 25th. With over ninety people registered so far it should be a great event, but if you can make it please register and come along, it would be great to see you there.

19 June 2014

Cloud, data and analytics in London. Next Wednesday June 25th.

Less than one week to go until our TimeScape MarketPlace breakfast briefing "Financial Markets Data and Analytics. Everywhere You Need Them" at Merchant Taylor's Hall on Wednesday June 25th. 

Come and join Xenomorph, Aite Group and Microsoft for breakfast and hear Virginie O'Shea of the analyst firm Aite Group offering some great insights from financial institutions into their adoption of cloud technology, applying it to address risk management, data management and regulatory reporting challenges.

Microsoft will be showing how their new Power BI can radically change and accelerate the integration of data for business and IT staff alike, regardless of what kind of data it is, what format it is stored in or where it is located.

And Xenomorph will be demonstrating the TimeScape MarketPlace, our new cloud-based data mashup service for publishing and consuming financial markets data and analytics. 

In the meantime, please take a look at the event and register if you can come along, it would be great to see you there.

18 June 2014

New Client - Mizuho Securities USA

Very pleased to announce that Mizuho Securities USA has completed a successful implementation of TimeScape, you can see the press release here and more detail is available in this article on Inside Reference Data. Big thank you to all those involved in making this happen, both at Mizuho and on the Xenomorph team.

11 June 2014

Financial Markets Data and Analytics. Everywhere London Needs Them.

Pleased to announce that our TimeScape MarketPlace event "Financial Markets Data and Analytics. Everywhere You Need Them" is coming to London, at Merchant Taylor's Hall on Wednesday June 25th. 

Come and join Xenomorph, Aite Group and Microsoft for breakfast and hear Virginie O'Shea of the analyst firm Aite Group offering some great insights from financial institutions into their adoption of cloud technology, applying it to address risk management, data management and regulatory reporting challenges.

Microsoft will be showing how their new Power BI can radically change and accelerate the integration of data for business and IT staff alike, regardless of what kind of data it is, what format it is stored in or where it is located.

And Xenomorph will be demonstrating the TimeScape MarketPlace, our new cloud-based data mashup service for publishing and consuming financial markets data and analytics. 

In the meantime, please take a look at the event and register if you can come along, it would be great to see you there.

14 May 2014

Clients and Partners. Everywhere You Need Them.

Quick thank you to the clients and partners who took some time out of their working day to attend our breakfast briefing, "Financial Markets Data and Analytics. Everywhere You Need Them." at Microsoft's Times Square offices last Friday morning. Not particularly great weather on here in Manhattan so it was great to see around 60 folks turn up...

Photo 1

 
Rupesh Khendry of Microsoft (Head of World-Wide Capital Markets Solutions) started the event and set out the agenda for the morning. Rupesh described the expense of data within financial markets, and the difficulties experienced by risk managers in pulling together all the data and analytics they need...  Photo 2
 
 ...and following Rupesh was Antonio Zurlo (below) of Microsoft (Senior Program Manager) who explained the fundamentals of Microsoft Azure and what services and infrastructure it offers, including public cloud, virtual private cloud and hybrid cloud architectures. Antonio also described a key usage pattern for HPC/grid on Azure being used to "burst to the cloud" when on-premise infrasture needs to be extended for end/intra-day risk calcs...
Photo 3
 
Sang Lee (below) of Aite Group (Managing Partner) then delivered his presentation "Floating in the Capital Markets Cloud: Moving Beyond Data Storage". Sang's main findings from the survey of 20 financial institutions were that concerns about security and SLAs relating to cloud usage remain, but even those that were concerned about this also said they were planning to start a cloud project within the next 24 months. Cloud technology seems to becoming more acceptable of late, and Sang said this seems to be due to regulation, cost pressures and the desire to offer better services to clients. Sang confirmed that HPC/Grid with "burst to the cloud" is a common usage pattern and that "Data as a Service" is becoming more popular... 
Photo 4
 
Fred Veasley (below) of Microsoft (Tech Solutions Professional) to introduce Microsoft Power BI and Office 365. Fred explained how Power BI extended the capabilities of Excel with data search (finding and retrieving publicized data sources both within an organization and over the web), its integration capabilities with standard databases, NoSQL databases, data standards such as OData and new APIs/sources of data such as Facebook. Once downloaded, the data can be shaped and merged with other datasets (for instance combining data from positions databases/systems with analytics and data from the cloud), and kept up to date automatically. In addition to Power BI, Power View enables great visualizations and interactive dashboards to be created, and once finalized these can be deployed centrally via web pages down to end users...
Photo 5
 
After Fred, Brian Sentance (below), CEO of Xenomorph explained the origins of the TimeScape MarketPlace. Based on some discussions with Microsoft about 18 months back, the idea was effectively to firstly to get TimeScape running in the Microsoft Azure cloud, secondly to turn the data management capabilities of TimeScape "upside-down" by using it as a means to upload and publish data to the cloud and thirdly to provide one-to-many access to multiple sources of data via web interfaces and key delivery tools such as Microsoft Power BI. Put another way, without any local software or hardware infrastructure both business users and IT staff can access multiple data sources in the same format and using the same data model wherever the data is needed. In addition to .NET and Java interfaces to the TimeScape MarketPlace via OData, web API delivery into F#, Python, R and MATLAB are all in development...
Photo 1 - Copy
 
...and in addition to downloading data via Power BI, Brian also demonstrated how you could build on the data using "Power View" to create powerful analytical dashboard functionality that could be built and tested in Excel, then deployed centrally within a browser for access by users outside of Excel. He added that partners was one of the key aspects for the platform, and introduced the TimeScape MarketPlace Partner Program for the platform to get data, analytics, model vendors, software and service vendors involved and building on the platform. Andrew Tognela (below) of Microsoft (Worldwide Managing Director) closed the presentations...
Photo 4 - Copy

02 May 2014

7 days to go - Financial Markets Data and Analytics. Everywhere You Need Them.

Quick reminder that there are just 7 days left to register for Xenomorph's breakfast briefing event at Microsoft's Times Square offices on Friday May 9th, "Financial Markets Data and Analytics. Everywhere You Need Them."

With 90 registrants so far it looks to be a great event with presentations from Sang Lee of Aite Group on the adoption of cloud technology in financial markets, Microsoft showing the self-service (aka easy!) data integration capabilities of Microsoft Power BI for Excel, and introducing the TimeScape MarketPlace, Xenomorph's new cloud-based data mashup service for publishing and consuming financial markets data and analytics.

Hope to see you there and have a great weekend!

 

30 April 2014

Xenomorph Releases TimeScape Data Validation Dashboard

Very pleased to announce general availability of TimeScape Data Validation Dashboard which we announced this morning. You can see find out more here. Big thank you to all the staff and the clients involved, who have helped us to put this together over the past year. 

15 April 2014

Financial Markets Data and Analytics. Everywhere You Need Them.

Very pleased to announce that Xenomorph will be hosting an event, "Financial Markets Data and Analytics. Everywhere You Need Them.", at Microsoft's Times Square New York offices on May 9th.

This breakfast briefing includes Sang Lee of the analyst firm Aite Group offering some great insights from financial institutions into their adoption of cloud technology, applying it to address risk management, data management and regulatory reporting challenges.

Microsoft will be showing how their new Power BI can radically change and accelerate the integration of data for business and IT staff alike, regardless of what kind of data it is, what format it is stored in or where it is located.

And Xenomorph will be introducing the TimeScape MarketPlace, our new cloud-based data mashup service for publishing and consuming financial markets data and analytics. More background and updates on MarketPlace in coming weeks.

In the meantime, please take a look at the event and register if you can come along, it would be great to see you there.

07 April 2014

When Big Data is not Big Understanding

Good article from Tim Harford (he of the enjoyable "Undercover Economist" books) in the FT last week called "Big data: are we making a big mistake". Tim injects some healthy realism into the hype of Big Data without dismissing its importance and potential benefits. The article talks about the four claims often made when talking about Big Data:

  1. Data analysis often produces uncannily accurate results
  2. Make statistical samplying obsolete by capturing all the data
  3. Statistical correlation is all you need - no need to understand causation
  4. Enough data means that scientific or statistical models aren't needed

Now models can have their own problems, but I can see where he is coming from, for instance 3. and 4. above seem to be in direct contradiction. I particularly like the comment later in the article that "causality won't be discarded, but it is being knocked off its pedestal as the primary fountain of meaning."

Also I liked the definition by one of the academics mentioned of a big data set being one where "N = All", and that you have "all" the data is an incorrect assumption behind some Big Data analysis put forward. Large data sets can mean that sample error is low, but sample bias is still a potentially big problem - for example everyone on Twitter is probably not representative of the population of the human race in general.

So I will now press save on this blog post, publish in Twitter and help re-enforce the impression that Big Data is a hot topic...which it is, but not for everyone I guess is the point.

 

 

24 March 2014

#DMSLondon - The Hobgoblin of Little Minds: Risk and Regulation as Drivers

The second panel of the day was "Regulation and Risk as Data Management Drivers" - you can find the A-Team's write up here. Some of my thoughts/notes can be found below:

  • Ian Webster of Axioma responded to a question about whether consistency was the Holy Grail of data management said that there isn't consistent view possible for data used in risk and regulation - there are many regulations with many different requirements and so unnecessary data consistency is "the hobgoblin of little minds" in delaying progress and achieving goals in data management.
  • James of Lombard Risk suggest that firms should seek competitive advantage from regulatory compliance rather than just compliance alone - seeking the carrot and not just avoiding the stick.
  • Ian said he thought too many firms dealt with regulatory compliance in a tactical manner and asked if regulation and risk were truly related? He suggested that risk levels might remain unchanged even if regulation demanded a great deal more reporting.
  • Marcelle von Wendland said she thought that regulation added cost only, and that firms must focus on risk management and margin.
  • James said that "regulatory risk" was a category of risk all in itself alongside its mainstream comtempories.
  • Ian added that risk and finance think about risk differently and this didn't help in promoting consistency of ideas in discussions about risk management.
  • James said that the legacy of systems in financial markets was a hindrince in complying with new regulation and mentioned the example of the relatively young energy industry where STP was much easier to implement.
  • Laurent of Bloomberg said that young, emerging markets like energy were greenfield and as such easier to implement systems but that they did not have any experience or culture around data governance.
  • Marcelle said that the G20 initiatives around trade reporting at least promoted some consistency and allowed issues to be identified at last.
  • Ian said in response that was unconvinced about politically driven regulation, questioning its effectiveness and motivations.
  • Ian raised the issues of the assumptions behind VaR and said that the current stress tests were overdone.
  • Marcelle agreed that a single number for VaR or some other measure meant that other useful information has potentially been ignored/thrown away.
  • General consensus across the panel that fines were not enough and that restricting business activities might be a more effective stick for the regulators.
  • James reference the risk data aggregation paper from the Basel Committee and suggested that data should be capture once, cleaned once and used many times.
  • Ian disagreed with James in that he thought clean once, capture once and use many times was not practically possible and this goal was one of the main causes of failure within the data management industry over the past 10 years. 
  • The panel ended with Ian saying that we not just solve for the last crisis, but the underlying causes of crises were similar and mostly around asset price bubbles so in order to recuce risk in the system 1) lets make data more transparent and 2) do what we can to avoid bubbles with better indices and risk measures.

3 Regulation panel

 

12 March 2014

S&P Capital IQ Risk Event #2 - Enterprise or Risk Data Strategy?

Christian Nilsson of S&P CIQ followed up Richard Burtsal's talk with a presentation on data management for risk, containing many interesting questions for those considering data for risk management needs. Christian started his talk by taking a time machine back to 2006, and asking what were the issues then in Enterprise Data Management:

  1. There is no current crisis - we have other priorities (we now know what happened there)
  2. The business case is still too fuzzy (regulation took care of this issue)
  3. Dealing with the politics of implementation (silos are still around, but cost and regulation are weakening politics as a defence?)
  4. Understanding data dependencies (understanding this throughout the value chain, but still not clear today?)
  5. The risk of doing it wrong (there are risk you will do data management wrong given all the external parties and sources involved, but what is the risk of not doing it?)

Christian then moved on to say the current regulatory focus is on clearer roadmaps for financial institutions, citing Basel II/III, Dodd Frank/Volker Rule in the US, challenges in valuation from IASB and IFRS, fund management challenges with UCITS, AIFMD, EMIR, MiFID and MiFIR, and Solvency II in the Insurance industry. He coined the phrase that "Regulation Goes Hollywood" with multiple versions of regulation like UCITS I, II, III, IV, V, VII for example having more versions than a set of Rocky movies. 

He then touched upon some of the main motivations behind the BCBS 239 document and said that regulation had three main themes at the moment:

  1. Higher Capital and Liquidity Ratios
  2. Restrictions on Trading Activities
  3. Structural Changes ("ring fence" retail, global operations move to being capitalized local subsidiaries)

Some further observations were on what will be the implications of the effective "loss" of globablization within financial markets, and also what now can be considered as risk free assets (do such things now exist?). Christian then gave some stats on risk as a driver of data and technology spend with over $20-50B being spent over the next 2-3 years (seems a wide range, nothing like a consensus from analysts I guess!). 

The talk then moved on to what role data and data management plays within regulatory compliance, with for example:

  • LEI - Legal Entity Identifiers play out throughout most regulation, as a means to enable automated processing and as a way to understand and aggregate exposures.
  • Dodd-Frank - Data management plays within OTC processing and STP in general.
  • Solvency II - This regulation for insurers places emphasis on data quality/data lineage and within capital reserve requirements.
  • Basel III - Risk aggregation and counterparty credit risk are two areas of key focus.

Christian outlined the small budget of the regulators relative to the biggest banks (a topic discussed in previous posts, how society wants stronger, more effective regulation but then isn't prepared to pay for it directly - although I would add we all pay for it indirectly but that is another story, in part illustrated in the document this post talks about).

In addtion to the well-known term "regulatory arbitrage" dealing with different regulations in different jurisdictions, Christian also mentioned the increasingly used term "subsituted compliance" where a global company tries to optimise which jurisdictions it and its subsidiaries comply within, with the aim of avoiding compliance in more difficult regimes through compliance within others.

I think Christian outlined the "data management dichotomy" within financial markets very well :

  1. Regulation requires data that is complete, accurate and appropriate
  2. Industry standards of data management and data are poorly regulated, and there is weak industry leadership in this area.

(not sure if it was quite at this point, but certainly some of the audience questions were about whether the data vendors themselves should be regulated which was entertaining).

He also outlined the opportunity from regulation in that it could be used as a catalyst for efficiency, STP and cost base reduction.

Obviously "Big Data" (I keep telling myself to drop the quotes, but old habits die hard) is hard to avoid, and Christian mentioned that IBM say that 90% of the world's data has been created in the last 2 years. He described the opportunities of the "3 V's" of Volume, Variety, Velocity and "Dark Data" (exploiting underused data with new technology - "Dark" and "Deep" are getting more and more use of late). No mention directly in his presentation but throughout there was the implied extension of the "3 V's" to "5 V's" with Veracity (aka quality) and Value (aka we could do this, but is it worth it?). Related to the "Value" point Christian brought out the debate about what data do you capture, analyse, store but also what do you deliberately discard which is point worth more consideration that it gets (e.g. one major data vendor I know did not store its real-time tick data and now buys its tick data history from an institution who thought it would be a good idea to store the data long before the data vendor thought of it).

I will close this post taking a couple of summary lists directly from his presentation, the first being the top areas of focus for risk managers:

  • Counterparty Risk
  • Integrating risk into the Pre-trade process
  • Risk Aggregation across the firm
  • Risk Transparency
  • Cross Asset Risk Reporting
  • Cost Management/displacement

The second list outlines the main challenges:

  • Getting complete view of risk from multiple systems
  • Lack of front to back integration of systems
  • Data Mapping
  • Data availability of history
  • Lack of Instrument coverage
  • Inability to source from single vendor
  • Growing volumes of data

Christian's presentation then put forward a lot of practical ideas about how best to meet these challenges (I particularly liked the risk data warehouse parts, but I am unsurprisingly biassed). In summary if you get the chance then see or take a read of Christian's presentation, I thought it was a very thoughtful document with some interesting ideas and advice put forward.

 

 

 

 

 

 

 

10 March 2014

S&P Capital IQ Risk Event #1 - Managed Services

Attended a good event at S&P Capital IQ's offices on Tuesday morning last week in London, built around the BCBS 239 document on risk aggregation and reporting (see earlier PRMIA event on this topic too). A partner vendor of S&P CIQ, Tech Mahindra, started the morning with Richard Burtsal's presentation on "Delivering an Enterprise Data Strategy". Tech Mahindra recently acquired a data management platform from UBS Asset Management and are offering a managed service data management offering based on this (see A-Team article).

Richard said that he wasn't going to "sell" in his presentation (always a worrying admission from one of us data management vendors, it usually means entirely the opposite). That small criticism aside, Richard gave a solid update on the state of the industry and obviously on what Tech Mahindra are offering, and added that:

  • For every $1 spent directly on market data, the total cost of that data goes up by a factor of 6 by the time the data is actually used 
  • 33% of rejected trades are caused by incorrect reference data
  • 60% of staff manipulate, report on or support data on a daily basis (I wonder what the other 40% actually do then? Be good to get the Tower Group report this came from to find out maybe?)
  • 25% of reference data management is wasted due to duplication and inefficiences
  • In their work with UBS Asset Management they had jointly shown that the cost of data management were reduced by 25-30% using a managed service (sounds worth verifying what the "before" situation was I guess, but interesting/impressive).
  • Clients were pushing for much faster instrument setup and a reduction in time from the 1-2 weeks setup in some systems.

There were a few questions from the audience during Richard's talk, the first asked about the differences in doing data management with the buy-side and data management on the sell-side. Richard said that his experience was that the buy-side managed less instruments (<500,000) but with greater depth of data, and sell-side held more instruments (10M+) but with less depth of data (not sure that completely reflects my experience, but sounds worth a survey maybe). 

The second question was why is the utility model for data management going to succeed right now, when previous attempts over the past 10 years had failed? Richard responded that he thought Tech Mahindra would succeed due to:

  • Tech Mahindra are data-vendor agnostic (I assume aimed at Markit-Cadis and Bloomberg-PolarLake)
  • Tech Mahindra own all their own IP (hmm, not really so sure this is a good reason or even a differentiator, but a I guess aimed at managed services that are not run by the firm that develops the data management system?)

I think the answers to this second question need thinking through more clearly, to be fair Richard had stated the 25% cost reduction already as one benefit, and various folks have said that the technology is ripe for these kinds of offerings now, but all the same the response need to be more fully developed to convince many I think (I remain undecided personally, it would be good to have some more evidence to back this up). One of the S&P CIQ added that what he thinks clients want is "Utility of Delivery" and not "Utility of Content" which I thought was a sensible comment and one that I will be revisiting in the coming months. 

On a related note to why managed services just now, another audience member asked how client specific data was managed within a utility or managed service model, and Richard said that client specific data was often managed at the client but that they can upload and integrate client generated data into the managed service offering. I think this is a very key issue within the debate about managed services and utilities, I mean I get the point the data utility proponents make that certain datasets are simple "facts" as such are either write or wrong and hence commoditisable, but much of the data is subjective and all of the data needs validating together in the context of its intended use in my view. I guess I kind of loose myself in looping arguments about why data utility vendors aren't ultimately wanting to be the next Thomson Reuters or Bloomberg (not that that is not a laudible aim but it is not going to change the world or indeed financial markets data provision very much).

 

 

03 March 2014

See you at the A-Team Data Management Summit this week!

Xenomorph is sponsoring the networking reception at the A-Team DMS event in London this week, and if you are attending then I wanted to extend a cordial invite to you to attend the drinks and networking reception at the end of day at 5:30pm on Thursday.

In preparation for Thursday’s Agenda then the blog links below are a quick reminder of some of the main highlights from last September’s DMS:

I will also be speaking on the 2pm panel “Reporting for the C-Suite: Data Management for Enterprise & Risk Analytics”. So if you like what you have heard during the day, come along to the drinks and firm up your understanding with further discussion with like-minded individuals. Alternatively, if you find your brain is so full by then of enterprise data architecture, managed services, analytics, risk and regulation that you can hardly speak, come along and allow your cerebellum to relax and make sense of it all with your favourite beverage in hand. Either way your you will leave the event more informed then when you went in...well that’s my excuse and I am sticking with it!

Hope to see you there!

11 December 2013

Aqumin visual landscapes for TimeScape

Very pleased that our partnering with Aqumin and their AlphaVision visual landscapes has been announced this week (see press release from Monday). Further background and visuals can be found at the following link and for those of you that like instant gratification please find a sample visual below showing some analysis of the S&P500.

Sp500aq

06 December 2013

F# in Finance New York Style

Quick plug for the New York version of F# in Finance event taking place next Wednesday December 11th, following on from the recent event in London. Don Syme of Microsoft Research will be demonstrating access to market data using F# and TimeScape. Hope to see you there!

27 November 2013

Putting the F# in Finance with TimeScape

Quick thank you to Don Syme of Microsoft Research for including a demonstration of F# connecting to TimeScape running on the Windows Azure cloud in the F# in Finance event this week in London. F# is functional language that is developing a large following in finance due to its applicability to mathematical problems, the ease of development with F# and its performance. You can find some testimonials on the language here.

Don has implemented a proof-of-concept F# type provider for TimeScape. If that doesn't mean much to you, then a practical example below will help, showing how the financial instrument data in TimeScape is exposed at runtime into the F# programming environment. I guess the key point is just how easy it looks to code with data, since effectively you get guided through what is (and is not!) available as you are coding (sorry if I sound impressed, I spent a reasonable amount of time writing mathematical C code using vi in the mid 90's - so any young uber-geeks reading this, please make allowances as I am getting old(er)...). Example steps are shown below:

Referencing the Xenomorph TimeScape type provider and creating a data context: 

F_1

Connecting to a TimeScape database:

F_2

Looking at categories (classes) of financial instrument available:

F_3

Choosing an item (instrument) in a category by name:

F_4

Looking at the properties associated with an item:

F_5

The intellisense-like behaviour above is similar to what TimeScape's Query Explorer offers and it is great to see this implemented in an external run-time programming language such as F#. Don additionally made the point that each instrument only displays the data it individually has available, making it easy to understand what data you have to work with. This functionality is based on F#'s ability to make each item uniquely nameable, and to optionally to assign each item (instrument) a unique type, where all the category properties (defined at the category schema level) that are not available for the item are hidden. 

The next event for F# in Finance will take place in New York on Wednesday 11th of December 2013 in New York, so hope to see you there. We are currently working on a beta program for this functionality to be available early in the New Year so please get in touch if this is of interest via info@xenomorph.com.  

 

19 November 2013

i2i Logic launch customer engagement platform based on TimeScape

An exciting departure from Xenomorph's typical focus on data management for risk in capital markets, but one of our partners, i2i Logic, has just announced the launch of their customer engagement platform for institutional and commercial banks based on Xenomorph's TimeScape. The i2i Logic team have a background in commercial banking, and have put together a platform that allows much greater interaction with a corporate client that a bank is trying to engage with.

Hosted in the cloud, and delivered to sales staff through an easy and powerful tablet app, the system enables bank sales staff to produce analysis and reports that are very specific to a particular client, based upon predictive analytics and models applied to market, fundamentals and operational data, initially supplied by S&P Capital IQ. This allows the bank and the corporate to discuss and understand where the corporate is when benchmarked against peers in a variety of metrics current across financial and operational performance, and to provide insight on where the bank's services may be able to assist in the profitability, efficiency and future growth of the corporate client.

Put another way, it sounds like the corporate customers of commercial banks are in not much better a position than us individuals dealing with retail banks, in that currently the offerings from the banks are not that engaging, generic and very hard to differentiate. Sounds like the i2i Logic team are on to something, so I wish them well in trying to move the industry's expectations of customer service and engagement, and would like to thank them for choosing TimeScape as the analytics and data management platform behind their solution. 

 

 

04 November 2013

Risk Data Aggregation and Risk Reporting from PRMIA

Another good event from PRMIA at the Harmonie Club here in NYC last week, entitled Risk Data Agregation and Risk Reporting - Progress and Challenges for Risk Management. Abraham Thomas of Citi and PRMIA introduced the evening, setting the scene by refering to the BCBS document Principles for effective risk data aggregation and risk reporting, with its 14 principles to be implemented by January 2016 for G-SIBs (Globally Systemically Important Banks) and December 2016 for D-SIBS (Domestically Systemically Important Banks).

The event was sponsored by SAP and they were represented by Dr Michael Adam on the panel, who gave a presentation around risk data management and the problems have having data siloed across many different systems. Maybe unsurprisingly Michael's presentation had a distinct "in-memory" focus to it, with Michael emphasizing the data analysis speed that is now possible using technologies such as SAP's in-memory database offering "Hana".

Following the presentation, the panel discussion started with a debate involving Dilip Krishna of Deloitte and Stephanie Losi of the Federal Reserve Bank of New York. They discussed whether the BCBS document and compliance with it should become a project in itself or part of existing initiatives to comply with data intensive regulations such as CCAR and CVA etc. Stephanie is on the board of the BCBS committee for risk data aggregation and she said that the document should be a guide and not a check list. There seemed to be general agreement on the panel that data architectures should be put together not with a view to compliance with one specific regulation but more as a framework to deal with all regulation to come, a more generalized approach.

Dilip said that whilst technology and data integration are issues, people are the biggest issue in getting a solid data architecture in place. There was an audience question about how different departments need different views of risk and how were these to be reconciled/facilitated. Stephanie said that data security and control of who can see what is an issue, and Dilip agreed and added that enterprise risk views need to be seen by many which was a security issue to be resolved. 

Don Wesnofske of PRMIA and Dell said that data quality was another key issue in risk. Dilip agreed and added that the front office need to be involved in this (data management projects are not just for the back office in insolation) and that data quality was one of a number of needs that compete for resources/budget at many banks at the moment. Coming back to his people theme, Dilip also said that data quality also needed intuition to be carried out successfully. 

An audience question from Dan Rodriguez (of PRMIA and Credit Suisse) asked whether regulation was granting an advantage to "Too Big To Fail" organisations in that only they have the resources to be able to cope with the ever-increasing demands of the regulators, to the detriment of the smaller financial insitutions. The panel did not completely agree with Dan's premise, arguing that smaller organizations were more agile and did not have the legacy and complexity of the larger institutions, so there was probably a sweet spot between large and small from a regulatory compliance perspective (I guess it was interesting that the panel did not deny that regulation was at least affecting the size of financial institutions in some way...)

Again focussing on where resources should be deployed, the panel debated trade-offs such as those between accuracy and consistency. The Legal Entity Identifier (LEI) initiative was thought of as a great start in establishing standards for data aggregation, and the panel encouraged regulators to look at doing more. One audience question was around the different and inconsistent treatment of gross notional and trade accounts. Dilip said that yes this was an issue, but came back to Stephanie's point that what is needed is a single risk data platform that is flexible enough to be used across multiple business and compliance projects.  Don said that he suggests four "views" on risk:

  • Risk Taking
  • Risk Management
  • Risk Measurement
  • Risk Regulation

Stephanie added that organisations should focus on the measures that are most appropriate to your business activity.

The next audience question asked whether the panel thought that the projects driven by regulation had a negative return. Dilip said that his experience was yes, they do have negative returns but this was simply a cost of being in business. Unsurprisingly maybe, Stephanie took a different view advocating the benefits side coming out of some of the regulatory projects that drove improvements in data management.

The final audience question was whether the panel through the it was possible to reconcile all of the regulatory initiatives like Dodd-Frank, Basel III, EMIR etc with operational risk. Don took a data angle to this question, taking about the benefits of big data technologies applied across all relevant data sets, and that any data was now potentially valuable and could be retained. Dilip thought that the costs of data retention were continually going down as data volumes go up, but that there were costs in capturing the data need for operational risk and other applications. Dilip said that when compared globally across many industries, financial markets were way behind the data capabilities of many sectors, and that finance was more "Tiny Data" than "Big Data" and again he came back to the fact that people were getting in the way of better data management. Michael said that many banks and market data vendors are dealing with data in the 10's of TeraBytes range, whereas the amount of data in the world was around 8-900 PetaBytes (I thought we were already just over into ZetaBytes but what are a few hundred PetaBytes between friends...).

Abraham closed off the evening, firstly by asking the audience if they thought the 2016 deadline would be achieved by their organisation. Only 3 people out of around 50+ said yes. Not sure if this was simply people's reticence to put their hand up, but when Abraham asked one key concern for many was that the target would change by then - my guess is that we are probably back into the territory of the banks not implementing a regulation because it is too vague, and the regulators not being too prescriptive because they want feedback too. So a big game of chicken results, with the banks weighing up the costs/fines of non-compliance against the costs of implementing something big that they can't be sure will be acceptable to the regulators. Abraham then asked the panel for closing remarks: Don said that data architecture was key; Stephanie suggested getting the strategic aims in place but implementing iteratively towards these aims; Dilip said that deciding your goal first was vital; and Michael advised building a roadmap for data in risk. 

 

 

 

 

Xenomorph: analytics and data management

About Xenomorph

Xenomorph is the leading provider of analytics and data management solutions to the financial markets. Risk, trading, quant research and IT staff use Xenomorph’s TimeScape analytics and data management solution at investment banks, hedge funds and asset management institutions across the world’s main financial centres.

@XenomorphNews



Blog powered by TypePad
Member since 02/2008