28 posts categorized "TimeScape"

15 April 2014

Financial Markets Data and Analytics. Everywhere You Need Them.

Very pleased to announce that Xenomorph will be hosting an event, "Financial Markets Data and Analytics. Everywhere You Need Them.", at Microsoft's Times Square New York offices on May 9th.

This breakfast briefing includes Sang Lee of the analyst firm Aite Group offering some great insights from financial institutions into their adoption of cloud technology, applying it to address risk management, data management and regulatory reporting challenges.

Microsoft will be showing how their new Power BI can radically change and accelerate the integration of data for business and IT staff alike, regardless of what kind of data it is, what format it is stored in or where it is located.

And Xenomorph will be introducing the TimeScape MarketPlace, our new cloud-based data mashup service for publishing and consuming financial markets data and analytics. More background and updates on MarketPlace in coming weeks.

In the meantime, please take a look at the event and register if you can come along, it would be great to see you there.

11 December 2013

Aqumin visual landscapes for TimeScape

Very pleased that our partnering with Aqumin and their AlphaVision visual landscapes has been announced this week (see press release from Monday). Further background and visuals can be found at the following link and for those of you that like instant gratification please find a sample visual below showing some analysis of the S&P500.

Sp500aq

06 December 2013

F# in Finance New York Style

Quick plug for the New York version of F# in Finance event taking place next Wednesday December 11th, following on from the recent event in London. Don Syme of Microsoft Research will be demonstrating access to market data using F# and TimeScape. Hope to see you there!

27 November 2013

Putting the F# in Finance with TimeScape

Quick thank you to Don Syme of Microsoft Research for including a demonstration of F# connecting to TimeScape running on the Windows Azure cloud in the F# in Finance event this week in London. F# is functional language that is developing a large following in finance due to its applicability to mathematical problems, the ease of development with F# and its performance. You can find some testimonials on the language here.

Don has implemented a proof-of-concept F# type provider for TimeScape. If that doesn't mean much to you, then a practical example below will help, showing how the financial instrument data in TimeScape is exposed at runtime into the F# programming environment. I guess the key point is just how easy it looks to code with data, since effectively you get guided through what is (and is not!) available as you are coding (sorry if I sound impressed, I spent a reasonable amount of time writing mathematical C code using vi in the mid 90's - so any young uber-geeks reading this, please make allowances as I am getting old(er)...). Example steps are shown below:

Referencing the Xenomorph TimeScape type provider and creating a data context: 

F_1

Connecting to a TimeScape database:

F_2

Looking at categories (classes) of financial instrument available:

F_3

Choosing an item (instrument) in a category by name:

F_4

Looking at the properties associated with an item:

F_5

The intellisense-like behaviour above is similar to what TimeScape's Query Explorer offers and it is great to see this implemented in an external run-time programming language such as F#. Don additionally made the point that each instrument only displays the data it individually has available, making it easy to understand what data you have to work with. This functionality is based on F#'s ability to make each item uniquely nameable, and to optionally to assign each item (instrument) a unique type, where all the category properties (defined at the category schema level) that are not available for the item are hidden. 

The next event for F# in Finance will take place in New York on Wednesday 11th of December 2013 in New York, so hope to see you there. We are currently working on a beta program for this functionality to be available early in the New Year so please get in touch if this is of interest via info@xenomorph.com.  

 

19 November 2013

i2i Logic launch customer engagement platform based on TimeScape

An exciting departure from Xenomorph's typical focus on data management for risk in capital markets, but one of our partners, i2i Logic, has just announced the launch of their customer engagement platform for institutional and commercial banks based on Xenomorph's TimeScape. The i2i Logic team have a background in commercial banking, and have put together a platform that allows much greater interaction with a corporate client that a bank is trying to engage with.

Hosted in the cloud, and delivered to sales staff through an easy and powerful tablet app, the system enables bank sales staff to produce analysis and reports that are very specific to a particular client, based upon predictive analytics and models applied to market, fundamentals and operational data, initially supplied by S&P Capital IQ. This allows the bank and the corporate to discuss and understand where the corporate is when benchmarked against peers in a variety of metrics current across financial and operational performance, and to provide insight on where the bank's services may be able to assist in the profitability, efficiency and future growth of the corporate client.

Put another way, it sounds like the corporate customers of commercial banks are in not much better a position than us individuals dealing with retail banks, in that currently the offerings from the banks are not that engaging, generic and very hard to differentiate. Sounds like the i2i Logic team are on to something, so I wish them well in trying to move the industry's expectations of customer service and engagement, and would like to thank them for choosing TimeScape as the analytics and data management platform behind their solution. 

 

 

21 October 2013

S&P Capital IQ Data Integration for TimeScape

Very pleased to announce our new data integration for TimeSCape with S&P Capital IQ - see the press release

18 September 2013

Vote for Xenomorph! - A-Team DMS Data Management Awards

Pleased to say that Xenomorph has been nominated in three categories in the A-Team DMS Data Management Awards. The categories are: 

  • Best Sell-Side Enterprise Data Management Platform
  • Best Buy-Side Enterprise Data Management Platform
  • Best Risk Data Management Analytics Platform

Please vote for Xenomorph by going to this link. Many thanks!

25 June 2013

Matthew Berry on the Libor/OIS curve debate

Guest post today from Matthew Berry of Bedrock Valuation Advisors, discussing Libor vs OIS based rate benchmarks. Curves and curve management are a big focus for Xenomorph's clients and partners, so great that Matthew can shed some further light on the current debate and its implications:

New Benchmark Proposal’s Significant Implications for Data Management

During the 2008 financial crisis, problems posed by discounting future cash flows using Libor rather than the overnight index swap (OIS) rate became apparent. In response, many market participants have modified systems and processes to discount cash flows using OIS, but Libor remains the benchmark rate for hundreds of trillions of dollars worth of financial contracts. More recently, regulators in the U.S. and U.K. have won enforcement actions against several contributors to Libor, alleging that these banks manipulated the benchmark by contributing rates that were not representative of the market, and which benefitted the banks’ derivative books of business.

In response to these allegations, the CFTC in the U.S. and the Financial Conduct Authority (FCA) in the U.K. have proposed changes to how financial contracts are benchmarked and how banks manage their submissions to benchmark fixings. These proposals have significant implications for data management.

The U.S. and U.K. responses to benchmark manipulation

In April 2013, CFTC Chairman Gary Gensler delivered a speech in London in which he suggested that Libor should be retired as a benchmark. Among the evidence he cited to justify this suggestion:

-          Liquidity in the unsecured inter-dealer market has largely dried up.

-          The risk implied by contributed Libor rates has historically not agreed with the risk implied by credit default swap rates. The Libor submissions were often stale and did not change, even if the entity’s CDS spread changed significantly. Gensler provided a graph to demonstrate this.

Gensler proposed to replace Libor with either the OIS rate or the rate paid on general collateral repos. These instruments are more liquid and their prices more readily-observable in the market. He proposed a period of transition during which Libor is phased out while OIS or the GC repo rate is phased in.

In the U.K., the Wheatley Report provided a broad and detailed review of practices within banks that submit rates to the Libor administrator. This report found a number of deficiencies in the benchmark submission and calculation process, including:

-          The lack of an oversight structure to monitor systems and controls at contributing banks and the Libor administrator.

-          Insufficient use of transacted or otherwise observable prices in the Libor submission and calculation process.

The Wheatley Report called for banks and benchmark administrators to put in place rigorous controls that scrutinize benchmark submissions both pre and post publication. The report also calls for banks to store an historical record of their benchmark submissions and for benchmarks to be calculated using a hierarchy of prices with preference given to transacted prices, then prices quoted in the market, then management’s estimates.

Implications for data management

The suggestions for improving benchmarks made by Gensler and the Wheatley Report have far-reaching implications for data management.

If Libor and its replacement are run in parallel for a time, users of these benchmark rates will need to store and properly reference two different fixings and forward curves. Without sufficiently robust technology, this transition period will create operational, financial and reputational risk given the potential for users to inadvertently reference the wrong rate. If Gensler’s call to retire Libor is successful, existing contracts may need to be repapered to reference the new benchmark. This will be a significant undertaking. Users of benchmarks who store transaction details and reference rates in electronic form and manage this data using an enterprise data management platform will mitigate risk and enjoy a lower cost to transition.

Within the submitting banks and the benchmark administrator, controls must be implemented that scrutinize benchmark submissions both pre and post publication. These controls should be exceptions-based and easily scripted so that monitoring rules and tolerances can be adapted to changing market conditions. Banks must also have in place technology that defines the submission procedure and automatically selects the optimal benchmark submission. If transacted prices are available, these should be submitted. If not, quotes from established market participants should be submitted. If these are not available, management should be alerted that it must estimate the benchmark rate, and the decision-making process around that estimate should be documented.

Conclusion

These improvements to the benchmark calculation process will, in Gensler’s words, “promote market integrity, as well as financial stability.” Firms that effectively utilize data management technology, such as Xenomorph's TimeScape, to implement these changes will manage the transition to a new benchmark regime at a lower cost and with a higher likelihood of success.

 


30 April 2013

Last day of voting in the FTF News Awards!

Today is the last day of voting in the FTF News Technology Innovation Awards 2013. Our TimeScape solution has been nominated under the Best Enterprise Data Management (EDM) category. If you can spare a minute of your time today, then Xenomorph would very much appreciate your vote!

10 April 2013

Xenomorph short-listed for the FTF News Awards 2013

Xenomorph has been short-listed in the FTF News Technology Innovation Awards 2013, under the Best Enterprise Data Management (EDM) category. The short-list includes:

BEST ENTERPRISE DATA MANAGEMENT SOLUTION

  • Eagle Investment Systems – Eagle Data Management
  • GoldenSource – GoldenSource EDM
  • Markit  - Markit EDM
  • SmartStream/Euroclear Partnership
  • Xenomorph - TimeScape

If you can spare a few moments of your time, then Xenomorph would very much appreciate your vote!

05 April 2013

Feel democracy in action with Xenomorph (and vote for us!)

In these times of disfunctional political parties, external troikas and technocratic governments, why not exercise your democratic rights in a positive way by voting for Xenomorph in the best Enterprise Data Management category of the Inside Reference Data Awards 2013. You know it makes democratic sense.

 

25 March 2013

Data Management for Risk at Mediobanca

Very pleased to announce today that Mediobanca, the leading investment bank in Italy, has decided to select TimeScape as its data management system. You can see the press release here.

14 February 2013

Analytics Strategy from Numerix

Good post from Jim Jockle over at Numerix - main theme is around having an "analytics" strategy in place in addition to (and probably as part of) a "Big Data" strategy. Fits strongly around Xenomorph's ideas on having both data management and analytics management in place (a few posts on this in the past, try this one from a few years back) - analytics generate the most valuable data of all, yet the data generated by analytics and the input data that supports analytics is largely ignored as being too business focussed for many data management vendors to deal with, and too low level for many of the risk management system vendors to deal with. Into this gap in functionality falls the risk manager (supported by many spreadsheets!), who has to spend too much time organizing and validating data, and too little time on risk management itself.

Within risk management, I think it comes down to having the appropriate technical layers in place of data management, analytics/pricing management and risk model management. Ok it is a greatly simplified representation of the architecture needed (apologies to any techies reading this), but the majority of financial institutions do not have these distinct layers in place, with each of these layers providing easy "business user" access to allow risk managers to get to the "detail" of the data when regulators, auditors and clients demand it. Regulators are finally waking up to the data issue (see Basel on data aggregation for instance) but more work is needed to pull analytics into the technical architecture/strategy conversation, and not just confine regulatory discussions of pricing analytics to model risk. 

22 January 2013

Chartis Research - Data Management for Risk White Paper

New whitepaper on data management for risk from the analysts Chartis Research, including a section on how Xenomorph's TimeScape solution addresses some of the key issues identified.

12 December 2012

Numerix and Xenomorph partner in risk management

Just a quick post to highlight Xenomorph's Numerix partnership announcement that went out earlier this week. In summary we have done some great work with Numerix on combining their ability to price and risk manage very complex trades with TimeScape's ability to manage all the data such types of instruments need.

The integration is a great demonstration of the flexibility of TimeScape's data model (see recent post and LinkedIn discussion) and addresses some of the issues discussed and illustrated in an earlier post on data management for risk. Quick thank you to the clients involved in testing and using the integration, to the Numerix team for their assistance on this and to my New York colleagues who led the TimeScape integration work.

11 October 2012

Bankenes Sikringsfond Selects Xenomorph's TimeScape for Faster Data Analysis and High-Quality Decision Support

Just a quick note to say that we have signed a new client, Bankenes Sikringsfond, the Norwegian Banks’ Guarantee Fund. They will be using TimeScape to fulfill requirements for a centralised analytics and data management platform. The press release is available here for those of you who are interested.

17 July 2012

Charting, Heatmaps and Reports for TimeScape, plus new Query Explorer

We have a great new software release out today for TimeScape, Xenomorph's analytics and data management solution, more details of which you can find here. For some additional background to this release then please take a read below.

For many users of Xenomorph's TimeScape, our Excel interface to TimeScape has been a great way of extending and expanding the data analysis capabilities of Excel through moving the burden of both the data and the calculation out of each spreadsheet and into TimeScape. As I have mentioned before, spreadsheets are fantastic end-user tools for ad-hoc reporting and analysis, but problems arise when their very usefulness and ease of use cause people to use them as standalone desktop-based databases. The four-hundred or so functions available in TimeScape for Excel, plus Excel access to our TimeScape QL+ Query Language have enabled much simpler and more powerful spreadsheets to be built, simply because Excel is used as a presentation layer with the hard work being done centrally in TimeScape.

Many people like using spreadsheets, however many users equally do not and prefer more application based functionality. Taking this feedback on board has previously driven us to look at innovative ways of extending data management, such as embedding spreadsheet-like calculations inside TimeScape and taking them out of spreadsheets with our SpreadSheet Inside technology. With this latest release of TimeScape, we are providing much of the ease of use, analysis and reporting power of spreadsheets but doing so in a more consistent and centralised manner. Charts can now be set up as default views on data so that you can quickly eyeball different properties and data sources for issues. New Heatmaps allow users to view large colour-coded datasets and zoom in quickly on areas of interest for more analysis. Plus our enhanced Reporting functionality allows greater ease of use and customisation when wanting to share data analysis with other users and departments.

Additionally, the new Query Explorer front really shows off what is possible with TimeScape QL+, in allowing users to build and test queries in the context of easily configurable data rules for things such as data source preferences, missing data and proxy instruments. The new auto-complete feature is also very useful when building queries, and automatically displays all properties and methods available at each point in the query, even including user-defined analytics and calculations. It also displays complex and folded data in an easy manner, enabling faster understanding and analysis of more complex data sets such as historical volatility surfaces. 

30 May 2012

TabbFORUM Video: Data Is Revenue

Video interview with Paul Rowady of the Tabb Group, primarily about how data management can break out from being just a back office function and become a source of competitive advantage in both the front office and in risk management.

For those of you with a curious mind, the perseverence to watch the video until the end and possibly not such advanced years as me and Paul, then the lead singer of Midnight Oil that he refers to at the close of the video is Peter Garrett, who looks like this:

Peter garrett

Whereas I look like this:

G7Q40383 square

See, completely different. Obviously Peter has a great choice in hairstyle though...

04 April 2012

NoSQL - the benefit of being specific

NoSQL is an unfortunate name in my view for the loose family of non-relational database technologies associated with "Big Data". NotRelational might be a better description (catchy eh? thought not...) , but either way I don't like the negatives in both of these titles, due to aestetics and in this case because it could be taken to imply that these technologies are critical of SQL and relational technology that we have all been using for years. For those of you who are relatively new to NoSQL (which is most of us), then this link contains a great introduction. Also, if you can put up with a slightly annoying reporter, then the CloudEra CEO is worth a listen to on YouTube.

In my view NoSQL databases are complementary to relational technology, and as many have said relational tech and tabular data are not going away any time soon. Ironically, some of the NoSQL technologies need more standardised query languages to gain wider acceptance, and there will be no guessing which existing query language will be used for ideas in putting these new languages together (at this point as an example I will now say SPARQL, not that should be taken to mean that I know a lot about this, but that has never stopped me before...)

Going back into the distant history of Xenomorph and our XDB database technology, then when we started in 1995 the fact that we then used a proprietary database technology was sometimes a mixed blessing on sales. The XDB database technology we had at the time was based around answering a specific question, which was "give me all of the history for this attribute of this instrument as quickly as possible".

The risk managers and traders loved the performance aspects of our object/time series database - I remember one client with a historical VaR calc that we got running in around 30 minutes on laptop PC that was taking 12 hours in an RDBMS on a (then quite meaty) Sun Sparc box. It was a great example how specific database technology designed for specific problems could offer performance that was not possible from more generic relational technology. The use of database for these problems was never intended as a replacement for relational databases dealing with relational-type "set-based" problems though, it was complementary technology designed for very specific problem sets.

The technologists were much more reserved, some were more accepting and knew of products such as FAME around then, but some were sceptical over the use of non-standard DBMS tech. Looking back, I think this attitude was in part due to either a desire to build their own vector/time series store, but also understandably (but incorrectly) they were concerned that our proprietary database would be require specialist database admin skills. Not that the mainstream RDBMS systems were expensive or specialist to maintain then (Oracle DBA anyone?), but many proprietary database systems with proprietary languages can require expensive and on-going specialist consultant support even today.

The feedback from our clients and sales prospects that our database performance was liked, but the proprietary database admin aspects were sometimes a sales objection caused us to take a look at hosting some of our vector database structures in Microsoft SQL Server. A long time back we had already implemented a layer within our analytics and data management system where we could replace our XDB database with other databases, most notably FAME. You can see a simple overview of the architecture in the diagram below, where other non-XDB databases (and datafeeds) can "plugged in" to our TimeScape system without affecting the APIs or indeed the object data model being used by the client:

TimeScape-DUL

Data Unification Layer

Using this layer, we then worked with the Microsoft UK SQL team to implement/host some of our vector database structures inside of Microsoft SQL Server. As a result, we ended up with a database engine that maintained the performance aspects of our proprietary database, but offered clients a standards-based DBMS for maintaining and managing the database. This is going back a few years, but we tested this database at Microsoft with a 12TB database (since this was then the largest disk they had available), but still this contained 500 billion tick data records which even today could be considered "Big" (if indeed I fully understand "Big" these days?). So you can see some of the technical effort we put into getting non-mainstream database technology to be more acceptable to an audience adopting a "SQL is everything" mantra.

Fast forward to 2012, and the explosion of interest in "Big Data" (I guess I should drop the quotes soon?) and in NoSQL databases. It finally seems that due to the usage of these technologies on internet data problems that no relational database could address, the technology community seem to have much more willingness to accept non-RDBMS technology where the problem being addressed warrants it - I guess for me and Xenomorph it has been a long (and mostly enjoyable) journey from 1995 to 2012 and it is great to see a more open-minded approach being taken towards database technology and the recognition of the benefits of specfic databases for (some) specific problems. Hopefully some good news on TimeScape and NoSQL technologies to follow in coming months - this is an exciting time to be involved in analytics and data management in financial markets and this tech couldn't come a moment too soon given the new reporting requirements being requested by regulators.

 

 

 

22 September 2011

Internal model approval, risk management and regulatory compliance

Achieving regulatory approval can be challenging if we consider that regulators are concerned about both the risk calculation methodology in place but also the quality, consistency and auditability of the data feeding the risk systems used for regulatory reporting.

The data management project at LBBW (Landesbank Baden-Württemberg), for example, was initiated to support LBBW’s internal model for market risk calculations, combined with the additional aim of enabling risk, back office and accountancy departments to have transparent access to high quality and consistent data.

This required a consolidated approach to the management of data in order to support future business plans and successful growth and we worked with LBBW to provide a centralised analytics and data management platform which could enhance risk management, deliver validated market data based upon consistent validation processes and ensure regulatory compliance.

More information on the joint project at LBBW can be found in the case study, available on our website. Any questions, drop us a line!

 

 

 

04 May 2011

More formal management of instrument valuation needed

Xenomorph has today released its white paper “Instrument Valuation Management: management of derivative and fixed income valuations in a multi-asset, multi-model, multi-datasource and multi-timeframe environment”.

The white paper expands on the “Rates, Curves and Surfaces – Golden Copy Management of Complex Datasets” white paper Xenomorph published recently (see earlier post) and describes how, despite the increasing importance of instrument valuation to investment, trading and risk management decisions, valuation management is not yet formally and fully addressed within data management strategies and remains a big concern for financial institutions.

Too often, says Xenomorph, valuations (and the analytics used to process input and calculate output data) fall between traditional data management providers and pricing model vendors. This leads to the over–use of tactical desktop spreadsheets where data “escapes” the control of the data management system, leading to an increased operational risk.

Whilst instrument valuation is certainly not the primary cause of the recent financial crisis, the lack of high quality, transparent valuations of many complex securities resulted in market uncertainty and in the failure of many risk models fed by untrustworthy valuations.

“A deeper understanding of financial products reduces operational risk and promotes quality, consistency and auditability, ensuring regulatory compliance”, says Brian Sentance, CEO Xenomorph. “Clients’ requirements have evolved and portfolio managers, traders and risk managers recognize that it is no longer sufficient to treat valuation as an external, black-box process offered by pricing service providers”, he adds.

Nowadays, regulators, auditors, clients and investors demand even more drill-down to the underlying details of an instrument’s valuation. It is therefore important to implement an integrated, consistent analytics and data management strategy which cuts across different departments and glues together reference and market data, pricing and analytics models, for transparent, high quality, independent valuation management.

“Our TimeScape solution provides a valuation environment which offers rapid and timely support for even the most complex instruments, allowing our clients to check easily the external valuation numbers, based on their choice of model and data providers”, says Sentance. “Otherwise, what is the point of good data management if the valuations and the analytics used are not based on the same data management infrastructure principles?”

For those who are interested, the white paper is available here.

 

24 February 2011

Rates, curves and derived data management remains a neglected area following the crisis

Xenomorph has released its white paper 'Rates, Curves and Surfaces – Golden Copy Management of Complex Datasets'. The white paper describes how, despite the increasing interest in risk management and tighter regulations following the crisis, the management of complex datasets – such as prices, rates, curves and surfaces - remains an underrated issue in the industry. One that can undermine the effectiveness of an enterprise-wide data management strategy.

In the wake of the crisis, siloed data management, poor data quality, lack of audit trail and transparency have become some of the most talked about topics in financial markets. People have started looking at new approaches to tackle the data quality issue that found many companies unprepared after Lehman Brothers' collapse. Regulators – both nationally and internationally – strive hard to dictate parameters and guidelines.

In light of this, there seems to be a general consensus on the need for financial institutions to implement data management projects that are able to integrate both market and reference data. However, whilst having a good data management strategy in place is vital, the industry also needs to recognize the importance of model and derived data management.

Rates, curves and derived data management is too often a neglected function within financial institutions. What is the point of having an excellent data management infrastructure for reference and market data if ultimately instrument valuations and risk reports are run off spreadsheets using ad-hoc sources of data?

In this evolving environment, financial institutions are becoming aware of the implications of a poor risk management strategy but are still finding it difficult to overcome the political resistance across departments to implementing centralised standard datasets for valuations and risk.

The principles of data quality, consistency and auditability found in traditional data management functions need to be applied to the management of model and derived data too. If financial institutions do not address this issue, how will they be able to deal with the ever-increasing requests from regulators, auditors and clients to explain how a value or risk report was arrived at?

For those who are interested, the white paper is available here.

04 February 2010

"Cut and Paste" Valuation Services

You can talk about more robust modelling, more stringent scenario testing and even moving everything onto an exchange, but unless we move the principles of good data management (in my view: consistency, security and quality of all types of data) into the front office then we will continue to get front-office mis-marking as described in this article in the FT.

Thanks to Ralph Baxter from Cluster7 for highlighting this article for me and those of you interested in this topic of operational risk and spreadsheet mis-use should maybe go along to EuSpRiG this year, and maybe take a look at a paper Xenomorph presented at a previous conference.

20 May 2009

OTC Valuation by SGSS

Given all the recent attention that OTC derivatives have received (see Geithner letter), then a topical update on the work we have done with Societe Generale Security Services (SGSS) on OTC and structured product valuation services has been written up on Securities Industry News. The work involved extensive integration with Mysis Summit, where our TimeScape data and analytics management system is used to provide "Golden Copy" of market, reference and derived data for the derivative products being valued. The section on TimeScape says:

"The Summit FT solutions are integrated with SGSS' market data software tool TimeScape, licensed from London's Xenomorph in November 2007. This produces a "golden copy" of end-of-day prices from 15 different information suppliers. The unit also processes information related to 70 different currencies and 5,000 volatility surfaces, which give three-dimensional views of how much and fast a security can move up or down. With Summit's product, each surface can include between 200 and 500 data points."

From talking to some of the SGSS team at our recent user group, the thing they most seem to value about TimeScape is its ease of use in describing and managing any kind of product, allowing product and market data specialists to use and customise the system without the need for specialist technology knowledge. This echos some of the things that were said about TimeScape after a demo to Lab49 last year. 

08 April 2009

High Performance Spreadsheets

Another article about the operational risk generated by the usage of spreadsheets within the financial markets (see earlier posts), appeared in the April issue of Waters Magazine.
 
The articles highlights how spreadsheets are largely used within financial institutions and suggests that the current regulation requirements for more transparency and ad-hoc risk management might push the proliferation of spreadsheets even further. The articles also refers to the progress and improvements made by Microsoft in recent versions of Excel to increase the security of spreadsheets.
 
Xenomorph has worked closely with Microsoft on hosting its time series database within SQL Server 2008. The case study we have written together describes how SQL Server 2008 offers integration within Office Excel 2007 so that whilst the spreadsheet is still the end-user viewing tool, operational risk is reduced by engaging Excel 2007 as an analytics and reporting tool and not as a mean of storing data.
 
Our TimeScape solution offers more than 700 easy to use add-in functions to Office Excel 2007 and we are currently working on the use of Excel Services, part of Microsoft Office Share Point Server 2007, to further enhance the centralized approach to spreadsheet.
 
If you are interested in how Xenomorph solves the problem of spreadsheet management, then take a look at our (newly updated) website. Here we explain how to solve the problem and how Xenomorph Spreadsheet Inside technology can bring unstructured spreadsheet data and complex calculation within a centralized data management system, increasing transparency and reducing operational risk.

Xenomorph: analytics and data management

About Xenomorph

Xenomorph is the leading provider of analytics and data management solutions to the financial markets. Risk, trading, quant research and IT staff use Xenomorph’s TimeScape analytics and data management solution at investment banks, hedge funds and asset management institutions across the world’s main financial centres.

@XenomorphNews



Blog powered by TypePad
Member since 02/2008