All organisational data falls into either one of two categories – transactional data and reference data. In the context of regulatory reporting by financial institutions, the importance of transactional data is self-evident. After all, it is actual transactions that are most responsible for determining the degree to which a financial institution is exposed in terms of credit, market and liquidity risk.

However, reference data is important too. Also referred to as static data, reference data includes product or security descriptions, counterparties, calendars, external market and price data. Reference data is descriptive in nature and is shared across trades and reused in transactions. Reference data will be affected by the new regulations especially in terms of additional data capture (e.g. as outlined in Solvency II, Basel III and FATCA (Foreign Account Tax Compliance Act)).

But well managed reference data can provide a platform for business growth and competitiveness. Good reference data is a powerful foundation for business intelligence, providing insights into products and setting the stage for maximum return.

The financial services industry has remained in the lead as far as process automation is concerned. That being said, between the two types of data, reference data is often more difficult to automate and traditionally requires more manual intervention. But every manual process in the management of data is costly, limited and ties down employees who would otherwise be engaged in other value adding activities.

As the business grows, so does the quantity of data. And as new regulations such as Solvency II, Basel III and FATCA kick in, so does the quantity of information that is fed to the risk data warehouse and used to perform internal risk management and generate regulatory reports. Manual data management can only go so far before the volume of data becomes overwhelming.

Remember that the time for filing regulatory reports is limited – often as little as 20 days after the end of the period in question. Manual intervention also increases the risk of errors. In contrast, strategic automation of reference and transaction data capture, analysis and modelling allows for a real time view of business exposure.

This reduces the likelihood of downstream inconsistencies. For instance, if certain characteristics of the counterparty, instrument or client have changed, they will be immediately reflected in the risk data warehouse and the subsequent risk reports. The following are some of the areas where low quality reference data and poor reference data management can have a direct capital and cost impact on the business.

Regulatory Capital – Taking Basel III as an example, the incorrect classification of assets due to wrong or incomplete reference data (e.g. missing product categories, missing ratings, wrong counterparty information etc) can necessitate the retention of additional capital in the balance sheet thus depriving capital from the business’ core profit-making activities.

For instance, the counterparty credit risk charge in the P& L depends on each given counterparty’s probability of default. Inconsistent or inaccurate reference data will not only lead to incorrect risk reporting but also inaccurate P&L calculation. In large financial institutions where capital and P&L are in the billions of dollars, the cost of such seemingly miniscule errors in the risk data warehouse can be colossal.

Product setup – Poor management of reference data can inhibit an organisation’s ability to create new products quickly. In the financial services industry where the speed of responding to changes in market conditions can make a large difference in revenues, developing robust, risk-assessed products on short notice is vital.

STP – Straight through processing (STP) is a big part of banking today. From the smallest to the largest banks, there is hardly any financial institution that does not have some form of STP. The business benefits are clear – faster processing, lower manual intervention and substantial cost savings. But the success rate of STP depends on the quality and consistency of transaction and reference data.

For instance, the absence of an account number, BIC identifier, fed wire or IBAN would prevent an outgoing or incoming bank transaction from being automatically applied to the beneficiary account. Each failed trade or transaction has a cost, the largest component being the man hours required to manually repair each failed trade or transaction.

Author's Bio: 

Graz Sweden AB provides financial services players with the most cost-effective way to access, manage, and analyze their data. Using the flexible data management platform HINC, Graz’s data warehouse infrastructure helps manage tens of thousands of investment portfolios for several institutions including 9 insurance companies, 120 banks and the largest fund manager in Scandinavia. For more information, visit www.graz.se