Sponsored by: Informatica
The implications of regulatory reforms on global financial markets have never been greater. In response to the global financial crisis, the Basel Committee issued several principles for devising a sound risk management system with references to data aggregation. Efforts to improve the ability of banks to aggregate risk data is primarily aimed at restoring the strength and viability of global markets, enabling firms to better withstand future episodes of economic distress; a notion that is widely accepted by banking institutions
Sponsored by: MISYS
Until relatively recently, credit limit management was viewed as a task on either the trading or banking book side of an institution that did not require a specific aggregation layer to become bank-wide. Limits were managed at a counterparty or company level, and by region or industry sector. However, the subprime mortgage crisis exposed fundamental weaknesses in the credit risk and limit management frameworks at many financial institutions. Major banks found themselves unable to determine their aggregate exposures and were unable to report the needed information on a timely basis and in an acceptable format. This situation was further exacerbated by the Eurozone sovereign debt crisis which soon followed, as it blurred the lines between developed economies, which have traditionally represented safe investments, and developing economies that were deemed to be riskier.
Sponsored by: Thomson Reuters
The Basel Committee on Banking Supervision (BCBS) has proposed a range of principles for effective risk data aggregation and reporting, which all systemically important banks must implement by 2016. The effective implementation of these principles is expected to bolster risk management procedures and improve the decision-making capabilities of banks, thereby strengthening financial stability. It is widely held that the principles set out by the BCBS are a necessary and positive response to the shortfalls exposed by the financial crisis in risk reporting frameworks and the utilisation of data. Despite the huge amount of risk data that is continuously generated at large financial institutions, it became clear that these banks fell short of being able to aggregate this data from across all business units and geographies in a timely manner.
Sponsored by: Calypso
Until recently, liquidity risk has not been a primary factor taken into consideration by banks when developing business strategies. To an extent, this may have been understandable in the past as banks have traditionally traded relatively simple securities in highly liquid markets. However, incentivised by the advancements in financial engineering and increasing competitive pressures over the last 10 years, banks have begun to trade more heavily in complex financial instruments and in more obscure markets.
Sponsored by: QlikView
Prior to the financial crisis of 2008, data governance was largely seen as a ceremonial process. However, the global financial crisis forced financial institutions across the world to conduct comprehensive reviews of their data management strategies in order to improve their risk profiles and avoid any miscalculations that could prove detrimental to the business. To this end, financial institutions across the world have strived to establish strategies that exert an adequate degree of control and govern the processes of data management.