As the third post in the series on Targeting a New Operating Model, in this article I will discuss Principle 3 – To Reinforce Control and Command.
Many corporations are inhibited in their ability to fully grasp their risk exposure due to outdated or poorly integrated IT systems. As a result of multiple mergers and acquisitions, many banks have disjointed IT structures in which business systems are duplicated or not included in the overall data aggregation.
These disconnected and disparate systems make it more difficult to identify the information necessary to fully grasp an institution’s broad spectrum of risk exposure.
To refocus and sharpen Control and Command across the enterprise, there are 2 key directives that need to be addressed :
Principle 3 : To Reinforce Control and Command
|Reinforce Control and Command||· Focus on improving detection and reducing excess management|
|· Focus on quality, not quantity of controls|
Focus on Improving Detection and Reducing Excess Management
Historically, risk frameworks have been designed for economic, regulatory and liquidity aspects and by definition were more reactive than proactive. The key design principle of these systems was of measurement as opposed to detection.
These frameworks were architected around fragmented controls and a silo based approach to risk. This was all about creating volume of data where detection mattered less. Risk systems would groan under the amount of data they had to consume swamping the organisation with wave after wave of metrics, generating white noise and static that the organisation was unable to hear let alone act on.
In addition, recent systematic failures across the industry have highlighted that many organisations do not poses a single holistic perspective on the detection of key market threats and risks with the result that key systematic failures resulting in massive losses were not detected in good time.
What is required is a new approach to risk management that requires less data and simpler output. This is key is to produce improved detection on key threat and market risks with the expertise and domain knowledge provided in the analysis of the data.
Weak signals will need to be converted into strong indicators through improved connectivity of information and triangulation of outcomes form independent sources. Only then can an organisation be confident that it is in a position to reduce excess risk management signals and focus on improving risk detection.
Focus on Quality, not quantity of controls
The risk management approach to date for many banks has relied on excessive controls being developed to imply comprehensive coverage.
This provides a false sense of security, and only papers over the gaps in the risk management framework obscuring systematic failures and key market losses until it is too late. Furthermore, the abundance of controls realises copious amounts of sizeable risk reports which are generated but rarely viewed.
To refocus and sharpen the risk management approach, a “control of controls” function is required. This function would measure the quality of exiting controls by conducting stress tests with a view to optimising over controlled and immature control areas with a view to optimise both the number and cost of controls across a specific business area.
The risk architecture needs to avoid garbage in garbage out scenarios. The underlying data needs to be mapped, managed and utilised across a central set of reference data, master data, positions and accounting data. Common and specific classification need to exist throughout.
Only be striving for cross product architecture across silos, geographical regions and asset classes can a scalable risk architecture and methodology be defined. The focus on quality, not quantity of controls will need to be designed from the bottom up in order to design a true control of controls risk function.
Email : ian@IanAlderton.com
Tel : +44 (0) 7702 777770