A pair of detailed papers from financial regulators reveals the creaking, manual risk processes at the heart of two major banks in the wake of fines for regulatory breaches – along with sustained reliance on spreadsheets.
Metro Bank’s risk processes suffered “multiple gaps in the controls framework at every stage of the process, from data sourcing through to report generation” – a blistering paper from the Bank of England reveals.
A £5.3 million fine against the challenger bank levied shortly before Christmas 2021 was widely reported – as was the bank’s failure in how it classifies its loan book, which triggered a share price collapse in 2019.
But the extent to which Metro Bank’s controls were a porous mess was only revealed in a 55-page “final notice” from the Bank of England’s Prudential Regulation Authority (PRA) published on December 21, 2021.
In a truly damning string of revelations the final notice reveals that in January 2017 the bank’s Regulatory Reporting Team (RRT) had “only one permanent member of staff (who was relatively junior)”; that its analysis of Risk Weighted Assets (RWA) was “largely manual… [and had] created key-person dependencies on a small number of individuals familiar with spreadsheets that were not scalable” and that “Metro Bank’s front-end data capture and systems did not allow it to capture all relevant information that the firm needed.”
(The bank has had a leadership shakeup since, appointed a former Chief Risk Officer, Dan Frumkin, as its CEO on February 2020 and — as the PRA final notice reveals — spent over £15 million remediating the issues)
Regulators get tough
Metro Bank’s modest fine reflected a lack of previous issues. The bank made “significant steps to remediate the issues identified”; implemented an “extensive remediation programme to remedy the issues underlying the systems and controls which led to the RWA adjustment; [carried out] significant leadership and cultural changes; [and engaged] external consultants to identify issues, root causes and remedial actions” the PRA said.
As a detailed analysis of where the bank went wrong it deserves a close read however – and casts a fresh light on a string of serious failures in basic controls at high profile banks, as well as their reliance on spreadsheets.
The paper’s publication follows a £264 million penalty and criminal conviction for NatWest for money-laundering and a £46.5 million fine for Standard Chartered for misreporting its liquidity position and “failing to be open and cooperative” with the regulator. Both were levied in December 2021, as regulators show their teeth.
See also: The era of control loops for compliance
(Shockingly in the NatWest case, which related to failures between November 2012 and June 2016, the bank was waving through the deposit of black bags stuffed full of so much cash that they left a branch’s floor-to-ceiling safes overflowing. As a Financial Conduct Authority report in December 2021 noted: “A member of staff later told FCA investigators that they often found that the weight of the cash was too great for the bin liners which would then break. Staff would have to move the cash into stronger hessian sacks to prevent cash falling out.”)
Another PRA “final notice” published December 17, 2021 for Standard Chartered, meanwhile, emphasises spreadsheet errors that in one case amounted to $10 billion, as the bank “failed to ensure that key systems and controls supporting its regulatory reporting framework were adequate to meet the PRA’s Expectations”.
Banks’ use of spreadsheets
To those still under the misapprehension that all-singing, shiny new technology has transformed critical risk processes, the PRA’s final notice for Standard Chartered is an excellent way to disabuse readers of that notion.
Here’s how it describes one process at Standard Chartered: “SCB’s… reporting process operated in the following way. First, the GT Fermat [vendor software] system would generate initial raw data versions of the USD FSA047/048 [liquidity reporting] returns. Next, GFS [The “Group Financial Services” team, based in India] would manually apply a set of adjustments… to reflect activity not automatically recorded by the GT Fermat system.
“Finally, after carrying out daily data controls testing (which included checking the accuracy of the data in the GT Fermat system against the data included in the FSA047/048 returns by way of reconciliation checks, validation and plausibility checks, and other data quality ‘sense checks’) GFS would send to GLRR [a London-based regulatory team] spreadsheets containing the adjusted returns, together with the original raw data and the list of adjustments it had made. GLRR would then review the data and use a further liquidity monitoring tool called the Liquidity Metric Monitor (“LMM”) spreadsheet to calculate and review a set of metrics derived from the FSA047/048 returns, including the USD Gap 2 Metric. GLRR would analyse the line-by-line movements in the returns and in the USD Gap 2 Metric and produce a management information file to track the variances.
“GLRR would prepare commentary to accompany the submission of the FSA047/048 returns. GLRR would then carry out a final verification of the data (known as a “four-eyes check”) by spot-checking individual cells in the working document and the management information document to ensure that the data was the same.
“Finally, GLRR would submit the data to the PRA via Gabriel” [The FCA’s online system].
In the not-so-distant past, many large enterprises were reluctant to invest in cybersecurity; which was seen as a cost centre and little more. An avalanche of ransomware attacks has increasingly put paid to that attitude. Risk and compliance, however, in the financial services sector, seems in some corners to be “last year’s cybersecurity”, with many companies still reluctant to invest properly in resourcing teams fully and adequately.
To what extent can emerging regulatory technology help banks move on from the kind of approaches above? We’re keen to hear the views of both vendors and risk professionals. Get in touch.