The AI future is upon us but how can financial regulation adapt and learn?

By Ben Arram on Thursday 12 January 2023

OpinionAlternative LendingDigital BankingSavings and InvestmentCrypto

Many financial firms are using AI already but careful consideration is still needed, writes Ben Arram, practice lead at Bovill.

The AI future is upon us but how can financial regulation adapt and learn?
Image source: Shutterstock

The Bank of England, PRA and FCA recently combined forces to produce a discussion paper on the use and regulation of artificial intelligence (AI) and machine learning in financial services (DP5/22). 

This is not the first time regulators extend their reach for the purpose of meeting their supervisory objectives.

What’s apparent is that AI is already deeply embedded in UK financial services. Around three-quarters of finance firms are either currently using AI in live environments, or are testing its application behind the scenes. 

Such widespread adoption of AI has significant implications for financial services, which is why the regulators are being so proactive and are keen to remain ahead of the curve. 

Any fears that they could be ‘asleep at the wheel’ are undoubtedly wide of the mark; they are active and on the front foot.

Regulating financial services means regulating AI

The discussion paper does raise an important issue: should AI be regulated at all?

The answer to that is clear. We have already seen how technology can cause systemic financial issues. 

The spark for the Global Financial Crisis was in no small part caused by many institutions using similar, yet poorly understood algorithms to their peers, meaning that small adjustments quickly became large systemic ones. 

Organisations thought that by adopting complex software solutions they were doing things differently, and giving themselves an advantage over competitors. Unfortunately for everyone, they were wrong.

Some have argued that, since 2007, AI has been implemented to address this issue by adding greater sophistication, adaptability, and an ability to learn. 

However, there are still big questions to be answered about how this can be properly tested. Moreover, this approach risks leading firms to the same, false sense of security.

There is certainly a need to make sure that AI is not being used in such a way that it creates systemic risks. Faced with pressure to increase the competitiveness of UK financial services, regulators need to balance protecting consumers with allowing innovation and disruption to take place. 

Regulators are also keen to harness the attributes of AI themselves. This includes exploring the social good that AI has to offer. 

For example, it can be a powerful tool in spotting patterns of behaviour that demonstrate a risk that a customer is vulnerable. AI can then be used to tailor financial services to those vulnerable consumers, mitigating the potential for harm and ensuring better consumer outcomes.

Principles first

The FCA is already using AI as part of its supervisory and monitoring tool kit. The regulator doesn’t necessarily need to understand every line of code firms are using, just what can potentially go wrong and what the early warning signs are. This is analogous to the approach it takes in other areas of regulation, such as the Senior Managers & Conduct Regime (SMCR), the Consumer Duty and Operational Resilience. 

These all set out the parameters of a principles-based approach, which provides companies with plenty of guidance and understanding of the conduct that is expected of them, as well as of the enforcement actions that will be imposed if the rules are not adhered to.

It is increasingly up to firms to self-monitor and keep within the rules. Equally, the Regulators cannot and will not rely on this alone. 

Regulators will observe the metrics without looking at every algorithm. When issues become apparent then scrutiny will be tightened, for example through ‘Dear CEO’ letters or S166 orders.

Many firms are using AI already. What they need to be clear about is where and how in their operations AI has its impact. Firms need to think through carefully whether in implementing AI they are unwittingly excluding or disadvantaging certain customer segments or individuals. 

In any case, the Consumer Duty, which will take effect from July 2023, requires that firms consider their distribution chains in detail. 

They will have to be able to show the FCA that they are ensuring good consumer outcomes across the entirety of their distribution chain. 

This includes all stages of product manufacturing and distribution, as well as service execution and ongoing consumer support. Firms should therefore also be thinking about how AI impacts all these aspects.

It is true that the future is here already. It is also true that much of the regulation to keep AI on track is already in place, too.

 

The views and opinions expressed are not necessarily those of AltFi.

Sign up for our newsletters


Your daily 7am download of all things alternative finance and fintech.

Fintech and alternative finance headlines with an exclusive Editor's Note each week. Delivered Monday at midday.

AltFi's new weekly US newsletter breaking down the ins and outs of America's burgeoning fintech sector. Delivered Monday 9am EST/ 6am PST.