Charles Randell, chair of the Financial Conduct Authority and the Payment Systems Regulator recently said that a possible trio of ongoing trends powering disruptive finance, if left unchecked, could turn nasty.
The UK could be at risk of turning into an ‘algocracy’ where algorithms and not people or regulators have power over the financial system, according to Charles Randell, chair of the Financial Conduct Authority (FCA).
Speaking earlier this month at an event organised by Reuters, Randell offered up a bullish outlook for disruptive finance in the UK but said there are three factors, evolving rapidly, that could come together to make a dystopian financial future. These are the accelerating march and adoption of Big data, machine learning and behavioural science.
“Some have said that in the future we will live not in a democracy, where the citizens decide how we are governed, nor in a bureaucracy, where officials like me decide, but in an algocracy, where algorithms decide,” he said
“We need to anticipate the fundamental questions which Big Data, artificial intelligence and behavioural science present, and make sure that we innovate ethically to shape the answers,”he added.
An algocracy could, he says, exacerbate social exclusion and worsen access to financial services in the way that it identifies the most profitable or the most risky customers as firms become more focused on profit maximisation through the increasing sophistication of Big Data, artificial intelligence and behavioural science.
In a world that is moving towards greater connectivity of data through the ‘internet of things’ - with cars, homes and electrical appliances all now networked into the digital grid - Randell says datasets could become vastly more powerful.
“Rapid advances in the ability to store data cheaply have created enormous and detailed datasets about many different aspects of our lives. The largest of these datasets are held and controlled by a small number of big corporations,” he said.
Within artificial intelligence and machine learning systems he says these large corporations can mine Big Data sets for patterns more effectively than ever before.
“Whereas in the past firms could only target broad groups of consumers, these patterns can now be turned into conclusions about us as individuals. They can make predictions about our future behaviour, and then decide which products and services we should be offered and on which terms.”
Through behavioural science these firms are also quickly improving their understanding of human behaviour meaning they are able to target marketing using ‘nudges’ which exploit our decision-making biases, informed by the Big Data sets containing our actions.
“Some nudges may be in consumers’ interests, as with auto-enrolment for pensions, but there is the potential for them to be used against our interests too,” he said.
Randell does concede that while the aforementioned trends - unchecked - are of concern, we should be optimistic of the advances in the use of data science within financial services.
“This rapidly evolving technology has already brought huge benefits to society. This includes smarter ways of detecting financial crime and market abuse, cheaper and faster transactions and greater access to affordable financial advice and guidance.”
“The UK fintech industry is world leading and bursting with new ideas. But there is no room for complacency. Technological innovation in financial services brings together two of the UK’s greatest assets and gives us the opportunity to lead the world in fintech,” he added.
Now in its sixth year, the AltFi London Summit returns on 18th March 2019 to 155 Bishopsgate. Last year proved to be a crucial turning point for the key players building the future of finance. Leading platforms launched oversubscribed IPOs, digital banks proliferated and mainstream financial institutions started their own disruptive propositions. With 2019 certain to be another landmark year, more questions will be asked by regulators with investor interest in disruption also poised for more rapid growth.