The FCA’s review into the use of AI across retail financial services is a timely and welcome intervention. We already find ourselves in a position where AI is rapidly moving from experimentation to implementation –in all walks of life. Financial services is absolutely no exception.
While it’s hugely exciting and presents a wealth of opportunity, it also presents significant risks too. As an industry, we have to be alive to this challenge – and the reality of how quickly this technology is advancing. AI is playing chess while many industries are still playing checkers, so any review has to be as forward-looking as possible.
It’s great to see that the regulator is looking to manage both AI risks and opportunities through existing frameworks – including Consumer Duty. It’s yet another proof point for those still hoping the Duty will be rolled back. Rather, it’s at the heart of its review and overall approach to AI.
Current applications
Across the market, we are already seeing some interesting developments. Looking at a key part of Consumer Duty – vulnerable customer management – there have been some good applications. Perhaps the best example is in contact centre environments, leveraging AI tools to analyse language, sentiment and behaviour to help monitor if agents are identifying vulnerabilities. Another use is looking at correlations within transactional data to identify patterns in financial behaviour.
AI used for language interpretation is typically trained on a broad user base of speech and is not specific for vulnerable customers. This is a key differentiator when considering using AI to specifically identify vulnerabilities from existing company data sets. It is very difficult to train and verify customer vulnerability when the granular data does not exist.
Getting the house in order
It is now apparent that training frontline staff to identify all types, traits and severity of vulnerability – and to recall all the support options – is near impossible.
While great, AI for call centre analytics is only useful for those customers who phone in, and financial transaction data is only useful for analysing financial vulnerabilities – and is very poor for health and lifestyle issues. These can’t be the extent of a firm’s attempt to manage customer vulnerability.
After all, it only focuses on one channel and one subset of customers, when really, we need to look across our entire customer base. We shouldn’t be waiting for customers to tell us of their vulnerabilities either. Delivering the good outcomes required by Consumer Duty means we need to know who our vulnerable customers are and understand what outcomes they are receiving. Consumer Duty is not restricted to customers that are easiest to assess, it applies to all consumers.
Furthermore, we shouldn’t be looking at AI as a silver bullet – particularly when it comes to customer vulnerability. We need to get our house in order first and that starts with the right technology, processes and data infrastructure to identify and monitor vulnerability in a consistent and objective way.
It means being able to record characteristics, circumstances, severity, coping mechanisms, support needs and outcomes. That’s not as an open text field in a CRM or a line in a call log, but as structured data that can be monitored and analysed. How else can firms track how vulnerability evolves over the lifecycle of the product, ensure support remains appropriate or report on this information?
We need to have these firm foundations in place and the ability to gather the robust intelligence required. Only then should we overlay any AI magic over the top.
Shaping the future
There’s no question that AI has the capability to enhance the industry, let alone how consumers access information, advice and financial services. It has the potential to mobilise an entire generation into exploring their finances.
While we might know the trajectory, we don’t quite know how we will get there or how this will look. The FCA’s Sheldon Mills is absolutely right when he says that we need to shape this future. I’d argue that this starts with solid fundamentals of having objective data on consumer vulnerability, rather than inconsistent opinion or comments buried within existing systems.
AI is only as good as the data that supports it. We know that as good as the technology is, there remains real concerns around hallucinations, exclusions among those who are data poor and both embedded and amplified bias. None of the above are conducive to good outcomes. We have to build strong foundations and the best chance to mitigate risk. We need good, objective, granular data first so we can train the AI models of the future.
We know many firms have struggled to build those foundations, which explains why the new guidance from the CII and PFS have been so well received. Finally, firms have a practical road map to implement the principles-based guidance of Consumer Duty, along with a clear specification on what good looks like when it comes to IT systems, vulnerability classification and data infrastructure.
Andrew Gething is managing director at MorganAsh