
The Financial Conduct Authority and the Information Commissioner’s Office will draw up new rules on the use of AI across the financial services firms.
In a joint statement, the FCA chief executive Nikhil Rathi and UK Information Commissioner John Edwards said: “To support good practice into the future we will be developing a statutory code of practice for organisations developing or deploying AI and automated decision-making — enabling innovation while safeguarding privacy.”
The FCA and ICO said that regulation is not the main blocker to innovation, but instead an enabler.
The statement says: “Done right, regulation isn’t a brake on innovation. It’s a bridge, connecting creativity with public trust. With the right approach, regulation becomes an enabler: providing the certainty firms need to invest, experiment, and grow responsibly.”
A roundtable was held last month with industry leaders to better understand the challenges firms face when deploying artificial intelligence (AI) and how the FCA and ICO can support responsible innovation and personal information use.
While firms understand the broad rules, firms wanted clearer examples of ‘what good looks like’ in practice and more opportunities for engagement to build confidence in trying new technologies.
To support good practice moving forward the FCA and ICO will be developing a statutory code of practice for organisations developing or deploying AI and automated decision-making.
In addition, both will help firms to develop, test, and evaluate AI as part of the FCA’s AI Lab.
The FCA also plans to host a roundtable with smaller firms later this year to better understand challenges around AI adoption, while the Digital Regulation Cooperation Forum has committed regulators to develop our collective understanding of how one another’s regulatory regimes might apply to AI and work to identify and resolve any points of conflict.
In the statement from Rathi and Edwards, it highlights that firms’ concerned about who holds responsibility when AI is developed by third parties.
To help with this, it states that the ICO published detailed analysis on allocating controllership across the generative AI supply chainLink is external, while the FCA has provided information on the responsibility of firms when seeking to adopt generative AI.
It also highlights that additional support has already been launched to help firms, including the Digital Sandbox, Supercharged Sandbox and AI Live Testing within the FCA’s Innovation Hub.
The ICO also offers the Innovation Advice Service, Regulatory Sandbox and Innovation Hub under its Innovation ServicesLink is external.
While these are available, the FCA and ICO reveal that it will increase the visibility of these services and signpost practical help.
Concluding their statement, Rathi and Edwards call on firms and trade bodies to “keep talking” not just when there’s a problem, real or perceived but earlier in the innovation journey.
The statement says: “We can help firms do things differently. But we need their insight to do things better.”
“With regulatory agility and confidence to innovate and invest in new technologies, businesses will provide the UK with the fuel to power economic growth.”