
Considerations for how regulators can support artificial intelligence (AI)’s adoption have been set out in a report co-produced by the Bank of England (BoE) and UK’s Financial Conduct Authority (FCA).
AI is an increasingly used technology in financial services but regulation at a national, international and global level remains in development.
The BoE and FCA set up an AI Public-Private Forum (AIPPF) in October 2020 to ‘share information, deepen collective understanding of the technology and explore how to support the safe adoption’ of AI across financial services.
The AIPPF’s final report has now been published, representing the culmination of four meetings, as well as workshops, which discussed how to overcome barriers to adoption, challenges and risks in three areas: data, model risk and governance.
The report comes ahead of a government white paper on AI regulation – which is expected to build on the Plan for Digital Regulation published last July – and five months after the government presented its National AI Strategy. The latter was billed as the ‘start of a step-change for AI, recognising the power of AI to increase resilience, productivity, growth and innovation across the private and public sectors.’
On alert to regulatory fragmentation
The AIPPF’s 47-page publication contains a 21-point summary structured around the three areas, as well as setting out next steps.
‘The discussion on the safe adoption of AI has only just begun,’ the report notes, stating that regulators ‘could start’ to support innovation and AI adoption by providing clarity on how existing regulation and policies apply to AI. ‘Regulatory alignment will catalyse progress,’ the report states.
The creation of new rules is going to be pivotal. The National AI Strategy said the UK’s Office for AI (part of the Department for Digital, Culture, Media & Sport and Department for Business, Energy & Industrial Strategy) would develop a national position on governing and regulating AI through a white paper that would ‘set out the government’s position on the potential risks and harms posed by AI technologies and our proposal to address them’. It is set to be published in ‘early’ 2022 but no specific date has been publicised.
Although the UK has left the European Union (EU), the AIPPF report acknowledges the importance of international developments. The European Commission proposed a draft regulation on AI 10 months ago and the AIPPF notes that ‘there may be a need to avoid regulatory fragmentation where possible, both domestically and internationally, and between different sectors.’
‘Harmonising regulation at these levels would help ensure accountability and manage risks without stifling innovation,’ the AIPFF report states, adding that ‘contradictions and vagueness make it harder to understand where the differences are between foreign and UK regulations and impede the use of AI models developed outside of the UK.’
The AIPPF report goes on to state that ‘at the same time, the topic of possible regulatory responses to AI is complex… there is a risk that regulation will be too strict and too early. Instead, any regulation should aim to be flexible and it may be that principles-based is the most effective form.’
Aiming to ‘broaden engagement’
The report acknowledges that some UK regulators have tried to address the use of AI in their respective remits. For example, the Information Commissioner’s Office (ICO) has issued guidance on ‘explainability’ that can help companies to use AI in a way that complies with data protection laws.
Other relevant forums include the Digital Regulation Co-operation Forum, which was formed in July 2020 by the Competition and Markets Authority (CMA), ICO and Office of Communications (Ofcom).
The AIPPF was co-chaired by BoE’s deputy governor for markets and banking, Dave Ramsden; and, from the FCA, initially by Sheldon Mills, executive director for consumers and competition, then Jessica Rusu, chief data, information and intelligence officer.
Observer members of the AIPPF included the Office for AI, as well as the Centre for Data Ethics and Innovation (CDEI); Fixed Income, Currencies and Commodities Markets Standards Board (FMSB); HM Treasury; and ICO. Global Government Fintech lists all 21 private sector and academic members of the forum at the end of this article.
Ramsden and Rusu write in the report’s foreword that their authorities plan to publish a discussion paper on AI ‘later this year’ that will ‘broaden engagement to a wider set of stakeholders’.
AI Public-Private Forum (AIPPF)
Global Government Fintech lists below the 21 private sector and academic members; the 21 were selected after more than 100 applications (according to p5 of the final report)
- Michael Baldwin (ex-Google)
- Jason Barto (Amazon Web Services)
- Fiona Browne (Datactics)
- Javier Campos (Experian)
- Hugh Christensen (Amazon Web Services)
- Cosmina Dorobantu (Alan Turing Institute)
- Mike Dewar (Mastercard)
- Sarah Gadd (Credit Suisse)
- Dan Kellett (Capital One UK)
- Rachel Kirkham (MindBridge AI)
- Shameek Kundu (Truera)
- Jessica Lennard (Visa)
- Andy Moniz (Acadian Asset Management)
- Owen Morris (Aviva)
- Gwilym Morrison (Royal London)
- Harriet Rees (Starling Bank)
- Kate Rosenshine (Microsoft UK)
- Jas Sandhu (Royal Bank of Canada)
- Amy Shi-Nash (National Australia Bank)
- Phil Tetlow (IBM UK)
- Philip Treleaven (University College London)
Source: p44 of the final report
FURTHER READING
‘Singapore authority publishes methodologies’ for responsible AI’ – our news story (8 February 2022) on assessment methodologies to steer and encourage responsible use of AI by financial institutions such as commercial banks being published by the Monetary Authority of Singapore (MAS) – the methodologies were compiled by MAS and more than 20 private companies working collectively as a consortium called ‘Veritas’
‘UK government presents artificial intelligence strategy’ – our news story (24 September 2021) on the UK’s ‘National AI Strategy’
‘UK regulators: machine learning deployments set to double in financial services’ – our news story (24 October 2019) on a report, ‘Machine learning in UK financial services’, produced by the BoE and FCA