Home Data Sink or swim: financial authorities dive into data    

Sink or swim: financial authorities dive into data    

Global Government Fintech Lab 2023 panel session two: Gavin Curran, Mari-Liis Kukk, Peter Oakes and Siobhan Benita (moderator)


The Global Government Fintech Lab 2023’s second session analysed how financial authorities are striving to keep on top – and maximise the value of – growing volumes of data, Daniel Tost reports

The exponential growth in data volumes has seen opportunities and challenges galore emerge for financial authorities across the globe.

The question of how they navigate their way through the increasingly choppy data ocean with most efficiency and effectiveness was the focus of the second panel discussion – ‘How can financial authorities capitalise on data to improve what they do?’ – of the Global Government Fintech Lab 2023 in Dublin.

Gavin Curran, head of the Central Bank of Ireland’s markets supervision division, opened the session. He was followed by Mari-Liis Kukk, head of the innovation department in Estonia’s Financial Supervision and Resolution Authority. Peter Oakes, the founder of independent network Fintech Ireland, completed the panel.

The session, moderated by former UK civil servant Siobhan Benita, saw the trio introduce their thoughts on the main question before the discussion took in topics including European Union (EU) data initiatives and how authorities should prioritise data-related actions.

‘Unbelievable quantities of data

Gavin Curran

Central banks receive “unbelievable” quantities of data, Curran said, describing some of the data-modernisation initiatives at Ireland’s central bank.

The authority is making growing use of machine-learning (an artificial intelligence technique that teaches computers to learn from experience) for the processing of a significant portion of suspicious transaction reports. There remains, he emphasised, ongoing need for “human context” in data analysis.

A data-quality framework (or ‘model’) has been developed. This should allow the central bank to detect potential inaccuracies early on, triggering prompt communication with data providers for resolution.

An order-book data analysis project is ongoing to help with market-abuse detection and for which the central bank is prototyping a new tool – the power of which he described as “just incredible”.

Other technology-based tools are already in use. “We have built a tool that allows our supervisors to go out and, over time – without any chance or influence of bias – record their observations in a tool that will give us a record, a sort of ‘single source of truth’,” he said.

Internal structures are, of course, also important. The central bank is, he said, establishing a ‘platform competence’ area, governance area, data analytics area and ‘data-as-a-service’ area. “We have pockets of this throughout the organisation, and to very high levels, but we’re now building on and enhancing that.”

RELATED ARTICLE SupTech on the rise: financial supervisors explore innovative technology – write-up of the Global Government Fintech Lab 2023’s SupTech session

Harmonising approaches across borders

Mari-Liis Kukk

Kukk kicked off by highlighting Estonian authorities’ relatively long track record of investing in digitalisation and data (broadening her comments beyond her own authority’s experiences).

The Baltic nation has, she said, had a chief data officer since 2018 (this position oversees the strategic co-ordination of data science and data governance, and includes oversight of topics such as artificial intelligence and open data); and she provided examples of the extensive range of online services available to the 1.3 million Estonians, helped by the existence of digital ID.

While digital developments recently experienced a temporary slowdown, Kukk stated that Estonia has now entered a phase of “great developments” again. She highlighted collaborative efforts with Ukrainian partners to create Estonian government app ‘mRiik’, which consolidates various digital services and provides a digital wallet for storing documents such as driving licences.

On the specific topic of financial data, Kukk mentioned the ongoing development of a so-called ‘positive credit registry’. This registry will allow creditors to view potential clients’ applications, leading to improved decision-making and a decrease in credit-related risks.

As an EU member state, she also emphasised the importance the bloc’s digital finance strategy in driving the harmonisation of approaches between financial authorities in the 27-member bloc.

“It basically means that the rules are the same and supervision should be the same,” Kukk said, welcoming international collaboration in terms of the use of SupTech (supervisory technology) tools. “It’s not smart to build up SupTech tools separately. Instead, we should do it altogether,” she urged.

Data quality at heart of rules ‘torrent’

Peter Oakes

Oakes, a former senior central banker and regulator who sits on the board of a number of innovative companies, began by quoting figures about the surge in data volumes globally.

According to Statista, the annual volume of digital payment transactions will be about $14.78 trillion (about £11.93 trillion / €13.75 trillion) by 2027, he said. Meanwhile, according to the EU data strategy – which aims to create a ‘single market for data that will allow it to flow freely within the EU and across sectors for the benefit of businesses, researchers and public administrations’the volume of data produced globally is expected to grow from 33 zettabytes in 2018 to 175 zettabytes in 2025. “The numbers are huge,” Oakes said.

He then cited a Bank for International Settlements (BIS) paper (‘Big data and machine learning in central banking’, published in March 2021) that found that 80 per cent of 52 central banks surveyed (in 2020) said they discuss the use of ‘big data’ formally within their institution. “That’s up from 30 per cent back in 2015,” Oakes said.

From the perspective of companies in the payments sector, he highlighted the importance of compliance with the EU’s General Data Protection Regulation (GDPR) and ‘sensitive payment data’ requirements. When seeking licences for e-money institutions or neobanks (fully digital banks), discussions with central banks and regulators revolve around these requirements, he said, adding that this then “takes you into discussions about operational resilience and cyber-security”.

Data quality is of paramount importance. “The torrent of rules and regulations from the EU and other bodies: all this comes down to the quality of data,” Oakes said.

‘Mis-analysis’ worsens problems

The panellists were asked to explain where authorities should begin – what the fundamental considerations are – when it comes to data modernisation.

Curran answered that the Central Bank of Ireland’s starting point is often the alignment with the authority’s overall strategic objectives. “We try to do an inventory of datasets and understand where [they] fit in terms of our prioritisation overall,” he said.

When a decision is made as to where to focus, resources are invested into the development of supervisory tools. “When we make a choice around the development of a tool, we will have a value-stream map of what that tool will deliver for us,” he said.

Central banks and regulators typically have shared priority objectives of financial stability, market integrity, consumer protection and fighting financial crime, Oakes said – and the latter area is where machine learning and AI is proving “really helpful these days” (he added that it was “interesting to see governments around the world now saying that financial crime in society is now an impact on financial stability”).

Asked to comment on how the global financial crisis of 2007-2008, or the more recent high-profile collapse of private banks such as Silicon Valley Bank (in March 2023), were possible considering the amount of data available to financial authorities, Oakes pointed to the “incredible” importance of data quality. “If that’s not right and it’s mis-analysed, the problems just manifest and they exponentially increase,” he said.

On the global financial crisis, Curran said: “The information was there, the regulator had it – but action was not taken, that was fundamentally the problem.”

‘Data is like money now’

Gavin Curran, Mari-Liis Kukk, Peter Oakes and Siobhan Benita (moderator)

Discussion turned to the related topics of ensuring data’s ‘integrity’ and how to prevent biased interpretation.

Oakes took the opportunity to highlight that the EU is implementing regulation on ‘high-value’ datasets that public sector bodies will have to make available for re-use, free of charge, within 16 months (the regulation falls under the EU’s open data directive). Six categories of high-value datasets are specified: geospatial, earth observation and environment, meteorological, statistics, companies and mobility.

“What is the public authorities’ approach going to be to ensure that the quality of that data is acceptable to be re-used, and hopefully kept ‘clean’ and not be ‘damaged’?”, Oakes asked. “That takes us back to [data] integrity – a lot of that comes down to ensuring that the data is locked down and you have the right quality of people looking at this information. Data is like money now – you have that obligation to treat it as safely as [you would] money.”

“If you speak about it from the view of the central bank developing a tool it will use itself, for example to work on a dataset, the dataset is in our own gift. And with that ownership, you hopefully are in a position to remove bias. Of course, you can’t remove the bias from tools that are developed outside of your own remit,” Curran said.

He mentioned a book titled ‘Humans Are Underrated: What High Achievers Know That Brilliant Machines Never Will’, authored by Geoff Colvin. “It speaks a bit about the need for us to have people in the right positions with the right skills to oversee the outputs from these pieces of technology that we’re all using,” he said.

‘It’s going to keep coming’

The session overall illustrated the diversity and growth in number of challenges for public authorities created by the digital data deluge – a similar sentiment to much of the Lab’s first session, which featured representatives of three public authorities discussing how they are striving to keep pace with the speed of innovation in fintech.

An audience member asked the panellists whether – and when – the exponential growth in data volumes would peak.

“I don’t think it’s going to slow down – I think it’s going to keep coming,” responded Curran. “So, it’s about prioritisation, picking battles and knowing what you want to focus on and what makes the biggest difference, at least in terms of the risks you observe or in terms of the priorities you have.”

“I don’t know about the influx of data,” said Oakes. “Is it ever going to stop? I’m as bad as everybody else. I’ve taken eight photos today [at the event]. They’re on my phone, they’ve been uploaded to [file-hosting service] Dropbox. I don’t know whether I’ll ever delete those photos. So, I’ve just added to that terabyte of data.”

And where does this leave authorities’ ability to regulate the quick-growing, multi-faceted world of digital finance? Are regulators able to keep up? “My answer is – of course – yes, I think we can do it,” responded Curran. “We need to be flexible, agile, open, engaged, talk to the firms, understand what their business models are and be grown-ups about it.”


Watch the session in full (it runs from about 02:12:00 to 03:05:26 if clicking on the full-day event video) =>