Navigating the UK’s Digital Regulation Landscape: Where are we headed?

Introduction
In 2000, the 4 biggest US corporations by market value were General Electric (founded in 1892), Exxon Mobil, drilling for oil since 1911, Pfizer producing medicines since 1849 and Citigroup with origins since 1812. The fifth was Microsoft founded in 1975.

Today the top 5 are Microsoft, Apple, Alphabet (parent company of Google) Amazon and Nvidia with Meta not far behind. Microsoft and Apple founded in the 1970s, the others in the 1990s or early 2000s. Just in the 4 years since the DRCF was born, we have seen 5G, generative AI and 4 versions of ChatGPT take hold.

While in human years, we would still be classed in our infancy, our growth is more akin to dog years. The transformation we are seeing is now an overwhelming feature of the work of regulators like the FCA.

Last week I was in Washington DC for the International Monetary Fund (IMF) and World Bank and other international regulatory meetings. The week before I did regional visits in Cumbria and Birmingham. In nearly all meetings, the role of big tech featured, relevant to our broad consumer protection and wholesale markets remit.

Whether it was a debt adviser at Citizens Advice telling me about the caseload caused by Buy Now Pay Later products promoted on social media, the Illegal Money Lending Team in Birmingham concerned about prosecutions becoming much harder with automatic deletion of Whatsapp messages by serious criminals and loan sharks, academics grappling with how to deal with the lack of access to financial services that can be exacerbated by digitalisation or building societies debating how to ensure operational resilience as they move to the cloud and build a more agile and appealing tech interface, the subject of Big Tech was prevalent.

In Washington, I learnt about US authorities’ recent antitrust case against Apple in relation to their control over how users make tap-to-pay payments using the near field communication (NFC) functionality of their iPhones. I also heard about recent developments in some online US merchants combining payment history and other personal data to provide dynamic individualised pricing for other products.

At the G20’s Financial Stability Board, and at IOSCO, the forum for international securities regulators, we continued discussions about how social media may have changed depositor behaviour and potentially accelerated bank runs in times of stress, calling for reflection on the financial stability tools that authorities have traditionally relied upon.

We also talked about how AI is increasingly prevalent in wholesale financial markets, which are already seeing amplified volatility, including in commodities markets where benchmark prices of core commodities are set.

My fellow DRCF CEO, Sarah Cardell of the CMA, had been in Washington the week before and made a speech about the competition impacts of how AI foundational models are being developed and governed.

There are at times diverging approaches taken to regulation across the world, yet technology does not stop at borders. While the EU is establishing a structure from which regulation will be created, in the UK, we are not creating separate regulations to apply to AI. For now.

That is why it is crucial for us to collaborate with each other as regulators but also with industry. Kate Jones, our CEO, describes the DRCF perfectly when she refers to us as the connective tissue that binds together regulation.

Today, as chair of the DRCF, I want to set out our priorities and how they align with our work at the FCA.

Leading a co-ordinated and effective effort to make the most of the opportunities of Big Tech – whilst having an open conversation about the risks and what can and cannot be mitigated – will continue to be a priority. We are outcomes focused at the FCA, pro-innovation and technology neutral.

We want to minimise disruption to our financial markets by making sure that they are operationally resilient. A small number of third parties that firms rely on could pose systemic risk to financial stability if their services were disrupted. The FCA, alongside the Prudential Regulation Authority and the Bank of England, recently consulted on proposals to have some oversight of these Critical Third Parties.

Firms will still be responsible for ensuring the services they provide are operationally resilient, including when they are relying upon third parties. These proposals will create a new proportionate oversight regime for CTPs, aimed at strengthening the operational resilience of the financial sector as a whole, with the Treasury determining which third parties will be designated as critical. Enhanced operational resilience should contribute to maintaining the UK as an attractive place for firms to locate.

AI and digital hub launch
Towards the end of last year, many of you gave feedback on the future DRCF workplan. The top issue identified was AI governance and to help co-ordinate the government’s principles set out in the white paper on a Pro-Innovation approach to AI regulation.

There were also suggestions that we explore the implications of Generative AI and improve our own understanding of the algorithmic assurance market. Deepening our understanding of the cross-regulatory implications of AI – and how we respond – will form a core part of our work within the DRCF AI and Digital Hub which will launch today.

The hub will be a shopfront for informal advice for tech innovators. Think of it as getting 4 regulators for the price of 1 – even though the price of that advice is free to you. We want to help you get your products to market quickly, safely and sustainably.

Regulation tied to innovation
At the FCA, we have experience of hosting innovators through our programmes including an innovation hub, our policy and tech sprints. We also run a permanent digital sandbox. We help firms test and refine ideas in a risk-free, controlled environment.

It also allows us to better scan the horizon. We hope the DRCF AI and Digital Hub will provide a sense of where technology and industry is going.

Our permanent sandbox uses synthetic transaction and market data, combined with more than 200 data sets, including payments and transactions, investments and Companies House data.

We are enabling an open API marketplace. And we know from the responses to the DRCF’s call for input, that the use, transfer and governance of data is another key industry concern.

Open Banking and Open Finance hold so much potential for our economy. We were early adopters, but there are concerns that progress has stalled.

First the CMA, then the FCA and PSR – have facilitated a thriving eco-system of fintechs, including by mandating the opening up of data.

We want to push on with open finance opening up data further, including across sectors, to increase competition and innovation. But we have to overcome resistance:

from incumbent firms that own the data but feel the threat from fintech and Big Tech
from some – not all – Big Tech firms who may want to access finance data but not reciprocate or pay for it
from Big Tech firms in adjacent markets which can constrain competition in financial services markets
The dominance of a handful of firms and further entrenchment of powers will imperil competition and innovation. And, alongside promoting effective competition, the FCA has a primary objective to protect consumers from harm.

The Prime Minister has signalled that the future of AI relies on the development of safe AI. And the legislative foundations are being laid through the Digital Markets, Competition and Consumers Bill and the Data Protection and Digital Information Bill, both due to receive Royal Assent. They follow the introduction of the Online Safety Act.

In the meantime, there are outstanding issues of trust and transparency around the use of data – it was one of the reasons for the creation of the DRCF – and also outstanding issues around trust of AI and trust of Big Tech and implications for privacy.

As a parent of young children, I am also mindful of the grassroots-powered debate around the use of smart phones in childhood that has gained ground in recent weeks. These are issues that the ICO and Ofcom – who are integral parts of the DRCF – and the government are already grappling with.

What I would say is that we also know that there were issues around trust around banking and financial services institutions in the wake of the 2008 financial crisis.

We saw that after the responsive way some banks reacted in the pandemic – with a bit of a push from the FCA it has to be said! – customers gained trust in these institutions.

It is possible to turn around sentiment by your sustained actions. In Australia, concerns over the use and sharing of data have been at least partly addressed through the Consumer Data Right, which gives citizens more control over their data. It also, interestingly, compels firms to share it if that is what the end user wants. To do that safely, we need to examine the need for digital infrastructure and the authentication of identity.

One of our priorities at the DRCF is to conduct and publish research on digital identity. We know other countries have made progress on this and we know mistakes were made in the past in this country as to how it was proposed. But technology and the opportunities have moved on, it is time also for us to revisit this.

Boosting transparency and accountability
As I have described, digital markets are increasingly central to so much of our work and DRCF regulators will often be interacting with the same market players.

We are currently consulting on whether to adopt a looser public interest test for announcement of enforcement investigations judged on a case by case basis.

Some City trade bodies have characterised this as ‘naming and shaming’ that would undermine competitiveness. That is not how we think about it.

We have observed how a number of our partner regulators have operated this approach in key markets for the UK economy and consumers. For example, Ofcom routinely publishes the names of firms and issues under enforcement investigation, as does the CMA.

The language used in the enforcement bulletins is measured and factual and can bolster transparency and accountability, something that the Public Accounts Committee has previously asked us to consider.

We have some experience of managing the balance between transparency, fairness to firms and competitiveness in how we operate our warning list, an item of our toolkit separate to enforcement.

For example, in September 2022, we issued a warning that the crypto firm FTX may be operating illegally in the United Kingdom – this was before any licence application had been considered or enforcement action. We judged it was necessary given the risks to consumers.

Some would argue that such an approach undermined the firm’s reputation and UK competitiveness when jurisdictions were vying to win business. In this case, in November 2022 FTX collapsed and there has recently been a criminal conviction linked to FTX in relation to one of the largest financial frauds in US history.

Such decisions are not easy and are often judged with hindsight. We hope to have a measured debate about these issues and will listen carefully to all feedback.

Response to input on Big Tech
I want to return to the subject of Big Tech as today at the FCA we have published our response to the Call for Input on the competition implications of Big Tech and data asymmetry.

While the respondents to our call for input did not identify any immediate harms from data asymmetry, we know we must remain vigilant. We will continue to take proactive steps towards developing a regulatory approach to Big Tech’s activities in financial services.

Data asymmetry may increase. Where that data is important in offering financial services, this data asymmetry will reduce competition, leading to reduced innovation and worse outcomes for consumers.

There is also a risk that Big Tech firms become the primary access channel for retail financial services. This could squeeze out innovation from small players and also discourage the incumbent legacy institutions from continuing to invest.

With Big Tech now an essential component of the financial services supply chain, there is also the risk that the combination of cloud, data and AI will cement Big Tech’s power in partnerships with firms across financial services and other sectors.

We need more industry players to feel they have a part to play on the data pitch. Safe data sharing can benefit firms, markets and consumers. It is crucial for Open Banking and Open Finance.

But our Call for Input found that firms lacked evidence of the value of Big Tech firms’ data from their core financial services. Respondents believed that the data from Big Tech could have significant value but were unable to provide evidence given their lack of access to it. That is where we have a part to play at the FCA.

We know there are many challenges in implementing the right Future Entity structure for Open Banking. Last Friday, the FCA jointly published our Call for Input on the Future Open Banking Entity alongside Joint Regulatory Oversight Committee members (CMA, HMT and PSR).

But we also need to keep our eye on longer-term data sharing. That is why we propose:

To work with industry to develop use cases for how Big Tech data could improve financial services.
Where such data is useful, to explore incentives for data sharing and what is needed for a level playing field.
And to work with the PSR to understand the competition problems of digital wallets where these could become the gateway for many financial services products.
Response to AI paper and our internal use of AI
In the FCA’s response to the government’s recent consultation on AI, we will prioritise clarity on the current deployment strategies of the firms we regulate. This will include a new edition of our machine learning survey with the PRA. Regulation will have to adapt to the speed, scale and complexity of AI.

At the DRCF, we will continue to support the adoption of responsible AI through the AI and Digital Hub. We will also conduct joint research into consumer use and understanding of generative AI.

The complexity of AI models may require a greater focus on the testing, validation and explainability of AI models as well as strong accountability principles.

We are investing in our digital skills at the FCA, including by creating a new digital hub in Leeds with the recruitment of more than 75 data scientists. We are exploring how we can use AI and Machine Learning (ML) in the pursuit of our objectives. It has already transformed the speed with which we can monitor and tackle scam websites, money laundering and sanctions breaches. But in a world that moves quickly, we also need to respond to the challenge of highly convincing deep fakes that can be deployed at lightning pace.

As the Prime Minister’s AI Safety Summit demonstrated, we cannot ‘solve’ AI in isolation. The DRCF created an international network last year and is engaging authorities in other countries to see how it can further collaborate.

Latest News

For Firms

For Consumers