The Competition and Markets Authority (CMA), the UK’s competition regulator, announced this month that it plans on publishing an update in March 2024 to its initial report on AI foundation models (published in September 2023). The update will be the result of the CMA launching a “significant programme of engagement” in the UK, the United States and elsewhere to seek views on the initial report and proposed competition and consumer protection principles.
The CMA’s Initial Review
Foundation models are large, machine-learning models trained on massive amounts of data—the term “foundation” reflects their role as building blocks for various AI applications. The CMA’s initial report examined how the competitive markets for foundation models and their use could evolve, exploring the opportunities and risks to competition and consumer protection, and produced guiding principles to support competition and protect consumers as AI foundation models develop.
The report emphasized that sustained and effective competition between developers is vital to realize the full potential of foundation models. The CMA highlighted false information, AI-enabled fraud, and fake reviews as immediate areas of concern for consumers, and if competition is ineffective in the longer term, both consumers and businesses may find themselves locked into ecosystems with higher prices and restrictions from which they cannot easily break free.
Proposed Competition and Consumer Protection Principles
The CMA proposed seven principles to guide the ongoing development and use of foundation models to help individuals, businesses and the economy benefit from the innovation and growth foundation models can offer:
- Accountability for outputs provided to consumers.
- Access to data, expertise and capital without unnecessary restrictions.
- Diversity of business models, including both open and closed, in a sustained manner.
- Choice for businesses from a range of deployment options.
- Flexibility to switch and/or use multiple foundation models according to need.
- Fair dealing with no anti-competitive conduct.
- Transparency of information about the risks and limitations of foundation model generated content for consumers and businesses.
The proposed principles are broad and emphasize the importance of unrestricted access to both a range of foundation models and information regarding the same in order to provide positive outcomes for individuals and businesses. Accountability and transparency continue to be key considerations with both AI development and deployment—the same were also highlighted earlier this month in the Bletchley Declaration on AI Safety.
The CMA is currently collecting views on its initial report and the principles themselves until January 12, 2024, from a range of stakeholders such as consumer groups and civil society representatives, leading developers and major deployers of foundation models, academics and other experts, the UK Government, and other regulators both in the UK and internationally. The March 2024 update is expected to cover reflections on market developments, assess the reception and implementation of the proposed principles by companies, provide updates on how foundation model developers acquire key resources, and consider the role of AI semiconductor chips in the foundation model value chain.
The CMA’s initial review sets out that it is important to also take into account considerations such as safety, data property and intellectual property rights. To do this, the CMA will continue its work with fellow regulators within the UK Digital Regulation Cooperation Forum (DRCF). This will include a joint statement with the UK data protection regulator (the Information Commissioner’s Office) in Spring 2024 that will consider the crossover between competition, consumer and data protection objectives, and conducting joint research with DRCF members on consumers’ understanding and the use of foundation model generated services.
Pillsbury will closely monitor further developments. Please visit our Artificial Intelligence practice page for more insights.