Google Building
EU

How Google assumed the role of an editor

Date: February 19, 2026.
Audio Reading Time:

The European Publishers Council (EPC) has filed an antitrust complaint with the institutions of the European Union against Google, targeting the new search logic in which users more often receive a ready-made answer at the top of the page and less often visit the original text.

The focus is on AI Overviews, AI-generated summaries that Google displays above traditional search results, as well as AI Mode, a conversational search feature.

The complaint is significant not because it is yet another media-versus-platform dispute, but because it precisely addresses the point where search becomes a distribution system for someone else's content, without a clear licence or compensation agreement.

The EPC is not only claiming that Google uses journalistic content without authorisation. It also claims that publishers have virtually no real choice.

The EPC states that the content is used "without authorisation, without effective opt-out mechanisms, and without fair remuneration, while at the same time displacing traffic, audiences, and revenues that are essential to the sustainability of professional journalism".

This is the central point of dispute. If opt-out is technically possible but penalised by the market, then the choice is no longer real.

AI Overviews and AI Mode are not merely cosmetic changes to search. They undermine the very economy of the open web.

For two decades, a simple, clear market arrangement functioned: newsrooms funded reporting, editing and fact-checking; the search engine found and ranked them; the user clicked on the resource.

That traffic generated revenue, through advertising, subscriptions, or both, and that money sustained the system.

When search starts delivering an answer on its own page, that chain breaks at a point crucial for publishers: the click becomes the exception, not the rule.

Media content still performs the most expensive part of the job – producing reliable information – but the value of that work is increasingly retained within the platform.

The publisher is reduced to a supplier of raw materials, while the "final product" of the user experience is sold as a Google service.

Market structure and the illusion of choice

This is a question of market structure. If the dominant gateway to the Internet shifts from mediating to taking over the "answering" function, then the balance of power changes.

The publisher no longer competes solely with other media outlets for attention but depends on whether the platform will even allow users to reach the source.

In such a scenario, the notion of publisher "choice" becomes problematic. Opting out of the AI Overviews is not a neutral move but potentially a loss of visibility and therefore a loss of market.

The dispute revolves around a question that is existential for publishers: Is the open web still a system in which quality content is rewarded by users visiting the source, or is it shifting to a regime where the platform internalises value and sources become invisible infrastructure?

Professional journalism can only survive if there is stable income to fund reporting, fact-checking, and editorial responsibility

If such a model is established, the consequences will affect not only the business balance sheets of the media outlets but also the public sphere.

Professional journalism can only survive if there is stable income to fund reporting, fact-checking, and editorial responsibility.

If that revenue is cut off by diverting users away from the source, search will increasingly rely on content that is cheaper to produce, less verifiable, and easier to automate. In the long term, this means less reliable information in circulation, not just a different method of distribution.

Existing investigations and regulatory context

This complaint is in addition to an ongoing proceeding. In December 2025, the European Commission opened a formal investigation into whether Google was using its dominant search position on YouTube to favour its own AI products.

The investigation does not concern the development of artificial intelligence but rather market behaviour.

The investigation examines whether Google imposes conditions on publishers and creators that they cannot refuse

Specifically, it examines whether Google imposes conditions on publishers and creators that, due to their dependence on search and the platform, they cannot refuse, and whether it grants itself privileged access to content unavailable to competing AI systems.

Within this context, the initiative of the European Publishers Council serves a clear function.

It does not introduce a new issue but lends political and legal weight to an existing investigation, directing it towards a specific mechanism through which market power may be abused.

Google’s response and unresolved responsibility

Google defends itself predictably, but not frivolously. It claims that AI tools improve the user experience, help discover content and give publishers some control.

In parallel, in recent days Google has publicly attempted to address the most sensitive issue: the visibility of links.

It has introduced changes to make links to sources more obvious, with hover windows and more prominent source icons in AI responses.

This is a tactical move that acknowledges the problem but does not resolve it. Publishers do not dispute that their content is "mentioned".

They dispute that their content is used to keep users on the platform, while the click becomes an additional option rather than the primary path.

Google AI Overview
When AI takes text, shortens it, and displays it as a ready-made answer, any error or misinterpretation is not attributed to the platform that generated the answer but to the media outlet from which the text was taken

It is not just about money. Responsibility and reputation are at stake. When AI takes text, shortens it, and displays it as a ready-made answer, any error or misinterpretation is not attributed to the platform that generated the answer but to the media outlet from which the text was taken.

Thus, responsibility for accuracy remains with the editorial office, while control over how the information is presented passes to the platform.

This is especially sensitive when dealing with topics of immediate public interest, such as medical advice.

A recent study published by the Guardian found that warnings and limits about AI responses are often shown in a subtle way or only after users click more, while the AI summary stands out as the main answer.

This creates an impression of reliability that does not always match the level of editorial control present in professional journalism.

Towards binding rules rather than voluntary fixes

The complaint from the European Publishers Council has a specific goal. It seeks to establish a clear rule for market behaviour in situations where a platform uses someone else's content to provide answers to users instead of directing them to the source.

In such a model, there must either be explicit permission and appropriate compensation for the use of content, or the publisher must have a genuine option to opt out of that process without consequences for their search visibility.

In practice, the EPC requires just that – the ability to opt out of AI Overviews without a incurring a market penalty.

This is where the broader significance of this dispute arises. This is not a classic copyright dispute, nor is it about whether artificial intelligence "takes" the content.

When the platform ceases to be a channel to the sources and instead provides the final answer itself, it assumes a function with editorial consequences

The issue is to change the role of search. When the platform ceases to be a channel to the sources and instead provides the final answer itself, it assumes a function with editorial consequences.

At that point, the question of competition is no longer limited to the ranking of links but concerns access to the information market and the conditions under which different actors can reach users in general.

A similar process is already underway in the United Kingdom. At the end of January, the UK's Competition and Markets Authority (CMA) launched a public consultation on possible changes to the way Google manages search and AI functions.

Among the measures under consideration are the possibility of exempting publishers from AI summaries or from using their content to train AI models, as well as additional requirements for ranking transparency and easier access to competing search engines. The consultation is open until 25 February.

This move has broader significance, as it shows that the issue is no longer considered a dispute between media outlets and a single technology company.

The British regulator treats it as a question of market structure and the balance of power in the digital distribution of information. This clearly shifts the focus from individual complaints to systemic rules.

The likely outcome and long-term consequences

If the European Union decides to follow the same path, the most likely framework will be the application of competition law through binding conditions of conduct.

The reason is practical. While it is very difficult to legally prove the unauthorised downloading of content in the traditional sense, it is much easier to determine whether a platform is using a dominant position to impose unfair terms or exploit a publisher's dependence on search.

The European Commission has already emphasised in its documents that it examines exactly those elements, including privileged access to content.

The European Publishers Council initiative aims to apply that existing framework clearly to new forms of AI search.

EU Commission SH EDITED.jpg (237 KB)
The outcome of this dispute will not be a ban on AI summaries. It is much more realistic to introduce clear and binding rules - European Commission

In practice, the outcome of this dispute will not be a ban on AI summaries. It is much more realistic to introduce clear and binding rules.

The essence of these rules is simple. If the platform uses the content in such a way as to replace the visit to the source, the publisher must have the right to refuse this without losing visibility in search or to receive appropriate compensation for such use.

In addition, the sources on which the AI answer is based must be clearly and visibly identified as an integral part of the answer, not as a side note.

Voluntary changes to the interface do not solve the problem, because publishers are seeking stable, not changing, rules.

Without the establishment of such a framework, the market will undergo long-term adjustments that negatively impact users.

Publishers will quickly lock content behind apps, newsletters, and paywalls to protect their revenue.

This reduces the availability of quality information to a wider audience, while platforms increasingly rely on content that is cheaper to produce and easier to automate.

The result is a weaker foundation of reliable sources, even for AI systems themselves.

That is why this dispute has broader significance. European institutions do not view it as a conflict of interest between two industries but as a question of market power and access to information.

The aim is not to halt the development of artificial intelligence but to prevent a model in which value is systematically taken from the source without clear rules or fair distribution.

In this respect, the publishers' initiative marks a shift from general warnings to a specific regulatory requirement, which will likely result in a new standard rather than a single major ruling.

Source TA, Photo: Shutterstock