Netherlands Authority for Consumers & Markets

12/11/2025 | Press release | Distributed by Public on 12/11/2025 06:48

Blog Paul de Bijl: Externalities in the attention economy

Apps and platforms sometimes influence users more than people want to or even realize. How many times per day do smartphone users look on their screens? How many hours are people online? To some, it feels like an addiction, and it often definitely has the hallmarks of it. And how many app users know what happens with their personal data? You're using a navigation service, but the app and your smartphone send all kinds of data about you to data brokers. And would you stop using that app if you knew that?

Some apps and platforms earn money by manipulating our attention and decision-making processes, for example in digital environments that put users on the wrong track, not unlike how stage magicians draw their audience's attention away, while creating an illusion.

Those stage magicians perfectly illustrate that the manipulation of attention is an age-old phenomenon. In the early 1970s, Herbert Simon, the Nobel Prize winner for economics, coined the phrase 'the attention economy' in the context of an ever growing quantity of information. Attention is scarce: attention to one thing comes at the expense of attention to other things. And with digitalization, this topic has gained traction. In the 1990s, policy advisor Michael Goldhaber already foresaw that the internet would lead to a massive hunt for attention.

Distraction and deception tend to be harmful, but do they constitute traditional consumer harm, like purchasing a faulty product, harm resulting from incorrect product information, or misleading labels, etc? Or is it more nuanced? To get the answers to those questions, we need to take a look under the hood: what does an app or platform actually trade in, and with whom? What is the business model?

Distraction and deception

Digitalization has vastly increased the possibilities for influencing behavior. The term 'captology' (Computers As Persuasive Technologies) refers to techniques such as peer pressure, desires, and triggers that take advantage of the functioning of our brains, oftentimes without us even knowing, which is known as 'dark patterns'.

This manifests itself in, for example, deception: in order to have people do things they normally wouldn't want to do, such as staying on social media or in a game for too long, thereby being exposed to ads or techniques that entice individuals into making in-game purchases.

Another manifestation is distraction. Apps that seemingly function normally may collect all kinds of user data without those individuals being capable of comprehending or even suspecting what could happen to that data in the future, for example, funneling personal details to data brokers, after which that data can end up anywhere, and can be used for anything.

Users don't have a clue what they agree to when ticking a box, let alone that they understand the terms and conditions of user agreements. Their 'consent' can be meaningless for other reasons too, for example when anonymous data packages are combined in such a way that individuals can be identified later anyway. See the trade in location data coming from smartphones of identifiable EU and NATO employees (Dutch newspaper NRC, 4 November 2025).

The business model as the starting point

In well-functioning markets, businesses make an effort for their customers: competition forces them to, because competitors would otherwise seize market share from them. However, app providers, digital platforms, and devices direct the interactions between users, which are not necessarily put first as customers. Some platforms operate as 'attention brokers'. Even though they offer their users free (or cheap) content or services, their real customers are advertisers, data buyers, and stakeholders that wish to influence the public debate.

That's how companies such as Google (provider of the namesake search engine and YouTube), Meta (provider of social-media platforms Facebook and Instagram) and TikTok (social-media platform) operate. This type of business model harks back to at least 1833, when the newspaper the New York Sun was launched: its price was just one penny. The revenues predominantly came from advertisers, not from readers. Hence the newspaper as a platform, facilitating interactions between different groups of users, just like a social-media platform brokers between users and buyers of attention or data. It has been said many times: if you don't pay for a product, you are the product.

Harm to users

Apps and platforms do not just direct financial streams, they also direct the attention of users, while registering their behavior and data. The interests of their real customers may not be aligned with those of their users. They do so deliberately, because some platforms see their users as a source of attention and data that can be commercially exploited. Such practices can be harmful, for example when algorithms prioritize content that incite individuals to keep scrolling endlessly, even if it's a waste of their time, leads to unwanted purchases, or comes at the expense of their mental health or well-being. Minors are particularly susceptible to that, and they are often bulk users of apps that keep users on a string.

In 2023, the European Parliament made a call to tackle deception, distraction, and addictive architectures. And rightly so, because frequent use of social media can lead to anxiety, depression, and sleep disorders, besides financial harm, loss of privacy, and reduced freedom of choice. Nowadays, various clinics offer help with internet addiction (such as addiction to games or social media), which is particularly prevalent among young adults.

On top of deception and distraction, there is a risk for unforeseen consequences, even for users that deliberately consent to the collection of data. Providers give the impression that personal data is needed for the functionality of their app, or for attracting advertisers to keep the app free. And then, years later, it turns out that data brokers have traded in your profile, enriched with identity data, on a global scale, for applications you do not endorse. Good luck reversing that.

Negative externalities

If profit comes at the expense of outsiders to commercial transactions, economists refer to this as negative externalities. That is, interactions by market participants affect 'third parties', such as users of which the attention and data are commodities to apps, platforms, and commercial businesses. That is the first externality.

A second externality is that the effects of deception and distraction go further: the impact of commercial transactions carry over to other people than users. An algorithm that places controversial content at the top of a 'feed' influences how people view society and each other. That leads to polarization. In addition, mutual trust is eroded by the constant circulation of user profiles and data when people experience the consequences later in life. That undermines the economy as well as democracy. Holding the attention of users in an artificial manner also creates social costs as a result of the erosion of activities that are more meaningful than scrolling, and also as a result of mental-health problems. Finally, addiction is a problem to one's friends and family, but also to society at large.

Fully aware

The major competitors (including their boards) that make use of the attention and data of their users are undoubtedly aware of the potential harm to their users as well as the wider consequences. Multiple whistleblowers have raised these issues, for example a former head of security at WhatsApp, and a former product manager at Facebook. Lawsuits have been filed in the US, supported by internal documents about mental-health studies that these businesses carried out themselves. This is reminiscent of the tobacco industry, which was sued in the 1990s for creating addictive products despite knowing their dangers.

On the other hand: even users that grasp the essence of the business model, and know that they can be harmed, will not easily abandon an app or platform, because of its useful functionality, the fact that their friends are also on the same platform, or because of its addictive nature. From that point of view, how bad is it that providers manipulate attention, if users are aware (fully or partially) of the risks? Prohibition is hardly feasible (leaving aside flagrant abuses). Discouraging their usage through excise taxes, such as on alcohol and tobacco, is not practical, considering the speed with which digital technologies are evolving.

Protection is needed

Wouldn't government intervention be patronizing? The argument that consumers are aware of the risks ignores the application of techniques that undermine self-control or are addictive. Due to the short-term profitability, providers do not protect users against themselves. In addition, conduct that is individually rational can still inflict harm to the group. That is why, next to competition enforcement and consumer protection, additional rules and regulations are very much needed. Some may call this paternalism, but it does restore the autonomy and actual freedom of choice of users with respect to their own attention and data.

The first measures have already been taken. See the Digital Services Act (DSA), the European regulation that imposes obligations on platforms for ensuring a safe, predictable, and reliable online environment, such as due diligence obligations with regard to content moderation and the handling of user accounts. The DSA also focuses on the social consequences of the use and design of large online platforms. For example, these large online platforms must take measures to mitigate the risks with regard to election processes, civil rights, and the (mental) health of users. Platforms must enhance online safety for children, for example, by implementing age verification and anti-grooming measures. They need to do all of this with open standards: as the risks become greater or increase in the future, a platform must take more measures.

In order to protect consumers more, the European Commission is working on a Digital Fairness Act (DFA), which focuses on, among other aspects, online manipulation and addictive architectures, as part of its 2030 Consumer Agenda. That is necessary as a counterweight to commercial practices that take advantage of consumers, and also because of the health risks (including mental health) and their harm to society.

More is needed than competition enforcement or consumer protection

Normally, we protect competition because it leads to better market outcomes: lower prices, higher quality, and increased choice. However, if business models seek to capture and hold the attention of users, while the market is not functioning properly because those users lack crucial information (dark patterns, addictive techniques) and because of negative externalities (mental health, polarization, social impact), additional rules and regulations are essential.

The 'digital economy' is different because market failures can be more persistent and affect society at large. The effects are more far-reaching than those of lagging competition or unfair commercial practices in traditional sectors, especially when it comes to multisided markets (such as platforms), where users are not the key customers, even more so if users do not pay for services. Because, at the end of the day, it's about protecting public interests. Strengthening the resilience of individuals will also strengthen the resilience of society.

Netherlands Authority for Consumers & Markets published this content on December 11, 2025, and is solely responsible for the information contained herein. Distributed via Public Technologies (PUBT), unedited and unaltered, on December 11, 2025 at 12:48 UTC. If you believe the information included in the content is inaccurate or outdated and requires editing or removal, please contact us at [email protected]