UCSD - University of California - San Diego

05/07/2026 | Press release | Distributed by Public on 05/07/2026 09:06

Missing Information Can Misinform

Published Date

May 07, 2026

Article Content

Key Takeaways

In the online attention economy, new UC San Diego research finds that making science more clickable or shareable can help some readers learn more - but leaves many others with an incomplete understanding.

To get people to pay attention to science, you have to make it engaging. But what makes content engaging often comes at the cost of detail - shaping what people learn and what they think they've learned. The result: People can come away with the wrong idea, even when what they read isn't factually wrong.

That tension sits at the core of research from Marta Serra-Garcia, a behavioral economist at the University of California San Diego's Rady School of Management. The study, published in the American Economic Review, examines how incentives in the online attention economy shape the way scientific information is communicated - and what readers ultimately take away from it.

A trade-off in the attention economy

You don't need bad actors for people to get the wrong idea. Incomplete information can be enough.

Crucially, the research finds that attention-grabbing summaries are not more likely to be factually inaccurate. Instead, they tend to include less information - especially key details about how studies were conducted.

"This is not a simple story that clickbait is bad," said Serra-Garcia, associate professor of economics and strategy and Phyllis and Daniel Epstein Chancellor's Endowed Faculty Fellow at UC San Diego's Rady School. "You need to get people's attention in order for them to learn something, and it's good to encourage curiosity. Yet there's a trade-off: Material designed to engage can also unintentionally contribute to the kinds of misunderstandings that can fuel misinformation."

The finding comes from a large, multi-stage experimental study in which freelance writers produced nearly 600 summaries of actual scientific research, and more than 3,700 participants were then tested on what they learned from them.

Why "in mice" matters

In one study used in the experiment, a compound in broccoli reduced cancer cell growth - in mice. Leave out those last two words, and the finding can sound far more directly relevant to human health than it actually is.

"Why can't we say 'in mice'?" Serra-Garcia said. "It's not very hard to add. It's two words. But once you say 'in mice,' maybe fewer people will click."

Study results were consistent. Summaries written to attract attention were shorter, easier to read and more engaging - but included less detailed information, especially about sample sizes and methods.

Given the option to seek out more information, most readers did not. That mirrors real-world behavior: Studies of social media use suggest most content is shared without users ever clicking through to read more.

Among those who relied on summaries alone in Serra-Garcia's study, knowledge dropped by about 6-7 percentage points. Readers were also more likely to draw incorrect conclusions - such as assuming findings applied to humans or reflected firm medical guidance.

Inside the experiments

To isolate these effects, Serra-Garcia conducted a multi-stage experimental study. In the first stage, 149 freelance writers produced nearly 600 summaries of the same set of studies - covering topics such as cancer, sleep, vaccines and climate - under different instructions: to inform readers accurately, or to attract attention by encouraging clicks or shares.

In the second stage, more than 3,700 participants read those summaries under different conditions, including whether they could click through for more information.

Behavioral economist Marta Serra-Garcia, UC San Diego Rady School of Management.

The results held across experiments: Attention-driven summaries increased engagement and prompted some readers to learn more - but left many others with less complete understanding.

AI and the attention economy

The same pattern emerged when a human wasn't doing the writing. In additional tests, when a large language model was prompted to attract attention, it also produced less detailed summaries - suggesting the effect is driven less by who creates the content than by the objective it's optimized for.

For Serra-Garcia, the findings point to an ongoing challenge for researchers, journalists and institutions alike.

"How do you make science engaging and important to readers," she said, "without missing the essentials that convey the full picture?"

The research was funded in part by National Science Foundation grant no. 2343858.

Read the full study: "The Attention - Information Trade-off."

The conundrum: "How do you make science engaging and important to readers, without missing the essentials that convey the full picture?" - Marta Serra-Garcia
UCSD - University of California - San Diego published this content on May 07, 2026, and is solely responsible for the information contained herein. Distributed via Public Technologies (PUBT), unedited and unaltered, on May 07, 2026 at 15:06 UTC. If you believe the information included in the content is inaccurate or outdated and requires editing or removal, please contact us at [email protected]