September 18, 2025
A University of Washington-led study of X found that posts with Community Notes attached were less prone to going viral and got less engagement. After getting a Community Note, on average, reposts dropped 46% and likes dropped 44%.iStock
In 2022, after Elon Musk bought what’s now X, the company laid off 80% of its content moderation team and made Community Notes the platform’s main form of fact-checking. Previously a pilot program at Twitter, Community Notes lets users propose attaching a comment to a specific post — usually to add context or correct an inaccurate fact. If other users with diverse views vote that the comment is useful, as measured by X’s algorithm, then the note is appended to the post. Other social media platforms, including Meta and YouTube, have since followed.
A University of Washington-led study of X found that posts with Community Notes attached were less prone to going viral and got less engagement. After getting a Community Note, on average, reposts dropped 46% and likes dropped 44%.
“We found that Community Notes are effective when attached, especially in reducing engagement that signals support for the content, such as reposts and likes,” said senior author Martin Saveski, a UW assistant professor in the Information School. “But the spread of misinformation on social media is complex and multifaceted, and it requires multiple approaches working together to effectively curb it. Systems like Community Notes are an important addition to the platforms’ toolbox.”
The team published its findings Sept. 18 in Proceedings of the National Academy of Sciences of the United States of America.
Between March and June of 2023, researchers tracked 40,000 posts for which a note was suggested. Of those, 6,757 notes were deemed helpful and were attached. The team tracked posts for 48 hours after getting a note attached and compared posts with notes to those without on two key aspects: engagement, such as likes and reposts, and diffusion.
Diffusion accounts for how a post spreads through the social network — essentially its virality. For example, do only people who follow an account engage with a post?
“We know from other studies that false information typically spreads faster, broader and more virally, than true information does,” said lead author Isaac Slaughter, a UW doctoral student in the Information School. “We found that Community Notes significantly change the way information spreads through a network. People who are distant in the social network from the person that posted the misinformation are much less likely to interact with the post. But people close to the source — followers, for instance — tend to be less affected by the note.”
On average, the team found that after notes were added, engagement dropped 46% for reposts, 44% for likes, 22% for replies and 14% for views. Over posts’ whole lifespans, including engagement before notes were attached, the drops were 12% for reposts, 13% for likes, 7% for replies and 6% for views.
“We think views were less affected because what users see is mostly decided by X’s feed algorithm,” Saveski said. “From the public release of the algorithm, we know that X does not explicitly deemphasize posts with notes attached, but that could change in the future.”
The study was also able to get granular data on what affected posts’ spread. Notes added to altered media, like fake photos and videos, affected those posts more than they did text-based posts. Notes on very popular posts led to greater reductions in engagement. And getting notes appended quickly was vital.
“Content spreads rapidly across X, and if a note comes too late, few users will get a chance to see it,” Slaughter said. “Notes that take 48 hours or so to go up have almost no effect.”
Saveski’s lab at UW is now developing potential tools to speed up how quickly notes can be attached to posts to increase their effectiveness.
The authors only looked at posts that had notes proposed in early 2023, and X has significantly updated its Community Notes methods since then. But it also ended free access to its API, making further academic studies infeasible. The paper also looked only at X, not at other social media platforms.
“Whether this kind of moderation is sustainable as many separate systems across different platforms, as it’s now being used, is really an open question,” Saveski said. “If someone is adding notes on X, does that make them less likely to do so on TikTok or Instagram? There’s also the question of how much platforms should collaborate and share data, which could help this scale. X has made its code and data available, but none of the other platforms have committed to opening up their systems yet.”
Co-authors include Axel Peytavin of Stanford University and Johan Ugander of Yale University. This research was funded in part by a UW Information School Strategic Research Fund award and an Army Research Office Multidisciplinary University Research Initiative award.
For more information, contact Saveski at msaveski@uw.edu and Slaughter at is28@uw.edu.
Tag(s): Information School • Isaac Slaughter • Martin Saveski