Emotion-based nudges cut social media disinformation by over 40%
Results showed that all digital nudges significantly decreased the intent to share disinformation, with distraction nudges proving the most effective. These nudges combined emotional information with prompts encouraging users to imagine their followers smiling when they see their posts. The distraction nudge was especially successful in countering high emotional intensity, prompting users to reconsider sharing content that might cause distress.
Digital nudges aimed at managing users' emotions can substantially decrease the spread of online disinformation fueled by intense anger, reveals a new peer-reviewed study submitted on arXiv. The study "Digital Nudges Using Emotion Regulation to Reduce Online Disinformation Sharing" provides the first empirical framework focused specifically on emotion regulation as a method of limiting emotionally charged misinformation on social media platforms.
Based on a pilot and large-scale experiment with 447 participants, the study assessed whether drawing users’ attention to the emotional content of disinformation at the moment of sharing could prompt deliberation and reduce impulsive dissemination. The researchers developed nine types of digital nudges, including distraction, perspective-taking, and empathy-based strategies, and compared them against conventional nudges modeled on existing social media platform features.
Results showed that all digital nudges significantly decreased the intent to share disinformation, with distraction nudges proving the most effective. These nudges combined emotional information with prompts encouraging users to imagine their followers smiling when they see their posts. The distraction nudge was especially successful in countering high emotional intensity, prompting users to reconsider sharing content that might cause distress.
The core mechanism behind these nudges is emotion regulation - a process by which individuals manage and modify emotional responses. The study leveraged visual tools, such as pie charts indicating emotional content like anger, sadness, or disgust, and paired them with textual messages prompting users to reflect on the social impact of their posts. Participants exposed to these nudges reported reduced emotional intensity and were more likely to cancel their intention to share disinformation.
In the main experiment involving 400 participants representative of regular users of the social media platform X (formerly Twitter), distraction nudges led to a 41.5% cancellation rate among users who had initially intended to share disinformation. Perspective-taking nudges, which prompted users to consider the manipulative intent behind disinformation, achieved a 39.4% cancellation rate. Existing nudges based on friction alone (e.g., pause and prompt to comment) yielded a lower cancellation rate of 19.5%.
Disinformation stimuli in the study were modeled on real-world social media content, including emotionally charged posts around gender and generational conflict. Participants were asked to role-play or respond as themselves, and reported their sharing intentions, recognized emotions, and perceived authenticity of the content before and after being exposed to digital nudges. Emotional intensity and type were measured using Plutchik’s Wheel of Emotions.
Importantly, the study found that digital nudges had minimal impact on users' belief in the authenticity of the information - suggesting that reduced sharing was not due to increased skepticism, but rather due to emotional self-regulation. This indicates that countering disinformation may not require changing belief but rather redirecting behavior through emotion-aware interface designs.
The researchers also observed that distraction nudges could produce a positive emotional shift. Among users who initially recognized negative emotions such as anger, sadness, or disgust, a small percentage later reported positive feelings like joy or trust after exposure to the nudge. This effect was more pronounced for distraction nudges (1.5%) than perspective-taking (0.8%) or existing nudges (0.8%).
The study builds on dual-process theories of cognition, which distinguish between intuitive, emotion-driven responses and deliberate, reflective thinking. Strong anger, as the authors noted, tends to trigger intuitive reactions, lowering the likelihood of users verifying accuracy before sharing. Digital nudges that invoke emotion regulation serve to slow down this process and increase users’ deliberative engagement.
Unlike conventional interventions that rely on fact-checking or accuracy judgments, these digital nudges leverage emotion regulation to intervene preemptively. Because many disinformation posts blur truth and falsehood, or exploit righteous anger around social justice, the study argues that authenticity-based nudges may not suffice.
The researchers recognized certain drawbacks in their study, such as its use of simulated sharing situations instead of actual real-time actions and the requirement for real-world application of sentiment analysis to enable automated emotion detection. However, the reliability of the findings indicates a promising opportunity for implementing these strategies in active social media settings.
Future research directions include testing digital nudges on emotionally provocative true content, integrating them with educational tools, and exploring their role in preventing belief formation based on emotional manipulation. The study also called for ethical evaluation mechanisms to ensure that nudges respect autonomy while guiding behavior.
- FIRST PUBLISHED IN:
- Devdiscourse

