Events

Webinar on October 15th, 2021

Where Opinions Come From: The Deficit Model vs. the Community of Knowledge

Steven Sloman (Brown University)

People have some crazy opinions. Generally, these are the opinions that we disagree with. The standard view in both academia and the wider culture is that people have such opinions due to knowledge deficits; they are lacking information. On this view, providing information and critical reasoning skills is the best way to get opinions to converge, because they’ll converge to the truth. There is already a strong reason to doubt this deficit model. I provide more in the form of evidence that knowledge is unrelated to attitudes about issues. In contrast, a person’s ideology influences both their attitudes and their sense of understanding. A competitor to the deficit model, the cultural cognition view, explains the effect of ideology on attitudes but does not address the sense of understanding. I follow the cultural cognition view in proposing that people outsource much of their reasoning to their communities; I add that it is the resulting sense of understanding that mediates their attitudes. This community of knowledge suggests that people outsource most of their reasoning. I show how this fact can be deployed to bring evidence to bear on policy.

Webinar on April 1st, 2022

“I think this news is accurate”: Endorsing accuracy decreases the sharing of fake news and increases the sharing of real news (Password: CosmoProbe1April2022)

Valerio Capraro (Middlesex University London)

Fighting the spread of fake news on social media is a problem of major importance. Accuracy prompts - nudges that make the concept of accuracy salient - are receiving considerable attention, but they have one major limitation: they decrease overall sharing. This limits their practical applicability, because overall engagement represents one of the main profit motives for social media companies. Here, we overcome this limitation. We report on seven survey experiments (five main, two supplementary) testing four novel accuracy prompts. Our main result is that an “endorsing accuracy” prompt (i.e., “I think this news is accurate”), placed on the sharing button, decreases intentions to share fake news, increases intentions to share real news, and keeps overall engagement constant. While Studies 1 and 2 collect only sharing intentions, Studies 3-5 place participants in a more ecological context, where they can scroll through a list of news headlines, each of which can be shared, liked, or ignored. Studies 4-5 explore the mechanism through which the intervention works. Study 4 compares the accuracy endorsement prompt with an accuracy salience prompt, “Think if this news is accurate”, and finds that the previous results are specific to endorsing accuracy. Study 5 replicates the main results with headlines displaying no source of news, thus ruling out the possibility that the endorsing accuracy intervention simply makes participants more likely to apply a “source heuristic”. These findings suggest that placing an accuracy endorsement on the sharing button can increase the quality of news sharing, without affecting overall engagement.