Continued Influence Effect

Media & Technology

Ecker et alia (2011)

Can false information still influence our judgment after it has been corrected?

Background

The Continued Influence Effect (or Post-Retraction Reliance on Misinformation) is cause for concern in an era where the explosion in communication due to digital and social media is accompanied with the spread of “fake news”. As explained  by Ecker et alia (2011), “information that is presumed to be true at encoding but later turns out to be false often continues to influence memory and reasoning (…). For example, if a fictional character is accused of a crime but later exonerated, people continue to use the outdated misinformation (that the person is guilty) in subsequent inferences, even if they recall the correction”. Two different theories can account for this strange cognitive phenomenon. 

  • Mental Model Theory: According to Craik (1943), people organize information from their environment into mental models that enable them to make sense of the world, including explaining and predicting events. Like any model, such mental representations are imperfect, and thus need to be partially resistant to change, even in the face of contradictory evidence. In particular, people are “reluctant to dismiss key information in their model (e.g., what caused an event) when no plausible alternative exists to fill the void (…). When no alternative is presented, people prefer an inconsistent event model to an incomplete model. Hence, in their inferential reasoning they may rely on outdated information despite knowing that it is false, rather than acknowledging the lack of valid information available” (Ecker et alia, 2011).
  • Dual Process Theory: According to Ecker et alia (2011), Dual Process Theory applies not only to judgment and decision-making (as proposed by Kahneman’s (2011) Dual System Theory), but also to memory. In its default mode, memory would rely on an automatic retrieval process that is largely unconscious and both highly efficient and unreliable as it is only based on the ease and strength with which particular pieces of information come to mind. As such, it is biased towards emotional, and especially negative memories (because of the Negativity Bias). This tends to be the case of baseless accusations, smear campaigns, and other “fake news”, while their retraction usually appeals more to facts and reason. To be retrieved and cancel out “hot” bits, such “cold” memories require the intentional activation of a different, controlled retrieval process carefully researching and selecting relevant information. Although much more reliable, this conscious system requires much more effort, and is thus only used when people have the opportunity (e.g., in terms of time) and motivation to monitor their thinking. 

Both theories help explain why people use misinformation in their reasoning even when they remember its retraction.

Participants & Procedures

161 undergraduate students from the University of Western Australia (Perth, Australia).

Participants were asked to read 17 messages said to come from a variety of sources (firefighter communication, police report, public radio, etc.) and providing information about a warehouse fire. The fire was initially reported as having been caused by volatile materials negligently stored in a closet, but this false information was later retracted, as it was reported that the closet was actually empty. Depending on the condition, the messages read by the participants contained either 1 or 3 instances of the misinformation, and either 0, 1, or 3 instances of its retraction. Volatile materials were also not mentioned to a control group. After a 10mn distraction task, subjects answered an open-ended questionnaire with 10 causal inference questions (such as “What could have caused the fire?”), 10 factual questions (such as “At what time was the fire eventually put out?”), and 2 manipulation-check questions targeting their awareness of the retraction (such as “Was any of the information in the story subsequently corrected or altered? If so, what was it?”). Their responses were scored by an independent grader naive to conditions, who counted the number of references made to the misinformation.

Findings

As can be seen in the graph above, the results did support the Continued Influence Effect (Post-Retraction Reliance on Misinformation). Unsurprisingly, an ANOVA test revealed main effects of both strength of misinformation and strength of retraction: the more the misinformation was repeated, the greater its continued influence. Conversely, the more the retraction was repeated, the smaller its continued influence. However, the latter never completely disappeared. Any instance of misinformation had a low, but statistically significant and irreducible effect—as seen from the fact that its continued influence was reduced just as much by 1 or 3 retractions. Interestingly, awareness of the retraction (which ranged from 63% to 83% depending on the condition)  had no mediating effect. The researchers added that the Continued Influence Effect might be even more prevalent in political and judicial settings, which involve personal opinions and attitudes, and thus a risk of Confirmation Bias. In these contexts, retractions might actually reinforce the belief in the original misinformation, because of the Backfire Effect.

IA Tip

In line with IB guidelines, we recommend that students only compare two conditions in their experiment and obtain a single measurable result for each participant in each condition. Doing otherwise would complicate inferential statistics without any benefit as far as the IA is concerned.
Here, students should simplify the original experiment and only test 2 different conditions: x instances of the misinformation and x instances of its correction versus no misinformation. The operationalization of the dependent variable can also be simplified, e.g., by asking participants to rate their belief in different statements (included the one that was misstated and corrected in the experimental condition.)

Citation

Ecker, U.K.H., Lewandowsky, S., Swire, B. et al. (2011). Correcting false information in memory: Manipulating the strength of misinformation encoding and its retraction. Psychonomic Bulletin and Review, 18, 570–578.

https://doi.org/10.3758/s13423-011-0065-1