1 Introduction
Despite the clear scientific consensus on Genetically Modified (GM) food safety and on the usefulness of vaccination, lay people’s skepticism remains high [Gaskell et al., 1999 ; MacDonald, 2015 ; Scott, Inbar and Rozin, 2016 ; Yaqub et al., 2014 ]. The large discrepancy between the state of agreement in the scientific community and what the general population thinks has been referred to as the “consensus gap” [Cook et al., 2018 ]. This consensus gap is puzzling because public trust in science is high and remained stable since the 1970s [Funk, 2017 ]. But people are selective about their trust in the scientific community: Americans trust less scientists on GM food safety and vaccination than on non-controversial topics [Funk, 2017 ]. Americans also largely underestimate the scientific consensus, together with scientists’ understanding of Genetically Modified Organisms (GMOs) and vaccination [Funk, 2017 ]. In France, where we conducted the two studies reported in this article, rejection of GM food is widespread [Bonny, 2003 ]: up to 84% of the population thinks that GM food is highly or moderately dangerous [Institut de Radioprotection et de Sûreté Nucléaire, 2017 ] and 79% of the public is worried that some GM food may be present in their diet [Institut d’études opinion et marketing en France et à l’international, 2012 ]. In the country of Louis Pasteur, public opinion on vaccination is also surprisingly negative. Even if 75% of the population is in favor of vaccination [Gautier, Jestin and Chemlal, 2017 ], only 59% of them think that vaccines are safe [Larson et al., 2016 ]. Our intervention at science festivals primarily aims at filling this lack of trust.
Attempting to correct misconceptions by targeting people at science festivals may seem like an odd choice, as they are known to be more interested in science, more educated, and more deferential toward the scientific community [Jensen and Buckley, 2014 ; Kennedy, Jensen and Verbeke, 2018 ]. But these traits do not exempt lay science enthusiasts from holding false beliefs on scientific topics. For example, teachers reading the most about cognitive science and who are the most interested in evidence based education are more likely to spread neuromyths — misconceptions about how the brain is involved in learning — than less interested teachers [Dekker et al., 2012 ].
People coming to science festivals could be good targets for interventions on heated topics for at least two reasons. First, it should be easier to convince them with scientific arguments since they are eager to learn and trust scientists. Second, their scientific motivation makes them good intermediates to further transmit the arguments of our intervention in their social networks by chatting with their friends and family, or by sharing them on social media.
The role of peers to relay messages from media is well known in the area of public opinion [Katz and Lazarsfeld, 1955 ]. For example, efforts at convincing staunchly anti-vaccine individuals through campaigns of communication have largely failed [Dubé, Gagnon and MacDonald, 2015 ; Sadaf et al., 2013 ]. These failures could be due to the lack of trust that anti-vaccine individuals place in the medical establishment [Salmon et al., 2005 ; Yaqub et al., 2014 ]. As a result, people coming to science festivals, who are likely more trusted by their peers than mass media, may be in a good position to convince vaccine hesitant individuals [at least fence-sitters; see Leask, 2011 ], if only they are able to muster convincing arguments [Altay and Mercier, 2018 ]. Thus, by providing science lovers with facts about GM food and vaccination, we could strengthen their argumentative arsenal, and indirectly use their social network to spread scientific information.
In a nutshell, the intervention consisted of small discussion groups in which an experimenter explained the hierarchy of proofs (from rumors to meta-analyses and scientific consensus), highlighted the scientific consensus on vaccines’ benefits and GM food safety, and answered the public’s questions on these topics. The design of the intervention was based on two core ideas: (i) that, as suggested by the Gateway Belief Model, highlighting the scientific consensus can change people’s minds, and (ii) that providing information in a dialogic context where arguments can be freely exchanged is a fertile ground for belief revision.
1.1 Gateway belief model
According to the Gateway Belief Model in Science Communication, highlighting the scientific consensus can improve people’s opinions on scientific topics and increase their public support [Ding et al., 2011 ; Dunwoody and Kohl, 2017 ; Kohl et al., 2016 ; Lewandowsky, Gignac and Vaughan, 2013 ; van der Linden, Leiserowitz, Feinberg et al., 2015 ; van der Linden, Leiserowitz and Maibach, 2017 ]. The idea behind the model is simple: emphasizing the degree of agreement between scientists on a given topic will influence the public’s perception of the consensus, which will in turn change people’s belief on the topic and will finally motivate public action.
The Gateway Belief Model has been successfully applied to vaccination, as being exposed to the consensus on vaccination leads to more positive beliefs on vaccination [Clarke, McKeever et al., 2015 ; Dixon and Clarke, 2013 ; van der Linden, Clarke and Maibach, 2015 ]. Yet, applications of the model to GM food yielded mixed results. Two studies found that exposure to the scientific consensus had no effect on beliefs about GM food safety [Dixon, 2016 ; Landrum, Hallman and Jamieson, 2018 ], while one reported a significant effect [Kerr and Wilson, 2018 ]. These results could reflect a lack of trust, as acceptance of biotechnology positively correlates with deference to scientific authority [Brossard and Nisbet, 2007 ] and high trust in the government, GM organizations and GM regulations, together with positive attitudes towards science and technology, are associated with favorable opinions towards GM applications [Hanssen et al., 2018 ]. But laypeople do not place a lot of trust in GM food scientists [Funk, 2017 ] and up to 58% of the French population thinks that public authorities cannot be trusted to make good decisions on GM food [Ifop and Libération, 2000 ]. This lack of trust is the biggest limitation of the Gateway Belief Model: it can only function if people are deferent to scientific authority in the first place [Brossard and Nisbet, 2007 ; Chinn, Lane and Hart, 2018 ; Clarke, Dixon et al., 2015 ; Dixon, McKeever et al., 2015 ].
Some have debated the validity of the Gateway Belief Model [Kahan, 2017 ; Kahan, Peters et al., 2012 ] and warned that exposition to the scientific consensus may backfire among those who see the consensus as calling into question their core values, pushing them away from the consensus, and increasing attitude polarization [see the “Cultural Cognition Thesis” Kahan, Jenkins-Smith and Braman, 2011 ].
Despite the uncertainties surrounding the Gateway Belief Model, we chose to rely on this model because people coming to science festivals should be particularly receptive to the scientific consensus, as they typically trust the scientific community.
1.2 Argumentation
The second core feature of our intervention is its interactive format: participants were free to give their opinion at any time, interrupt us, and discuss with each other. We repeatedly asked for participants’ opinions to engage them in the discussion as much as possible. This format, often used in science festivals and educational workshops, could enable participants to make the best of their reasoning abilities.
Reasoning works best when used in a dialogical context in small groups of individuals holding conflicting opinions [Mercier and Sperber, 2011 ; Mercier and Sperber, 2017 ]. And numerous studies have shown that real life argumentation is a fertile ground for belief revision [for reviews see Mercier, 2016 ; Mercier and Landemore, 2012 ; for an application to vaccination see Chanel et al., 2011 ]. But there is no consensus on the positive role that argumentation could play on heated topics. It has even been suggested that counter-argumentation on heated topics could also backfire, leading to attitude polarization [Ecker and Ang, 2019 ; Kahan, 2013 ; Nyhan and Reifler, 2010 ]. For example, providing people with information in a written format about the low risk and benefits of GM technology have been found to increase opinions’ polarization [Frewer, Howard and Shepherd, 1998 ; Frewer, Scholderer and Bredahl, 2003 ; Scholderer and Frewer, 2003 ]. Still, on the whole, backfire effects remain the exception: as a rule, when people are presented with reliable information that challenges their opinion, they move in the direction of this information, not away from it [Guess and Coppock, 2018 ; Wood and Porter, 2019 ].
1.3 The present contribution
Although scientific festivals are popular and represent a great opportunity for the scientific community to share its knowledge with the public, evaluations of interventions’ impact during science festivals are rare [Bultitude, 2014 ]. But evidence suggests that interacting with scientists and engineers at science festivals positively affect the audience’s experience of the event [Boyette and Ramsey, 2019 ]. And a recent study showed that discussing gene editing in humans during a science festival increased participants understanding of the topic, as well as the perceived moral acceptability of the technology [Rose et al., 2017 ]. Our study aims to extend these results to GM food and vaccination.
In the two studies reported here, we held 10 to 30 minutes discussions with small groups of volunteers from two science festivals. During these discussions, a group leader (among the authors) explained the hierarchy of proofs (from rumors to meta-analyses and scientific consensus), backed the scientific consensus on vaccine benefits and GM food safety with scientific reports and studies, and answered the public’s questions. The discussions started on a non-controversial topic — Earth’s sphericity — and ended when the three topics — the other two being vaccines and GM foods — had been discussed, and all participants’ questions had been answered. In Study 1, we measured participants’ opinions on the Earth’s sphericity, the benefits of vaccination, and the health effects of GM food, before and after the intervention. Participants answered on Likert scales and used an anonymous voting system with a ballot box. Study 2 is a replication of Study 1 with additional measures, including participants’ trust in the scientific community and participants’ degree of confidence in their responses. Data, materials, questionnaires, and pictures of the intervention’s setting can be found here: https://osf.io/9gbst/ .
Since our experimental design does not allow us to isolate the causal factors that contributed to change people’s minds, we will not speculate on the role that might have played the exposition to the scientific consensus (Gateway Belief Model) or argumentation. But from our data we will be able to infer: (i) whether participants changed their minds, and (ii) if, on the contrary, cases of backfire were common. Based on the literature reviewed above, we predict that our intervention will change people’s minds in the direction of the scientific consensus (H ) and that cases of backfire will be rare (H ).
2 Study 1
The first study was designed as a proof of concept to measure whether our intervention would change people’s minds on the heated topics that are, in France, GM food and vaccination. We hypothesized that our intervention would change people’s minds in the direction of the scientific consensus (H ).
2.1 Participants
In October 2018, at Strasbourg University, as part of a French Science Festival, we discussed with 103 participants who volunteered (without compensation) to take part in the workshop: “The wall of fake news: what is a scientific proof.” When coming to our workshop, volunteers did not know that they were going to discuss vaccination and GM food. Everyone was welcome and no one have been excluded from the workshop. The median age bracket was 15 to 18 years old, as many high school students came to the workshop. The youngest participant was 13 while the oldest was 80 years old. We excluded children under 13 because they attended the workshop with their parents or teacher, and thus did not respond independently.
2.2 Design and procedure
Before and after our intervention, we asked participants to answer questions in French about Earth’s sphericity, the benefits of vaccination, and GM food safety on seven-points Likert scales. The first question was: “What do you think of the Earth sphericity?”. The scale ranged from “I am absolutely certain that the Earth is FLAT” (1) to “I am absolutely certain that the Earth is SPHERICAL” (7). The second question was: “What do you think of vaccines?’. The scale ranged from “I am absolutely certain that vaccines are DANGEROUS for human health” (1) to “I am absolutely certain that vaccines are BENEFICIAL for human health” (7). The third question was: “What do you think about GM (Genetically Modified) food?”. The scale ranged from “I am absolutely certain that GM food is DANGEROUS for health” (1) to “I am absolutely certain that GM food is HARMLESS for health” (7). After answering the questions and selecting their age bracket, they were asked to put the piece of paper in a ballot box anonymously.
Discussions took place in groups of one to six volunteers and lasted between 10 to 30 minutes. Two group leaders (the authors) lead the discussions. Each group leader was in charge of one group, so the maximum number of parallel groups was two. We, as group leaders, started the discussions by asking participants what they thought about the Earth’s sphericity. All participants believed the Earth to be spherical because of the abundant scientific evidence. To challenge their belief, we handed them a book entitled: “200 Proofs Earth is Not a Spinning Ball.” Even though participants were unable to debunk the numerous arguments present in the book, they maintained their initial position because of the stronger scientific evidence. This allowed us to bring to their attention the origin of their belief in the Earth sphericity: trust in science. At this early stage we also explained to them what scientific evidence is and introduced the notion of scientific consensus (with the help of the pyramid of proof document that can be found in appendix B ). After this short introduction on the Earth’s sphericity accompanied by some notions of epistemology, we engaged the discussion on vaccination and GM food, arguing that there are few reasons to distrust scientists on these topics.
We asked participants’ opinion on each topic, made them guess the amount of evidence gathered by scientists, informed them of the scientific consensus, and answered their questions. The majority of the discussion time was devoted to GM food, as participants had little knowledge on the topic, and asked many questions. A session ended when the three topics had been discussed and all of the participants’ questions had been answered. We used the brief report of the Committee to Review Adverse Effects of Vaccines [ 2012 ] and the brief report of the National Academies of Sciences & Medicine [ 2016 ] to present the scientific consensus on GM food safety and vaccine benefits. We emphasized the fact that genetic engineering is first and foremost a technology [Blancke, Grunewald and De Jaeger, 2017 ; Landrum and Hallman, 2017 ]. As ecology was a recurrent topic of interest, we also argued that genetic engineering could contribute to a sustainable agriculture — in the fight against global warming it is an ally rather than an enemy [Ronald, 2011 ].
Participants were provided with scientific studies and misinformation coming from blogs, journal articles, books or tweets (the list of materials used during the intervention can be found in appendix A ). We also read some scientific studies with participants, debunked the misinformation articles, and highlighted the discrepancy between the scientific facts and the way GM food and vaccines are sometimes portrayed in the news media. The materials were used to support our arguments and answer participants’ questions. Therefore not all participants were exposed to the same material. But all participants were presented with the two reports of the National Academies of Sciences & Medicine on GM food safety and vaccine benefits, were familiarized with the pyramid of proof, had to guess how much evidence is available today on GM food safety and vaccines benefits, and were told that there is a scientific consensus on these topics.
We presented ourselves as non-experts allocating their trust in the scientific community because of the rigorous epistemic norms in place. Participants were asked not to trust us on our words, but to check the facts online. We urged them to use Google Scholar or Wikipedia, thanks to its accessibility and reliability [J. Giles, 2005 ].
3 Results and discussion
All statistical analyses in this paper were conducted in R [v.3.6.0 R core team, 2017 ], using R Studio [v.1.1.419 RStudio team, 2015 ].
Since pre- and post-intervention responses were anonymous and could not be matched, we used a non-parametric test (permutation with “lmPerm” package [Wheeler and Torchiano, 2016 ] ) to compare pre- and post-intervention ratings. The permutation test generated all the possible pairings between the pre- and post-intervention ratings of our data set and re-computed the test statistic for the rearranged variables [for a detailed explanation see D. Giles, 2019 ].
Our intervention had no significant effect on the Earth’s sphericity ratings F (1, 204) = 0.70, p = 0.49 (number of iterations = 103), as before our intervention participants already believed the Earth to be spherical (before: M = 6.83, SD = 0.53; after: M = 6.94; SD = 0.27). We found a small effect of our intervention on opinion about vaccination F (1, 204) = 12.64, p = .02 (number of iterations = 4609), with participants rating vaccines as more beneficial and less harmful after our intervention ( M = 6.13; SD = 1.34) than before ( M = 5.61; SD = 1.53). Our intervention had a very strong effect on opinion about GM food F (1, 204) = 155.54, p < .001 (number of iterations = 5000), with participants rating GM food as being less harmful to human health after our intervention ( M = 5.29; SD = 1.74) than before ( M = 3.55; SD = 1.80).
Our intervention shifted participants’ opinions in the direction of the scientific consensus, offering support for our first hypothesis.
4 Study 2
Study 2 is a replication of Study 1 with additional measures, including participants’ trust in the scientific community and participants’ degree of confidence in their responses. Participants were also assigned a participant number, allowing us to compare each participant’s pre- and post-intervention responses, and thus measure the magnitude of the backfire effect. Based on the literature reviewed in the Introduction, we hypothesized that cases of backfire would be rare (H ).
4.1 Participants
In May 2019, at the Cité des Sciences et de l’Industrie in Paris, as a part of the Forum of Cognitive Science, we discussed with 72 participants ( M = 26.06, SD = 10.19; three participants failed to provide their age) who volunteered without compensation to take part in our workshop. Again, everyone was welcome, and no one was excluded from the discussion groups.
4.2 Design and procedure
First, participants wrote their age and their participant number on the questionnaire. Second, we measured participants’ trust in the scientific community with the following question: “To what extent do you trust the scientific community?”, the scale ranged from “0%” (1) to “100%” (6), each point of the scale was associated with a percentage (the second point corresponded to “20%”, the third point “40%”, etc.). Third, we asked participants to answer three questions about the Earth sphericity, vaccine benefits and GM food safety, together with their degree of confidence on a six-points Likert scale before and after the intervention. The three scales were shifted from seven to six points Likert scales to
prevent participants from ticking the middle point of the scale to express uncertainty. But participants now had the opportunity to express their uncertainty via the confidence scales.
The first question was: “What do you think of the Earth sphericity?”. The scale ranged from “The Earth is FLAT” (1) to “The Earth is SPHERICAL” (6). The second question was: “What do you think of vaccines?”. The scale ranged from “In general, vaccines are DANGEROUS for human health” (1) to “In general, vaccines are BENEFICIAL for human health” (6). Contrary to Study 1, we specified “in general” because of the complaints expressed by some participants in the Study 1. The third question was: “What do you think about the impact of Genetically Modified (GM) food on human health?”. The scale ranged from “GM food is DANGEROUS for health” (1) to “GM food is HARMLESS for health” (6). Each question was accompanied with a second question assessing participants confidence that went as follow: “How confident are you in your answer?”. Participants answered on a six-point Likert scale ranging from “I am 0% sure” (1) to “I am 100% sure” (6), each point of the scale was associated with a percentage (the second point corresponded to “20%”, the third point “40%”, etc.). The rest of the design and procedure are the same as in Study 1, except that during the afternoon one of the group leader present in Study 1 was replaced by another group leader whom we trained in the morning. We used the exact same materials and followed the same procedure as in Study 1.
5 Results
5.1 Main results
Since our experimental design allowed us to match pre- and post-intervention ratings, we conducted a one-way repeated measures analyses of variance (ANOVA) to compare the pre- and post-intervention ratings on each topic. The intervention had no significant effect on the Earth’s sphericity ratings (p = 0.10), as before our intervention participants already believed the Earth to be spherical (before: M = 5.71, SD = 0.52; after: M = 5.76; SD = 0.43). The intervention had a medium effect on opinions about vaccination F (1, 71) = 8.63, p < .01, = 0.11, with participants rating vaccines as more beneficial and less harmful after the intervention ( M = 5.31, SD = 0.85) than before ( M = 5.10, SD = 0.98). The intervention had a very strong effect on opinions about GM food F (1, 71) = 58.97, p < .001, = 0.45, with participants rating GM food as being less harmful to human health after the intervention ( M = 4.63, SD = 1.35) than before ( M = 3.26, SD = 1.54).
5.2 Did the intervention increased participants’ trust in science?
One-way repeated ANOVA revealed that our intervention had no effect on participants’ trust in science ( p = 0.10; Mean before = 4.91, Mean after = 4.99, corresponding to “80%” on the scale). And that initial trust in the scientific community had no effect on participants’ propensity to change their minds on the Earth sphericity ( p = 0.06), vaccine benefits ( p = 0.90), nor GM food safety ( p = 0.91).
5.3 What is the effect of confidence on attitude change?
In the analysis below, three participants who failed to provide their age were excluded ( N = 69, M = 26.13, SD = 10.64). A linear regression was conducted to evaluate the effect of participants’ initial confidence on the extent to which they changed their minds (measured as the difference between pre- and post-interventions ratings). We found that initial confidence had no effect on the propensity of participants to change their minds on the Earth sphericity ( p = 0.96), vaccine benefits ( p = 0.10), nor GM food safety ( p = 0.81)
5.4 How common were backfire cases?
After our intervention, out of 72 participants, six participants changed their minds (in the direction of the scientific consensus or not) on the Earth sphericity, 19 on vaccination and 49 on GM food. Cases of backfire effects (i.e. change in the opposite direction of the scientific consensus) were rare: one for the Earth sphericity, five for vaccination, and three for GM food.
6 Discussion
We successfully replicated the results of our first intervention, suggesting that the effect is robust to the different phrasing of the questions, and providing further evidence in favor of the positive influence of discussing heated topics at science festivals (H ). We also found support for the hypothesis that cases of backfire are rare (H ).
6.1 Internal meta-analysis
We ran fixed-effects meta-analysis model implemented in the ‘metafor’ R package [Viechtbauer, 2010 ] to compare the results of Study 1 and Study 2. This statistical test allowed us to calculate the overall effect of the intervention by averaging the effect sizes of Study 1 and Study 2. The test modulated the weight given to each study depending on their precision, i.e. effect sizes with smaller standard errors were given more weight [for a detailed explanation see Harrer et al., 2019 ].
Across the two studies, after the intervention participants considered vaccines to be more beneficial ( = 0.33 0.07, z = 5.44, p < .001, CI [0.21, 0.45]) and GM food to less dangerous than before the intervention ( = 0.75 0.06, z = 12.33, p < .001, CI [0.63, 0.87]). For a visual representation of the results see Figure 1 .
7 General discussion
The present studies show that it is possible to change people’s minds at science festivals, even on heated topics, and in relatively little time. Moreover, the risks of backfire effects seem very limited, suggesting that counter-argumentation on heated topics is probably safer than expected [Ecker and Ang, 2019 ; Kahan, 2013 ; Nyhan and Reifler, 2010 ] and that the worries of the Cultural Cognition Thesis may be overblown [Kahan, 2017 ; Kahan, Peters et al., 2012 ].
Overall, the high trust that participants had in science did not exempt them from holding false beliefs on vaccination and GM food (e.g. GMOs were often confused with pesticides). The mere fact of explaining what a GMO is and how they are used in agriculture and medicine helped reduce fears. Most participants were very surprised by the scientific consensus and the number of studies published on this subject. But they also spontaneously produced counterarguments to challenge the consensus, pointing out for example the existence of conflicts of interest. These common counterarguments were easily addressed in the course of the discussion. But this spontaneous generation of counterarguments could hinder the effectiveness of the Gateway Belief Model, since the consensus is typically conveyed in a one-way message format, in which participants’ counterarguments are left unanswered, potentially leading to rejection of the consensus or even to the well-known backfire effect [see Altay, Schwartz et al., 2020 ].
The Deficit Model of Communication [Sturgis and Allum, 2004 ], which assumes that conflicting attitudes toward science are a product of lay people’s ignorance, may be relevant for opinions on GM food since most participants lacked information on the subject — as polls and studies on GM food understanding have already shown [Fernbach et al., 2019 ; McFadden and Lusk, 2016 ; McPhetres et al., 2019 ]. Participants, deprived of strong arguments to defend their stance, nonetheless had the intuition that GMOs were harmful, suggesting that some cognitive obstacles might prevent GMOs’ acceptance [Blancke, Van Breusegem et al., 2015 ; Swiney, Bates and Coley, 2018 ]. But as these initial intuitions relied on weak arguments (such as “GMOs are not natural”), they were easy to dispel through argumentation.
Not all participants were equally sensitive to our arguments. The Cultural Cognition Thesis [Kahan, Jenkins-Smith and Braman, 2011 ] may help explain some of the inter-subject variability. For example, the participants who reacted most negatively to the consensus on GM food safety were environmental activists. In France the ecological party is well known for its strong stance against GMOs, thus the consensus may have been perceived as a significant threat to environmentalists’ core values.
In the case of vaccination and the Earth’s sphericity, high positive prior attitudes can account for the small and medium effect sizes observed as the progress margin was extremely small (particularly for the Earth’s sphericity were the ceiling effect was obvious). No participants challenged the consensus on the Earth’s sphericity and all of them were aware of it before the intervention. Similarly, most participants knew about the scientific consensus on vaccines and agreed that overall, vaccines are beneficial. But many participants had concerns about particular vaccines, such as the ones against papillomavirus, hepatitis B, and the flu. Corroborating studies showing that vaccine refusal is mainly targeted at specific vaccines and not at vaccination in general [Ward, 2016 ].
Lastly, as we found that most participants did not know what a scientific consensus is, providing laypeople with some basic notions of epistemology before applying the Gateway Belief Model could be an easy way to increase their deference to scientific consensus.
7.1 Limitations
Since our experimental design does not allow us to isolate the causal factors that contributed to attitude change (knowledge of the scientific consensus, argumentation, or simply being provided with information), causal factors should be investigated in future studies by adding control groups where participants are not exposed to the scientific consensus, are provided with arguments in a non-interactive context or are not taught basic epistemology.
It would also be relevant to vary the context of the intervention, as evidence suggest that scientists’ intervention on Genetic Engineering in classrooms can increase students’ knowledge on the topic [Weitkamp and Arnold, 2016 ]. Furthermore, the long-lasting effects of the intervention should be investigated by measuring attitudes weeks, or even months after the intervention, as Mcphetres and colleagues did in a learning experiment on GM food [McPhetres et al., 2019 ].
Finally, our participants’ increased knowledge about the scientific consensus on GM food and vaccination could have motivated them to discuss it with their peers. It has been shown that the more people know about the scientific consensus on global warming, the more likely they are to discuss it with their peers, leading to a “proclimate social feedback loop” [Goldberg et al., 2019 , p. 1; see also Sloane and Wiles, 2020 ]. Even though the present study did not measure participants’ sharing behaviors after the experiment, we strongly encourage future research to do so as it is an important — alas neglected — dimension of science communication.
8 Conclusion
The two studies reported in this article show that during science festivals people can change their minds on heated topics if scientists take the time to discuss with them. The results were particularly striking for GM food since most participants with negative opinions on GM food left the workshop thinking that it was harmless to human health. The replication of our intervention indicates that the effect is robust, and that cases of backfire are rare. People coming to science festivals are probably more inclined to accept scientific arguments, and yet we show that not all of them have been exposed to scientific evidence on heated topics such as GM food. This population is a good target for science communication policies, as it is possible to leverage their trust and interest in science to spread scientific arguments outside the scope of the festivals through interpersonal communication [Goldberg et al., 2019 ]. Our results should encourage scientists to engage more often with the public during science festivals, even on heated topics [see also Schmid and Betsch, 2019 ].
Funding. This research was supported by the grant EUR FrontCog ANR-17-EURE-0017 and ANR-10-IDEX-0001-02 PSL. The first author’s PhD thesis is funded by the Direction Générale de L’armement (DGA).
Conflict of interest. The authors declare that they have no conflict of interest.
Acknowledgments
We would like to thank Hugo Mercier and Camille Williams for their valuable feedback and numerous corrections on previous versions of the manuscript. We are also grateful to all the participants with whom we had great discussions and who took the time to fil in our boring questionnaires. We thank the organizers of the science festivals, Cognivence, Starsbourg University and Vanessa Flament, without whom nothing would have been possible. We also thank Joffrey Fuhrer who animated the workshop with us for one afternoon. Lastly, we are grateful to the two anonymous referees for their valuable feedback.
Appendix A Materials
Appendix B Pyramid of the hierarchy of proof
References
-
Altay, S. and Mercier, H. (2018). ‘Framing messages for vaccination supporters’. PsyArXiv . https://doi.org/10.31234/osf.io/ubp5s .
-
Altay, S., Schwartz, M., Hacquin, A., Blancke, S. and Mercier, H. (2020). ‘Scaling up interactive argumentation by providing counterarguments with a chatbot’. In revision.
-
Blancke, S., Grunewald, W. and De Jaeger, G. (2017). ‘De-problematizing ‘GMOs’: suggestions for communicating about genetic engineering’. Trends in Biotechnology 35 (3), pp. 185–186. https://doi.org/10.1016/j.tibtech.2016.12.004 .
-
Blancke, S., Van Breusegem, F., De Jaeger, G., Braeckman, J. and Van Montagu, M. (2015). ‘Fatal attraction: the intuitive appeal of GMO opposition’. Trends in Plant Science 20 (7), pp. 414–418. https://doi.org/10.1016/j.tplants.2015.03.011 .
-
Bonny, S. (2003). ‘Why are most Europeans opposed to GMOs? Factors explaining rejection in France and Europe’. Electronic Journal of Biotechnology 6 (1). https://doi.org/10.2225/vol6-issue1-fulltext-4 .
-
Boyette, T. and Ramsey, J. (2019). ‘Does the messenger matter? Studying the impacts of scientists and engineers interacting with public audiences at science festival events’. JCOM 18 (02), A02. https://doi.org/10.22323/2.18020202 .
-
Brossard, D. and Nisbet, M. C. (2007). ‘Deference to Scientific Authority Among a Low Information Public: Understanding U.S. Opinion on Agricultural Biotechnology’. International Journal of Public Opinion Research 19 (1), pp. 24–52. https://doi.org/10.1093/ijpor/edl003 .
-
Bultitude, K. (2014). ‘Science festivals: do they succeed in reaching beyond the ‘already engage’?’ JCOM 13 (04), C01. URL: http://jcom.sissa.it/archive/13/04/JCOM_1304_2014_C01 .
-
Chanel, O., Luchini, S., Massoni, S. and Vergnaud, J.-C. (2011). ‘Impact of information on intentions to vaccinate in a potential epidemic: Swine-origin Influenza A (H1N1)’. Social Science & Medicine 72 (2), pp. 142–148. https://doi.org/10.1016/j.socscimed.2010.11.018 .
-
Chinn, S., Lane, D. S. and Hart, P. S. (2018). ‘In consensus we trust? Persuasive effects of scientific consensus communication’. Public Understanding of Science 27 (7), pp. 807–823. https://doi.org/10.1177/0963662518791094 .
-
Clarke, C. E., Dixon, G. N., Holton, A. and McKeever, B. W. (2015). ‘Including “evidentiary balance” in news media coverage of vaccine risk’. Health Communication 30 (5), pp. 461–472. https://doi.org/10.1080/10410236.2013.867006 .
-
Clarke, C. E., McKeever, B. W., Holton, A. and Dixon, G. N. (2015). ‘The influence of weight-of-evidence messages on (vaccine) attitudes: a sequential mediation model’. Journal of Health Communication 20 (11), pp. 1302–1309. https://doi.org/10.1080/10810730.2015.1023959 .
-
Committee to Review Adverse Effects of Vaccines (2012). Adverse effects of vaccines: evidence and causality. Washington, DC, U.S.A.: National Academies Press. https://doi.org/10.17226/13164 .
-
Cook, J., van der Linden, S., Maibach, E. and Lewandowsky, S. (2018). The consensus handbook. Why the scientific consensus on climate change is important. https://doi.org/10.13021/G8MM6P .
-
Dekker, S., Lee, N. C., Howard-Jones, P. and Jolles, J. (2012). ‘Neuromyths in education: prevalence and predictors of misconceptions among teachers’. Frontiers in Psychology 3. https://doi.org/10.3389/fpsyg.2012.00429 .
-
Ding, D., Maibach, E. W., Zhao, X., Roser-Renouf, C. and Leiserowitz, A. (2011). ‘Support for climate policy and societal action are linked to perceptions about scientific agreement’. Nature Climate Change 1 (9), pp. 462–466. https://doi.org/10.1038/nclimate1295 .
-
Dixon, G. (2016). ‘Applying the Gateway Belief Model to Genetically Modified Food Perceptions: New Insights and Additional Questions’. Journal of Communication 66 (6), pp. 888–908. https://doi.org/10.1111/jcom.12260 .
-
Dixon, G. N. and Clarke, C. E. (2013). ‘Heightening Uncertainty Around Certain Science: Media coverage, false balance, and the autism vaccine controversy’. Science Communication 35 (3), pp. 358–382. https://doi.org/10.1177/1075547012458290 .
-
Dixon, G. N., McKeever, B. W., Holton, A. E., Clarke, C. and Eosco, G. (2015). ‘The power of a picture: overcoming scientific misinformation by communicating weight-of-evidence information with visual exemplars’. Journal of Communication 65 (4), pp. 639–659. https://doi.org/10.1111/jcom.12159 .
-
Dubé, E., Gagnon, D. and MacDonald, N. E. (2015). ‘Strategies intended to address vaccine hesitancy: review of published reviews’. Vaccine 33 (34), pp. 4191–4203. https://doi.org/10.1016/j.vaccine.2015.04.041 .
-
Dunwoody, S. and Kohl, P. A. (2017). ‘Using weight-of-experts messaging to communicate accurately about contested science’. Science Communication 39 (3), pp. 338–357. https://doi.org/10.1177/1075547017707765 .
-
Ecker, U. K. H. and Ang, L. C. (2019). ‘Political attitudes and the processing of misinformation corrections’. Political Psychology 40 (2), pp. 241–260. https://doi.org/10.1111/pops.12494 .
-
Fernbach, P. M., Light, N., Scott, S. E., Inbar, Y. and Rozin, P. (2019). ‘Extreme opponents of genetically modified foods know the least but think they know the most’. Nature Human Behaviour 3 (3), pp. 251–256. https://doi.org/10.1038/s41562-018-0520-3 .
-
Frewer, L. J., Howard, C. and Shepherd, R. (1998). ‘The influence of initial attitudes on responses to communication about genetic engineering in food production’. Agriculture and Human Values 15 (1), pp. 15–30. https://doi.org/10.1023/A:1007465730039 .
-
Frewer, L. J., Scholderer, J. and Bredahl, L. (2003). ‘Communicating about the risks and benefits of genetically modified foods: the mediating role of trust’. Risk Analysis: An Official Publication of the Society for Risk Analysis 23 (6), pp. 1117–1133. PMID: 14641888 .
-
Funk, C. (2017). ‘Mixed messages about public trust in science’. Issues in Science and Technology 34 (1), pp. 86–88. URL: https://issues.org/real-numbers-mixed-messages-about-public-trust-in-science/ .
-
Gaskell, G., Bauer, M. W., Durant, J. and Allum, N. C. (1999). ‘Worlds apart? The reception of genetically modified foods in Europe and the U.S.’ Science 285 (5426), pp. 384–387. https://doi.org/10.1126/science.285.5426.384 .
-
Gautier, A., Jestin, C. and Chemlal, K. (2017). ‘Adhésion à la vaccination en France: résultats du Baromètre santé 2016’. [Acceptance of immunization in France: results from the 2016 health barometer]. Bulletin epidemiologique hebdomadaire . Base documentaire BDSP — Banque de données en santé publique, pp. 21–27. URL: https://www.santepubliquefrance.fr/determinants-de-sante/vaccination/documents/article/adhesion-a-la-vaccination-en-france-resultats-du-barometre-sante-2016 .
-
Giles, D. (3rd April 2019). What is a permutation test? URL: https://www.r-bloggers.com/what-is-a-permutation-test/ .
-
Giles, J. (2005). ‘Internet encyclopaedias go head to head’. Nature 438 (7070), pp. 900–901. https://doi.org/10.1038/438900a .
-
Goldberg, M. H., van der Linden, S., Maibach, E. and Leiserowitz, A. (2019). ‘Discussing global warming leads to greater acceptance of climate science’. Proceedings of the National Academy of Sciences 116 (30), pp. 14804–14805. https://doi.org/10.1073/pnas.1906589116 .
-
Guess, A. and Coppock, A. (2018). ‘Does counter-attitudinal information cause backlash? Results from three large survey experiments’. British Journal of Political Science , pp. 1–19. https://doi.org/10.1017/s0007123418000327 .
-
Hanssen, L., Dijkstra, A., Sleenhoff, S., Frewer, L. and Gutteling, J. M. (2018). ‘Revisiting public debate on Genetic Modification and Genetically Modified Organisms. Explanations for contemporary Dutch public attitudes’. JCOM 17 (04), A01. https://doi.org/10.22323/2.17040201 .
-
Harrer, M., Cuijpers, P., Furukawa, T. A. and Ebert, D. D. (2019). Doing meta-analysis in R: a hand-on guide . URL: https://bookdown.org/MathiasHarrer/Doing_Meta_Analysis_in_R/ .
-
Ifop and Libération (3rd August 2000). ‘Les Français et les risques alimentaires’. Libération .
-
Institut d’études opinion et marketing en France et à l’international (2012). Les Français et les OGM . URL: https://www.ifop.com/wp-content/uploads/2018/03/1989-1-study_file.pdf .
-
Institut de Radioprotection et de Sûreté Nucléaire (2017). Baromètre sur la perception des risques et de la sécurité par les Français. France. URL: http://barometre.irsn.fr/wp-content/uploads/2017/07/IRSN_barometre_2017.pdf .
-
Jensen, E. and Buckley, N. (2014). ‘Why people attend science festivals: interests, motivations and self-reported benefits of public engagement with research’. Public Understanding of Science 23 (5), pp. 557–573.
-
Kahan, D. M. (2013). ‘Ideology, motivated reasoning and cognitive reflection’. Judgment and Decision Making 8 (4), pp. 407–424.
-
Kahan, D. M. (2017). ‘The “Gateway Belief” illusion: reanalyzing the results of a scientific-consensus messaging study’. JCOM 16 (05), A03. URL: https://jcom.sissa.it/archive/16/05/JCOM_1605_2017_A03 .
-
Kahan, D. M., Jenkins-Smith, H. and Braman, D. (2011). ‘Cultural cognition of scientific consensus’. Journal of Risk Research 14 (2), pp. 147–174. https://doi.org/10.1080/13669877.2010.511246 .
-
Kahan, D. M., Peters, E., Wittlin, M., Slovic, P., Ouellette, L. L., Braman, D. and Mandel, G. (2012). ‘The polarizing impact of science literacy and numeracy on perceived climate change risks’. Nature Climate Change 2, pp. 732–735. https://doi.org/10.1038/nclimate1547 .
-
Katz, E. and Lazarsfeld, P. F. (1955). Personal influence: the part played by people in the flow of mass communications. Free Press.
-
Kennedy, E. B., Jensen, E. A. and Verbeke, M. (2018). ‘Preaching to the scientifically converted: evaluating inclusivity in science festival audiences’. International Journal of Science Education, Part B 8 (1), pp. 14–21. https://doi.org/10.1080/21548455.2017.1371356 .
-
Kerr, J. R. and Wilson, M. S. (2018). ‘Changes in perceived scientific consensus shift beliefs about climate change and GM food safety’. PLOS ONE 13 (7), e0200295. https://doi.org/10.1371/journal.pone.0200295 .
-
Kohl, P. A., Kim, S. Y., Peng, Y., Akin, H., Koh, E. J., Howell, A. and Dunwoody, S. (2016). ‘The influence of weight-of-evidence strategies on audience perceptions of (un)certainty when media cover contested science’. Public Understanding of Science 25 (8), pp. 976–991. https://doi.org/10.1177/0963662515615087 .
-
Landrum, A. R. and Hallman, W. K. (2017). ‘Engaging in effective science communication: a response to Blancke et al. on deproblematizing GMOs’. Trends in Biotechnology 35 (5), pp. 378–379. https://doi.org/10.1016/j.tibtech.2017.01.006 .
-
Landrum, A. R., Hallman, W. K. and Jamieson, K. H. (2018). ‘Examining the impact of expert voices: communicating the scientific consensus on genetically-modified organisms’. Environmental Communication 13 (1), pp. 51–70. https://doi.org/10.1080/17524032.2018.1502201 .
-
Larson, H. J., de Figueiredo, A., Xiahong, Z., Schulz, W. S., Verger, P., Johnston, I. G., Cook, A. R. and Jones, N. S. (2016). ‘The state of vaccine confidence 2016: global insights through a 67-country survey’. EBioMedicine 12, pp. 295–301. https://doi.org/10.1016/j.ebiom.2016.08.042 .
-
Leask, J. (2011). ‘Target the fence-sitters’. Nature 473 (7348), pp. 443–445. https://doi.org/10.1038/473443a .
-
Lewandowsky, S., Gignac, G. E. and Vaughan, S. (2013). ‘The pivotal role of perceived scientific consensus in acceptance of science’. Nature Climate Change 3 (4), pp. 399–404. https://doi.org/10.1038/nclimate1720 .
-
MacDonald, N. E. (2015). ‘Vaccine hesitancy: definition, scope and determinants’. Vaccine 33 (34), pp. 4161–4164. https://doi.org/10.1016/j.vaccine.2015.04.036 .
-
McFadden, B. R. and Lusk, J. L. (2016). ‘What consumers don’t know about genetically modified food and how that affects beliefs’. The FASEB Journal 30 (9), pp. 3091–3096. https://doi.org/10.1096/fj.201600598 .
-
McPhetres, J., Rutjens, B. T., Weinstein, N. and Brisson, J. A. (2019). ‘Modifying attitudes about modified foods: increased knowledge leads to more positive attitudes’. Journal of Environmental Psychology 64, pp. 21–29. https://doi.org/10.1016/j.jenvp.2019.04.012 .
-
Mercier, H. and Sperber, D. (2017). The enigma of reason. U.S.A.: Harvard University Press.
-
Mercier, H. (2016). ‘The argumentative theory: predictions and empirical evidence’. Trends in Cognitive Sciences 20 (9), pp. 689–700. https://doi.org/10.1016/j.tics.2016.07.001 .
-
Mercier, H. and Landemore, H. (2012). ‘Reasoning is for arguing: understanding the successes and failures of deliberation’. Political Psychology 33 (2), pp. 243–258. https://doi.org/10.1111/j.1467-9221.2012.00873.x .
-
Mercier, H. and Sperber, D. (2011). ‘Why do humans reason? Arguments for an argumentative theory’. Behavioral and Brain Sciences 34 (2), pp. 57–74. https://doi.org/10.1017/s0140525x10000968 .
-
National Academies of Sciences & Medicine (2016). Genetically engineered crops: experiences and prospects. Washington, DC, U.S.A.: National Academies Press. https://doi.org/10.17226/23395 .
-
Nyhan, B. and Reifler, J. (2010). ‘When corrections fail: the persistence of political misperceptions’. Political Behavior 32 (2), pp. 303–330. https://doi.org/10.1007/s11109-010-9112-2 .
-
R core team (2017). R: a language and environment for statistical computing . URL: https://www.R-project.org .
-
Ronald, P. (2011). ‘Plant genetics, sustainable agriculture and global food security’. Genetics 188 (1), pp. 11–20. https://doi.org/10.1534/genetics.111.128553 .
-
Rose, K. M., Korzekwa, K., Brossard, D., Scheufele, D. A. and Heisler, L. (2017). ‘Engaging the public at a science festival: findings from a panel on human gene editing’. Science Communication 39 (2), pp. 250–277. https://doi.org/10.1177/1075547017697981 .
-
RStudio team (2015). RStudio: integrated development for R. Boston, MA, U.S.A.: RStudio Inc.
-
Sadaf, A., Richards, J. L., Glanz, J., Salmon, D. A. and Omer, S. B. (2013). ‘A systematic review of interventions for reducing parental vaccine refusal and vaccine hesitancy’. Vaccine 31 (40), pp. 4293–4304. https://doi.org/10.1016/j.vaccine.2013.07.013 .
-
Salmon, D. A., Moulton, L. H., Omer, S. B., deHart, M. P., Stokley, S. and Halsey, N. A. (2005). ‘Factors associated with refusal of childhood vaccines among parents of school-aged children’. Archives of Pediatrics & Adolescent Medicine 159 (5), pp. 470–476. https://doi.org/10.1001/archpedi.159.5.470 .
-
Schmid, P. and Betsch, C. (2019). ‘Effective strategies for rebutting science denialism in public discussions’. Nature Human Behaviour 3 (9), pp. 931–939. https://doi.org/10.1038/s41562-019-0632-4 .
-
Scholderer, J. and Frewer, L. J. (2003). ‘The biotechnology communication paradox: experimental evidence and the need for a new strategy’. Journal of Consumer Policy 26 (2), pp. 125–157. https://doi.org/10.1023/a:1023695519981 .
-
Scott, S. E., Inbar, Y. and Rozin, P. (2016). ‘Evidence for Absolute Moral Opposition to Genetically Modified Food in the United States’. Perspectives on Psychological Science 11 (3), pp. 315–324. https://doi.org/10.1177/1745691615621275 .
-
Sloane, J. D. and Wiles, J. R. (2020). ‘Communicating the consensus on climate change to college biology majors: the importance of preaching to the choir’. Ecology and Evolution 10 (2), pp. 594–601. https://doi.org/10.1002/ece3.5960 .
-
Sturgis, P. and Allum, N. (2004). ‘Science in Society: Re-Evaluating the Deficit Model of Public Attitudes’. Public Understanding of Science 13 (1), pp. 55–74. https://doi.org/10.1177/0963662504042690 .
-
Swiney, L., Bates, D. G. and Coley, J. D. (2018). ‘Cognitive constraints shape public debate on the risks of synthetic biology’. Trends in Biotechnology 36 (12), pp. 1199–1201. https://doi.org/10.1016/j.tibtech.2018.09.002 .
-
van der Linden, S. L., Clarke, C. E. and Maibach, E. W. (2015). ‘Highlighting consensus among medical scientists increases public support for vaccines: evidence from a randomized experiment’. BMC Public Health 15 (1), p. 1207. https://doi.org/10.1186/s12889-015-2541-4 .
-
van der Linden, S. L., Leiserowitz, A. A., Feinberg, G. D. and Maibach, E. W. (2015). ‘The Scientific Consensus on Climate Change as a Gateway Belief: Experimental Evidence’. PLoS ONE 10 (2), e0118489. https://doi.org/10.1371/journal.pone.0118489 .
-
van der Linden, S. L., Leiserowitz, A. and Maibach, E. (2017). ‘Gateway illusion or cultural cognition confusion?’ JCOM 16 (05), A04. URL: https://jcom.sissa.it/archive/16/05/JCOM_1605_2017_A04 .
-
Viechtbauer, W. (2010). ‘Conducting meta-analyses in R with the metafor package’. Journal of Statistical Software 36 (3), pp. 1–48. https://doi.org/10.18637/jss.v036.i03 .
-
Ward, J. K. (2016). ‘Rethinking the antivaccine movement concept: a case study of public criticism of the swine flu vaccine’s safety in France’. Social Science & Medicine 159, pp. 48–57. https://doi.org/10.1016/j.socscimed.2016.05.003 .
-
Weitkamp, E. and Arnold, D. (2016). ‘A cross disciplinary embodiment: exploring the impacts of embedding science communication principles in a collaborative learning space’. In: Science and technology education and communication. Rotterdam, The Netherlands: SensePublishers, pp. 67–84. https://doi.org/10.1007/978-94-6300-738-2_5 .
-
Wheeler, B. and Torchiano, M. (2016). lmPerm: permutation tests for linear models . R package version 2.1.0. URL: https://CRAN.R-project.org/package=lmPerm .
-
Wood, T. and Porter, E. (2019). ‘The elusive backfire effect: mass attitudes’ steadfast factual adherence’. Political Behavior 41 (1), pp. 135–163. https://doi.org/10.1007/s11109-018-9443-y .
-
Yaqub, O., Castle-Clarke, S., Sevdalis, N. and Chataway, J. (2014). ‘Attitudes to vaccination: a critical review’. Social Science & Medicine 112, pp. 1–11. https://doi.org/10.1016/j.socscimed.2014.04.018 .
Authors
Sacha Altay is a PhD candidate in the department of cognitive science at PSL university. Fascinated by apparently irrational beliefs and behaviors, he is studying the cognitive mechanisms underlying information transmission and evaluation. His thesis focuses on the spread of false information and counter-argumentation. E-mail: sacha.altay@gmail.com .
Camille Lakhlifi has a master in cognitive science. She is interested in critical thinking and currently tries to create evidence-based policies to improve laypeople’s understanding of science. E-mail: camille.lakhlifi@gmail.com .