1 Why study public (dis)trust in science in digital media environments?

Nowadays, large parts of the population obtain information about science, technology, and topics such as climate change or COVID-19 online; this includes journalistic online media, but increasingly also social and other internet-based media [e.g., European Commission, 2021; Guenther et al., 2022; National Science Board, 2018]. Digital media environments, especially social media, are characterized by a combination of interpersonal and mass-mediated communication; they provide heterogeneous content regarding actors, publics, and topics [e.g., Neuberger, 2014]. Content critical of science [e.g., Gierth & Bromme, 2019], disinformation [e.g., Scheufele & Krause, 2019], conspiracy narratives [e.g., Allgaier, 2019; Mahl et al., 2022; Plohl & Musil, 2021], and algorithm-curated information environments [e.g., Ziewitz, 2015] seem to be related to a so-called post-truth era [Keyes, 2004] and potentially negative consequences for public trust in science [e.g., Schäfer, 2016; Weingart & Guenther, 2016]. So far, however, empirical evidence of a decreased public trust in science is often lacking [Krause et al., 2019], and social media may also potentially benefit public trust in science by facilitating access to and exchange of scientific information [e.g., Taddicken & Krämer, 2021]. This special issue is dedicated to exploring public (dis)trust in science against the backdrop of changing information environments and potentially contrasting trends regarding the audience’s increasing use of digital media. The aim of this special issue is to reflect on and give a wide overview of the current academic and public discourses in order to advance research regarding theory and methods.

2 Editorial processes of this special issue

The contributions to this special issue have been selected in a two-stage process: the selection of abstracts suitable for full-paper submissions and the selection of full papers via a double-blind reviewing process. Based on our open call for papers, 37 abstracts were submitted, and 13 were invited to full-paper submission. The aim was to achieve the most diverse and multi-perspective issue possible regarding the following selection criteria: a perspective on trust in science as well as on digital media environments, involved actors (i.e., scientists/researchers, publics, journalists, science communicators), the target group(s) for science communication, digital media environment/platforms, scientific issues and disciplines, and applied theoretical and methodological approaches. Furthermore, we have selected different article types, including research papers and practice insights. Because we aim to maintain a diverse perspective as an international journal, we selected abstracts from Africa, Asia, Europe, North America, and South America for full-paper submission.

For the double-blind review process, we ensured that at least two referees gave feedback per manuscript. In selecting referees, we aimed for and achieved diversity regarding gender and cultural backgrounds.

3 Summary of papers

Subsequently, nine publications are part of the special issue focusing on four different perspectives of the topic: science communicators and their strategies, media content and online discourses, users’ perceptions, and media effects.

3.1 Science communicators and their strategies

Yang et al. [2024] studied three identity strategies attributed to achieving audience trust in three types of science communicators (scientists, citizens, and institutions). The authors applied a quantitative content analysis of answers provided to questions regarding climate change or astrophysics on the Chinese knowledge-sharing online platform Zhihu. Results revealed that communicators vary how they use different identity strategies and that responses receive different numbers of likes (interpreted as trust effects by the authors) depending on the communicators as well as the science topic.

In the only practice insight in this special issue, Trollip et al. [2024] reported on community-driven misinformation on a national HIV survey (e.g., it falsely accused data collectors of criminal activity) and how translation and multimedia formats, such as video and audio, was used to enhance science communication efforts and (re)build trust in communities. The research team’s response included multilingual, multimodal digital communication and community engagement. This insight demonstrates the effectiveness of a blended approach in restoring trust and dispelling misinformation in a country with a diverse social and linguistic setting: South Africa.

3.2 Media content and online discourses

Using (automated) content analysis, Lima et al. [2024] examined the Brazilian climate change discourse on Instagram, Facebook, and Twitter (2014–2022). Their findings on the dynamics of the discourse, scientific authority, and eco-emotions showed no significant increase in challenges to scientific authority or skepticism. However, they did reveal a subtle shift toward using uncertainty as a rhetorical tool to undermine trust in the scientific discourse.

Guenther et al. [2024] ran a linkage study that combined data from a German two-wave panel survey with a quantitative content analysis of (digital) media. They examined the effects of exposure to trust cues identified for various types of media on respondents’ trust in science. For the total sample, exposure to trust cues has only limited effects, but varying effects emerged when analyzing groups of people with varying degrees of trust in science. While trust cue exposure in public TV and science blogs only increased previously moderately trusting individuals’ trust in science, the use of populist media only decreased trust indicated by people who were already untrusting of science.

3.3 The users’ perceptions

Zimmermann et al. [2024] argue that in a time of crisis and in digital contexts, people’s trust in science drives their media choices. Their findings from a cross-sectional survey in Germany emphasized that respondents who perceived scientists as trustworthy expected science to provide accurate knowledge and guide reasonable decision-making. Positive trust expectations then, in turn, also positively predicted the use of journalistic media as well as scientific online sources for scientific information. In contrast, respondents with low or no trust in science tended to use the messaging app Telegram or ‘alternative’ online media outlets.

The paper by Schäfer et al. [2024] took new developments in (generative) artificial intelligence (GenAI) into account. More specifically, the authors asked whether people trust GenAI as a source for science communication. Using survey data from the German Science Barometer in 2023, the authors reported that Germans are rather skeptical about GenAI and also do not strongly trust the technology to perform science communication. General trust in science predicts trust in GenAI.

Utilizing a cross-sectional survey in Germany, Schug et al. [2024] examined differences in respondents’ perceived trustworthiness and the authenticity of scientists, depending on their particular research field. They also included the role of science-related media use. Scientists who study controversial scientific issues such as COVID-19 or climate change were perceived as significantly less trustworthy and authentic than scientists in general and those working in non-controversial fields. While traditional science-news media use was positively related to the perceived integrity and benevolence of scientists, digital science-media consumption was not connected to how trustworthy scientists were perceived, but it was related to more negative perceptions of scientists’ authenticity.

The paper by Essary [2024] is based on focus group discussions with college students in the United States of America (U.S.). The paper discusses how the students evaluated science information during the COVID-19 pandemic and whether and how they have perceived potential changes in their trust in science. Results showed that the interviewed young adults were informed about and trusted in scientific methods, principals, and institutions. They reported to be aware and cautious of online misinformation about science, but felt they had become more skeptical of science due to the pandemic. Trust in science was revealed to be highly politicized.

3.4 Media effects

By applying an online experiment in Germany, Egelhofer et al. [2024] showed that exposure to comments harassing a scientist on social media had a negative effect on respondents’ perceived trustworthiness of that specific scientist. This effect was stronger for male respondents and those inheriting science populist attitudes. No effect was detected regarding general trust in scientists. However, if a female scientist was attacked, it had a positive effect on general trust in scientists.

4 What can we learn from the collection of papers?

Taken together, the findings of this special issue’s papers provide answers to four overarching questions regarding public (dis)trust in science in digital media environments:

4.1 How much do people (dis)trust science, and how does (dis)trust in science develop?

The special issue includes five survey studies in the German context that found rather high levels of public trust in science or scientists [Egelhofer et al., 2024; Guenther et al., 2024; Schäfer et al., 2024; Schug et al., 2024; Zimmermann et al., 2024]. These trust assessments seem relatively stable; nevertheless, there are group- [Guenther et al., 2024] and issue-specific variations [Schäfer et al., 2024; Schug et al., 2024].

Based on the issue variations, it may not be surprising that, according to self-reports of young adults in the U.S., Essary [2024] found hints of an increased skepticism due to the COVID-19 pandemic despite generally positive views of science and scientific processes. Furthermore, aspects of generation and political ideology seem relevant. After analyzing the sentiments and narratives of the climate change online discourse in Brazil over time, Lima et al. [2024] concluded that President Bolsonaro’s administration did not seem to have a significant effect on public trust in science, as no increase in denial and skepticism was found, rather a subtle increase in uncertainty narratives. Hence, the level of public (dis)trust in science and its development over time can vary by culture, scientific issue, and different groups within the public.

4.2 How is public (dis)trust in science connected to other variables?

In the publications in this special issue, trust in science was tested for connections with a variety of selected variables. For instance, science-related populist attitudes seem to negatively predict the perceived trustworthiness of scientists who are harassed online [Egelhofer et al., 2024; see also Mede et al., 2021], whereas attitudes regarding the benefits of science and self-perceived knowledge seem to be positive predictors of trust in science [Schäfer et al., 2024]. Furthermore, a scientists’ perceived authenticity was found to be correlated with their perceived trustworthiness [Schug et al., 2024]. In sum, when studying public (dis)trust in science, connections to a great variety of other variables and related constructs can and need to be explored. This special issue can only provide a few insights.

4.3 Which role does (digital) media play regarding public (dis)trust in science?

Digital media environments can create opportunities but also risks for public trust in science; this is clearly evident in the empirical results of the papers in this special issue. The practice insight by Trollip et al. [2024] shows that, on the one hand, online misinformation can be harmful to the success of a scientific study — an effect that some people may not perceive or want to reveal in self-reports but only attribute to others [see Essary, 2024]. On the other hand, public trust in science can be restored by commitment to multimodal and multilingual science communication initiatives [Trollip et al., 2024]. While the use of traditional science-related media is positively associated with scientists’ perceived trustworthiness, Schug et al. [2024] found no relationship with online media use, which may be explained by the diversity of content and opinions found online. Regarding climate change, for example, the visibility of skepticism and denial is potentially higher [Lima et al., 2024; see also Walter et al., 2018]; thus, a differentiation in platforms and content may reveal interesting results. Egelhofer et al. [2024] elaborate on the negative effects of exposure to online comments harassing a scientist on their perceived trustworthiness but not on respondents’ general trust in scientists. While journalistic media such as public TV content can positively predict respondents’ trust in science [Guenther et al., 2024], and higher trust in scientists is connected with more frequent use of established journalistic and scientific online sources [Zimmermann et al., 2024], negative links were shown in the use of alternative, populist online media outlets. This effect is emphasized when specific groups of respondents are investigated [Guenther et al., 2024]. Hence, (digital) media does play a role in public (dis)trust in science, but it can vary by specific platform, content, and user groups.

4.4 Which communication strategies are related to (dis)trust in science?

Since the results of the special issue’s papers highlight that online harassment [Egelhofer et al., 2024], misinformation [Trollip et al., 2024], and populist media content [Guenther et al., 2024; Zimmermann et al., 2024] are negatively related to public trust in science, (science communication) researchers and practitioners need to be aware of possible backlash and be ready to react with counter-communication (initiatives) [Trollip et al., 2024]. Yang et al. [2024] suggest that communicating scientists should not only focus on revealing their expertise; referencing to their moral qualities and similarities to the general public when providing expert advice might be more effective. Hence, while strategic science communication gains importance, different strategies involving diverse digital platforms should be considered when studying public (dis)trust in science.

5 What needs to be considered in future research?

Based on the four overarching questions and their answers, five aspects seem particularly important to be considered in future research:

5.1 Different cultural contexts

Although this special issue aimed for a high level of diversity in the authors’ and research studies’ cultural backgrounds, in the end, we could not exploit the full potential. However, each paper strongly emphasized reflecting on the particular cultural specifications. We strongly encourage future research on the issue of public (dis)trust in science in digital media environments and science communication research more generally to keep cultural aspects in mind and to interpret results accordingly. For example, the U.S. seems to be a country where trust in science is a highly politicized topic [Essary, 2024; for trust in the government and scientific experts in different cultural settings during the COVID-19 pandemic see also Buturoiu et al., 2022; Weingart et al., 2022; Yokoyama & Ikkatai, 2022]. For the adoption of cultural differences into communication strategies, the practical insight by Trollip et al. [2024] for the culturally diverse country of South Africa is a prime example.

5.2 Different scientific issues and disciplines

While Guenther et al. [2024] introduced a regression model for general trust in science, other papers cover a range of different scientific issues. This special issue touched on (dis)trust in science in times of crisis, more specifically the COVID-19 pandemic [Essary, 2024] as well as potential future pandemics [Zimmermann et al., 2024] and climate change [Lima et al., 2024; Schug et al., 2024; Yang et al., 2024]. Schug et al. [2024] even considered both issues as examples of controversial science issues compared to the uncontroversial disciplines of astrophysics [see also Reif et al., 2020] and history. Similarly, Yang et al. [2024] differentiate between the issue of climate change as controversial and astronomy as hard science. Furthermore, Trollip et al. [2024] focused on the HIV health crisis, while Schäfer et al. [2024] studied trust in GenAI. In general, the summarized results reveal issue-specific differences, suggesting that models and communication strategies need to be adopted for specific science topics and disciplines [see Yang et al., 2024]. Thus, we encourage future empirical research to consider and reflect on public (dis)trust in science in the context of different scientific issues and disciplines.

5.3 Diverse involved actors and groups

The papers and their results also highlight the importance of considering a diversity of actors involved in public (dis)trust in science. Not only did the authors of this special issue study different science communicators regarding gender differences [Egelhofer et al., 2024] and their communication strategies [Yang et al., 2024], but also diverse target groups of science communication and active citizens. For example, (dis)trust in science may vary across generations [Essary, 2024]. Cultural, linguistic, and socioeconomic differences [Trollip et al., 2024], as well as trust differences [Guenther et al., 2024] across population groups, may require tailored communication strategies. Here, different online platforms should be considered, as they are potential intermediaries of trust in science [Guenther et al., 2024; see also Reif & Guenther, 2021; Schäfer, 2016]. Therefore, we encourage future research to embrace and further explore the involvement of diverse actors and target groups when studying public (dis)trust in science.

5.4 Theoretical reflection and enhancement

Most of the papers conceptualized trust or trustworthiness as positive perceptions and expectations toward science, scientists, or scientific organizations [Egelhofer et al., 2024; Essary, 2024; Guenther et al., 2024; Schäfer et al., 2024; Schug et al., 2024; Zimmermann et al., 2024], with a focus on an epistemic understanding of public (dis)trust in science [see also Hendriks et al., 2015; Sperber et al., 2010]. Zimmermann et al. [2024] also considered guidance trust expectations based on which behavioral trust may be placed [see also Besley & Tiffany, 2023; Mayer et al., 1995]. Furthermore, Guenther et al. [2024] introduced the idea of trust cues identified in media content, and Yang et al. [2024] understood identity strategies as expressions of communication strategies to foster trust in science communicators. Thus, we encourage future research to reflect on different theoretical approaches to public (dis)trust in science, embed studies accordingly, and advance theoretical enhancement [see also Fage-Butler et al., 2022].

5.5 Innovation and diversification of trust methods

Finally, this special issue highlights the popularity of survey methods within the field, however, with varying designs [cross-sectional: Schäfer et al., 2024; Schug et al., 2024; Zimmermann et al., 2024; panel: Guenther et al., 2024; experimental: Egelhofer et al., 2024]. While one paper analyzed qualitative focus group discussions [Essary, 2024], three other studies elaborated on trust-related aspects in media content [Guenther et al., 2024] and online discourses [Lima et al., 2024; Yang et al., 2024]. We would like to encourage future research to contribute to the innovation and diversification of the methodological base of the field, for example, through triangulations or mixed-methods studies [see Guenther et al., 2024].

Acknowledgments

We would like to express our sincere gratitude to all authors, referees, and members of the editorial board for their invaluable contributions and support throughout the development of this special issue.

References

Allgaier, J. (2019). Science and environmental communication on YouTube: strategically distorted communications in online videos on climate change and climate engineering. Frontiers in Communication, 4, 36. https://doi.org/10.3389/fcomm.2019.00036

Besley, J. C., & Tiffany, L. A. (2023). What are you assessing when you measure “trust” in scientists with a direct measure? Public Understanding of Science, 32, 709–726. https://doi.org/10.1177/09636625231161302

Buturoiu, R., Corbu, N., Oprea, D.-A., & BoČ›an, M. (2022). Trust in information sources during the COVID-19 pandemic. A Romanian case study. Communications, 47, 375–394. https://doi.org/10.1515/commun-2020-0052

Egelhofer, J. L., Seeger, C., & Binder, A. (2024). The effects of witnessing harassment of scientists on public perceptions of science. JCOM, 23, A08. https://doi.org/10.22323/2.23090208

Essary, C. (2024). “I think it gave me a little bit of mistrust”: exploring trust in COVID-19 science among college students. JCOM, 23, A07. https://doi.org/10.22323/2.23090207

European Commission. (2021). European citizens’ knowledge and attitudes towards science and technology [Special eurobarometer nr. 516]. https://doi.org/10.2775/071577

Fage-Butler, A., Ledderer, L., & Nielsen, K. H. (2022). Public trust and mistrust of climate science: a meta-narrative review. Public Understanding of Science, 31, 832–846. https://doi.org/10.1177/09636625221110028

Gierth, L., & Bromme, R. (2019). Attacking science on social media: how user comments affect perceived trustworthiness and credibility. Public Understanding of Science, 29, 230–247. https://doi.org/10.1177/0963662519889275

Guenther, L., Reif, A., Taddicken, M., & Weingart, P. (2022). Positive but not uncritical: perceptions of science and technology amongst South African online users. South African Journal of Science, 118. https://doi.org/10.17159/sajs.2022/11102

Guenther, L., Schröder, J. T., Reif, A., Brück, J., Taddicken, M., Weingart, P., & Jonas, E. (2024). Intermediaries in the limelight: how exposure to trust cues in content about science affects public trust in science. JCOM, 23, A03. https://doi.org/10.22323/2.23090203

Hendriks, F., Kienhues, D., & Bromme, R. (2015). Measuring laypeople’s trust in experts in a digital age: the Muenster Epistemic Trustworthiness Inventory (METI) (J. M. Wicherts, Ed.). PLOS ONE, 10, e0139309. https://doi.org/10.1371/journal.pone.0139309

Keyes, R. (2004). The post-truth era: dishonesty and deception in contemporary life. St. Martin’s Press.

Krause, N. M., Brossard, D., Scheufele, D. A., Xenos, M. A., & Franke, K. (2019). The polls—trends: Americans’ trust in science and scientists. Public Opinion Quarterly. https://doi.org/10.1093/poq/nfz041

Lima, R. O., Belém, A., Lycarião, D., Oliveira, T., Evangelista, S., Massarani, L., & Alves, M. (2024). (Un)certainty in science and climate change: a longitudinal analysis (2014–2022) of narratives about climate science on social media in Brazil (Instagram, Facebook and Twitter). JCOM, 23, A02. https://doi.org/10.22323/2.23090202

Mahl, D., Schäfer, M. S., & Zeng, J. (2022). Conspiracy theories in online environments: an interdisciplinary literature review and agenda for future research. New Media & Society, 25, 1781–1801. https://doi.org/10.1177/14614448221075759

Mayer, R. C., Davis, J. H., & Schoorman, F. D. (1995). An integrative model of organizational trust. The Academy of Management Review, 20, 709. https://doi.org/10.2307/258792

Mede, N. G., Schäfer, M. S., & Füchslin, T. (2021). The SciPop scale for measuring science-related populist attitudes in surveys: development, test and validation. International Journal of Public Opinion Research, 33, 273–293. https://doi.org/10.1093/ijpor/edaa026

National Science Board. (2018). Science & engineering indicators 2018. National Science Foundation. https://www.nsf.gov/statistics/2018/nsb20181/assets/nsb20181.pdf

Neuberger, C. (2014). Konflikt, Konkurrenz und Kooperation: Interaktionsmodi in einer Theorie der dynamischen Netzwerköffentlichkeit. Medien & Kommunikationswissenschaft, 62, 567–587. https://doi.org/10.5771/1615-634x-2014-4-567

Plohl, N., & Musil, B. (2021). Modeling compliance with COVID-19 prevention guidelines: the critical role of trust in science. Psychology, Health & Medicine, 26, 1–12. https://doi.org/10.1080/13548506.2020.1772988

Reif, A., & Guenther, L. (2021). How representative surveys measure public (dis)trust in science: a systematisation and analysis of survey items and open-ended questions. Journal of Trust Research, 11, 94–118. https://doi.org/10.1080/21515581.2022.2075373

Reif, A., Kneisel, T., Schäfer, M., & Taddicken, M. (2020). Why are scientific experts perceived as trustworthy? Emotional assessment within TV and YouTube videos. Media and Communication, 8, 191–205. https://doi.org/10.17645/mac.v8i1.2536

Schäfer, M. S. (2016). Mediated trust in science: concept, measurement and perspectives for the ‘science of science communication’. JCOM, 15, C02. https://doi.org/10.22323/2.15050302

Schäfer, M. S., Kremer, B., Mede, N., & Fischer, L. (2024). Trust in science, trust in ChatGPT? How Germans think about generative AI as a source in science communication. JCOM, 23, A05. https://doi.org/10.22323/2.23090205

Scheufele, D. A., & Krause, N. M. (2019). Science audiences, misinformation and fake news. Proceedings of the National Academy of Sciences, 116, 7662–7669. https://doi.org/10.1073/pnas.1805871115

Schug, M., Bilandzic, H., & Kinnebrock, S. (2024). Public perceptions of trustworthiness and authenticity towards scientists in controversial scientific fields. JCOM, 23, A06. https://doi.org/10.22323/2.23090206

Sperber, D., Clément, F., Heintz, C., Mascaro, O., Mercier, H., Origgi, G., & Wilson, D. (2010). Epistemic vigilance. Mind & Language, 25, 359–393. https://doi.org/10.1111/j.1468-0017.2010.01394.x

Taddicken, M., & Krämer, N. (2021). Public online engagement with science information: on the road to a theoretical framework and a future research agenda. JCOM, 20, A05. https://doi.org/10.22323/2.20030205

Trollip, K., Gastrow, M., Ramlagan, S., & Shean, Y. (2024). Harnessing multimedia, multimodal and multilingual science communication to combat misinformation in a diverse country setting. JCOM, 23, N01. https://doi.org/10.22323/2.23090801

Walter, S., Brüggemann, M., & Engesser, S. (2018). Echo chambers of denial: explaining user comments on climate change. Environmental Communication, 12, 204–217. https://doi.org/10.1080/17524032.2017.1394893

Weingart, P., & Guenther, L. (2016). Science communication and the issue of trust. JCOM, 15, C01. https://doi.org/10.22323/2.15050301

Weingart, P., van Schalkwyk, F., & Guenther, L. (2022). Democratic and expert legitimacy: science, politics and the public during the COVID-19 pandemic. Science and Public Policy, 49, 499–517. https://doi.org/10.1093/scipol/scac003

Yang, Z., Huang, Y., Yang, T., & Yu, T. (2024). How different science communicators use identity strategies to gain public trust: a study on astronomy and climate change issues on a Chinese knowledge sharing platform. JCOM, 23, A01. https://doi.org/10.22323/2.23090201

Yokoyama, H. M., & Ikkatai, Y. (2022). Support and trust in the government and COVID-19 experts during the pandemic. Frontiers in Communication, 7, 940585. https://doi.org/10.3389/fcomm.2022.940585

Ziewitz, M. (2015). Governing algorithms: myth, mess, and methods. Science, Technology, & Human Values, 41, 3–16. https://doi.org/10.1177/0162243915608948

Zimmermann, F., Petersen, C., & Kohring, M. (2024). Who, if not science, can you trust to guide you through a crisis? The relationship between public trust in science and exposure to established and alternative online sources in times of crisis. JCOM, 23, A04. https://doi.org/10.22323/2.23090204

About the authors

Anne Reif (PhD, 2021, at Technische Universität Braunschweig, Germany) is a Senior Researcher at the University of Hamburg, Germany. Her main research interests are in the fields of science communication and digital communication. She focuses on public trust in science as well as public perceptions of climate change in connection to the use of online media.

E-mail: anne.reif@uni-hamburg.de X: @reif_anne

Lars Guenther (PhD, 2015, at Friedrich Schiller University Jena, Germany) is Professor of Communication Science at LMU Munich’s Department of Media and Communication in Germany, and Extraordinary Associate Professor at the Centre for Research on Evaluation, Science and Technology (CREST) at Stellenbosch University in South Africa. He is interested into public perceptions of (controversial) science, science and health journalism, trust in science, as well as the public communication about risks and scientific (un)certainty.

E-mail: lars.guenther@ifkw.lmu.de

Hiromi M. Yokoyama is Professor at the Kavli Institute for the Physics and Mathematics of the Universe, the University of Tokyo. Her research interest is public perceptions toward science. Trust in science is a long-standing interest of hers, and recently she is researching about AI ethics, especially sustainable AI and climate awareness.

E-mail: hiromi.yokoyama@ipmu.jp