This Special Issue of the Journal of Science Communication is dedicated to exploring public (dis)trust in science against the backdrop of changing information environments and potentially contrasting trends regarding audience’s increasing use of digital media.
Scientists are increasingly affected by harassment, especially on social media. While initial research highlights the detrimental consequences for affected scientists, the increased visibility of harassment through social media might also negatively affect public perceptions of scientists. Using a preregistered 2x2 between-subjects experiment (N = 1,246), this study shows that exposure to uncivil comments harassing female or male scientists negatively affects citizens’ trust in the attacked scientists but not trust in scientists in general or scientific information. Furthermore, some of the effects are moderated by gender and science-related populist attitudes.
Generative AI like ChatGPT has been diagnosed to fundamentally impact different realms of life. This includes science communication, where GenAI tools are becoming important sources of science-related content for many people. This raises the question of whether people trust GenAI as a source in this field, a question that has not been answered sufficiently yet. Adapting a model developed by Roberts et al. [2013] and utilizing survey data from the German Science Barometer 2023, we find that Germans are rather sceptical about and do not strongly trust GenAI in science communication. Structural equation modelling shows that respondents' trust in GenAI as a source in science communication is driven strongly by their general trust in science, which is largely driven by their knowledge about science and the perception that science improves quality of life.