1 Context

In recent years, the presence and the relevance of science communication has changed significantly. This was particularly visible during the Covid-19 pandemic. To “educate and actively communicate with the public” was found to be among the most important non-pharmaceutical measures to decrease the Covid-19 spread [Haug et al., 2020 ]. Research on prior pandemics and outbreaks demonstrates that the knowledge — what to do exactly and why it is important — is a key variable to predict whether people will adopt protection measures [Webster et al., 2020 ]. Therefore, communication is key when coping with these kinds of crises.

However, science communication is not only relevant during pandemics. Science communication has increased in volume recently and generally been highlighted as necessary for realizing the potential of science for effective decisions on both individual and societal levels. Thus, it is important for democracy and keeping citizens informed [Fischhoff and Scheufele, 2013 ]. Consequently, the field of science communication is of emerging importance, not least due to the increasing measures of science policy and science associations.

The underlying overall trend has been considerably intensified by the digitalization of media and communication. The advent and rise of the internet, and especially social media, has radically transformed the communicative landscape in general as well as specifically for science communication [Taddicken and Reif, 2016 ]. Nowadays, scientific issues are discussed in online environments both by experts and laypeople. What was once reserved for meetings of scientists — only connected to public discussion through press releases and reports from professional journalists — has now been brought to all individuals through internet platforms and applications. With this, we have changed communication practices using new and different functionalities of online platforms, following from this an increase of the diversity of actors and a changing interaction potential between experts and laypeople, and thus between science and society. First, this paper aims to systematically describe the changes that the emergence of social media has brought about for science communication processes. Here, we specifically address the changing communication practices. Second, we discuss three different theoretical strands that need to be considered to explain the impact of the changing media environments in greater depth. Here, we reflect on a) technological affordances, b) the new knowledge order and its actors, and c) trust and rationality. Based on this, we provide an outlook on demands for future research and first conclusions for a theoretical framework.

2 Changing communication practices

Communication technologies have evolved and fundamentally transformed the communication environment and communicators’ aims and expectations. As Lupia and Sin [ 2003 , p. 316] put it: “It is worth remembering that as recently as the early 1990s, such actions [posting to global audiences] were impossible for all but a few world leaders, public figures and entertainment companies — and even for them only at select moments. Now millions take such abilities for granted”.

The internet allows laypeople who were once seen as a passive audience to use several forms of engagement. Taking different levels of user activity into account, we differentiate between consuming, participating and generating [Taddicken, 2012 ; Taddicken and Reif, 2016 ]: consuming refers to individuals who watch, read or view, but do not participate. Participating includes user-to-user interaction and user-to-content interaction (such as ranking the content, liking, sharing with others, and so on). This encompasses active behaviour but is largely characterized by one-click activities and does not include the production of one’s own content. Generating , by contrast, includes the creation and publication of one’s personal content in the form of text, images, audio, and video [Shao, 2009 ; Taddicken, 2012 ; Taddicken and Reif, 2016 ]. Writing comments can also be subsumed here, as people need to formulate their thoughts, beliefs, and feelings in their own words when commenting. In order to be able to derive theoretical implications, we describe in the following what the three forms entail, and which empirical insights have been found in the context of science communication.

2.1 Consuming

The internet has become a major source of information on scientific issues [National Science Board, 2018 ; Wissenschaft im Dialog, 2018 ]. In addition to the traditional mass media, a broad range of content can be found online through diverse platforms, including news sites, blogs, and social networking sites [O’Neill and Boykoff, 2011 ]. Moreover, information that goes beyond journalistic content is offered online, such as academic scientific information, governmental information, policy and NGO statements, and large amounts of user-generated content. Therefore, users can inform themselves about any scientific issue or method through many different sources with varying levels of expertise. Even incorrect information, possibly given by accident (misinformation), and besides intentionally false messages that are often used for specific communication strategies, such as damaging the reputation of a person, social group, organization or country (disinformation) can be easily found and consumed online [cf. Egelhofer and Lecheler, 2019 ; Garrett, 2017 ; Scheufele and Krause, 2019 ; Taddicken and Wolff, 2020 ].

However, individuals can be uninformed and mis-/disinformed all at once. For example, they may be uninformed about how scientific processes work while being mis-/disinformed about the facts of a specific scientific issue — and these factors may influence each other [Scheufele and Krause, 2019 ]. Climate change [Allgaier, 2016 ] and vaccination [Donzelli et al., 2018 ] are just two examples of scientific issues upon which a great deal of mis-/disinformation can be found online.

It is important to highlight that consumption is not genuinely passive but involves active selection (see selective exposure theories in Knobloch-Westerwick [ 2015 ] as well as research on attention and its relevance for selecting), as well as active interpretation and processing of the information [LaRose and Eastin, 2004 ; Ruggiero, 2000 ]. For example, Nauroth et al. [ 2014 ] showed that gamers devalue science findings on the effects of playing games that contradict their own opinion based on their social identity. On a more positive note, research has found that users are generally competent to judge the credibility of science-related information on social media [Stadtler et al., 2017 ; Winter and Krämer, 2012 ].

2.2 Participating

It has long been recognized that science communication is hugely influenced by new forms of interaction, enabled by online media [Brossard, 2013 ]. Specifically, social media platforms allow users to consume content and to interact with it, for example, by rating or tagging online content or sharing it with others. This is exceptionally popular among young adults [Hargittai, Füchslin and Schäfer, 2018 ]. We term this unelaborate form of contribution “participation”. Through these actions, every user can contribute to the dissemination of specific messages (see the concept of mass interpersonal persuasion by Fogg [ 2008 ]). For example, ‘likes’ have been said to lead to a bandwagon effect insofar as more people will trust a message if it has received many likes on social media [Sundar, Oeldorf-Hirsch and Xu, 2008 ]. Other research, however, has demonstrated that the impact of likes has probably been overstated [Winter, Brückner and Krämer, 2015 ].

2.3 Generating

Generating is a form of interaction with media content on social media, which is even more active in that recipients post their own thoughts, beliefs or knowledge. For example, blogs on science-related topics (e.g., HIV and potential protection measures) written from a personal perspective can be more influential than formal information sites [Neubaum and Krämer, 2015 ]. Similarly, YouTube influencers, such as scientists Mai Thi Nguyen-Kim, Marius Angeschrien, or Doktor Whatson, can significantly impact people’s beliefs about science — with resulting research questions only just beginning to be analyzed [Reif, Kneisel et al., 2020 ]. Moreover, less elaborate production such as commenting can also be influential and alter the attitudes that recipients might develop based on the original piece of information on a blog or in a written piece by a journalist [Gierth and Bromme, 2020 ; Winter, Brückner and Krämer, 2015 ; Winter and Krämer, 2016 ]. Regarding the motives for commenting, Nauroth et al. [ 2015 ] found that a threat to social identity can motivate science-discrediting comments online. However, both the effects and the motives for commenting online on science issues should be further analyzed.

Therefore, these new forms of potential reciprocal interaction have already been shown to influence science communication. However, it is necessary to further analyze these phenomena. In the following, we will focus on three theoretical strands that will help address and explain the mechanisms of these new interaction forms and communication opportunities. We will describe the affordances in the sense of social media behaviour with their technological structures. Then we reflect on the new knowledge order and its various actors, including scientists and journalists and the public, to an increasing extent. Finally, we discuss trust and rationality as important concepts for future theorizing.

3 Technological structures and affordances

To advance our understanding of online engagement with science, it is important to consider the technological structures of these different online communication behaviours. As technology determines the structure of communication, technological aspects influence how people act, such as liking, sharing, tagging, and the use of algorithm-based recommendations; how they communicate, such as using emoticons, linking to other sources or addressing specific users; and how they perceive content, such as layout aspects, the presence or absence of the number of followers and likes, and the possibility of interactivity [Greussing, 2020 ]. The multimodality of online environments as well as particular platform vernaculars play a crucial role [Pearce et al., 2020 ].

A prominent and important concept of online activity that considers the technology perspective is the concept of affordances. This concept was initially introduced by Gibson [ 1979 ] and refers to the inherent values and meanings of things in the environment, which can be directly perceived and linked to the action possibilities offered. ‘Everyday things’ can be designed such that the user can easily infer what they can afford [Norman, 2013 ]. This concept is increasingly used to refer to technology when thinking about online users and individual online activities. It is argued that online networks and platforms help to shape discussion networks and influence participation in different ways [Halpern and Gibbs, 2013 ]. With this, the relevance of architecture is emphasized.

This is how the concept of affordances comes into play when thinking about people’s online engagement with science-related issues. Boyd [ 2010 ] uses this concept in her approach of ‘networked publics’, which is relevant because she describes ‘networked publics’ as “simultaneously (1) the space constructed through networked technologies and (2) the imagined collective that emerges as a result of the intersection of people, technology, and practice” [Boyd, 2010 , p. 39]. The technological structures possess certain affordances that affect how people interact with and behave in communication environments. Affordances “do not dictate participants’ behaviour, but they do configure the environment in a way that shapes participants’ engagement” [Boyd, 2010 , p. 39]. Networked technologies have reorganized how information flows and how people interact with each other, and have thus defined ‘networked publics’ from traditional concepts of the public sphere. Boyd [ 2010 ] speaks of digital architectures as “structural forces” [p. 42]. The ‘networked publics’ are not only ‘publics networked together’, but are fundamentally transformed by the networked media, their properties and possibilities. Above all, Boyd emphasizes the persistence, reproducibility, scalability and searchability of content [Boyd, 2010 , pp. 46–48]. These are not new in themselves, but are fundamental in a new way.

Affordances reshape public spheres, both directly and through the practices that people develop to deal with them, which must be considered when thinking about people’s online engagement with science-related issues. Affordances, seen as structural forces, are relevant. Concerning the online environment and networked publics, these are invisible audiences (not all audiences are visible when a person is contributing online nor are they necessarily co-present), collapsed contexts (the lack of spatial, social, and temporal boundaries makes it difficult to maintain distinct social contexts), and the blurring of public and private (without control over the context, public and private become meaningless binaries, are scaled in new ways, and are difficult to maintain as distinct) [Boyd, 2010 , p. 52]. Although research on affordances and how these affect online users has recently become popular, analyses concerning scientific issues are still lacking.

4 The new knowledge order and its actors

Moreover, it is important to acknowledge that the modern media environment is characterized by the interrelation of traditional mass media, online platforms, and social media, meaning that so-called hybrid public spheres have emerged [Chadwick, 2013 ]. These conditions of the new media environments have radically transformed the public sphere, and a fundamental change in the knowledge system has been initiated. In their recently developed theoretical framework, Neuberger et al. [ 2019 ] call this ‘The Digital Transformation of the Knowledge Order’. This framework aims to systematically describe and explain the digital transformation of the genesis, examination, distribution and acquisition of knowledge. The authors refer to the basic concepts of truth, knowledge and rationality, and distinguish between phases, contexts, hierarchical levels and roles. Online environments dissolve the previous order of knowledge by collapsing contexts, levelling epistemic hierarchies, dissolving phase successions in the knowledge process, and opening access to previously exclusive roles and through the emergence of hybrid roles.

The model has been developed with a robust journalistic focus but offers great potential to review the state of research and to develop a research agenda for the science communication context in order to reflect online engagement with science. It helps to differentiate between different individual engagement behaviours and their social embeddedness concerning functions and contributions to the public discourse. However, although Neuberger et al. [ 2019 ] have started to adapt their framework from the journalistic system to science, this has yet to be fully developed. In the following, we address the opening of access to previously exclusive roles and the levelling of epistemic hierarchies.

4.1 Opening access to exclusive roles: a new diversity of actors

Communication barriers to online arenas are low [Lörcher and Taddicken, 2017 ; Schmidt, 2013 ]. The transformed communication environment provides significant participatory potential for science communication [O’Neill and Boykoff, 2011 ]. On the one hand, this opens the exclusive role of communicating on scientific processes and findings to a plurality of new actors with the potential of engaging the non-engaged. On the other hand, this gives stage to ‘fake experts’, such as self-appropriated experts, conspiracy theorists and other actors with persuasive strategies in mind. This is met with concern, as the absence of traditional gatekeepers can lead to a spread of information that has not been fact-checked and is incorrect.

With regard to the Covid-19 pandemic, the WHO highlighted the problem of the so-called “infodemic” [World Health Organization, 2020 ], meaning that people receive information about SARS-CoV-2 through various sources: via traditional journalistic media that translate scientific information for the general public based on professional routines and standards [Briggs and Hallin, 2010 ; Szczuka, Meinert and Krämer, 2020 ] — but also via alternative news media (in particular, social media and other online channels) that might add their own ideological spin and contribute to the spread of (potentially dangerous) ‘fake news’ or conspiracy theories [Boberg et al., 2020 ].

This is related to dramatic (re)assessments of professional competences in the public arena (see Bucchi [ 1996 ] for similar claims) and dissolutions of the traditional hierarchy of professional knowledge generation [Neuberger et al., 2019 ]. Online environments increase the blurring of boundaries between (scientific) experts and non-experts, calling the sovereignty of interpretation of science [Kienhues, Jucks and Bromme, 2020 ] into question.

4.2 Levelling of epistemic hierarchies: changing interaction potential

The blurring boundaries between (scientific) experts and non-experts are acknowledged within the changing communication environments. In this regard, members of what was once considered the passive audience now have various possibilities to engage with scientific issues: they can pose questions to scientists who provide information in podcasts or blogs, challenge their views, or participate in data collection or consolidation of research questions in online formats. Indeed, they may even have higher levels of expertise than the professional gatekeeper, the journalist. As an example, Huber [ 2014 ] outlined that journalists can learn from user comments about their online articles: interviewed journalists reported that users detect errors, add relevant information and links, and so on.

However, it is not just laypeople who are involved in the online communication on science, recently and particularly during the pandemic scientists also seem to involve themselves in the online discourse — often entering a new field of professional activities and accessing an arena of direct communication with the public. Thereby, scientists tend to take roles that were previously filled by journalists. For instance, they might run their own blog, use Twitter or provide science videos on a YouTube channel. The scientists’ online channels, such as Twitter or podcasts, have already been shown as effective on the willingness to adhere to the protective measures against the spread of Covid-19 [Szczuka, Meinert and Krämer, 2020 ].

Moreover, the scientists may also engage publicly online by participating, for instance, by writing user comments [Huber, Wetzstein and Aichberger, 2019 ; Shanahan, 2010 ] and with this correcting misinformation online [Vraga and Bode, 2017 ]. Thus, scientists can function as professional gatekeepers.

Overall, this direct communication from science to the public represents an important development of science communication [e.g., Nisbet and Scheufele, 2009 ].

5 Trust and rationality

In order to be able to tailor the two previous theoretical strands more closely to science communication, the constructs of trust and rationality warrant increased attention as they are closely related with science and scientific issues. Scientific information is overly complex and intrinsically uncertain [Popper, 2002 ], making it difficult for non-scientists/laypeople to make appropriate assessments [Maier and Taddicken, 2013 ]. This is especially true for so-called socioscientific issues, which are controversial, socially relevant, real-world problems that are informed by science and often include an ethical component [Sadler, Barab and Scott, 2007 ]. Over time, science has become increasingly professionalized, which has strengthened the boundaries to the public. Although functional roles have begun to transform (see above), science is still culturally distant for many individuals [Guenther, Weingart and Meyer, 2018 ]. Along with this functional differentiation of society, the relevance of trust becomes evident [Kohring, 2016 ].

Bromme and Gierth [ 2021 ] provide a profound analysis of why people’s trust in science and scientists plays a more important role than understanding science. They argue that the public understanding of science is bounded in the sense that for large parts of the public, their understanding is limited [Simon, 1955 ; Simon, 1979 ]. This has already been demonstrated in school studies, which revealed that scientific literacy largely varies between individuals, cultures, and societies. However, these results refer to scientific knowledge taught at school, which in principle is understandable and certain, while many scientific topics related to everyday problems and discussed in the media are not. Therefore, further constraints of understanding results from the potentially unlimited depth of scientific explanations and the complexity of the methods and procedures involved. This is even more important when it comes to conflicting scientific knowledge claims, such as health-related information (as can currently be found regarding SARS-CoV-2). In this case, the public needs to decide which of the conflicting knowledge claims are true for the wider public. Bromme and Gierth [ 2021 ] argue that: “most members of the general public are not able to decide by virtue of their own understanding of the relevant topic which scientific knowledge claim should be adopted as a true belief (in the sense of being justified scientifically)” [w.p.].

In addition, even if a scientific issue is, in principle, comprehensible for the general public, this does not necessarily lead to a thorough rational processing of the available information. Numerous theories describe that based on the fact that human’s cognitive resources are bounded, they will retreat to a more heuristic instead of analytical thinking — especially when competing tasks are present or a lack of involvement and motivation is given [see Dual-process theory of system 1 and system 2, Evans, 2008 ; Kahneman and Frederick, 2002 ; Stanovich and West, 2000 ; Elaboration-likelihood model, Petty and Cacioppo, 1986 ; Heuristic-systematic model, Chaiken, 1980 , and others] or under the impression of uncertainty [Kahneman and Frederick, 2002 ; Kahneman, Slovic and Tversky, 2018 ; Stanovich and West, 2002 ]. All these theories contrast fast, automatic, or unconscious with slow, effortful, and conscious processes of reasoning or judgement and decision-making [Evans, 2008 ]. With regard to the online context, it is well-documented that especially when interacting with social media information, heuristic processing becomes prevalent [Winter, 2020 ]. For example, social recommendations have been found to change users’ calculus and heuristics used for information selection decisions [Messing and Westwood, 2014 ; Metzger, Flanagin and Medders, 2010 ].

Given an inability and/or unwillingness to thoroughly process information, an important way to reach the public is through trust. In line with this, the problem shifts away from the question of the plausibility of the content (“What to believe?”) to the question of the trustworthiness of its respective sources (“Whom to believe?”) [Bromme and Kienhues, 2014 ]. Although this form of “epistemic trust” [Sperber et al., 2010 ] seems to be at odds with scientific reasoning and argumentation, it is the only chance for those who cannot evaluate the evidence for themselves. Still, this does not need to be seen as a denial of rationality: the topic of this reasoning is no longer seen as science as a cognitive structure but rather science as a social system. Three dimensions of trust in scientists have been distinguished: expertise, integrity, and benevolence [Hendriks, Kienhues and Bromme, 2015 ]. However, the acceptance of scientific knowledge as a valid belief becomes additionally relevant, is influenced by processes of motivated reasoning [Kunda, 1990 ], and is subject to confirmation bias. Nevertheless, Bromme and Gierth [ 2021 ] argue that citizens’ capability to make judgments of trust is less bounded than their capability to judge which results from a set of competing claims that are true.

Given this, the often suspected general and fundamental loss of trust in elites [“crisis of faith”, Garrett, 2017 ] is even more threatening. Not only are politicians members of the societal elite, but scientists are too. Therefore, this has to be considered when researching online engagement with scientific issues [Bauer, Allum and Miller, 2007 ; Gauchat, 2012 ; Kohring, 2016 ].

The general trust of individuals in science correlates with their attitudes toward science [Guenther and Joubert, 2017 ; Marques, Critchley and Walshe, 2015 ]. The increased specialization of science led to the need for intermediaries to pass on information to the public [Weingart, 2002 ; Weingart and Guenther, 2016 ]. Hence, trust in science is primarily shaped through media information [Anderson et al., 2012 ; Brewer and Ley, 2013 ; Chryst et al., 2018 ]. However, intermediaries and their functional roles have begun to fundamentally transform, and it remains unclear whether the new aspects of the media environment foster or hinder public trust in science [Huber, Barnidge et al., 2019 ].

6 Conclusion and first steps toward a theoretical framework and a future research agenda

Although the research on science communication has significantly increased during the last few decades and a ‘science of science communication’ [Fischhoff and Scheufele, 2013 ] has been established, there are still substantial research gaps — particularly when it comes to questions regarding online and social media. In particular, the current situation of the novel coronavirus pandemic has highlighted the enormous relevance of online and direct science communication and informed evidence-based decisions of politicians as well as citizens.

Despite the relevance of science and science communication in modern societies, the potential benefits and threats of social media have both been mostly analyzed in the context of politics and political communication [Gil de Zúñiga, Huber and Strauß, 2018 ; Scheufele, 2013 ; Scheufele and Krause, 2019 ]. Only a small proportion of studies have addressed the dissemination of science and online discussion of science-related topics. However, these are no less important for democracy and informed citizens [Fischhoff and Scheufele, 2013 ]. For example, mis-/disinformation regarding science topics can affect participation in climate protection, the formation of filter bubbles around vaccine sceptics to undermine health protection measures, or hate speech towards scientists, hindering their ability to present their findings to the public [Scheufele and Krause, 2019 ]. Even though these issues are of clear societal relevance, little is known about how laypeople engage in online discourse on scientific issues [Brossard, 2013 ; Davies and Hara, 2017 ]. There is an urgent need to understand how citizens engage with science and how the way in which such scientific information presented affects this engagement.

As discussed above, different levels of user activity (consuming, participating, generating) have to be distinguished for deeper insight. Moreover, activities must be analyzed against the background of the wider context of societal transformation. For this, we picked up the most prominent strands of technological affordances, a new knowledge order and aspects of trust and rationality. In the following, we will attempt to highlight challenges for future research against this background.

  1. On the consumption level, it needs to be highlighted that the internet, and in particular social media, allows for a more intensive and diverse amount of information exposure. Online users may be exposed to and engage with a greater volume and a broader range of science news, for example, by incidental news exposure based on algorithmic recommendation or social network sharing, and this heightened exposure can foster trust in science [Huber, Wetzstein and Aichberger, 2019 ; Nisbet, Scheufele et al., 2002 ]. Moreover, online information is supplemented by social recommendations, such as ratings and comments, which affects content perception and credibility judgements [Winter and Krämer, 2014 ; Winter and Krämer, 2016 ] as well as information processing [Messing and Westwood, 2014 ; Metzger, Flanagin and Medders, 2010 ]. In addition, it has been shown that users prefer scientists themselves to present scientific information rather than journalists because scientists are perceived as more trustworthy, more precise, and more objective [Huber, Wetzstein and Aichberger, 2019 ].

    Future research needs to further elucidate the challenges of finding and selecting valuable and valid information on science issues and how users cope with these. Depending on users’ motivations to consume science information (e.g., defence motivations were found to amplify the confirmation bias effect, Winter, Metzger and Flanagin [ 2016 ]) and the different ways (or systems) of processing, it is important to discuss how the technological affordances of new media environments affect a user’s media consumption behaviour.

    Moreover, it is important to widen the focus to the actors that provide the information as a new and seemingly unmanageable variety of new actors appears on stage. The relevant theoretical strand is, therefore, focused on the role of trust and rationality. It is important to acknowledge in future theorizing and empirical analyses that most people will not be able to decide what to believe in the virtue of the specific message but merely based on a person’s expertise. However, the decision on who to trust is complicated by varying and collapsing contexts. Whereas an actor might be able to provide valuable scientific expertise in one context and regarding one issue, he or she might not know about another. A paediatrician might be able to provide relevant knowledge on the extent to which children might suffer from physical distancing during pandemic-related lockdowns but might not be an expert on the specificities of virus dispersion in children. Therefore, it is challenging to decide who to trust. In order to support laypeople with these kinds of decisions in the future, it is important to better understand the mechanisms and conditions of yielding trust.

  2. On the participation level: participation and dialogue are generally seen as an effective means of creating a relationship of trust between science and the public [Sturgis, 2014 ]. The new media environments offer unique potential for low-threshold participation opportunity for many different users [Brossard, 2013 ; Brossard and Schefeule, 2013 ; Lörcher and Neverla, 2015 ; Lörcher and Taddicken, 2017 ; Stilgoe, Lock and Wilsdon, 2014 ]. However, it is still unclear how this affects the trust relationship between science and the public [Reif and Guenther, n.d. ]. This research gap is crucial, as Huber, Barnidge et al. [ 2019 ] conclude from their 20-country multilevel analysis that social media news use is more strongly related to trust in science than is traditional news use. However, the mechanisms and processes behind this finding are yet to be thoroughly investigated. This need is particularly relevant as other research suggests that given the greater variety of social media actors more emotional postings are available in social media that are more prone to instil emotions instead of increasing fact-based knowledge [Stieglitz and Dang-Xuan, 2013 ].

    The theoretical considerations regarding social media affordances are relevant regarding participation. Here, the degree to which the technological structures of platforms influence the way the public engages with science needs to be better understood. A better comprehension of relevant mechanisms will also enable the design of more helpful platforms for science communication. For example, the choice of features which will be particularly helpful for beneficial interactions between scientists, journalists and the public is possible.

    Besides the role of affordances, the relevance of algorithms in selecting and providing information on science issues should be considered. Although previous research shows heterogenous results on the impact of algorithms on individual and public opinion formation and discourse processes, the way users encounter information as well as the role of personalisation is a key concern [Diakopoulos and Koliska, 2017 ] and should become of focus for public science communication.

  3. On the generating level, even less is known about how this behaviour — either of lay users or of scientists — affects trust and credibility judgements. Here, the new roles of scientists need to be considered: as scientists themselves increasingly engage in public online discourse by (participating and) generating, such as reacting to mis-/disinformation or commenting on journalistic content referencing their work, scholarly attention should increase regarding the understanding of reciprocal and dialogue-oriented processes of science-related discussions. Here, it is important to understand more deeply why, how and with what impact people engage — lay users and scientists. It is also to elucidate how potential convergences on the level of communication about science issues affect the generating of scientific findings, and thus add to the processes of generating evidence and its interpretations. The levelling of epistemic hierarchies challenges the functions and roles of scientists as well as of non-scientists.

This paper aimed to identify and reflect on relevant theoretical strands and thus help to inform theoretical frameworks and research agendas. We believe that it is important to consider the described transformative context changes as these will seemingly (further) significantly change science communication in the future.

Acknowledgments

We thankfully acknowledge excellent support by (in alphabetical order) Helena Bilandzic, Rainer Bromme, Brigitte Huber, Dorothe Kienhues, Tobias Rothmund and Stephan Winter.

References

Allgaier, J. (2016). ‘Science on YouTube: what users find when they search for climate science and climate manipulation’. arXiv: 1602.02692 . URL: http://arxiv.org/pdf/1602.02692v2 .

Anderson, A. A., Scheufele, D. A., Brossard, D. and Corley, E. A. (2012). ‘The role of media and deference to scientific authority in cultivating trust in sources of information about emerging technologies’. International Journal of Public Opinion Research 24 (2), pp. 225–237. https://doi.org/10.1093/ijpor/edr032 .

Bauer, M. W., Allum, N. and Miller, S. (2007). ‘What can we learn from 25 years of PUS survey research? Liberating and expanding the agenda’. Public Understanding of Science 16 (1), pp. 79–95. https://doi.org/10.1177/0963662506071287 .

Boberg, S., Quandt, T., Schatto-Eckrodt, T. and Frischlich, L. (2020). ‘Pandemic populism: Facebook pages of alternative news media and the corona crisis — A computational content analysis’. arXiv: 2004.02566 . URL: http://arxiv.org/pdf/2004.02566v3 .

Boyd, D. (2010). ‘Social network sites as networked publics: affordances, dynamics, and implications’. In: A networked self: identity, community, and culture on social network sites. Ed. by Z. Papacharissi. New York, NY, U.S.A.: Routledge. URL: https://content.taylorfrancis.com/books/download?dac=C2009-0-18715-0&isbn=9781135966164&doi=10.4324/9780203876527-8&format=pdf .

Brewer, P. R. and Ley, B. L. (2013). ‘Whose science do you believe? Explaining trust in sources of scientific information about the environment’. Science Communication 35 (1), pp. 115–137. https://doi.org/10.1177/1075547012441691 .

Briggs, C. L. and Hallin, D. C. (2010). ‘Health reporting as political reporting: biocommunicability and the public sphere’. Journalism 11 (2), pp. 149–165. https://doi.org/10.1177/1464884909355732 .

Bromme, R. and Gierth, L. (2021). ‘Rationality and the public understanding of science’. In: The handbook of rationality. Ed. by M. Knauff and W. Spohn. In press. Cambridge, MA, U.S.A.: MIT Press.

Bromme, R. and Kienhues, D. (2014). ‘Wissenschaftsverständnis und Wissenschaftskommunikation’. In: Pädagogische Psychologie. Ed. by T. Seidel and A. Krapp. 6th ed. Weihheim, Germany: Beltz, pp. 55–81.

Brossard, D. (2013). ‘New media landscapes and the science information consumer’. Proceedings of the National Academy of Sciences 110 (Supplement 3), pp. 14096–14101. https://doi.org/10.1073/pnas.1212744110 .

Brossard, D. and Schefeule, D. A. (2013). ‘Science, new media and the public’. Science 339 (6115), pp. 40–41. https://doi.org/10.1126/science.1232329 .

Bucchi, M. (1996). ‘When scientists turn to the public: alternative routes in science communication’. Public Understanding of Science 5 (4), pp. 375–394. https://doi.org/10.1088/0963-6625/5/4/005 .

Chadwick, A. (2013). The hybrid media system: politics and power. Oxford studies in digital politics. New York, NY, U.S.A.: Oxford University Press. https://doi.org/10.1093/acprof:oso/9780199759477.001.0001 .

Chaiken, S. (1980). ‘Heuristic versus systematic information processing and the use of source versus message cues in persuasion’. Journal of Personality and Social Psychology 39 (5), pp. 752–766. https://doi.org/10.1037/0022-3514.39.5.752 .

Chryst, B., Marlon, J., van der Linden, S., Leiserowitz, A., Maibach, E. and Roser-Renouf, C. (2018). ‘Global warming’s “Six Americas Short Survey”: audience segmentation of climate change views using a four question instrument’. Environmental Communication 12 (8), pp. 1109–1122. https://doi.org/10.1080/17524032.2018.1508047 .

Davies, S. R. and Hara, N. (2017). ‘Public science in a wired world: how online media are shaping science communication’. Science Communication 39 (5), pp. 563–568. https://doi.org/10.1177/1075547017736892 .

Diakopoulos, N. and Koliska, M. (2017). ‘Algorithmic transparency in the news media’. Digital Journalism 5 (7), pp. 809–828. https://doi.org/10.1080/21670811.2016.1208053 .

Donzelli, G., Palomba, G., Federigi, I., Aquino, F., Cioni, L., Verani, M., Carducci, A. and Lopalco, P. (2018). ‘Misinformation on vaccination: a quantitative analysis of YouTube videos’. Human Vaccines & Immunotherapeutics 14 (7), pp. 1654–1659. https://doi.org/10.1080/21645515.2018.1454572 .

Egelhofer, J. L. and Lecheler, S. (2019). ‘Fake news as a two-dimensional phenomenon: a framework and research agenda’. Annals of the International Communication Association 43 (2), pp. 97–116. https://doi.org/10.1080/23808985.2019.1602782 .

Evans, J. S. B. T. (2008). ‘Dual-processing accounts of reasoning, judgment, and social cognition’. Annual Review of Psychology 59, pp. 255–278. https://doi.org/10.1146/annurev.psych.59.103006.093629 .

Fischhoff, B. and Scheufele, D. A. (2013). ‘The science of science communication’. Proceedings of the National Academy of Sciences 110 (Supplement 3), pp. 14031–14032. https://doi.org/10.1073/pnas.1312080110 .

Fogg, B. J. (2008). ‘Mass interpersonal persuasion: an early view of a new phenomenon’. In: Persuasive technology . Third International Conference, PERSUASIVE 2008 (Oulu, Finland, 4th–6th June 2008). Ed. by H. Oinas-Kukkonen, P. Hasle, M. Harjumaa, K. Segerståhl and P. Øhrstrøm. Vol. 5033. Lecture Notes in Computer Science. Berlin, Heidelberg, Germany: Springer, pp. 23–34. https://doi.org/10.1007/978-3-540-68504-3_3 .

Garrett, R. K. (2017). ‘The “echo chamber” distraction: disinformation campaigns are the problem, not audience fragmentation’. Journal of Applied Research in Memory and Cognition 6 (4), pp. 370–376. https://doi.org/10.1016/j.jarmac.2017.09.011 .

Gauchat, G. (2012). ‘Politicization of science in the public sphere: a study of public trust in the United States, 1974 to 2010’. American Sociological Review 77 (2), pp. 167–187. https://doi.org/10.1177/0003122412438225 .

Gibson, J. J. (1979). The ecological approach to visual perception. Boston, MA, U.S.A.: Houghton Mifflin.

Gierth, L. and Bromme, R. (2020). ‘Attacking science on social media: how user comments affect perceived trustworthiness and credibility’. Public Understanding of Science 29 (2), pp. 230–247. https://doi.org/10.1177/0963662519889275 .

Gil de Zúñiga, H., Huber, B. and Strauß, N. (2018). ‘Social media and democracy’. El Profesional de la Información 27 (6), pp. 1172–1180. https://doi.org/10.3145/epi.2018.nov.01 .

Greussing, E. (2020). ‘Powered by immersion? Examining effects of 360-degree photography on knowledge acquisition and perceived message credibility of climate change news’. Environmental Communication 14 (3), pp. 316–331. https://doi.org/10.1080/17524032.2019.1664607 .

Guenther, L. and Joubert, M. (2017). ‘Science communication as a field of research: identifying trends, challenges and gaps by analysing research papers’. JCOM 16 (02), A02. https://doi.org/10.22323/2.16020202 .

Guenther, L., Weingart, P. and Meyer, C. (2018). ‘“Science is everywhere, but no one knows it”: assessing the cultural distance to science of rural South African publics’. Environmental Communication 12 (8), pp. 1046–1061. https://doi.org/10.1080/17524032.2018.1455724 .

Halpern, D. and Gibbs, J. (2013). ‘Social media as a catalyst for online deliberation? Exploring the affordances of Facebook and YouTube for political expression’. Computers in Human Behavior 29 (3), pp. 1159–1168. https://doi.org/10.1016/j.chb.2012.10.008 .

Hargittai, E., Füchslin, T. and Schäfer, M. S. (2018). ‘How do young adults engage with science and research on social media? Some preliminary findings and an agenda for future research’. Social Media + Society 4 (3), pp. 1–10. https://doi.org/10.1177/2056305118797720 .

Haug, N., Geyrhofer, L., Londei, A., Dervic, E., Desvars-Larrive, A., Loreto, V., Pinior, B., Thurner, S. and Klimek, P. (2020). ‘Ranking the effectiveness of worldwide COVID-19 government interventions’. Nature Human Behaviour 4 (12), pp. 1303–1312. https://doi.org/10.1038/s41562-020-01009-0 .

Hendriks, F., Kienhues, D. and Bromme, R. (2015). ‘Measuring laypeople’s trust in experts in a digital age: the Muenster Epistemic Trustworthiness Inventory (METI)’. PLoS ONE 10 (10), e0139309. https://doi.org/10.1371/journal.pone.0139309 .

Huber, B. (2014). Öffentliche Experten. Über die Medienpräsenz von Fachleuten. Wiesbaden, Germany: Springer.

Huber, B., Barnidge, M., Gil de Zúñiga, H. and Liu, J. (2019). ‘Fostering public trust in science: the role of social media’. Public Understanding of Science 28 (7), pp. 759–777. https://doi.org/10.1177/0963662519869097 .

Huber, B., Wetzstein, I. and Aichberger, I. (2019). ‘Societal problem solver or deficient discipline? The debate about social science in the online public sphere’. JCOM 18 (02), A04. https://doi.org/10.22323/2.18020204 .

Kahneman, D. and Frederick, S. (2002). ‘Representativeness revisited: attribute substitution in intuitive judgment’. In: Heuristics and biases: the psychology of intuitive judgment. Ed. by T. Gilovich, D. Griffin and D. Kahneman. Cambridge, MA, U.S.A.: Cambridge University Press, pp. 49–81. https://doi.org/10.1017/CBO9780511808098.004 .

Kahneman, D., Slovic, P. and Tversky, A. (2018). Judgment under uncertainty: heuristics and biases. Cambridge, MA, U.S.A.: Cambridge University Press.

Kienhues, D., Jucks, R. and Bromme, R. (2020). ‘Sealing the gateways for post-truthism: reestablishing the epistemic authority of science’. Educational Psychologist 55 (3), pp. 144–154. https://doi.org/10.1080/00461520.2020.1784012 .

Knobloch-Westerwick, S. (2015). Choice and preference in media use: advances in selective exposure theory and research. New York, NY, U.S.A.: Routledge.

Kohring, M. (2016). ‘Misunderstanding trust in science: a critique of the traditional discourse on science communication’. JCOM 15 (05), C04. https://doi.org/10.22323/2.15050304 .

Kunda, Z. (1990). ‘The case for motivated reasoning’. Psychological Bulletin 108 (3), pp. 480–498. https://doi.org/10.1037/0033-2909.108.3.480 .

LaRose, R. and Eastin, M. S. (2004). ‘A social cognitive theory of Internet uses and gratifications: toward a new model of media attendance’. Journal of Broadcasting & Electronic Media 48 (3), pp. 358–377. https://doi.org/10.1207/s15506878jobem4803_2 .

Lörcher, I. and Neverla, I. (2015). ‘The dynamics of issue attention in online communication on climate change’. Media and Communication 3 (1), pp. 17–33. https://doi.org/10.17645/mac.v3i1.253 .

Lörcher, I. and Taddicken, M. (2017). ‘Discussing climate change online. Topics and perceptions in online climate change communication in different online public arenas’. JCOM 16 (02), A03. https://doi.org/10.22323/2.16020203 .

Lupia, A. and Sin, G. (2003). ‘Which public goods are endangered?: How evolving communication technologies affect the logic of collective action’. Public Choice 117 (3–4), pp. 315–331. https://doi.org/10.1023/B:PUCH.0000003735.07840.c7 .

Maier, M. and Taddicken, M. (2013). ‘Audience perspectives on science communication’. Journal of Media Psychology 25 (1), pp. 1–2. https://doi.org/10.1027/1864-1105/a000081 .

Marques, M. D., Critchley, C. R. and Walshe, J. (2015). ‘Attitudes to genetically modified food over time: how trust in organizations and the media cycle predict support’. Public Understanding of Science 24 (5), pp. 601–618. https://doi.org/10.1177/0963662514542372 .

Messing, S. and Westwood, S. J. (2014). ‘Selective exposure in the age of social media: endorsements Trump partisan source affiliation when selecting news online’. Communication Research 41 (8), pp. 1042–1063. https://doi.org/10.1177/0093650212466406 .

Metzger, M. J., Flanagin, A. J. and Medders, R. B. (2010). ‘Social and heuristic approaches to credibility evaluation online’. Journal of Communication 60 (3), pp. 413–439. https://doi.org/10.1111/j.1460-2466.2010.01488.x .

National Science Board (2018). Science & engineering indicators 2018. Science and technology: public attitudes and understanding . URL: https://www.nsf.gov/statistics/2018/nsb20181/report/sections/science-and-technology-public-attitudes-and-understanding/interest-information-sources-and-involvement .

Nauroth, P., Gollwitzer, M., Bender, J. and Rothmund, T. (2014). ‘Gamers against science: the case of the violent video games debate’. European Journal of Social Psychology 44 (2), pp. 104–116. https://doi.org/10.1002/ejsp.1998 .

— (2015). ‘Social identity threat motivates science-discrediting online comments’. PLoS ONE 10 (2), e0117476. https://doi.org/10.1371/journal.pone.0117476 .

Neubaum, G. and Krämer, N. C. (2015). ‘Let’s blog about health! Exploring the persuasiveness of a personal HIV blog compared to an institutional HIV website’. Health Communication 30 (9), pp. 872–883. https://doi.org/10.1080/10410236.2013.856742 .

Neuberger, C., Bartsch, A., Reinemann, C., Fröhlich, R., Hanitzsch, T. and Schindler, J. (2019). ‘Der digitale Wandel der Wissensordnung. Theorierahmen für die Analyse von Wahrheit, Wissen und Rationalität in der öffentlichen Kommunikation’. M&K Medien & Kommunikationswissenschaft 67 (2), pp. 167–186. https://doi.org/10.5771/1615-634X-2019-2-167 .

Nisbet, M. C. and Scheufele, D. A. (2009). ‘What’s next for science communication? Promising directions and lingering distractions’. American Journal of Botany 96 (10), pp. 1767–1778. https://doi.org/10.3732/ajb.0900041 .

Nisbet, M. C., Scheufele, D. A., Shanahan, J., Moy, P., Brossard, D. and Lewenstein, B. V. (2002). ‘Knowledge, reservations, or promise? A media effect model for public perceptions of science and technology’. Communication Research 29 (5), pp. 584–608. https://doi.org/10.1177/009365002236196 .

Norman, D. A. (2013). The design of everyday things. Revised and expanded edition. New York, NY, U.S.A.: Basic Books.

O’Neill, S. and Boykoff, M. (2011). ‘The role of new media in engaging the public with climate change’. In: Engaging the public with climate change: behaviour change and communication. Ed. by L. Whitmarsh, S. O’Neill and I. Lorenzoni. London, U.K.: Earthscan, pp. 233–251.

Pearce, W., Özkula, S. M., Greene, A. K., Teeling, L., Bansard, J. S., Omena, J. J. and Rabello, E. T. (2020). ‘Visual cross-platform analysis: digital methods to research social media images’. Information, Communication & Society 23 (2), pp. 161–180. https://doi.org/10.1080/1369118X.2018.1486871 .

Petty, R. E. and Cacioppo, J. T. (1986). ‘The elaboration likelihood model of persuasion’. In: Advances in experimental social psychology. Ed. by L. Berkowitz. Vol. 19. New York, NY, U.S.A.: Academic Press, pp. 123–205. https://doi.org/10.1016/S0065-2601(08)60214-2 .

Popper, K. (2002). Conjectures and refutations: the growth of scientific knowledge. 2nd ed. London, U.K.: Routledge.

Reif, A. and Guenther, L. (n.d.). ‘What representative surveys tell us about public (dis)trust in science: a systematisation and analysis of survey items and open-ended questions’. Journal of Trust Research . To appear.

Reif, A., Kneisel, T., Schäfer, M. and Taddicken, M. (2020). ‘Why are scientific experts perceived as trustworthy? Emotional assessment within TV and YouTube videos’. Media and Communication 8 (1), pp. 191–205. https://doi.org/10.17645/mac.v8i1.2536 .

Ruggiero, T. E. (2000). ‘Uses and gratifications theory in the 21st century’. Mass Communication and Society 3 (1), pp. 3–37. https://doi.org/10.1207/S15327825MCS0301_02 .

Sadler, T. D., Barab, S. A. and Scott, B. (2007). ‘What do students gain by engaging in socioscientific inquiry?’ Research in Science Education 37 (4), pp. 371–391. https://doi.org/10.1007/s11165-006-9030-9 .

Scheufele, D. A. (2013). ‘Communicating science in social settings’. Proceedings of the National Academy of Sciences 110 (Supplement 3), pp. 14040–14047. https://doi.org/10.1073/pnas.1213275110 .

Scheufele, D. A. and Krause, N. M. (2019). ‘Science audiences, misinformation, and fake news’. Proceedings of the National Academy of Sciences 116 (16), pp. 7662–7669. https://doi.org/10.1073/pnas.1805871115 .

Schmidt, J.-H. (2013). ‘Onlinebasierte Öffentlichkeiten: Praktiken, Arenen und Strukturen’. In: Online-Diskurse. Theorien und Methoden transmedialer Online-Diskursforschung. Ed. by C. Fraas, S. Meier and C. Pentzold. Köln, Germany: Herbert von Halem, pp. 35–56.

Shanahan, M.-C. (2010). ‘Changing the meaning of peer-to-peer? Exploring online comment spaces as sites of negotiated expertise’. JCOM 09 (01), A01. https://doi.org/10.22323/2.09010201 .

Shao, G. (2009). ‘Understanding the appeal of user-generated media: a uses and gratification perspective’. Internet Research 19 (1), pp. 7–25. https://doi.org/10.1108/10662240910927795 .

Simon, H. A. (1955). ‘A behavioral model of rational choice’. The Quarterly Journal of Economics 69 (1), pp. 99–118. https://doi.org/10.2307/1884852 .

— (1979). ‘Rational decision making in business organizations’. The American Economic Review 69 (4), pp. 493–513. URL: https://www.jstor.org/stable/1808698 .

Sperber, D., Clément, F., Heintz, C., Mascaro, O., Mercier, H., Origgi, G. and Wilson, D. (2010). ‘Epistemic vigilance’. Mind & Language 25 (4), pp. 359–393. https://doi.org/10.1111/j.1468-0017.2010.01394.x .

Stadtler, M., Winter, S., Scharrer, L., Thomm, E., Krämer, N. and Bromme, R. (2017). ‘Selektion, Integration und Evaluation’. Psychologische Rundschau 68 (3), pp. 177–181. https://doi.org/10.1026/0033-3042/a000361 .

Stanovich, K. E. and West, R. F. (2000). ‘Individual differences in reasoning: implications for the rationality debate?’ Behavioral and Brain Sciences 23 (5), pp. 645–665. https://doi.org/10.1017/s0140525x00003435 .

— (2002). ‘Individual differences in reasoning: implications for the rationality debate?’ In: Heuristics and biases: the psychology of intuitive judgment. Ed. by T. Gilovich, D. Griffin and D. Kahneman. Cambridge, MA, U.S.A.: Cambridge University Press, pp. 421–440. https://doi.org/10.1017/CBO9780511808098.026 .

Stieglitz, S. and Dang-Xuan, L. (2013). ‘Emotions and information diffusion in social media — Sentiment of microblogs and sharing behavior’. Journal of Management Information Systems 29 (4), pp. 217–248. https://doi.org/10.2753/MIS0742-1222290408 .

Stilgoe, J., Lock, S. J. and Wilsdon, J. (2014). ‘Why should we promote public engagement with science?’ Public Understanding of Science 23 (1), pp. 4–15. https://doi.org/10.1177/0963662513518154 .

Sturgis, P. (2014). ‘On the limits of public engagement for the governance of emerging technologies’. Public Understanding of Science 23 (1), pp. 38–42. https://doi.org/10.1177/0963662512468657 .

Sundar, S. S., Oeldorf-Hirsch, A. and Xu, Q. (2008). ‘The bandwagon effect of collaborative filtering technology’. In: CHI ‘08. The 2 6 th annual CHI Conference on Human Factors in Computing Systems (Florence, Italy, 5th–10th April 2008). Ed. by M. Czerwinski, A. M. Lund and D. S. Tan. New York, NY, U.S.A.: Association for Computing Machinery, pp. 3453–3458. https://doi.org/10.1145/1358628.1358873 .

Szczuka, J., Meinert, J. and Krämer, N. (2020). ‘Listen to the scientists: effects of exposure to scientists and general media consumption on cognitive, affective and behavioral mechanisms during the COVID-19 pandemic’. https://doi.org/10.31234/osf.io/6j8qd .

Taddicken, M. (2012). ‘Privacy, surveillance, and self-disclosure in the social web: exploring the user’s perspective via focus groups’. In: Internet and surveillance: the challenges of web 2.0 and social media. Ed. by C. Fuchs, K. Boersma, A. Albrechtslund and M. Sandoval. New York, NY, U.S.A.: Routledge.

Taddicken, M. and Reif, A. (2016). ‘Who participates in the climate change online discourse? A typology of Germans’ online engagement’. Communications 41 (3), pp. 315–337. https://doi.org/10.1515/commun-2016-0012 .

Taddicken, M. and Wolff, L. (2020). ‘‘Fake news’ in science communication: emotions and strategies of coping with dissonance online’. Media and Communication 8 (1), pp. 206–217. https://doi.org/10.17645/mac.v8i1.2495 .

Vraga, E. K. and Bode, L. (2017). ‘Using expert sources to correct health misinformation in social media’. Science Communication 39 (5), pp. 621–645. https://doi.org/10.1177/1075547017731776 .

Webster, R. K., Brooks, S. K., Smith, L. E., Woodland, L., Wessely, S. and Rubin, G. J. (2020). ‘How to improve adherence with quarantine: rapid review of the evidence’. Public Health 182, pp. 163–169. https://doi.org/10.1016/j.puhe.2020.03.007 .

Weingart, P. (2002). ‘The moment of truth for science: the consequences of the ‘knowledge society’ for society and science’. EMBO reports 3 (8), pp. 703–706. https://doi.org/10.1093/embo-reports/kvf165 .

Weingart, P. and Guenther, L. (2016). ‘Science communication and the issue of trust’. JCOM 15 (05), C01. https://doi.org/10.22323/2.15050301 .

Winter, S. (2020). ‘Do anticipated Facebook discussions diminish the importance of argument quality? An experimental investigation of attitude formation in social media’. Media Psychology 23 (1), pp. 79–106. https://doi.org/10.1080/15213269.2019.1572521 .

Winter, S., Brückner, C. and Krämer, N. C. (2015). ‘They came, they liked, they commented: social influence on Facebook news channels’. Cyberpsychology, Behavior and Social Networking 18 (8), pp. 431–436. https://doi.org/10.1089/cyber.2015.0005 .

Winter, S. and Krämer, N. C. (2012). ‘Selecting science information in web 2.0: how source cues, message sidedness, and need for cognition influence users’ exposure to blog posts’. Journal of Computer-Mediated Communication 18 (1), pp. 80–96. https://doi.org/10.1111/j.1083-6101.2012.01596.x .

— (2014). ‘A question of credibility — Effects of source cues and recommendations on information selection on news sites and blogs’. Communications 39 (4), pp. 435–456. https://doi.org/10.1515/commun-2014-0020 .

— (2016). ‘Who’s right: the author or the audience? Effects of user comments and ratings on the perception of online science articles’. Communications 41 (3), pp. 339–360. https://doi.org/10.1515/commun-2016-0008 .

Winter, S., Metzger, M. J. and Flanagin, A. J. (2016). ‘Selective use of news cues: a multiple-motive perspective on information selection in social media environments’. Journal of Communication 66 (4), pp. 669–693. https://doi.org/10.1111/jcom.12241 .

Wissenschaft im Dialog (2018). Wissenschaftsbarometer 2018. URL: https://www.wissenschaft-im-dialog.de/fileadmin/user_upload/Projekte/Wissenschaftsbarometer/Dokumente_18/Downloads_allgemein/Broschuere_Wissenschaftsbarometer2018_Web.pdf .

World Health Organization (2020). Novel Coronavirus(2019-nCoV): situation report — 13. URL: https://www.who.int/docs/default-source/coronaviruse/situation-reports/20200202-sitrep-13-ncov-v3.pdf .

Authors

Monika Taddicken is a professor of Communication Sciences at the Technische Universität Braunschweig, Germany. She received her Ph.D. in communication research from the University of Hohenheim, Germany. She is currently working on the audience’s perspective of science communication. She has also published several papers on social media. E-mail: m.taddicken@tu-braunschweig.de .

Nicole Krämer is a professor of Social Psychology — Media and Communication at the University Duisburg-Essen. She received her Ph.D. from the University of Cologne, Germany. She is currently working on social media engagement and emotions. She has also published several papers on computer-mediated communication. E-mail: nicole.kraemer@uni-due.de .