1 Context
One of the most popular online locations for science news, based on the number of accounts, is The New Reddit Journal of Science , the whimsically named discussion board for science topics within the social media and discussion board site reddit. Technically named “r/science”, the subreddit boasts over 26 million subscribers (the seventh highest on reddit), and since its founding in October of 2006, has generated nearly 1.2 million discussion comments (at the time of writing) [https://subredditstats.com]. While not a disciplinary journal by any measure, the discussion board hosts a wide array of specific content areas, from geology to biology, and from social science to engineering; until 2018, it also hosted the very popular science-themed AMA (ask me anything) sessions, and it continues to host the “Science Discussions” series, which are similar in structure. 1 r/science offers a forum for anonymous community members to create threads by linking to an external source and offering a brief summary of the research findings. The subreddit’s rules require posted research to be peer reviewed, although community members link to a mixture of existing journalism (from newspapers and online magazines), press releases (often from university websites or aggregation sites like EurekAlert!), and landing pages for research articles in academic journals. These details encourage other members of the reddit community (redditors, for short) to comment on and discuss that specific topic. Such conversations fit into the broader category of popular (i.e., non-specialist) science, where the public’s understanding of science is demonstrated in the discourse that takes place outside of any official disciplinary, educational, or industry-specific realms. While there are subject matter experts present in these communities, they have no formal authority; there are moderators who can remove commentary inappropriate to the community’s guidelines, and the site has specific rules for discussion (see Figure 1 ), but in general, conversations move in any directions participants desire. These features, among others, make r/science a unique place to investigate several long-standing assumptions about how the public makes sense of written science communication and what those findings might mean for science communicators.
Chief amongst these assumptions is that complex, discipline-specific scientific texts are better understood by non-subject matter experts when received through casual-language intermediary genres of writing like short journalism news. The claim that informal science writing can help increase the public’s understanding of science, their scientific literacy, or their critical awareness of how science functions in society is foundational to science writing courses, their textbooks, 2 countless workshops and outreach/engagement programs, 3 and, frankly, the field of science communication as a whole, regardless of the operational paradigm. 4 Numerous large-scale studies have found evidence for the effectiveness of science journalism on scientific literacy [Eurobarometer, 2013 ; National Science Board, 2016 ] and other empirical work has reported positive results from the public’s exposure to science writing and media [Nisbet and Goidel, 2007 ; Akin and Landrum, 2017 ]. The effects of science journalism have primarily been studied within science literacy or public understanding of science models, where empirical data has been employed to refine the science of science communication. A study of r/science discourse from within a science in society model, by contrast, can provide detail about the quality of conversations that result from exposure to different written forms of science communication, and shed light on the benefits and effects of popular science writing.
Unlike deficit models of scientific literacy, or some restrictive public understanding of science models, a science in society model recognizes communicative complexity, albeit at the expense of generalizable evidence. Older conceptualizations of science writing, as Sarah Perrault [ 2013 ] explains, fit within a Public Appreciation of Science and Technology (PAST) model, characterized by “a one-way flow of information from scientists to the public” where communication fits a generic, linear structure [p. 12]. Perrault criticizes this model as too-rigid, ineffective, condescending, and ultimately, not necessary. In this model, the practice of identifying gaps between the public’s uptake of science and claims in the scientific literature assumes that the public operates at a deficit. In its worst forms, this model promotes an us/them view of scientific communication, where the public is seen to be “superficial, inattentive and [ …] handicapped by all sorts of cognitive biases, misguided by prior experience and easily swayed by their emotions” [Mellor, 2018 , p. 750]. Likewise, deficit models often fault science journalism for their amplification [Knudsen, 2005 ], simplification [Hijmans, Pleijter and Wester, 2003 ; Brechman, Lee and Cappella, 2009 ], lack of appropriate hedges [Jensen, 2008 ], or tendency to present “both sides” of an issue, even in the face of overwhelming consensus [Dunwoody, 1999 ; Stocking and Holstein, 2009 ]. Instead of comparing where language from subject matter experts has been imprecisely repeated by readers, thereby upholding a deficit model, a more nuanced study of r/science would examine the features of that discourse for insight into how engaged and subject-matter-relevant discussion can be fostered.
A framework for analysis that values how science is understood by readers and commenters can be found in the science in society model of science communication, one that Perrault identifies as aligned with the Critical Understanding of Science in Public (CUSP) model. Peter Broks first popularized the CUSP model in Understanding popular science [ 2006 ], and described this approach as sensitive to non-linear, contextual understandings of expertise and concerned with subtleties of meaning. In a CUSP model, science writing is seen to affect the quality of the discourse about a particular topic, though not in instrumental ways, where the only major concerns are either the precise representation of science by journalists or the retention of that knowledge by a broader public. Research on how audiences are engaged and participate in science [Bucchi and Trench, 2014b ] or how online forms of science communication impact understanding and behavior [Brossard, 2013 ; Mehlenbacher, 2019 ] are indicative of recent shifts in the field of science communication toward CUSP ideals. Where PAST models might assume that reddit discussion boards show conversations that are “limited and unstructured” [Zavestoski, Shulman and Schlosberg, 2006 , p. 4], CUSP models instead depict levels of complexity around the public’s actual engagement with the topics that science journalism represents, in all its messiness. This framework for science communication likewise acknowledges that, while often imprecise, online discourse shows evidence for the public’s critical understanding of science, what Susanna Priest [ 2013 ] calls “critical science literacy”. Critical science literacy describes how people are able to analyze and make use of scientific knowledge, especially in an information environment that is rife with competing claims. Critical science literacy is evident in how discussion board participants make meaning from scientific articles and are encouraged to do so by science journalism (as opposed to uncritical press releases). That said, science journalism is facing new challenges, and the impacts of those challenges are evident in the online public discourse that responds to science writing.
While the world of online science writing is now larger than ever, that expansiveness brings appropriate concerns about the quality of writing and its impact upon the public. In the last decade, dozens of online science and environmental publications have been launched [Gutierrez, 2017 ], and many more online locations and access points, both formal and informal (YouTube channels, blogs, podcasts, hashtags, etc.), now provide popularized science communication. More Americans, for example, are looking online for their science news — roughly half, as reported in 2016 [National Science Board, 2016 ]. A key issue in the blossoming of online science writing has been the emergence, and potential dominance, of press releases, typically understood as short, uncritical, promotional materials, sometimes passing as “science news”, often written by Public Information Officers (PIOs) at universities or other institutions, composed in a journalistic style in “common and not too specialized language” [Carver, 2014 , p. 2]. 5 Press releases were initially intended to gain journalists’ attention, prompting additional coverage of newsworthy science, however, as Marcinkowski et al. [ 2014 ] have argued, press releases are becoming the dominant form of science communication. This emergence could be a concern for the quality of popular science writing and its impact upon an audience’s understanding, because, according to Sumner et al. [ 2014 ] and Sumner et al. [ 2016 ], press releases typically include more direct or explicit advice than the associated journal article and often contain exaggerations about causality. The proliferation of press releases could pose a problem in the advancement of critical science literacy because press releases offer simplified, less-critical information that could offer readers less material to use in making relevant sense of the science. This research project collects and analyzes data about conversations happening on r/science, including whether they result from press releases or science journalism, and considers several factors that affect the quality of the discussions that follow.
2 Objectives
Instead of scrutinizing how accurately information was replicated between scientific articles, popularized science writing, and redditor discourse, this study offers an approach which evaluates features of the discourse itself. We eschewed a large-data grab through reddit’s API, and instead evaluated a smaller data set, coded for 17 features that highlight how commenters engage with the scientific news. This close examination studied 97 posts and their top ten first-order comments in order to understand details about that engagement; we aimed to identify what kinds of comments occur in this popular science discussion board and some factors that impact those conversations. Our research focuses on “engaged and subject-matter-relevant” comments — comments that show active participation in the topic of a science discussion and shed light on how people make sense of and understand science. Participation in r/science is broadly representative of participation in other online venues for popular science writing, so we are able to glean some lessons about how scientific topics are discussed by the public.
We sought to answer what happens when redditors work through their “understanding” of science news, though did not want to pre-define “understanding” as the mirroring of scientific knowledge. Instead, we identified what the process of thinking with and thinking through what that information looked like. To that end, we applied a conceptual analysis model and coded for discursive features across seven different sub-areas of scientific specialization in seeking more information about how an initial piece of writing impacted the following conversation. The following four questions guided this study:
- Which genre (original research article, press release, or scientific news) generates the most engaged and subject-matter-relevant online discussions from readers?
- Does the existence of an intermediary article (press release and scientific news) situated between the reader and the original research article improve the discussions that follow?
- Do differences between genres of intermediary articles (press release or scientific news) impact the discussions that follow?
- How does the availability of the original research article (paywalled or open access) 6 impact the discussions that follow?
3 Methods
This data collection was analyzed through quantitative conceptual analysis, also known as content analysis or thematic analysis [Palmquist, Carley and Dale, 1997 ]. Conceptual analysis begins with identifying research questions, before choosing rules for a dataset, compiling that dataset, and then coding selections from that material into manageable content categories through selective reduction [Busch et al., 2012 ]. Researchers then focus on words and phrases and look for patterns that relate to the research questions. This method is similar to other recent studies of reddit’s Ask Me Anything (AMA) series, which also employ content analysis. However, unlike focusing on whether and how questions had been answered [Lai et al., 2020 ], our study attended to the content of the comments themselves, much like Hara et al.’s [ 2019 ] recent study. Based on the above research questions, we chose seven r/science subreddits for our dataset (r/biology, r/medicine, r/environment, r/economics, r/socialsience, r/psychology and r/animalscience). These subreddits were chosen because they represented each of the four specialization categories in r/science’s main dropdown menu and because they involved a substantial number of posts and comments. The subreddit specialization r/paleontology, for example, does not appear to have many active users and therefore very few posts fit our parameters for analysis (see below). We recognize that these specialization areas might impact the results of this study, as some commenters might interact with intermediary genres and scientific source material in distinct ways. For example, while r/animalscience may not encourage discussions that relate to commenters’ personal lives, r/psychology topics are likely to encourage these comments. We address this concern below and in the discussion.
In order to collect a significant enough sample of r/science comments, we gathered the following information from the seven areas of specialization: the original posts, their linked sources, and the top ten first-order comments that followed the post, as ranked by “TOP” (see Table 2 ). “TOP” sorting in reddit orders first-order comments by aggregate votes, and first-order comments are defined as comments posted in reply to the original post, not another comment. We wanted to focus on the comments that were most often read and attending to “TOP”-ranked comments accomplishes this goal. Admittedly, this focus on top-level comments loses some granularity of conversational detail, but it also means we were able to identify a comprehensible set of reactions to the original post. All posts were retrieved from September and October of 2019 and received between 50–500 comments. Requiring at least 50 comments ensured that discussions had sufficient relevant conversation; likewise, limiting the comments to 500 meant we avoided very controversial or popular posts. 7 Limiting the data range to September and October of 2019 also avoided discussions of COVID-19. 8 These parameters meant that some specialization areas, like r/physics, which has a fairly low level of user interaction, were not considered for this study.
After identifying research questions and choosing specialization areas, we coded 25 posts with a third researcher. This step was taken for three reasons: 1) to establish which concept categories were useful to analyze, 2) to refine definitions for each concept category, and 3) to ensure that our independent analysis was in agreement. The process of creating concept categories came from reading comment threads, creating a too-large list of possible concept categories, and then removing categories that were less relevant to the research questions of this study. Some initial concept categories, such as the presence of “user flair” (symbols associated with specific redditor identities) were jettisoned as insignificant to the research, as we concentrated on comment topics (i.e., whether comments focused on humor, summaries, etc.). Concept categories, like definition-, implication-, or methods-related comments, were chosen because of their prevalence and their ability to show evidence for engaged and subject-matter-relevant comments. As a result of discussions between raters, concept categories were refined and the definitions indicated in Table 1 were determined. Finally, kappa (percentage of agreement) was calculated for three concept categories (number of inquires/questions, summaries, and definition-related comments) and determined to be 0.79 with no excessive outliers — sufficient agreement for conceptual analysis.
For the process of identifying concepts in each of the top 10 comments, we categorized each post and comment set in the following ways (see Table 1 ). The 17 content categories are followed by a brief concept rule definition (see appendix A for additional examples). These categories represent the bulk of rhetorical, textual moves that commenters make in response to an original post. This method is inobtrusive, as the existing texts are public, and ensures anonymity, as no identifying information was gathered.
Analysis of this dataset with the above concepts allowed us to identify general features of how the public responds to science news, and how the source, genre, and availability of that textual material impacts the discussion that follows. If most science communication models are correct, then open access research articles and the presence of any intermediary, popular science writing (especially scientific news journalism, with its additional critical lens) will beneficially impact the discussions that follow. It is also likely that different groups within r/science react to these elements of the original post in different ways, yet our goal in performing this analysis was not to determine which features are most precisely aligned with the science described in those original materials, but to identify the qualities of the discussion that result from identifiable changes in the source material. Based within a CUSP model, our concern is with how people make sense of scientific information in response to different texts.
4 Results
Each conversation thread from the seven different r/science subreddits falls into one of three major categories, depending upon which text readers and commenters encountered first: scientific news article, press release, or original research article (Table 2 ). The largest major distinction occurs between whether the original poster (OP) posted a link to an intermediary article (scientific news article or press release) or to an original research article (a permalink to a journal-controlled webpage). Comparing the comments that followed posts with links to intermediary articles to those without reveals distinctions in how those science writing genres helped to encourage certain conversations. Figure 2 shows the basic breakdown for comment threads, and “TOP” 10 first-order comments, across nine different concept categories.
4.1 Intermediary articles vs original research articles
The presence of intermediary articles resulted in increased visibility for posts when compared to posts linking to scientific articles alone. Posts with intermediary articles received a 185% larger vote score (9,026 vs 3,164) and 34% more comments (218 vs 163) than those without intermediary articles. When considering Q2, “Does the existence of an intermediary article (press release and scientific news) situated between the reader and the original research article improve the discussions that follow?”, we identified several key results. As Figure 2 shows, intermediary articles encouraged redditors to respond with 31% more content-appropriate humorous comments and 75% more references to personal situations or contexts. Both of the intermediary genres are meant to communicate science to non-specialist audiences, situate the results of research in context, and speculate on their impact, so it would follow that audiences more readily connected what they read to their own experiences. When the post linked directly to an original research article, the conversations that followed included about 16% more comments concerning the article’s methods and 36% more questions than when the post linked to an intermediary article. Differences in many other concept categories were negligible.
4.2 Differences by genre
We can also identify distinctions between the two major genres of intermediary articles — as queried through Q3: “Do differences between genres of intermediary articles (press release or scientific news) impact the discussions that follow?” The distinctions between these genres for nine concept categories are represented in Figure 3 , along with data from posts that link directly to original research articles and information on posts to intermediary articles with missing or broken links to an original research article.
Posts which linked to press releases prompted fewer definition-, implication-, and methods-related comments than either journalistic scientific news articles or posts which linked directly to scientific articles, recording 61%, 85%, and 45% of the instances of these concept categories recorded for the next-lowest genre, respectively. Tellingly, scientific news articles do not perform much better in these categories than links to an original research article alone ( difference for each category), but press releases perform demonstrably worse in these categories than either.
Where press releases fell short in some measures of engagement, they excelled in quantity of engagement; the only major advantage of a press release appears to be in its ability to attract more attention, as measured by total aggregate votes and comments. Press release-linked posts recorded the highest values in both categories, averaging 139% of the comments (227 vs 163) and 348% of the votes (10,998 vs 3,164) when compared to links to scientific articles alone.
Scientific news articles encourage more subject-matter-relevant humor than press releases (about 36% more), which themselves do not encourage much more humor than links to research articles alone. The guidelines for the r/science subreddit explicitly state that “no jokes or memes” are allowed and that “[c]omments should constructively contribute to the discussion or be an attempt to learn more” (see Figure 1 ). While memes or jokes are disallowed, the greater number of humorous comments for posts that link to scientific news articles suggest that some subject-matter-relevant humor may have additional merit in facilitating discussion.
Finally, scientific news articles also appear to discourage commenters from posting references to additional research. This result may be because the genre of scientific news typically includes references to other research or comments from researchers un-affiliated with the authors of the paper being covered.
4.3 Effects of broken or missing links
The following two results offer evidence to Q4: “How does the availability of the original research article (paywalled or open access) impact the discussions that follow?”
Of the 97 posts considered, nine linked to intermediary articles in which the links to an original research article were either broken or omitted. While this represents an admittedly small sample size, posts without working links encouraged more comments in general as well as more deleted comments, summaries, and implication-related comments (see Figure 3 ). The absence of a working link also prompted fewer inquiries and subject-matter-relevant humorous comments. Genre could have influenced the higher rates of votes and comments, as press releases prompted higher averages in both these categories and a higher percentage of press releases featured missing or broken links (19.36%) than scientific news articles (7.32%), despite the industry-standard practice of linking to the research article.
4.4 Open access impacts
As noted above, posts linking to intermediary articles received a 185% larger vote score and 34% more comments than posts to original research articles. This disparity was exacerbated by a research article’s paywalled status (Table 3 ). Posts to scientific news articles linking to paywalled research received a 266% larger vote score than posts linking directly to paywalled articles, and posts to press releases linking to paywalled articles received a staggering 511% larger vote score than posts linking directly to paywalled articles. The number of comments was similarly affected by a research article’s paywalled status, with scientific news articles receiving 86% more comments and press releases receiving 82% more comments than posts linking directly to paywalled articles. Presumably, few readers had the access necessary to engage with this research in any form. 9
Along with votes and comments, the open access status of an article appears to increase the number of summaries and methods-related comments across genres (see Figure 4 ). The presence of a dense, data-rich format would provide both an exigence and the means to summarize information, and detailed access to methodological information could provide the basis for specific commenting and inquiry.
5 Discussion
Which genre (original research article, press release, or original journalism) generates the most engaged and subject-matter-relevant online discussions from readers? The above results suggest that the answers to this question are complicated, insofar as there is no straightforward path towards greater numbers of subject-matter-relevant comments. In general, the presence of an intermediary article suggests that more readers and commenters will participate in an online discussion and have a greater positive response to the research being discussed, but the effects of intermediary articles are not significantly distinguishable, in most of our chosen concept categories, from direct links to original research articles. One category where intermediary articles stand out is in the average number of comments that make reference to personal experiences; despite demonstrating variation across individual topic categories, intermediary articles encouraged 75% more of these personal connections across all posts. It is tempting to read that difference as evidence that intermediary articles do an effective job at connecting complex research findings to everyday concerns — in part because this is one of their major goals. As described by numerous commentators on journalistic science writing and oft-repeated in science writing textbooks [see especially Stocking, 2010 , chapter 3], a major goal is to connect scientific concepts and language to the public’s knowledge and personal experience [Fahnestock, 1986 ] by “relating it to phenomena, events, issues, knowledge, and concerns outside science” [Peters, 2013 , p. 14107]. In this regard, science journalism appears to relate complex science to a non-specialist audience effectively.
Additional results direct our attention to the effectiveness of different genres of intermediary article: scientific news articles, written by science journalists, and press releases, typically written by university-connected public information officers. These differences can be generally aligned with existing descriptions of those genres. For example, scientific news articles encourage more on-topic humor in comments but discourage additional research on the same topic. This result aligns with Marsh’s [ 2016 ] commentary on science communication — how humor can “create an informal and more welcoming space for discussion” and can play “a powerful role in the reception of a message” [pp. 6–7]. If humorous content occurs more frequently following scientific news articles, we can assume that the humor is both on-topic (otherwise it would have been deleted, as per the subreddit’s rules) and that it derives from an understanding of context, which presumes greater immersion in the article’s content. That scientific news discourages additional research could likewise be an effect of genre. Most scientific news includes secondary opinions from researchers not affiliated with the original research institute. This external review helps the journalist understand the significance of the research, gives the story more credibility, and helps the audience gain some critical understanding of the science. This critical function is often noted as the main distinction between science journalism and press releases [Autzen, 2014 ], as journalists are able to be evaluative in ways that PIOs are not. Science journalists include the work of other scientists in order to consider the significance of the research, a detail that can bolster readers’ critical science literacy and obviate the need for commenters to find and report on the same references.
Press releases appear to discourage comments on research methodology, an effect that can also be linked to a function of the genre, as press releases typically include more discussion on methods than scientific news articles. Brechman, Lee and Cappella [ 2009 ] located differences between journalistic accounts and press releases, finding that press releases included more content focused on methodology than did scientific news articles. The authors noted that “[c]laims within the press release often emphasized methodology, history, or the sociological environment of the research. In contrast, claims presented to lay public in news accounts provided little direct contextual information, instead emphasizing how study results apply to the “real world”” [Brechman, Lee and Cappella, 2009 , p. 467]. 10 Press releases’ focus on methodology could explain the much lower number of comments in this area. Relatedly, the presence of a press release also made a positive impact upon the aggregate vote score (37% more than scientific news, 111% more than original research articles) and the comments to that post (33% more than original research articles). Press releases are designed to increase “the likelihood of the media reporting on the research, which in turn can increase visibility and attract public interest in both the research and the institute” [Carver, 2014 , p. 2]. While we cannot speak to whether press releases increase the likelihood of media attention, we can state that press releases increase the visibility and attention paid toward research in online forums like reddit.
The findings presented here also reinforce previous assumptions about the benefits of open access. Open science initiatives aim for increased availability, accessibility, and transparency as well as to build trust between scientists and non-specialists [United Nations Educational, Scientific and Cultural Organization, 2017 ]. Our results suggest that open access articles accomplish at least the former goal; when original research articles were accessible without fee, comments were more engaged and subject-matter-relevant — as demonstrated by increased occurrence in the majority of concept categories. This trend held true for posts which linked to both genres of intermediary articles and those which linked directly to open access research articles. This result generally matches other findings which suggest that open access articles are more often cited and read [Li et al., 2018 ; Holmberg et al., 2020 ]. One of the few categories which exhibited the opposite effect was the number of comments which referenced personal experiences; for both genres of intermediary articles, paywalled research prompted a somewhat greater number of these connections. With less access to the information which would prompt specific methodological, definitional, and summative questions in comments, readers’ relations to personal experience may have increased in the top 10 first-order comments, possibly explaining this discrepancy.
Previous research investigating the relationship between open science and public engagement has noted concerns with the amount of effective contextualization, mapping, and interpretation of information required to make science accessible rather than simply available [Grand et al., 2016 ]. As open access articles are meant for a discipline-specific audience, the communicative imbalance between specialist author and non-specialist reader is little reduced. The discussions generated by open access articles observed here may thus be seen as surprising under a PUS model — or indeed general PAST frameworks. While it is possible that science communicators author more engaging intermediary articles when working with open access articles, both journalists and PIOs almost certainly have access to original research articles both open and paywalled. Therefore, it seems more likely these conversations are the result of some commenters displaying the scientific literacy skills necessary to access the original research article and raise the level of discussion. Regardless, more research on the interaction between open access and public engagement and understanding would be worthwhile.
6 Conclusion
Overall, science communicators whose activities align with public engagement models may thus derive useful implications from the impacts of genre and open access on audience understanding. Certainly, while press releases increase the visibility of science news and help readers connect science and their own lives, that same genre falls short of the advantages that science journalism provides. More engaged and subject-matter-relevant discussions are likely to emerge from responses to science news articles. Ultimately, it appears that genre matters. We see the features of genres affect the conversations that follow; for example, science journalism often includes multiple researchers’ perspectives, which lessens the need for readers to find those sources. Likewise, open access also matters. The open access status of an article appears to increase the number of summaries and methods-related comments across genres. While these two categories are not immediately indicative of better conversations, the open access status of an article is beneficial across most concept categories, taking the possible negative impact of press releases into account.
Several factors should qualify these implications. While the relatively small sample ( ) analyzed here has allowed for in-depth observation of the types of engagement displayed in online conversations, it also limits the generalizability of our conclusions. Reddit users may also constitute a unique audience and their engagement with science news could deviate from that of a more general population. One must additionally approach total vote score carefully as a metric of engagement. A 2017 computational analysis of user behavior within the complete reddit.com domain, for instance, found that “most users do not read the article that they vote on, and that, in total, 73% of posts were rated (i.e., upvoted or downvoted) without first viewing the content” [Glenski, Pennycuff and Weninger, 2017 , p. 200]. Further, members of the r/science community may behave in a more specific fashion than reddit’s general users; certainly, the changes in engagement observed in our study suggest that r/science users interact with linked articles at higher rates. This can be partially explained by the fact that r/science’s usership is, after all, self-selected on the basis of interest in science news. Many of these users would likely align with highly interested segments of the public identified by PAST models and varyingly termed “boosters” or “sciencephiles” [Perrault, 2013 ; Schäfer et al., 2018 ]. Observations about the nature of their engagement may or may not apply to populations who seek out science news from different media.
A second major caveat should also be made: the decisions to investigate specific r/science specializations could have impacted the results of our analysis. Because we studied posts from r/biology, r/medicine, r/environment, r/economics, r/socialsience, r/psychology and r/animalscience, we could have unintentionally skewed our representation of r/science towards the unique interactions that readers have with that respective content. These specializations were chosen because they represent each of the four categories in reddit’s dropdown menu and because they involve substantial posts and comments — enough to fit our dataset parameters. Studying results within each individual specialization, however, also brings a challenge, insofar as posts in each r/science area may not represent large enough of a sample size to make significant claims. Ultimately, we feel that the seven specializations chosen for this study represent the most trafficked r/science areas and therefore, when studied in aggregate, offer a fairly accurate representation of activity on r/science.
The metrics of engagement and discussion identified in this study represent a means of analyzing reader understanding that aligns more with a CUSP model than other frameworks. We approached reader understanding not in terms of fact-recollection but in terms of the reactions it produced: connections to personal experience, further inquiry, and generally deeper discussion. Undoubtedly, the patterns identified here would benefit from larger sets of data gathered from different audiences. Researchers of science communication may additionally explore concepts of reader understanding in alternative ways as we seek to learn more about how non-specialists work to understand scientific research and, accordingly, how we may better facilitate that understanding. Ultimately, this study investigated how readers experience and understand science, and further investigations of living discourse guided by CUSP principles could prove similarly fruitful.
A Concept categories
References
-
Akin, H. and Landrum, A. R. (2017). ‘A recap: heuristics, biases, values, and other challenges to communicating science’. In: The Oxford handbook of the science of science communication. Ed. by K. H. Jamieson, D. M. Kahan and D. A. Scheufele. Oxford, U.K.: Oxford University Press, pp. 455–460. https://doi.org/10.1093/oxfordhb/9780190497620.013.48 .
-
Angler, M. W. (2017). Science journalism: an introduction. London, U.K.: Routledge. https://doi.org/10.4324/9781315671338 .
-
Autzen, C. (2014). ‘Public communication from research institutes: is it science communication or public relations? Press releases — the new trend in science communication’. JCOM 13 (03), C02. https://doi.org/10.22323/2.13030302 .
-
Blum, D., Knudson, M. and Marantz Henig, R., eds. (2005). A field guide for science writers: the official guide of the National Association of Science Writers. 2nd ed. New York, NY, U.S.A.: Oxford University Press. https://doi.org/10.1093/oso/9780195174991.001.0001 .
-
Brechman, J., Lee, C.-j. and Cappella, J. N. (2009). ‘Lost in translation?: a comparison of cancer-genetics reporting in the press release and its subsequent coverage in the press’. Science Communication 30 (4), pp. 453–474. https://doi.org/10.1177/1075547009332649 .
-
Broks, P. (2006). Understanding popular science. Maidenhead, U.K.: Open University Press.
-
Brossard, D. (2013). ‘New media landscapes and the science information consumer’. Proceedings of the National Academy of Sciences 110 (Supplement 3), pp. 14096–14101. https://doi.org/10.1073/pnas.1212744110 .
-
Bucchi, M. and Trench, B., eds. (2014a). Routledge handbook of public communication of science and technology. 2nd ed. London, U.K.: Routledge. https://doi.org/10.4324/9780203483794 .
-
— (2014b). ‘Science communication research: themes and challenges’. In: Routledge handbook of public communication of science and technology. Ed. by M. Bucchi and B. Trench. 2nd ed. London, U.K.: Routledge, pp. 1–14. https://doi.org/10.4324/9780203483794 .
-
Busch, C., De Maret, P. S., Flynn, T., Kellum, R., Le, S., Meyers, B., Saunders, M., White, R. and Palmquist, M. (2012). Content analysis. Writing@CSU Guide. Colorado State University. URL: https://writing.colostate.edu/guides/guide.cfm?guideid=61 (visited on 2nd November 2020).
-
Carpenter, S., ed. (2020). The craft of science writing: selections from The Open Notebook . Madison, WI, U.S.A.: The Open Notebook.
-
Carver, R. B. (2014). ‘Public communication from research institutes: is it science communication or public relations?’ JCOM 13 (03), C01. https://doi.org/10.22323/2.13030301 .
-
Dimopoulos, K. and Koulaidis, V. (2002). ‘The socio-epistemic constitution of science and technology in the Greek press: an analysis of its presentation’. Public Understanding of Science 11 (3), pp. 225–241. https://doi.org/10.1088/0963-6625/11/3/302 .
-
Dunwoody, S. (1999). ‘Scientists, journalists, and the meaning of uncertainty’. In: Communication uncertainty: media coverage of new and controversial science. Ed. by S. M. Friedman, S. Dunwoody and C. L. Rogers. Mahwah, NJ, U.S.A.: Lawrence Erlbaum Associates, pp. 59–79.
-
Einsiedel, E. F. (1992). ‘Framing science and technology in the Canadian press’. Public Understanding of Science 1 (1), pp. 89–101. https://doi.org/10.1088/0963-6625/1/1/011 .
-
Eurobarometer (2013). Special Eurobarometer 401. Responsible Research and Innovation (RRI), Science and Technology. Brussels, Belgium: TNS Opinion & Social on request of European Commission. URL: https://www.genderportal.eu/resources/special-eurobarometer-401-responsible-research-and-innovation-rri-science-and-technology .
-
Fahnestock, J. (1986). ‘Accommodating science: the rhetorical life of scientific facts’. Written Communication 3 (3), pp. 275–296. https://doi.org/10.1177/0741088386003003001 .
-
Fahy, D. and Nisbet, M. C. (2011). ‘The science journalist online: shifting roles and emerging practices’. Journalism 12 (7), pp. 778–793. https://doi.org/10.1177/1464884911412697 .
-
Glenski, M., Pennycuff, C. and Weninger, T. (2017). ‘Consumers and curators: browsing and voting patterns on Reddit’. IEEE Transactions on Computational Social Systems 4 (4), pp. 196–206. https://doi.org/10.1109/TCSS.2017.2742242 .
-
Grand, A., Wilkinson, C., Bultitude, K. and Winfield, A. F. T. (2016). ‘Mapping the hinterland: data issues in open science’. Public Understanding of Science 25 (1), pp. 88–103. https://doi.org/10.1177/0963662514530374 .
-
Gutierrez, I. (2017). ‘The future needs funding: the persistent struggles of digital science magazines’. WCSJ 2017 . URL: http://wcsj2017.org/future-needs-funding-persistent-struggles-digital-science-magazines/ (visited on 3rd November 2017).
-
Hara, N., Abbazio, J. and Perkins, K. (2019). ‘An emerging form of public engagement with science: Ask Me Anything (AMA) sessions on Reddit r/science’. PLoS ONE 14 (5), e0216789. https://doi.org/10.1371/journal.pone.0216789 .
-
Hijmans, E., Pleijter, A. and Wester, F. (2003). ‘Covering scientific research in Dutch newspapers’. Science Communication 25 (2), pp. 153–176. https://doi.org/10.1177/1075547003259559 .
-
Holmberg, K., Hedman, J., Bowman, T. D., Didegah, F. and Laakso, M. (2020). ‘Do articles in open access journals have more frequent altmetric activity than articles in subscription-based journals? An investigation of the research output of Finnish universities’. Scientometrics 122, pp. 645–659. https://doi.org/10.1007/s11192-019-03301-x .
-
Jensen, J. D. (2008). ‘Scientific uncertainty in news coverage of cancer research: effects of hedging on scientists’ and journalists’ credibility’. Human Communication Research 34 (3), pp. 347–369. https://doi.org/10.1111/j.1468-2958.2008.00324.x .
-
Knudsen, S. (2005). ‘Communicating novel and conventional scientific metaphors: a study of the development of the metaphor of genetic code’. Public Understanding of Science 14 (4), pp. 373–392. https://doi.org/10.1177/0963662505056613 .
-
Lai, D., Wang, D., Calvano, J., Raja, A. S. and He, S. (2020). ‘Addressing immediate public coronavirus (COVID-19) concerns through social media: utilizing Reddit’s AMA as a framework for public engagement with science’. PLoS ONE 15 (10), e0240326. https://doi.org/10.1371/journal.pone.0240326 .
-
Li, Y., Wu, C., Yan, E. and Li, K. (2018). ‘Will open access increase journal CiteScores? An empirical investigation over multiple disciplines’. PLoS ONE 13 (8), e0201885. https://doi.org/10.1371/journal.pone.0201885 .
-
Marcinkowski, F., Kohring, M., Fürst, S. and Friedrichsmeier, A. (2014). ‘Organizational influence on scientists’ efforts to go public: an empirical investigation’. Science Communication 36 (1), pp. 56–80. https://doi.org/10.1177/1075547013494022 .
-
Marsh, O. (2016). ‘“People seem to really enjoy the mix of humour and intelligence”: science humour in online settings’. JCOM 15 (02), CO3. https://doi.org/10.22323/2.15020303 .
-
Mehlenbacher, A. R. (2019). Science communication online: engaging experts and publics on the Internet. Columbus, OH, U.S.A.: The Ohio State University Press. https://doi.org/10.26818/9780814213988 .
-
Mellor, F. (2018). ‘Book review: The Oxford handbook of the science of science communication’. Public Understanding of Science 27 (6), pp. 750–752. https://doi.org/10.1177/0963662518779838 .
-
Moriarty, D. and Mehlenbacher, A. R. (2019). ‘The coaxing architecture of Reddit’s r/science: adopting ethos -assessment heuristics to evaluate science experts on the Internet’. Social Epistemology 33 (6), pp. 514–524. https://doi.org/10.1080/02691728.2019.1637964 .
-
National Science Board (2016). Science and engineering indicators 2016 . Arlington, VA, U.S.A.: National Science Foundation. URL: https://www.nsf.gov/statistics/2016/nsb20161/#/ .
-
Nisbet, M. C. and Goidel, R. K. (2007). ‘Understanding citizen perceptions of science controversy: bridging the ethnographic–survey research divide’. Public Understanding of Science 16 (4), pp. 421–440. https://doi.org/10.1177/0963662506065558 .
-
Palmquist, M. E., Carley, K. M. and Dale, T. A. (1997). ‘Applications of computer-aided text analysis: analyzing literary and nonliterary texts’. In: Text analysis for the social sciences: methods for drawing statistical inferences from texts and transcripts. Ed. by C. W. Roberts. Mahwah, NJ, U.S.A.: Lawrence Erlbaum Associates.
-
Perrault, S. (2013). Communicating popular science: from deficit to democracy. New York, NY, U.S.A.: Palgrave McMillan.
-
Peters, H. P. (2013). ‘Gap between science and media revisited: scientists as public communicators’. Proceedings of the National Academy of Sciences 110 (Supplement 3), pp. 14102–14109. https://doi.org/10.1073/pnas.1212745110 .
-
Priest, S. (2013). ‘Critical science literacy: what citizens and journalists need to know to make sense of science’. Bulletin of Science, Technology & Society 33 (5–6), pp. 138–145. https://doi.org/10.1177/0270467614529707 .
-
Schäfer, M. S. (2017). ‘Wissenschaftskommunikation Online’. In: Forschungsfeld Wissenschaftskommunikation. Ed. by H. Bonfadelli, B. Fähnrich, C. Lüthje, J. Milde, M. Rhomberg and M. S. Schäfer. Wiesbaden, Germany: Springer VS, pp. 275–293. https://doi.org/10.1007/978-3-658-12898-2_15 .
-
Schäfer, M. S., Füchslin, T., Metag, J., Kristiansen, S. and Rauchfleisch, A. (2018). ‘The different audiences of science communication: a segmentation analysis of the Swiss population’s perceptions of science and their information and media use patterns’. Public Understanding of Science 27 (7), pp. 836–856. https://doi.org/10.1177/0963662517752886 .
-
Stocking, S. H. (2010). The New York Times reader: science & technology. Washington, D.C., U.S.A.: CQ Press.
-
Stocking, S. H. and Holstein, L. W. (2009). ‘Manufacturing doubt: journalists’ roles and the construction of ignorance in a scientific controversy’. Public Understanding of Science 18 (1), pp. 23–42. https://doi.org/10.1177/0963662507079373 .
-
Subreddit stats (2020). URL: https://subredditstats.com (visited on 14th October 2020).
-
Sumner, P., Vivian-Griffiths, S., Boivin, J., Williams, A., Venetis, C. A., Davies, A., Ogden, J., Whelan, L., Hughes, B., Dalton, B., Boy, F. and Chambers, C. D. (2014). ‘The association between exaggeration in health related science news and academic press releases: retrospective observational study’. BMJ 349, g7015. https://doi.org/10.1136/bmj.g7015 .
-
Sumner, P., Vivian-Griffiths, S., Boivin, J., Williams, A., Bott, L., Adams, R., Venetis, C. A., Whelan, L., Hughes, B. and Chambers, C. D. (2016). ‘Exaggerations and caveats in press releases and health-related science news’. PLoS ONE 11 (12), e0168217. https://doi.org/10.1371/journal.pone.0168217 .
-
The Writers of SciLance (2013). The science writers’ handbook: everything you need to know to pitch, publish, and prosper in the digital age. Ed. by T. C. Hayden and M. Nijhuis. Boston, MA, U.S.A.: Da Capo Lifelong Books.
-
United Nations Educational, Scientific and Cultural Organization (2017). Global Open Access Portal . URL: http://www.unesco.org/new/en/communication-and-information/portals-and-platforms/goap/open-science-movement/ (visited on 3rd January 2021).
-
Zavestoski, S., Shulman, S. and Schlosberg, D. (2006). ‘Democracy and the environment on the Internet: electronic citizen participation in regulatory rulemaking’. Science, Technology, & Human Values 31 (4), pp. 383–408. https://doi.org/10.1177/0162243906287543 .
Authors
Ehren Helmut Pflugfelder is an Associate Professor of Technical, Scientific, and Professional Writing at Oregon State University in the U.S., where he teaches courses in rhetoric, technical and science writing, and writing pedagogy. His research has appeared in journals such as The Journal of Business and Technical Communication , Technical Communication Quarterly , Communication Design Quarterly , and Rhetoric Society Quarterly , and he is the author of ‘Communicating mobility and technology: a material rhetoric for transportation’ (Routledge). E-mail: ehren.pflugfelder@oregonstate.edu .
Alexander Mahmou-Werndli holds a Master of Arts in Rhetoric and Writing from Oregon State University. He teaches composition and science writing as a lecturer at Al Akhawayn University in Ifrane, Morocco. His research focuses on writing instruction in the disciplines, scientific communication, and translingual publishing and pedagogies. E-mail: werndlia@oregonstate.edu .
Endnotes
1 Moriarty and Mehlenbacher [ 2019 ] consider r/science AMAs in greater detail, as do Hara, Abbazio and Perkins [ 2019 ] and Lai et al. [ 2020 ], both of whom focus on public engagement with science (PES).
2 This assumption has been incorporated into existing science writing textbooks, including: The craft of science writing , Carpenter; A field guide for science writers , Blum et al.; The science writers handbook , Hayden and Nijhuis; The New York Times reader: science and technology , Stocking; and Science journalism: an introduction , Angler.
3 These organizations include the American Association for the Advancement of Science, National Association of Science Writers, the Society of Environmental Journalists, the Association of Health Care Journalists, and the World Federation of Science Journalists, to name only a few.
4 See Bucchi and Trench, Routledge handbook of public communication of science and technology: second edition [ 2014a ].
5 As science journalists experience worsening working conditions [Schäfer, 2017 ], material can be generated by uncritically copying a press release into popular science publications [Fahy and Nisbet, 2011 ].
6 Throughout this article, we use the term “open access” to describe freely available original research articles — ones that can be found, downloaded, or viewed without additional login or payment. We use the term in its more casual reference, without distinguishing between gold, green, or hybrid open access status.
7 Heavily upvoted posts, for example, could be popular because of the comments, or because the topic is controversial for reasons which may have little to do with the genre of communication. By excluding the most popular posts, we hoped to avoid such outliers.
8 A study of COVID-19 discussions would be interesting, but we wanted to capture a broad representation of r/science discussions and not have the dataset overwhelmed by a single scientific topic.
9 With so few examples of paywalled articles in this study available for analysis, the statistical significance of these distinctions is better left for studies of larger datasets.
10 Previous research has shown that science journalism tends to avoid lengthy descriptions of methodology. See, for example, Einsiedel [ 1992 ], Dimopoulos and Koulaidis [ 2002 ], and Hijmans, Pleijter and Wester [ 2003 ].