1 Introduction
Research from Pew Research Center has found that U.S. adults’ trust in science and scientists has declined since the COVID-19 pandemic [Kennedy & Tyson, 2023]. Specifically, they found that the proportion of people who distrust scientists to act in the public’s best interests rose from 13% in January 2019 to 27% in October 2023; this rise in distrust is more prominent among Americans who identify as members of the Republican party than the Democratic party [Kennedy & Tyson, 2023]. This gap between Republicans and Democrats can be explained by the politicization of the COVID-19 pandemic in media [Hart et al., 2020; Schmidt, 2023]. In television and print news, for example, scholars found that pandemic news coverage frequently featured politicians and political actions rather than scientists which might have contributed to its politicization [Hart et al., 2020; Schmidt, 2023]. On social media, politicians’ discourse regarding the pandemic, face masks, and vaccines were often political rather than medical and may have contributed to the political polarization of the pandemic [Zhou et al., 2024].
While studies have found that trust in science among U.S. adults, as a whole, has declined [Cox et al., 2023; Kennedy & Tyson, 2023], it may have impacted young adults specifically. According to the “impressionable years” hypothesis, the social and historical environment in which a young adult becomes a member of society — the late teenage and early adulthood years — shapes their attitudes, beliefs, and worldviews [see Krosnick & Alwin, 1989]. Given the politicization of the COVID-19 pandemic, moving out of adolescence and into adulthood during this period of uncertainty regarding science may have influenced how young adults view science, scientists, and the government’s role in science. These beliefs established during these formative years, may hold steady into adulthood [Krosnick & Alwin, 1989]. While some work has begun to address how exposure to the COVID-19 pandemic and other events have affected young adults’ trust in science [see Aksoy et al., 2020, 2022; Eichengreen et al., 2021; González & Simes, 2023], these studies have not considered the personal perspectives and lived experiences of young people during the pandemic.
To address this gap, this study sought to gain an understanding of young adults’ trust in science in light of the COVID-19 pandemic, exploring their trust in science principles and methods, science institutions, and scientists. This study also explored how young adults came to trust COVID-19 related (mis)information on social media. To fulfill these purposes, this study employed five focus group interviews with college students at Texas Tech University.
2 Trust in science
Trust is the willingness of one party (the trustor) to be vulnerable to another party (the trustee), despite the inherent risks involved [Mayer et al., 1995]. For trust to be established, the trustor must believe that the trustee possesses the necessary expertise, acts with benevolence, and demonstrates integrity [Mayer et al., 1995]. In other words, the trustor must perceive the trustee as credible due to their expertise, show goodwill, and adhere to principles that the trustor finds acceptable [Mayer et al., 1995].
Due to the complexity of science, the public must rely on the scientific community for guidance on intricate topics, such as a global pandemic [Hendriks et al., 2016]. Scholars have explored trust in science at various levels: macro (science as a system), meso (science organizations and institutions), and micro (individual scientists) [Reif & Guenther, 2021]. Trust can vary across these levels; Achterberg and colleagues [2017] identified a ‘science confidence gap,’ where some individuals may trust scientific principles and methods but are less trusting of individual scientists and scientific institutions.
Recent research has found a decline in trust in scientific principles, scientists, and scientific institutions among U.S. adults over the past five years [Lupia et al., 2024]. People are increasingly skeptical about scientists’ ability to control their personal biases and those of their funding sources [Lupia et al., 2024]. Survey data revealed that 30% of respondents doubted scientists could manage their personal and political biases in research [Lupia et al., 2024]. Data also showed that 28% of participants believed scientists could not mitigate these biases when communicating COVID-19 information to the public [Lupia et al., 2024].
Though trust in science principles and methods, scientists, and science institutions midst the COVID-19 pandemic has been examined in previous research [i.e. Kennedy & Tyson, 2023; Lupia et al., 2024], there is a gap in how young adults’ lived experiences during the COVID-19 pandemic may have impacted their trust in these institutions. To address this gap, this study sought to answer the following research questions:
RQ1: How do focus group participants view scientists, science institutions, and scientific principles and methods amidst the COVID-19 pandemic?RQ2: How do focus group participants believe the COVID-19 pandemic impacted their trust in science?
3 Trust in science (mis)information
Epistemic trust refers to the confidence that the knowledge that scientists produce is true and accurate [Hendriks et al., 2016; Wilholt, 2013]. Epistemic trust consists of a default trust and a vigilant trust [Origgi, 2012]. The default trust requires the dependent party to have a general willingness to trust others and the information they produce [Origgi, 2012]. That is, they are not blatantly opposed to placing trust in others [Origgi, 2012]. On a deeper level, vigilant trust requires a deeper evaluation of the message arguments and the messenger before coming to accept those arguments to avoid being misinformed [Sperber et al., 2010]. To access epistemic trust in science information, the message receiver must evaluate the source in regard to their expertise, integrity, and benevolence in order to determine whether or not they are a trustworthy source whose communicated information they should adopt as their own [Cummings, 2014; Hendriks et al., 2015; Peters et al., 1997]. Closely related to epistemic trust are epistemic beliefs — individual cues about how one assesses the accuracy of a message [Hofer, 2001]. Garrett and Weeks [2017] identified three main types of epistemic beliefs which people may use to come to trust information: faith in intuition, truth is political, and need for evidence.
Faith in intuition involves relying on subjective reasoning or a “gut feeling” to determine the truth of a claim, often at the expense of available evidence [Garrett & Weeks, 2017]. This reliance can lead to biases influenced by personal attitudes and may result in misperceptions and unwarranted distrust of objectively factual information. For example, an individual who encounters a post about the importance of wearing masks may dismiss it based on their own differing experiences, despite the post being supported by scientific evidence. Empirical research has found that those who have high levels of faith in intuition tend to be more susceptible to misinformation regarding health and rely on alternative health media outlets and social media for information [Wu et al., 2022]. They are also more likely to rely on politically-biased media sources [Butterfuss et al., 2020]. Truth is political describes the tendency of individuals to dismiss established evidence in favor of their partisan affiliations [Garrett & Weeks, 2017]. Those who hold this belief view facts, including scientific data, as socially and politically constructed and perceive the validity of these data as dependent on political context. For instance, a person high in the belief that truth is political might reject a post about mask-wearing simply because it contradicts their political party’s stance. Garrett and Weeks [2017] argue that individuals who perceive truth as a political construct are more likely to adopt conspiracy beliefs. Indeed, empirical research has shown that belief that truth is political is positively associated with COVID-19 conspiracy theory belief and negatively associated with COVID-19 preventative behaviors such as mask-wearing and receiving a vaccine [Rudloff et al., 2022]. In contrast to the other epistemic beliefs, those who prioritize need for evidence rely on logic and empirical data to form their judgment about a message and ensure that their beliefs about an issue are consistent with the evidence available [Garrett & Weeks, 2017]. Using the mask example again, someone with a higher need for evidence would be more inclined to accept a post that provides empirical data or statistics supporting the claim. Such individuals generally hold fewer misperceptions regarding science [Garrett & Weeks, 2017]. Research regarding COVID-19 found that those who have high need for evidence beliefs are less likely to believe COVID-19 conspiracy theories and more likely to practice COVID-19 preventative behaviors [Rudloff et al., 2022].
However, being high in need for evidence does not necessarily make one immune to misinformation. For example, some of those who believe in conspiracy theories have been known to use scientific evidence — albeit flawed — to support their claims [i.e., Flat Earthers Olshansky et al., 2020]. Indeed, research regarding COVID-19 misinformation has found that those who had more positive attitudes toward ‘doing your own research’ also believed more misinformation about COVID-19 [Chinn & Hasell, 2023]. Calls to do ‘do your own research’ encourage “individuals to seek additional or alternative sources of information, verify facts, and examine evidence to make informed decisions that best suit one’s individual circumstances” [Chinn & Hasell, 2023, p. 2]. However, the evidence found while doing research for themselves may not be accurate or data from the soundest sources [Chinn & Hasell, 2023]. Furthermore, in ‘doing their own research,’ individuals may gravitate toward data that align with their pre-existing beliefs [Nickerson, 1998], and although they may believe they are accurately interpreting scientific evidence, laypersons often overestimate their abilities to do so [Atir et al., 2015] and may not be qualified to make such judgments [Ballantyne & Dunning, 2022].
Those who had a high need for evidence but do not have the scientific knowledge and training to assess the accuracy of science information may have come across flawed scientific studies such as one titled, “The Föegen Effect: A Mechanism by Which Facemasks Contribute to the COVID-19 Case Fatality Rate” [Fögen, 2022] during the pandemic. This peer-reviewed research article faced criticism for its methodological errors and bold assertions based on spurious correlations [Science Feedback, 2022]. The study claimed that “mask mandates actually caused about 1.5 times the number of deaths or 50% more deaths compared to no mask mandates.” Conservative news outlets and health commentators cited this study to argue against mask-wearing during the pandemic [i.e. High Intensity Health, 2022; Morefield, 2022; The Washington Times, 2022]. Those who were high in need for evidence but without the scientific knowledge and training to assess the flaws in the study may have been particularly accepting of this study as valid evidence especially if they were skeptical of the effectiveness of facemasks and distrustful of government science institutions.
As COVID-19 misinformation was widely available on social media [Bridgman et al., 2020] and young adults are heavy users of social media [Perrin & Anderson, 2019], it is important to know how these people in particular react to misinformation in these environments and what epistemic beliefs they use to assess the accuracy of science information online. Thus, this study sought to answer the following research question:
RQ3: How do focus group participants come to trust the accuracy ofscience information?
4 Method
To address the research questions, five focus group interviews were conducted at Texas Tech University. Focus groups are effective for exploring participant attitudes, motivations, and beliefs, while also providing insights into group dynamics [Brennen, 2017]. In this study, the focus group setting facilitated a deeper understanding of the generational cohort of college students at Texas Tech University and their collective experiences. Unlike individual interviews or surveys, focus groups allowed participants to share and reflect on shared experiences during the pandemic, such as their initial reactions, experiences with COVID-19, and encounters with (mis)information.
Two pilot focus group interviews were conducted to test the moderator’s guide and the management of group dynamics. These sessions took place in mid to late October 2022, lasted about an hour and a half each, and included four to five participants aged 18–24. Following the pilot focus group interviews, the moderator’s guide was revised. The main study then comprised five focus groups with four to five participants each, aged 18–27, conducted from early November 2022 to mid-February 2023.
4.1 Participants
As this study sought to understand how the COVID-19 pandemic has shaped the way young people view science, the target population for this study consisted of college students at Texas Tech University who were between the ages of 16 and 25 during the COVID-19 pandemic. According to Krosnick and Alwin [1989], these ages are the prime impressionable years. Participants were recruited through the SONA study registration system and received extra credit for their involvement. A total of 22 students were recruited across five focus groups.
The sample included a diverse range of majors in STEM and non-STEM fields, with participants representing areas such as biology, human sciences, fashion design, interior design, and advertising. Most participants identified as women (n = 14) and were primarily from Texas. Pseudonyms were assigned to participants in the transcripts to protect their identities. While college students were the most accessible population for this qualitative and exploratory study, this choice is a limitation. Future research should aim to include participants from broader demographics and educational backgrounds. Table 1 provides a summary of participants’ age, gender, and academic major.
4.2 Procedure
The research questions informed the focus group protocol, which was divided into three main sections: Perceptions of Science, Science Communication, and COVID-19’s Impact. Each session included opening, closing, and transitional questions to ensure smooth topic transitions. Focus groups lasted between 58 minutes and 1 hour and 20 minutes and were held in a room designed specifically for focus groups at Center for Communication Research at Texas Tech University. Audio and video recordings of the focus groups were made. This study was conducted with the approval of the Texas Tech University Institutional Review Board (IRB2022-817). All questions asked can be found in the supplementary material.
In the Perceptions of Science section, participants described the scientific method, their views on how scientific knowledge and innovations — such as vaccines — are developed, and the government’s role in science. They were also asked to draw a picture of the scientific process.
The Science Communication section prompted participants to discuss how they assess the accuracy of scientific information. In pilot groups, participants struggled to recall instances of encountering COVID-19 science on social media. To address this, the main focus groups were presented with a specific tweet from June 2022 by TownHall.com, which claimed that mask mandates contributed to COVID-19 deaths (see Figure 1). This post was selected due to its verified status and the political bias associated with the source. The cited study in the article — “The Föegen Effect,” which was published in a peer-reviewed journal — faced criticism for its methodological shortcomings [see Fögen, 2022; Science Feedback, 2022]. Participants discussed their thoughts on the post’s accuracy and were encouraged to recall other COVID-19 posts they encountered online, as well as how social media influenced their views of the pandemic.
In the COVID-19’s Impact section, participants reflected on how the pandemic has shaped their understanding of science and the scientific process. They were also asked whether they believe science is politicized and how growing up during a time of polarized scientific discourse has affected their trust in science and their generation as a whole.
4.3 Data analysis
Transcripts of the focus group interviews were obtained using the transcription feature in Adobe Premiere Pro and then manually cleaned by the researcher. All identifiers were removed from the transcripts before transferring them to MaxQDA for coding. While preparing the data, the lead researcher familiarized themself with the material, took notes, and developed ideas for codes and themes. Data were analyzed using a predominantly inductive method, identifying themes most relevant to each research question [Thomas, 2006]. The coding procedure outlined by Thomas [2006] was used; this procedure involved coding many themes and then collapsing them to reduce overlap and redundancy. This inductive coding approach enhances the trustworthiness of the findings, as “the findings arise directly from the analysis of the raw data” rather than “a set of expectations about specific findings” [Thomas, 2006, p. 239].
4.4 Reflexivity statement
Because the researcher is the instrument in qualitative research, mistakes and judgements made through the inputs of personal biases can be made [Brennen, 2017]. That being said, the following should be kept in mind when reading the findings of this study: the researcher who moderated the focus groups and analyzed the data for this study was in the age group of scope for this study at the time of data collection and analysis and was a young adult during the COVID-19 pandemic. Though these influences may have impacted the interpretation of the data for this study, the researcher’s role was simply to guide the focus groups and gain a better understanding of the participants’ thoughts on science and scientific information online in light of the COVID-19 pandemic.
5 Findings
5.1 Views on science
Views on science methods and principles. Participants largely expressed trust in science methods and principles. Many participants described the scientific method as an educated guessing game in which hypotheses are formed based on existing knowledge and then rigorously tested.
So sometimes it’s a lucky guess… It’s a lot of testing. Like the flu vaccine every year is just guessing which strain is going to be the most prevalent that year from different countries. (Ginger, Focus Group 2)
Others in her focus group agreed including Toby, Bonnie, and Amy who said science consists of “trial and error.” Participants viewed this process positively, seeing it as an assurance that science self-corrects and constantly refines its understanding of phenomena. Josh (Focus Group 1) used the discovery of germs as an example and said that science uncovers “new things that you didn’t take into consideration [before] that change what is already there.”
The peer review process was also mentioned as a vital principle in science, which participants cited as a critical mechanism for maintaining quality and accuracy in science. They felt that peer review provided a layer of accountability, as it ensures that scientific findings are scrutinized by experts before being accepted or disseminated to the public. Annabeth (Focus Group 4) said, “It has to be, like, repeatable and challenged by other people.” Along the same lines, another member of her focus group, Will (Focus Group 4), said, “It has to be challenged. Like other professionals have to look it over and be like, ‘Get your math right or your processing right and then you get published.”’ Will went on to say, “[Questioning and peer review] is a critical part of it — challenging it, asking questions, being able to answer those questions.”
Views on science institutions. When discussing trust in science institutions, focus group participants were largely trusting of organizations such as the Centers of Disease Control (CDC) and the Food and Drug Administration (FDA) as sources of science information. However, they were wary of the government’s interference with science. When asked how they knew whether or not science information online was accurate, Carol (Focus Group 5) said, “I think definitely anything CDC. […] I wasn’t really listening to politicians because like, obviously it was like an election year and COVID, so that [COVID] was just a pawn.” Participants expressed that the government should focus on providing funding and resources for developing technologies, such as vaccines, and establishing regulatory guidelines to ensure public safety. However, they felt that government involvement should be limited to these things and expressed discomfort with government intervention beyond these roles. Carol (Focus Group 5) said, “I feel like they should not have a role in probably, like, the scientific stuff because they’re most likely not scientists.” When asked what the government’s role should be in science, Ginger (Focus Group 2) said:
I feel like if you don’t know anything about science, you shouldn’t be able to speak on it. And I know that’s a hot take right there. Like, you have a lot of power to do a lot of bad if you don’t know what you’re talking about and there are people that do. So why do they [politicians] get to be the final judge? (Ginger, Focus Group 2)
The role of the government to communicate factual and accurate information about new technologies like vaccines to the public was mentioned in four of the five focus groups. Margaret (Focus Group 1) said, “[The government’s job is] to spread that information without bias to the public. Like, ‘Hey, this vaccine is safe to use. Go at it.”’ Additionally, participants saw the government as responsible for combating misinformation by disseminating science-based information to the public and counteracting the spread of misinformation:
I think the government has a big role in, like, vetting the information that, like, is being put out into the public, to make sure that it’s — I don’t know — like, actually scientifically-based and not just, like, random opinions being thrown out and being taken as truth because that causes chaos. (Donna, Focus Group 1)
Views on scientists. Participants expressed a strong reliance on scientists for accurate medical advice during the pandemic. Many mentioned that they trusted parents, extended family members, and friends who worked in healthcare as credible sources of information about COVID-19. Carol (Focus Group 5), for example, mentioned that three of her friends on Instagram who were nurses during the pandemic urged their followers to take the pandemic seriously, using anecdotes from their real lives. However, they mentioned that there were a great deal of unqualified professionals or non-professionals spreading misinformation on social media during the pandemic. They said some of these people had good intentions. For instance, Will (Focus Group 4) said:
I think there are a lot of people trying to spread information in, like — they were meaning to be helpful. They were meaning to communicate information about public health, but they’re not qualified to do that. […] Just because you read a scientific paper doesn’t mean you actually understand what’s in it. And so I think there were a lot of people trying to and making a good effort of trying to spread good information but not being qualified to do so. (Will, Focus Group 4)
Participants also differentiated between types of scientists, expressing that there are both ‘good’ and ‘bad’ scientists. Three participants from Focus Group 4, Sheila, Will, and Annabeth, discussed seeing medical professionals who had less than good intentions when discussing research regarding the pandemic online:
They would have these, like, livestreams on Facebook where they would talk about the research and how it’s [factual information] not true… And then you’d look into them, and it’s like, ‘Okay, you’re a chiropractor,’ or ‘You’re a dentist,’ or whatever other kind of doctor. (Will, Focus Group 4)
Participants also noted concerns about the politicization of science, suggesting that some scientists may mislead the public under the guise of providing scientific insight that support political parties’ beliefs. Josh (Focus Group 1) noted that he would, “Look at any past studies he [the scientist] may have done and just like his professional background and see if he has any pre-existing biases or anything” when checking the accuracy of a scientific study posted online. Donna, (Focus Group 1) said:
It [science] can be, like, twisted. Like, people can just make it say whatever they want, slap a study onto it and make people believe it and cause mass panic. But at the other end, like, it can be used for good. (Donna Focus Group 1)
5.2 COVID-19’s impact on Trust in Science
Changing Trust in Science. While participants expressed gratitude for science and held trust in scientific principles and methods, many reported a newfound skepticism that emerged during the COVID-19 pandemic. Throughout the focus groups, participants described themselves as “skeptical,” “cautious,” and noted that they approached scientific information “with a grain of salt.” Will (Focus Group 4) said, “I think I have gotten more critical, maybe to the point that I’m too critical of things.” Similarly, Amy (Focus Group 1) said, “I’m way more hesitant now, which is probably not a good thing, but I don’t know.” Josh (Focus Group 1) said:
I also think it also made me realize that just that science itself doesn’t hold as much weight now. Like a scientific study will pretty much always be out beaten online by like some dude who just says the opposite, you know, without much information. (Josh, Focus Group 1)
Toby (Focus Group 2) said that while he considered himself more of a cynic to begin with, he now feels even more cynical regarding science, “[It made me] more skeptical about what people are saying out there… that [the pandemic] made me even more cynical about not trusting stuff I see.”
However, they noted that this may be a healthy skepticism. Carol (Focus Group 5) said the pandemic “gave [her] a healthy dose of wariness about it [science] and the importance of research.” A participant from Focus Group 4, Will, talked about the importance of healthy skepticism — or a vigilant trust — regarding science and noted:
I think Democrats might even be to the point where you can’t question science, which when we were writing down our scientific processes, that [questioning and peer review] is a critical part of it — challenging it, asking questions, being able to answer those questions. (Will, Focus Group 4)
Politicization of science. Throughout the focus groups, participants discussed the intertwining of COVID-19 science and politics. Some participants even attributed individuals’ attitudes toward science to coincide with their political party. For example, Margaret (Focus Group 1) said, “If you believe one way on a scientific thing, then you’re conservative or you’re liberal.” Similarly, Will (Focus Group 4) said, “I feel like the current stereotype is that Republicans are anti-science and Democrats are pro-science.” Participants recognized that the pandemic occurring during a presidential election year in the U.S. heightened the politicization of COVID-19 science. For example, Amy (Focus Group 1) and Deborah (Focus Group 5) said:
That was probably the worst thing is that it [2020] was an election year. […] I mean if Trump said something [about COVID-19], whether it was true or not, you’re still going to have your people that agree with him or disagree. (Amy, Focus Group 1)
I think science is really, like, neutral and just, like, seeking answers. But especially the timing of it all. Like the election year pandemic […]. It was so easy to just grab something that you supported or if you saw a politician supporting it or like endorsing something, you’re like, ‘Well, I’m on this team.’ Like, the camps were so clear. And so I think it was easy to politicize something that normally isn’t. (Deborah, Focus Group 5)
Many focus group participants attributed their skepticism of science to the perceived intervening role of the government and the politicization of scientific discourse during the COVID-19 pandemic.
[COVID-19 science] was politicized so much and that’s just very hard to go forward with because you’re like, ‘Okay, if it comes in another election year and another thing happens, like what can I believe?’ You know? So, I think it gave me a little bit of mistrust. (Carol, Focus Group 5)
5.3 Assessing the accuracy of science information
Need for evidence. When viewing the Townhall post about the study regarding The Föegen Effect and thinking about other science-based COVID-19 information online, participants said that they predominantly relied on evidence to determine the accuracy of information regarding COVID-19. They emphasized the importance of seeking proof, such as “statistics” and “data,” and expressed a desire to engage in fact-checking when the topic was of interest. Toby (Focus Group 2) said, “It really just depends on what’s in the study. Like, if there’s actual statistics, you know, but if it’s just an opinion piece then…” On a similar note, Josh (Focus Group 1) said:
As long as I can see, like, the evidence that you put into it, then I’m more trusting. But if you’re just, like, saying something just for the sake of saying something or not even, like, providing the source, then I’m not prone to take you seriously. (Josh, Focus Group 1)
Participants said they actively sought to verify the claims made in a study by searching for corroborating evidence before deciding whether or not to trust the information. Margaret (Focus Group 1), for example, said that she would:
Probably see where they got their source from. But honestly, I would probably just look that [claim] up and then see if it’s like just one person, you know what I mean? If it’s like there’s other studies like backing up or like. So yeah, because if there’s 100 different studies proving it wrong, then it’s like, okay, obviously I’m not going to believe that. (Margaret, Focus Group 1)
Many participants also argued that others should use evidence and fact-check information they see online before coming to a conclusion. For example, Carol (Focus Group 5) said that people need to “do [their] due diligence and read this study along with the studies on both sides to, like, see what actual information is being put out there that was researched.”
Faith in intuition and truth is political. Participants expressed a sense of confidence in their own ability to use evidence to assess the accuracy of scientific studies. However, they were concerned that others, particularly older individuals, relied more heavily on the other epistemic beliefs — faith in intuition and truth is political — rather than need for evidence when coming across COVID-19 science information online. Amy (Focus Group 1), for example, said:
I think people just thought, like, ‘I’m going to go with my gut and do what I think is best and that’s it.’ […] which, I mean, a lot of people have always thought that way, but like, I feel like it became very increased. Like, ‘I’m going to believe what I want to believe about science.’ (Amy, Focus Group 1)
Participants said they believed the older generation may have been more prone to believing in misinformation and conspiracy theories that arose regarding the COVID-19 pandemic than their own generation. Bonnie (Focus Group 2), for example, said, “I feel like the younger generation was maybe more skeptical about what you see online. And then at least for me, the older generation was maybe a little more believing into events that seemed a little silly.”
Others in Bonnie’s focus group agreed. Abbey (Focus Group 2) said that her mother, who was usually very practical, got caught up in political conspiracy theories regarding COVID-19 online. Other focus groups also noted that the older generation was more apt to believe misinformation online. For example, Josh (Focus Group 1) said, “I think that has to do with our already existing biases — for older people predominantly anyway.” Similarly, Sam (Focus Group 1) said, “A lot of the older generation, they already have their own, you know, biases and beliefs and so they’ll obviously follow those.”
Some participants noted that, while they recognized that their parents may have been more believing in misinformation and using political heuristics to assess the accuracy of science information, they still went along with their parents’ beliefs regarding COVID-19 prevention. For example, Sheila (Focus Group 4) mentioned that she had friends who chose not to get vaccinated or wear masks to protect themselves and others from COVID-19. She said, “They’re like, ‘Well, you’re my parents, so I’m going to follow what you say because I think you are very certain this is the right way.”’
6 Discussion
This study sought to gain an understanding of how young adults view science principles and methods, science institutions, and scientists and how they come to trust science information online in light of the COVID-19 pandemic. To fulfill these purposes, five focus groups were conducted with college students at Texas Tech University from November 2022 to February 2023. The findings from this study indicate that while the focus group participants were largely trusting of science and science institutions, they were wary of government and politician interference in science and scientists who lack integrity and benevolence to act in the best interests of the public. This study also found that the focus group participants often found themselves using evidence and fact-checking to assess the accuracy of science information they found online, while they found members of older generations (i.e., their parents) to use political heuristics to form their judgements. These results and their place in the trust in science literature are discussed below.
6.1 Trust in Science principles and methods, science institutions, and scientists
Research Questions One and Two asked how focus group participants viewed science principles and methods, science institutions, and scientists and how they believed the pandemic influenced their trust. This study found that while participants were largely trusting of science principles, methods, and institutions, they were wary of the intermingling of science and politics during the pandemic. Participants noted that the 2020 U.S. Presidential Election, which coincided with the pandemic, amplified the politicization of COVID-19. Participants indicated that this politicization during the pandemic contributed to their growing skepticism of science and the role of politics and the government in science. Participants also expressed concerns regarding scientists’ ability to remain politically impartial in their conducting of and dissemination of scientific research, noting that they considered the potential political bias of the scientist when evaluating the credibility and trustworthiness of their work. Despite this skepticism of politics in science, participants were largely trusting of government science institutions such as the CDC and FDA, attributing these as being sources unbiased by politics. They also expressed that the government should continue to fund science but encouraged limited involvement of the government beyond that.
These findings align with prior research that found that the politicization of science in media can foster more science skepticism [Bolsen et al., 2014; Bolsen & Druckman, 2015; Schmid-Petri et al., 2022; van der Linden, 2015]. For example, Bolsen and colleagues [2014] found that the politicization of science can promote uncertainty about what can and cannot be trusted in science, which echoes the experiences shared by the participants in the current study. The findings also align with those found by Lupia et al. [2024] who found that people have grown concerned of scientists’ ability to mute their political biases when presenting science information, which reinforce the themes found in this study.
Future studies should continue to monitor trust in science, specifically, among this generational cohort to better understand how the COVID-19 pandemic impacts their trust in science over time and how this may impact their views on future science issues. According to the “impressionable years” hypothesis, the political events that take place during the emerging adulthood years may influence political beliefs in later adulthood [Krosnick & Alwin, 1989]. Additionally, future studies should examine how disclosure of funding sources or affiliations of the scientist influences an individual’s trust in the scientist and their findings.
6.2 Assessing the accuracy of science information
Research Question Three asked how participants came to trust the accuracy of science information online, particularly in the context of the COVID-19 pandemic. Participants said they were more skeptical of science information and needed more evidence to come to a conclusion about its accuracy in light of the COVID-19 pandemic. Participants in this study claimed to use the epistemic belief need for evidence to assess the accuracy of science information online rather than other epistemic beliefs — faith in intuition and truth is political. Their self-reported reliance on need for evidence suggests that participants seemed to show signs of vigilant trust, which helps protect against the threat of being misinformed [Sperber et al., 2010]. This may help protect them moving forward, as those who have a high need for evidence tend to hold fewer conspiracy theory beliefs [Rudloff et al., 2022] and hold fewer misperceptions regarding science [Garrett & Weeks, 2017].
Participants attributed the other epistemic beliefs — faith in intuition and truth is political — to be used by older generations, namely their parents. However, research has also found that older people believe that those younger than them are more susceptible to misinformation [see Martínez-Costa et al., 2022]. Though social media users over the age of 50 do tend to share more misinformation on social media than young adults [Grinberg et al., 2019; Guess et al., 2019], there is a possible third-person perception which should be studied in future work. Future studies should examine age as a factor in research regarding epistemic beliefs and science information to assess whether different generational cohorts use different epistemic beliefs to assess the accuracy of science information online and how this affects their perceptions of science issues. Future studies could also examine this possible third-person perception to better understand how this age-bias influences evaluations of science information online.
Participants viewed organizations such as the CDC and FDA as trustworthy sources of science information. While these are government-run institutions, participants viewed these organizations as reputable and able to mute political biases that simply state facts opposed to scientists and politicians, who they saw as less reputable sources. This finding shows that participants placed trust in these science institutions instead of relying on their own, largely untrained, understanding of science [Atir et al., 2015]. Future studies should examine the nuances of trust in science institutions to better understand what characteristics make these organizations more trustworthy than others and to compare trust in science institutions and trust in the scientists who might work for these institutions.
6.3 Limitations
Like all studies, this study has limitations which should be addressed. First, are the study participants. Participants in this study were college students, and while student samples are often used by social science researchers in academia due to their accessibility [Meltzer et al., 2012], college-educated individuals tend to have higher levels of trust in science, generally [Kennedy & Tyson, 2023]. The location of this study and the residency of the participants is also a limitation. Texas is considered to be a Red state in the U.S., meaning the majority of its government leaders and citizens identify with the Republican Party. As previously noted, trust in science during the COVID-19 pandemic tended to decline especially among those who identified as Republican. While data on political party and ideology were not collected, it is important to note the geographical and sociopolitical environment of the participants when interpreting the results of this study. Not collecting important information regarding political ideology given the importance of political ideology in trust in COVID-19 science is a limitation of this study.
There are also limitations regarding the design of the study and the questions asked. Participants were not directly asked about their trust in science; rather, they were asked what their views of science were. While many participants discussed their trust despite being asked their views, this is a limitation. This study utilized focus group interviews. While the intention behind this method was to allow participants to reflect on their shared experiences during the pandemic, participants’ answers to questions may have been influenced by their desire to be accepted among the other participants in their group and the moderator. For example, they may have indicated holding more positive views of science because they believed those were the views of the other participants or the moderator. Participants may have also said they used evidence to assess the accuracy of science information because of this social desirability. That is, it is possible that they believed it was more socially desirable to use evidence rather than political heuristics to evaluate the accuracy of science information online. That being said, while they all seemed to discount the information in the example study shown in the focus groups as misinformation, this may not have been the case in practice. Another limitation is the lack of generalizability this study affords as it was qualitative in nature. However, the goal of this study was not to provide generalizable evidence but to provide a nuanced understanding of the lived experiences of these individuals and how their experiences as emerging adults during the COVID-19 pandemic may have influenced their trust in science.
Acknowledgments
Thank you to Dr. Amy Koerber for her guidance on this project.
References
-
Achterberg, P., de Koster, W., & van der Waal, J. (2017). A science confidence gap: education, trust in scientific methods and trust in scientific institutions in the United States, 2014. Public Understanding of Science, 26, 704–720. https://doi.org/10.1177/0963662515617367
-
Aksoy, C., Eichengreen, B., & Saka, O. (2020). Young people trust governments less after exposure to an epidemic. LSE COVID 19 Blog. https://blogs.lse.ac.uk/covid19/2020/06/22/young-people-trust-governments-less-after-exposure-to-an-epidemic/
-
Aksoy, C., Eichengreen, B., & Saka, O. (2022). COVID-19 and trust among the young. Finance & Development, 0059, 1. https://doi.org/10.5089/9781513597300.022
-
Atir, S., Rosenzweig, E., & Dunning, D. (2015). When knowledge knows no bounds: self-perceived expertise predicts claims of impossible knowledge. Psychological Science, 26, 1295–1303. https://doi.org/10.1177/0956797615588195
-
Ballantyne, N., & Dunning, D. (2022). Skeptics say, ‘Do your own research.’ It’s not that simple. The New York Times. https://www.nytimes.com/2022/01/03/opinion/dyor-do-your-own-research.html
-
Bolsen, T., & Druckman, J. N. (2015). Counteracting the politicization of science: counteracting the politicization of science. Journal of Communication, 65, 745–769. https://doi.org/10.1111/jcom.12171
-
Bolsen, T., Druckman, J. N., & Cook, F. L. (2014). How frames can undermine support for scientific adaptations: politicization and the status-quo bias. Public Opinion Quarterly, 78, 1–26. https://doi.org/10.1093/poq/nft044
-
Brennen, B. S. (2017). Qualitative research methods for media studies. Routledge.
-
Bridgman, A., Merkley, E., Loewen, P. J., Owen, T., Ruths, D., Teichmann, L., & Zhilin, O. (2020). The causes and consequences of COVID-19 misperceptions: understanding the role of news and social media. Harvard Kennedy School Misinformation Review, 1, 1–18. https://doi.org/10.37016/mr-2020-028
-
Butterfuss, R., Aubele, J., & Kendeou, P. (2020). Hedged language and partisan media influence belief in science claims. Science Communication, 42, 147–171. https://doi.org/10.1177/1075547020908598
-
Chinn, S., & Hasell, A. (2023). Support for “doing your own research” is associated with COVID-19 misperceptions and scientific mistrust. Harvard Kennedy School Misinformation Review, 4, 1–15. https://doi.org/10.37016/mr-2020-117
-
Cox, D. A., Mills, M. A., Banks, I. R., Hammond, K. E., & Gray, K. (2023). America’s crisis of confidence: rising mistrust, conspiracies, and vaccine hesitancy after COVID-19. Survey Center on American Life. https://www.americansurveycenter.org/research/americas-crisis-of-confidence-rising-mistrust-conspiracies-and-vaccine-hesitancy-after-covid-19/
-
Cummings, L. (2014). The “trust” heuristic: arguments from authority in public health. Health Communication, 29, 1043–1056. https://doi.org/10.1080/10410236.2013.831685
-
Eichengreen, B., Aksoy, C. G., & Saka, O. (2021). Revenge of the experts: will COVID-19 renew or diminish public trust in science? Journal of Public Economics, 193, 104343. https://doi.org/10.1016/j.jpubeco.2020.104343
-
Fögen, Z. (2022). The Fögen effect: a mechanism by which facemasks contribute to the COVID-19 case fatality rate. Medicine, 101, e28924. https://doi.org/10.1097/md.0000000000028924
-
Garrett, R. K., & Weeks, B. E. (2017). Epistemic beliefs’ role in promoting misperceptions and conspiracist ideation (S. Lozano, Ed.). PLOS ONE, 12, e0184733. https://doi.org/10.1371/journal.pone.0184733
-
González, F. A. I., & Simes, H. (2023). Impressionable years and trust in institutions: microeconomic evidence from Argentina. Studies in Microeconomics. https://doi.org/10.1177/23210222231189065
-
Grinberg, N., Joseph, K., Friedland, L., Swire-Thompson, B., & Lazer, D. (2019). Fake news on Twitter during the 2016 U.S. presidential election. Science, 363, 374–378. https://doi.org/10.1126/science.aau2706
-
Guess, A., Nagler, J., & Tucker, J. (2019). Less than you think: prevalence and predictors of fake news dissemination on Facebook. Science Advances, 5. https://doi.org/10.1126/sciadv.aau4586
-
Hart, P. S., Chinn, S., & Soroka, S. (2020). Politicization and polarization in COVID-19 news coverage. Science Communication, 42, 679–697. https://doi.org/10.1177/1075547020950735
-
Hendriks, F., Kienhues, D., & Bromme, R. (2015). Measuring laypeople’s trust in experts in a digital age: the Muenster Epistemic Trustworthiness Inventory (METI) (J. M. Wicherts, Ed.). PLOS ONE, 10, e0139309. https://doi.org/10.1371/journal.pone.0139309
-
Hendriks, F., Kienhues, D., & Bromme, R. (2016). Trust in science and the science of trust. In Trust and communication in a digitized world (pp. 143–159). Springer International Publishing. https://doi.org/10.1007/978-3-319-28059-2_8
-
High Intensity Health. (2022, June 25). Do masks make infections worse? The Foegen effect explained [Youtube video]. https://youtu.be/5tuKmLjiN6w
-
Hofer, B. K. (2001). Personal epistemology research: implications for learning and teaching. Educational Psychology Review, 13, 353–383. https://doi.org/10.1023/a:1011965830686
-
Kennedy, B., & Tyson, A. (2023). Americans’ trust in scientists, positive views of science continue to decline. Pew Research Center. https://www.pewresearch.org/science/2023/11/14/americans-trust-in-scientists-positive-views-of-science-continue-to-decline/
-
Krosnick, J. A., & Alwin, D. F. (1989). Aging and susceptibility to attitude change. Journal of Personality and Social Psychology, 57, 416–425. https://doi.org/10.1037/0022-3514.57.3.416
-
Lupia, A., Allison, D. B., Jamieson, K. H., Heimberg, J., Skipper, M., & Wolf, S. M. (2024). Trends in U.S. public confidence in science and opportunities for progress. Proceedings of the National Academy of Sciences, 121. https://doi.org/10.1073/pnas.2319488121
-
Martínez-Costa, M.-P., López-Pan, F., Buslón, N., & Salaverría, R. (2022). Nobody-fools-me perception: influence of age and education on overconfidence about spotting disinformation. Journalism Practice, 17, 2084–2102. https://doi.org/10.1080/17512786.2022.2135128
-
Mayer, R. C., Davis, J. H., & Schoorman, F. D. (1995). An integrative model of organizational trust. Academy of Management Review, 20, 709–734. https://doi.org/10.5465/amr.1995.9508080335
-
Meltzer, C. E., Naab, T., & Daschmann, G. (2012). All student samples differ: on participant selection in communication science. Communication Methods and Measures, 6, 251–262. https://doi.org/10.1080/19312458.2012.732625
-
Morefield, S. (2022). New study alleges mask mandates associated with increased COVID death rate. Townhall. https://townhall.com/tipsheet/scottmorefield/2022/06/05/new-study-mask-mandates-associated-with-increased-covid-death-rate-n2608241
-
Nickerson, R. S. (1998). Confirmation bias: a ubiquitous phenomenon in many guises. Review of General Psychology, 2, 175–220. https://doi.org/10.1037/1089-2680.2.2.175
-
Olshansky, A., Peaslee, R. M., & Landrum, A. R. (2020). Flat-smacked! Converting to flat eartherism. Journal of Media and Religion, 19, 46–59. https://doi.org/10.1080/15348423.2020.1774257
-
Origgi, G. (2012). Epistemic injustice and epistemic trust. Social Epistemology, 26, 221–235. https://doi.org/10.1080/02691728.2011.652213
-
Perrin, A., & Anderson, M. (2019). Share of U.S. adults using social media, including Facebook, is mostly unchanged since 2018. Pew Research Center. https://www.pewresearch.org/short-reads/2019/04/10/share-of-u-s-adults-using-social-media-including-facebook-is-mostly-unchanged-since-2018/
-
Peters, R. G., Covello, V. T., & McCallum, D. B. (1997). The determinants of trust and credibility in environmental risk communication: an empirical study. Risk Analysis, 17, 43–54. https://doi.org/10.1111/j.1539-6924.1997.tb00842.x
-
Reif, A., & Guenther, L. (2021). How representative surveys measure public (dis)trust in science: a systematisation and analysis of survey items and open-ended questions. Journal of Trust Research, 11, 94–118. https://doi.org/10.1080/21515581.2022.2075373
-
Rudloff, J. P., Hutmacher, F., & Appel, M. (2022). Beliefs about the nature of knowledge shape responses to the pandemic: epistemic beliefs, the dark factor of personality and COVID-19-related conspiracy ideation and behavior. Journal of Personality, 90, 937–955. https://doi.org/10.1111/jopy.12706
-
Schmid-Petri, H., Bienzeisler, N., & Beseler, A. (2022). Effects of politicization on the practice of science. In T. Bolsen & R. Palm (Eds.), Molecular biology and clinical medicine in the age of politicization (pp. 45–63). Elsevier. https://doi.org/10.1016/bs.pmbts.2021.11.005
-
Schmidt, H. (2023). Pandemics and politics: analyzing the politicization and polarization of pandemic-related reporting. Newspaper Research Journal, 44, 26–52. https://doi.org/10.1177/07395329221095850
-
Science Feedback. (2022, June 11). Scientific evidence shows that mask-wearing is effective at limiting community transmission; claims that face masks increase mortality are based on flawed correlation studies. https://science.feedback.org/review/scientific-evidence-shows-mask-wearing-effective-at-limiting-community-transmission-claims-face-masks-increase-mortality-based-on-flawed-correlation-studies/
-
Sperber, D., Clément, F., Heintz, C., Mascaro, O., Mercier, H., Origgi, G., & Wilson, D. (2010). Epistemic vigilance. Mind & Language, 25, 359–393. https://doi.org/10.1111/j.1468-0017.2010.01394.x
-
The Washington Times. (2022, June 13). Face masks reconsidered: following the science uncovers failure. https://www.washingtontimes.com/news/2022/jun/13/editorial-face-masks-reconsidered-following-the-sc/
-
Thomas, D. R. (2006). A general inductive approach for analyzing qualitative evaluation data. American Journal of Evaluation, 27, 237–246. https://doi.org/10.1177/1098214005283748
-
van der Linden, S. (2015). The conspiracy-effect: exposure to conspiracy theories (about global warming) decreases pro-social behavior and science acceptance. Personality and Individual Differences, 87, 171–173. https://doi.org/10.1016/j.paid.2015.07.045
-
Wilholt, T. (2013). Epistemic trust in science. The British Journal for the Philosophy of Science, 64, 233–253. http://www.jstor.org/stable/24563046
-
Wu, Y., Kuru, O., Campbell, S. W., & Baruh, L. (2022). Explaining health misinformation belief through news, social and alternative health media use: the moderating roles of need for cognition and faith in intuition. Health Communication, 38, 1416–1429. https://doi.org/10.1080/10410236.2021.2010891
-
Zhou, A., Liu, W., & Yang, A. (2024). Politicization of science in COVID-19 vaccine communication: comparing U.S. politicians, medical experts, and government agencies. Political Communication, 41, 649–671. https://doi.org/10.1080/10584609.2023.2201184
About the author
Ch’Ree Essary, Ph.D. is an assistant professor of science communication in the Department of Advertising and Public Relations at The University of Alabama. A strategic science communication scholar, she is interested in understanding the individual, message, and source factors that influence how people select media and how to strategically tailor messages to increase interest and engagement in media related to science, health, and the environment.
E-mail: cessary@ua.edu X: @Chree_Essary
Supplementary material
Available at https://doi.org/10.22323/2.23090202