‘Insanity is doing the same thing over and over againand expecting different results.’
Rita Mae Brown
We all know what the problem is. The science deniers have grown in number, spurred by political and ideological forces, and fuelled by a global wave of COVID-19 skepticism [Rutjens et al., 2022]. In some countries, notably the United States, climate and vaccine skeptics occupy positions at the highest levels of government and have wielded political power to make major cuts to funding for science and academic institutions [Heidt & Ledford, 2025].
In response, scientists have increasingly gathered in numbers to “stand up for science” [Berntsen et al., 2025]. During the opening plenary at the 2025 American Association for Advancement in Science meeting in Boston, an enthusiastic audience of scientists and engineers participated in call-and-response chants about the public benefits of science: “Who eradicated smallpox?” “Science!” “Memory foam?” “Science!” Similar cries were heard at recent rallies for science, which gathered tens of thousands of scientists and their allies across Europe and the United States. The mandate for those in the field of science communication, who aim to respond to this apparent conflict between scientists and science deniers, is clear: “to help people update existing understandings of the world and to change those understandings when necessary” [Jones & Anderson Crow, 2017, p. 1]. In other words, if the rest of the world could be persuaded to see the world the way scientists do, we would all be much better off.
Of course, this approach is closely related to the so-called deficit model, frequently critiqued by much of the science communication community because of its overemphasis on the importance of information transfer as driving attitudes towards science [Simis et al., 2016]. Readers of this journal will be familiar with the decades of scholarship showing that facts alone don’t change minds, that persuasive approaches often backfire, and that meaningful progress requires two-way dialogue and long-term engagement with skeptical publics [Stilgoe et al., 2014; Toomey, 2023]. And yet, a sticky, perhaps subconscious, assumption that lurks in the shadows alongside this critique is as follows: if we just do a good enough job with public engagement, this humble listening and creating of spaces for dialogue and participation, we will all come to similar conclusions about the truths of science and its benefits for society.
Nowhere is this assumption so persistent as in concerns about, and proposed solutions to, the widespread problem of scientific misinformation, particularly around vaccines and climate change. Most scholarship on scientific misinformation begins with the premise that it is an objective harm to be eliminated, whether through pre-emptive strategies, such as information inoculation, or through retroactive tactics, such as technique rebuttal [Farrell et al., 2019; Coleman et al., 2024]. These approaches are regarded as especially critical in an era marked by misinformation campaigns funded by powerful interest groups, such as those in the tobacco and oil industries, aimed at challenging and delaying science-based action [Oreskes & Conway, 2011; O’Connor & Weatherall, 2019].
But there are several problems with these persuasion approaches to scientific misinformation. First, there is evidence that some of the ways the scientific community attempts to tackle misinformation are backfiring, contributing to increased societal distrust in science [Carey et al., 2020]. Research has found that in contexts where science has become politicized, such as the United States, messaging targeted at skeptical communities is not effective at increasing trust in science, and can sometimes even backfire [Kusmanoff et al., 2020; Gligorić et al., 2025]. For example, one study found that banner warnings posted on online misinformation can lead to reduced trust in scientifically accurate, as well as inaccurate, claims [Williams-Ceci et al., 2024]. Second, there is always the chance that the scientific consensus on a given issue may be corrected — the history of science is full of updates and even reversals in what the scientific community thought it knew, and dissent from established views has often been later found to be valuable [Kuhn, 1962; Oreskes, 2019; de Melo-Martín & Intemann, 2018]. For example, hypotheses about plate tectonics and the bacterial origin of ulcers were both adamantly rejected by the scientific community before later becoming dominant scientific views [Oreskes, 1999; Radomski et al., 2021; Solomon, 2007]. If the scientific community dismisses ideas that later are found to have at least some grounds for evidence, this could create even further skepticism and distrust among publics. And third, as we argue below, the deeply value-driven nature of science gives us strong reasons to reject the idea that persuasion should be the main goal of science communication. For example, there is evidence that skepticism towards science is often not really about the science at all, but rather about values and worldviews that are related to specific science topics [Oreskes & Conway, 2022; Fuglsang & Losi, 2024; Santoro & Sydnor, 2024]. In such cases, it is not helpful to try to persuade skeptical individuals about the evidence supporting a particular scientific issue.
Thus, perhaps to truly break free from these deficit-like tendencies, we need to clearly articulate a different goal for science communication that abandons persuasion in favour of something else entirely. In this commentary, we propose a shift toward rethinking the goal of science communication to be one of empowerment, where questions and concerns about science are seen as a healthy part of the science-society ecosystem, rather than an objective “bad” to be systematically dismissed and debunked. First, we argue that science is not neutral or universally positive but is shaped by diverse values, and thus some degree of skepticism and debate about science is always warranted in a democratic society. On this basis, we argue that the goal of science communication should not be one of persuasion, where people are mere recipients of scientific advice and outputs, but rather, that of empowerment, where all people have access to tools, resources, and opportunities for engaging with science in ways that align with their values. We propose an account of scientific empowerment that includes material, social-psychological, and civic aspects, and we show how this account could be operationalised in a future pandemic response.
Science as value-laden. As scholars of science studies have long argued, science does not happen in a black box, free of the values of the humans who practice it [Ward, 2021]. In other words, it is value-laden, with the influences of values being not only unavoidable but also legitimate [Brown, 2020; Douglas, 2009; Elliott, 2022]. For example, social and ethical values play legitimate roles in the prioritisation of scientific research topics, such as deciding how much money to spend on fundamental research in high-energy physics as opposed to research on renewable energy or on cardiovascular disease. Values are relevant to deciding whether particular studies, such as potentially dangerous gain-of-function research on deadly pathogens, should be limited or regulated [Resnik, 2021]. Similarly, values often guide the application of research to policy making, such as when deciding how much evidence of harm is necessary before taking action in response to threats like plastic pollution, toxic chemicals, biodiversity loss, or climate change [Steel, 2015]. Furthermore, social and ethical values have legitimate roles to play even in more “internal” or “core” aspects of scientific reasoning [Douglas, 2009; Elliott, 2017]. For example, values can influence what questions to ask or what methods to use when studying socially relevant topics like agriculture or medicine [Lacey, 2017]. Values are also implicated in the design of models, such as deciding what features of the world are most important for climate models to predict accurately [Intemann, 2015], as well as the selection of scientific categories and terminology, such as deciding whether to analyse genomic data in terms of race, ancestry, or some other kind of classification system [Yudell et al., 2016]. Similarly, as uncertainty is inherent to science, values are involved when researchers weigh the potential costs of false-positive or false-negative errors as they make methodological decisions and decide how to communicate their findings [Douglas, 2009; Elliott & Richards, 2017; John, 2015].
Few researchers would dispute the above examples of how values are embedded in the practice of and engagement with science. But what is often less acknowledged are the ramifications of this insight for public engagement with science. Since science rightly incorporates values, and since scientists are not uniquely positioned to make decisions about values in a democratic society, the value-ladenness of science legitimises broad and diverse public engagement with science [Lusk, 2021; Schroeder, 2021]. As such, public disagreement with science does not automatically equate to ignorance of scientific information [Elliott, 2017]. For example, skepticism about the use of genetically modified organisms (GMOs) in agriculture may reflect differing values about the kinds of agricultural systems that are best able to meet social, economic, and environmental priorities, rather than a lack of knowledge about GMO safety [Hicks, 2017; Lacey, 2017]. Similarly, those who express hesitancy about vaccines may be asking different questions than the dominant ones asked by vaccine researchers; in particular, the vaccine hesitant may be focused on the risks of vaccines for their loved ones with unique underlying medical conditions, whereas vaccine researchers tend to be focused on assessing the overall risk/benefit ratio of vaccines for society as a whole [Goldenberg, 2021]. Along the same lines, those who express skepticism about the outcomes of chemical risk assessments may have completely legitimate concerns about the methodological assumptions underlying those assessments [Douglas, 2000] or about the policy decision to structure chemical regulations around a slow and costly scientific process [Boyd, 2024].
This scholarship paints science as a pragmatic endeavour that invariably involves numerous value-laden choices that serve specific aims [Brown, 2020]. Science does not just yield a neutral set of facts but rather provides answers to specific questions that serve specific interests or values. It relies on models that emphasise specific types of information [Harvard & Winsberg, 2022] and draws conclusions shaped by disciplinary perspectives, evidentiary standards, and framing choices — all of which prioritise some values over others. Thus, even when scientists do not intentionally incorporate values into their work, it remains value-laden in that it inevitably promotes certain social or ethical values over others [Ward, 2021].
From science literacy to scientific empowerment. When we recognise science as a pragmatic endeavour that reflects particular values, it opens up a new vision for science communication. Rather than treating science communication as an effort to persuade publics to accept neutral scientific facts, a pragmatic perspective frames it as a way to equip people to understand and use science accurately and effectively in pursuit of individual and collective goals. To express this vision of science communication, we suggest the concept of scientific empowerment, which we define broadly as the ability and agency to inform and influence one’s life through skills, knowledge, opportunities, experiences, and resources related to science. Scientific empowerment focuses on the individual and collective science-related choices, agency, and capital made available to people, particularly those in communities that have been historically (or are contemporarily) underserved by science. By doing so, concepts such as power and powerlessness are foregrounded and made explicit, thereby highlighting equitable access to the benefits of (and protection from the risks of) science as a basic human right [Massimi, 2025]. While values related to science are often held abstractly by people (e.g., concerns about ethics related to AI), scientific empowerment creates opportunities and circumstances through which individuals can express those values through practical action (e.g., voicing their concerns to policymakers).
Shifting towards scientific empowerment as the primary goal of science communication enables us to see the work of science communication through multiple lenses. Building on the broader scholarship of empowerment theory [Kabeer, 1999; Hennink et al., 2012; Bayissa et al., 2018], we identify three central dimensions of scientific empowerment (Figure 1): (1) material (access to science education, information about science, and science-related careers); (2) social-psychological (one’s sense of science identity, proximity of science to one’s life and community); and (3) civic (agency and responsibility in making use of science for the betterment of one’s own life and broader society).
The material dimension of scientific empowerment focuses on resources, which include access to formal and informal science education, science programming and materials, and the benefits of science (e.g., evidence-based medical treatments), as well as the ability to gain the skills necessary to take part in science-related careers. This is related to the concept of “science capital,” which encompasses the cumulative experiences that an individual has with science, incorporating familiar concepts such as science literacy as well as less recognised forms of experience [see Halpern & Elliott, 2022], such as having members of family involved in science, engagement with science-related media, and awareness of science jobs in one’s community [Archer et al., 2015].
One crucial aspect of science capital is one’s “science identity,” which relates to the psychological dimension of scientific empowerment. Science identity captures the extent to which people connect science to other aspects of their lives and how they are perceived (both by others and through self-perception) as capable of engaging with science [DeWitt et al., 2016; Carlone & Johnson, 2007]. Research suggests that individuals with a stronger sense of science identity will be more likely to participate in science-related careers, and this is particularly relevant for individuals from groups that have traditionally been underrepresented in science domains [Stets et al., 2017; Vincent-Ruz & Schunn, 2018]. Relatedly, there is new research that explores how one’s psychological proximity to or distance from science is related to levels of skepticism across multiple domains of science, with greater “psychological distance” leading to greater skepticism [Većkalov et al., 2024]. This cumulative scholarship supports our argument that scientific empowerment goes beyond the material and interacts with the social and psychological aspects of what it means to feel a sense of “belongingness” with science. The scientific community can foster this sense of belonging by welcoming engagement with non-specialists, even (or perhaps especially) those who have concerns about science, thus potentially reducing feelings of alienation and mistrust [McIntyre, 2021]. For example, ‘Sidewalk Science’ approaches bring conversations about all types of science into the street, enabling interactions between researchers, science communicators, and members of the public through low-budget interactive displays and unstructured conversations [Stoudt et al., 2019; Pawar, 2024]. These kinds of efforts point to the importance of seeing opportunities for people to have conversations about science as a fundamental human right and allowing participating researchers to serve as “access points” between the public and scientific institutions [see Giddens, 1991].
The third and final dimension of scientific empowerment centres on the idea of scientific citizenship or agency, emphasising the importance of active participation and action in relation to science. An agent is a “being with the capacity to act, and ‘agency’ denotes the exercise or manifestation of this capacity” [Schlosser, 2015, p. 1]. People who exercise scientific agency are able to engage with science in ways that improve their lives — for example, patients advocating for access to participate in clinical trials, environmental justice communities organising to monitor local air and water quality, or students participating in public deliberations to shape climate adaptation plans for the future of their communities. Scientific citizenship/agency is closely related to the concept of critical science literacy, which involves an understanding of the workings of science and the social forces that influence it [Priest, 2013]. It also means equipping people to recognise the limits of science, for example, by acknowledging the extent of scientific uncertainty and by appreciating that policy decisions incorporate a wide array of ethical, economic, and social considerations in addition to scientific information. Importantly, people can have this kind of critical science literacy regardless of their level of technical knowledge (the focus of “traditional” scientific literacy). This means that people can be included in, rather than excluded from, conversations about science, particularly conversations rife with value debates, such as AI, nuclear energy, and genetic modifications [Davies, 2022]. Thus, promoting scientific empowerment may in some cases have very different effects from traditional approaches to scientific communication that are explicitly or implicitly premised on persuading publics to accept recommendations made by scientific experts.
Incorporating scientific empowerment into a pandemic response. To illustrate what it might look like to set scientific empowerment as the goal of science communication, consider the example of a future pandemic response. During the COVID-19 pandemic, science communication was largely focused on persuading the public to follow the most up-to-date public health guidelines, including following expert-led advice regarding vaccination, social distancing, and mask wearing. However, in retrospect, it appears that these guidelines were based on value-laden choices about which scientific disciplines to engage (e.g., epidemiology), thereby prioritizing some questions (e.g., how many lives could be saved) and not others (e.g., what educational and/or physical and mental health harms might result as a result of the lockdowns, particularly among less-privileged communities) [Macedo & Lee, 2025]. This approach likely reduced mortality, but it also led to a degree of backlash against the wider expertise of scientific and medical institutions where some people felt that the unintended consequences of the pandemic polices had not been taken into sufficient consideration [Bardosh et al., 2022; Reed, 2025]. Subsequently, due to a mix of declining confidence in public health experts, interrupted routine health checkups during the pandemic, and increased disparities, childhood immunizations (e.g., DTP) actually declined in many parts of the world between 2020 and 2023 [UNICEF, 2023; Haeuser et al., 2025]. In contrast, a pandemic approach grounded in the goal of scientific empowerment would focus on equipping members of the public, policy makers, and scientists to participate in the making of science-informed decisions that accord with societal values. Crucially, this would not mean disregarding public health recommendations and epidemiological models but rather recognising and critically evaluating the assumptions on which they are based and the potential short and long-term consequences of their implementation. As discussed above, scientific empowerment incorporates material, social-psychological, and civic aspects, although there are not always sharp distinctions between these three elements. From both a material and civic perspective, empowerment would call for engaging a critical mass of experts able to bring the insights of different fields (e.g., not just virology and public health, but also bioethics, economics, science communication, psychology, and philosophy of science) to bear on complex problems like pandemics. It would also seek to provide educational and medical resources to the public to support informed personal and collective health decisions. Importantly, this education would not be limited solely to a narrow form of “science literacy” focused on understanding scientific facts about viruses, vaccines, statistics, and probability. It would also involve promoting more positive experiences with biomedical science [Halpern & Elliott, 2022], in keeping with the social-psychological aspects of empowerment. For example, given the history of sexism and racism in biomedical research (e.g., the Tuskegee syphilis study, or the labelling of women as hysterical) and the ways the pharmaceutical industry has manipulated science to promote its profits, many people have warranted distrust in this area of science [de Melo-Martín & Intemann, 2018; Grasswick, 2017]. Scientific empowerment would call for both long- and short-term steps to alleviate this distrust, such as by promoting more demographic diversity among biomedical researchers, increasing transparency in the research process, fostering more positive relationships between patients and physicians, and creating structural changes that limit pharmaceutical industry abuses [de Melo-Martín & Intemann, 2018].
From a civic perspective, scientific empowerment would also involve equipping communities, scientists, government officials and policy makers with the critical science literacy to incorporate science (including social science) more thoughtfully in decision making [Priest, 2013]. An important principle of critical science literacy is that scientific information is always tentative and subject to refinement, so policy decisions require important judgments about how much evidence to demand before acting, given the relative costs of different kinds of errors [Douglas, 2009]. Moreover, when dealing with complex social and environmental systems, it is important to consider insights from multiple disciplinary perspectives as well as the local knowledge of affected communities [Massimi, 2025; Toomey, 2024], while being cognizant of the fact that both experts and non-experts sometimes act in bad faith [e.g. Oreskes & Conway, 2011]. As Jean-Marc Lévy-Leblond famously noted: “if scientists are definitely not universal experts, non-scientists are not universal non-experts” [1992, p. 17]. Thus, scientific empowerment can help all members of society adopt a reasonable balance of valuing the knowledge of scientific experts and avoiding conspiracy theories while at the same time being aware that experts are also subject to being wrong. This ability to discern what is true outside of one’s own area of expertise is a challenge that everyone faces, whether they are researchers or not; we argue that focusing on scientific empowerment as the main aim of science communication will strengthen this skill for those outside of the scientific community.
Concluding notes. Some readers may be attracted by our proposal to shift the central goal of science communication to scientific empowerment, and yet they might worry that the proposal will play into the hands of scientific “naysayers” and “denialists.” Might we simply empower those who refuse to accept solid scientific evidence? One response is that we do not see a viable alternative. Efforts at persuasion in recent decades have not stemmed the tide of skepticism about crucial social issues like climate change and vaccine safety. Indeed, at the time of writing, the United States currently has the highest number of cases of measles in the last 33 years, largely due to vaccine skepticism, and similar trends are being seen elsewhere [Lee, 2025; Reed, 2025]. Clearly, creative new approaches are needed. But a deeper response is that we believe that genuine scientific empowerment would help equip people to avoid falling prey to disinformation campaigns or conspiracy theories, and it would enable people to recognise cases where there is strong scientific evidence that is compelling regardless of one’s ethical or political values. In our view, scientific empowerment should also give people the opportunity to critically reflect on their own values, including in response to new scientific information, rather than blindly taking them as given [Ratti & Russo, 2024]. Therefore, although some might think that scientific empowerment raises concerns about amplifying misinformation, we argue that misinformation is addressed more respectfully and effectively through holistic efforts to promote scientific empowerment. By explicitly framing scientific empowerment as a practical means of critically reflecting and acting upon one’s values about science, we foreground the questions and concerns that are truly at stake rather than resorting to a battle of whose facts can be believed. To make our argument, we presented three dimensions of scientific empowerment (material, social-psychological, and civic), but we recognise that there may be others worthy of exploration. The literature acknowledges the multidimensionality of empowerment, particularly across different disciplines, and the importance of recognising that empowerment in some dimensions does not necessarily translate into others [Bayissa et al., 2018]. This is especially important when considering interventions aimed at enhancing particular dimensions of empowerment (e.g., recognising that additional access to formal science education will not necessarily lead to effective political advocacy). Attending to the different dimensions of science empowerment also highlights the importance of engagement methods that go beyond traditional dialogue; for example, science-based games, participatory approaches, art, and storytelling are likely to be particularly effective at fostering the social-psychological dimension of empowerment, and they may also promote civic empowerment by generating richer, more dynamic discussions about science-related values and worldviews [Davies et al., 2019; Rogers et al., 2021]. Thus, we present this piece as a starting point for continuing to explore the role of empowerment in science communication with more depth and breadth than this short commentary could provide. In our view, empowering people to convey and enact their values about science expresses the essential role of science communication in a democratic society.
Acknowledgments
We would like to thank our colleagues and friends for insightful conversations that have contributed to our thinking behind this commentary, including John Besley, Maya Goldenberg, Megan Halpern, Scott Markle, and Scott Peacor. KCE received support through a Research Award from the Alexander von Humboldt Foundation and through the SOCRATES Centre at Leibniz University, Hannover, which is funded by the Deutsche Forschungsgemeinschaft (DFG, German Research Foundation) — Project 470816212/KFG43. We acknowledge the use of ChatGPT for formatting the references section of this article [OpenAI, 2025].
References
-
Archer, L., Dawson, E., DeWitt, J., Seakins, A., & Wong, B. (2015). “Science capital”: a conceptual, methodological and empirical argument for extending Bourdieusian notions of capital beyond the arts. Journal of Research in Science Teaching, 52(7), 922–948. https://doi.org/10.1002/tea.21227
-
Bardosh, K., de Figueiredo, A., Gur-Arie, R., Jamrozik, E., Doidge, J., Lemmens, T., Keshavjee, S., Graham, J. E., & Baral, S. (2022). The unintended consequences of COVID-19 vaccine policy: why mandates, passports and restrictions may cause more harm than good. BMJ Global Health, 7(5), e008684. https://doi.org/10.1136/bmjgh-2022-008684
-
Bayissa, F. W., Smits, J., & Ruben, R. (2018). The multidimensional nature of women’s empowerment: beyond the economic approach. Journal of International Development, 30(4), 661–690. https://doi.org/10.1002/jid.3268
-
Berntsen, L., Courtney, E., Delawalla, C., Flores, J. P., Goldstein, S., & Payne, C. (2025). Why we organized ‘Stand Up For Science’. Nature Human Behaviour, 9(4), 627–628. https://doi.org/10.1038/s41562-025-02146-0
-
Boyd, W. (2024). De-risking environmental law. Harvard Environmental Law Review, 48, 153. https://ssrn.com/abstract=4753197
-
Brown, M. (2020). Science and moral imagination. University of Pittsburgh Press.
-
Carey, J. M., Chi, V., Flynn, D. J., Nyhan, B., & Zeitzoff, T. (2020). The effects of corrective information about disease epidemics and outbreaks: evidence from Zika and yellow fever in Brazil. Science Advances, 6(5), eaaw7449. https://doi.org/10.1126/sciadv.aaw7449
-
Carlone, H. B., & Johnson, A. (2007). Understanding the science experiences of successful women of color: science identity as an analytic lens. Journal of Research in Science Teaching, 44(8), 1187–1218. https://doi.org/10.1002/tea.20237
-
Coleman, R., Thorson, E., Jimenez, C., & Vinton, K. (2024). Reaching science skeptics: how adaptive framing of climate change leads to positive responses via persuasion knowledge and perceived behavioral control. Communication Research, 51(4), 392–414. https://doi.org/10.1177/00936502221084925
-
Davies, S. R. (2022). Science communication at a time of crisis: emergency, democracy and persuasion. Sustainability, 14(9), 5103. https://doi.org/10.3390/su14095103
-
Davies, S. R., Halpern, M., Horst, M., Kirby, D., & Lewenstein, B. (2019). Science stories as culture: experience, identity, narrative and emotion in public communication of science. JCOM, 18(05), A01. https://doi.org/10.22323/2.18050201
-
de Melo-Martín, I., & Intemann, K. (2018). The fight against doubt: how to bridge the gap between scientists and the public. Oxford University Press.
-
DeWitt, J., Archer, L., & Mau, A. (2016). Dimensions of science capital: exploring its potential for understanding students’ science participation. International Journal of Science Education, 38(16), 2431–2449. https://doi.org/10.1080/09500693.2016.1248520
-
Douglas, H. (2000). Inductive risk and values in science. Philosophy of Science, 67(4), 559–579. https://www.jstor.org/stable/188707
-
Douglas, H. (2009). Science, policy and the value-free ideal. University of Pittsburgh Press.
-
Elliott, K. (2017). A tapestry of values: an introduction to values in science. Oxford University Press.
-
Elliott, K. (2022). Values in science. Cambridge University Press.
-
Elliott, K., & Richards, T. (2017). Exploring inductive risk: case studies of values in science. Oxford University Press.
-
Farrell, J., McConnell, K., & Brulle, R. (2019). Evidence-based strategies to combat scientific misinformation. Nature Climate Change, 9(3), 191–195. https://doi.org/10.1038/s41558-018-0368-6
-
Fuglsang, S., & Losi, L. (2024). Is science skepticism really about science? Science and Public Policy, 51(6), 1133–1142. https://doi.org/10.1093/scipol/scae057
-
Giddens, A. (1991). The consequences of modernity. Polity Press.
-
Gligorić, V., van Kleef, G. A., & Rutjens, B. T. (2025). Political ideology and trust in scientists in the U.S.A. Nature Human Behaviour, 9(7), 1501–1512. https://doi.org/10.1038/s41562-025-02147-z
-
Goldenberg, M. J. (2021). Vaccine hesitancy: public trust, expertise and the war on science. University of Pittsburgh Press.
-
Grasswick, H. (2017). Epistemic injustice in science. In The Routledge handbook of epistemic injustice (pp. 313–323). Routledge.
-
Haeuser, E., Byrne, S., Nguyen, J., Raggi, C., McLaughlin, S. A., Bisignano, C., Harris, A. A., Smith, A. E., Lindstedt, P. A., Smith, G., Herold, S. J., Nesbit, O. D., Noyes, T., Shalev, N., Olana, L. T., Aalipour, M. A., Aalruz, H., Abbasifard, M., Abbaspour, F., … Mosser, J. F. (2025). Global, regional and national trends in routine childhood vaccination coverage from 1980 to 2023 with forecasts to 2030: a systematic analysis for the Global Burden of Disease Study 2023. The Lancet, 406(10500), 235–260. https://doi.org/10.1016/s0140-6736(25)01037-2
-
Halpern, M. K., & Elliott, K. C. (2022). Science as experience: a Deweyan model of science communication. Perspectives on Science, 30(4), 621–656. https://doi.org/10.1162/posc_a_00398
-
Harvard, S., & Winsberg, E. (2022). The epistemic risk in representation. Kennedy Institute of Ethics Journal, 32(1), 1–31. https://doi.org/10.1353/ken.2022.0001
-
Heidt, A., & Ledford, H. (2025). Vaccine sceptic RFK Jr is now a powerful force in U.S. science: what will he do? Nature. https://doi.org/10.1038/d41586-025-00439-y
-
Hennink, M., Kiiti, N., Pillinger, M., & Jayakaran, R. (2012). Defining empowerment: perspectives from international development organisations. Development in Practice, 22(2), 202–215. https://doi.org/10.1080/09614524.2012.640987
-
Hicks, D. J. (2017). Scientific controversies as proxy politics. Issues in Science & Technology, 33(2), 62–77. https://issues.org/scientific-controversies-as-proxy-politics/
-
Intemann, K. (2015). Distinguishing between legitimate and illegitimate values in climate modeling. European Journal for Philosophy of Science, 5(2), 217–232. https://doi.org/10.1007/s13194-014-0105-6
-
John, S. (2015). Inductive risk and the contexts of communication. Synthese, 192(1), 79–96. https://doi.org/10.1007/s11229-014-0554-7
-
Jones, M. D., & Anderson Crow, D. (2017). How can we use the ‘science of stories’ to produce persuasive scientific stories? Palgrave Communications, 3(1), 53. https://doi.org/10.1057/s41599-017-0047-7
-
Kabeer, N. (1999). Resources, agency, achievements: reflections on the measurement of women’s empowerment. Development and Change, 30(3), 435–464. https://doi.org/10.1111/1467-7660.00125
-
Kuhn, T. S. (1962). The structure of scientific revolutions. University of Chicago Press.
-
Kusmanoff, A. M., Fidler, F., Gordon, A., Garrard, G. E., & Bekessy, S. A. (2020). Five lessons to guide more effective biodiversity conservation message framing. Conservation Biology, 34(5), 1131–1141. https://doi.org/10.1111/cobi.13482
-
Lacey, H. (2017). Distinguishing between cognitive and social values. In K. Elliott & D. Steel (Eds.), Current controversies in values and science (pp. 15–30). Routledge. https://doi.org/10.4324/9781315639420-2
-
Lee, C. (2025). Measles cases are at a 33-year high. Experts warn other diseases could follow. Time Magazine. Retrieved July 12, 2025, from https://time.com/7301457/measles-cases-record-vaccine/
-
Lévy-Leblond, J.-M. (1992). About misunderstandings about misunderstandings. Public Understanding of Science, 1(1), 17–21. https://doi.org/10.1088/0963-6625/1/1/004
-
Lusk, G. (2021). Does democracy require value-neutral science? Analyzing the legitimacy of scientific information in the political sphere. Studies in History and Philosophy of Science Part A, 90, 102–110. https://doi.org/10.1016/j.shpsa.2021.08.009
-
Macedo, S., & Lee, F. (2025). In COVID’s wake: how our politics failed us. Princeton University Press.
-
Massimi, M. (2025). Local knowledges and the right to participate in science. Philosophy of Science, 1–25. https://doi.org/10.1017/psa.2025.4
-
McIntyre, L. (2021). How to talk to a science denier: conversations with flat earthers, climate deniers and others who defy reason. MIT Press.
-
O’Connor, C., & Weatherall, J. (2019). The misinformation age: how false beliefs spread. Yale University Press.
-
OpenAI. (2025, July). ChatGPT (version GPT-4o) [Large language model]. https://chat.openai.com/
-
Oreskes, N. (1999). The rejection of continental drift: theory and method in American earth science. Oxford University Press.
-
Oreskes, N. (2019). Why trust science? Princeton University Press.
-
Oreskes, N., & Conway, E. M. (2011). Merchants of doubt: how a handful of scientists obscured the truth on issues from tobacco smoke to global warming. Bloomsbury Publishing.
-
Oreskes, N., & Conway, E. M. (2022). From anti-government to anti-science: why conservatives have turned against science. Daedalus, 151(4), 98–123. https://doi.org/10.1162/daed_a_01946
-
Pawar, P. (2024). Bringing science experiments to the streets. The Brilliant. https://thebrilliant.com/sidewalk-science-center/
-
Priest, S. (2013). Critical science literacy: what citizens and journalists need to know to make sense of science. Bulletin of Science, Technology & Society, 33(5–6), 138–145. https://doi.org/10.1177/0270467614529707
-
Radomski, B. M., Šešelja, D., & Naumann, K. (2021). Rethinking the history of peptic ulcer disease and its relevance for network epistemology. History and Philosophy of the Life Sciences, 43(4). https://doi.org/10.1007/s40656-021-00466-8
-
Ratti, E., & Russo, F. (2024). Science and values: a two-way direction. European Journal for Philosophy of Science, 14(1), 6. https://doi.org/10.1007/s13194-024-00567-8
-
Reed, J. (2025). Rise of vaccine distrust — why more of us are questioning jabs. BBC. https://www.bbc.com/news/articles/c1jgrlxx37do
-
Resnik, D. B. (2021). Dual use research in the biomedical sciences. In Precautionary reasoning in environmental and public health policy (pp. 241–269). Springer. https://doi.org/10.1007/978-3-030-70791-0_8
-
Rogers, H., Halpern, M., Hannah, D., & de Ridder-Vignone, K. (Eds.). (2021). Routledge handbook of art, science and technology studies. Routledge. https://doi.org/10.4324/9780429437069
-
Rutjens, B. T., Sengupta, N., der Lee, R. v., van Koningsbruggen, G. M., Martens, J. P., Rabelo, A., & Sutton, R. M. (2022). Science skepticism across 24 countries. Social Psychological and Personality Science, 13(1), 102–117. https://doi.org/10.1177/19485506211001329
-
Santoro, L. R., & Sydnor, E. (2024). Blind trust, blind skepticism: liberals’ & conservatives’ response to academic research. American Politics Research, 52(1), 52–66. https://doi.org/10.1177/1532673x231206136
-
Schlosser, M. (2015). Agency. Stanford Encyclopedia of Philosophy. https://plato.stanford.edu/entries/agency/
-
Schroeder, S. A. (2021). Democratic values: a better foundation for public trust in science. The British Journal for the Philosophy of Science, 72(2), 545–562. https://doi.org/10.1093/bjps/axz023
-
Simis, M. J., Madden, H., Cacciatore, M. A., & Yeo, S. K. (2016). The lure of rationality: why does the deficit model persist in science communication? Public Understanding of Science, 25(4), 400–414. https://doi.org/10.1177/0963662516629749
-
Solomon, M. (2007). Social empiricism. MIT Press.
-
Steel, D. (2015). Philosophy and the precautionary principle: science, evidence and environmental policy. Cambridge University Press. https://doi.org/10.1017/cbo9781139939652
-
Stets, J. E., Brenner, P. S., Burke, P. J., & Serpe, R. T. (2017). The science identity and entering a science occupation. Social Science Research, 64, 1–14. https://doi.org/10.1016/j.ssresearch.2016.10.016
-
Stilgoe, J., Lock, S. J., & Wilsdon, J. (2014). Why should we promote public engagement with science? Public Understanding of Science, 23(1), 4–15. https://doi.org/10.1177/0963662513518154
-
Stoudt, B. G., Torre, M. E., Bartley, P., Bissell, E., Bracy, F., Caldwell, H., Dewey, L., Downs, A., Greene, C., Haldipur, J., Lizama, S., Hassan, P., Manoff, E., Sheppard, N., & Yates, J. (2019). Researching at the community-university borderlands: using public science to study policing in the South Bronx. Education Policy Analysis Archives, 27, 56. https://doi.org/10.14507/epaa.27.2623
-
Toomey, A. H. (2023). Why facts don’t change minds: insights from cognitive science for the improved communication of conservation research. Biological Conservation, 278, 109886. https://doi.org/10.1016/j.biocon.2022.109886
-
Toomey, A. H. (2024). Science with impact: how to engage people, change practice and influence policy. Island Press.
-
UNICEF. (2023). United Nations Children’s Fund, the state of the world’s children 2023: for every child, vaccination.
-
Većkalov, B., Zarzeczna, N., McPhetres, J., van Harreveld, F., & Rutjens, B. T. (2024). Psychological distance to science as a predictor of science skepticism across domains. Personality and Social Psychology Bulletin, 50(1), 18–37. https://doi.org/10.1177/01461672221118184
-
Vincent-Ruz, P., & Schunn, C. D. (2018). The nature of science identity and its role as the driver of student choices. International Journal of STEM Education, 5(1). https://doi.org/10.1186/s40594-018-0140-5
-
Ward, Z. B. (2021). On value-laden science. Studies in History and Philosophy of Science Part A, 85, 54–62. https://doi.org/10.1016/j.shpsa.2020.09.006
-
Williams-Ceci, S., Macy, M. W., & Naaman, M. (2024). Misinformation does not reduce trust in accurate search results, but warning banners may backfire. Scientific Reports, 14(1), 10977. https://doi.org/10.1038/s41598-024-61645-8
-
Yudell, M., Roberts, D., DeSalle, R., & Tishkoff, S. (2016). Taking race out of human genetics. Science, 351(6273), 564–565. https://doi.org/10.1126/science.aac4951
About the authors
Anne H. Toomey is an Associate Professor of Environmental Studies and Science at Pace University, based in New York. Her research focuses on leveraging science to address real-world environmental and policy challenges. In 2024, she published the award-winning book “Science with Impact: How to Engage People, Change Practice, and Influence Policy”. Anne is also executive director of Participatory Science Solutions, a social impact research consulting company that supports collaborative decision-making through participatory science, robust research methods, engagement, and communication.
E-mail: toomey.ah@gmail.com
Kevin C. Elliott is Red Cedar Distinguished Professor at Michigan State University, with joint appointments in Lyman Briggs College, the Department of Fisheries and Wildlife, and the Department of Philosophy. He works in the philosophy of science and practical ethics, focusing especially on the roles that ethical and social values play in scientific research. His books include “A Tapestry of Values: An Introduction to Values in Science” and (edited with Ted Richards) “The Routledge Handbook of Values and Science”.
E-mail: kce@msu.edu