1 Introduction

The COVID-19 pandemic has drastically changed the world within a few short months. In addition to morbidity and mortalities, the pandemic has put a strain on healthcare systems, exposed weaknesses in government preparedness to deal with a pervasive health threat, changed the way millions of people work, and set economies on uncertain trajectories. The mixture of uncertainty, fear, and danger creates a perfect storm for disinformation, particularly in a time of social networks and instantaneous communication.

While all countries have been vulnerable to disinformation during the escalating COVID-19 pandemic, Ukraine’s experience is a particularly unique case that deserves special attention. The ongoing conflict in Ukraine, alongside the annexation of Crimea by Russia, has manifested as a hybrid war in which assaults are waged in many spheres beyond the traditional battlefield [Patel, Grace et al., 2020 ]. One aspect of this hybrid warfare is cyber and information warfare; in other words, disinformation campaigns.

The terms “disinformation” and “misinformation,” in the grey literature, are often used interchangeably without consideration or awareness of the nuanced difference between each other. Conceptually found in past research studies, disinformation has several theoretical foundations and lessons learned [Starbird and Wilson, 2020 ]. The Merriam-Webster dictionary [ 2020a ] defines disinformation as false information deliberately and often covertly spread (as by the planting of rumors) to influence public opinion or obscure the truth. On the other hand, from a more academic view, another definition of disinformation is described as deliberately false or misleading information with its purpose to not always to convince, but to instill doubt [Jack, 2017 ; Pomerantsev and Weiss, 2014 ]. Although the descriptions of the two definitions of disinformation are very similar, the purpose of the spread of it varies. However, in accordance with the U.S. Joint Dictionary, personnel within the DoD will use an accepted general definition from a credible dictionary [Office of the Chairman of the Joint Chiefs of Staff, 2020 , p. 2]. Therefore, this paper will use and refer to disinformation using the definition provided by the Merriam-Webster dictionary. Misinformation is incorrect or misleading information [Merriam-Webster, 2020b ]. In other words, the term “disinformation” implies intentionality to spread false information to achieve a goal, whereas “misinformation” refers to incorrect information that may or may not be purposefully spread to mislead. Additionally, a search of relevant North Atlantic Treaty Organization (NATO) doctrine, terminology, and publications indicates that the term “misinformation” is present in references but is not well defined. It is listed as a purpose of NATO communication to alleviate misinformation [North Atlantic Treaty Organization, 2017 ]. Employed to mislead for a strategic, political purpose, actions arising from disinformation campaigns can create a hybrid type of warfare between countries.

1.1 Hybrid warfare and disinformation in Ukraine

Hybrid warfare, stemming from the conflict in East Ukraine, has established an ongoing presence of disinformation within Ukraine. Hybrid threats incorporate a full range of warfare tactics, including conventional means, acts of terrorism including indiscriminate violence and coercion, criminal disorder, and disruptive technologies or cyber security attacks to destabilize an existing order [Munoz Mosquera and Bachmann, 2016 ; North Atlantic Treaty Organization, 2019 ]. These attacks can be waged by states or nonstate actors. Disinformation is an example of a hybrid warfare technique and encompasses a full spectrum of activities aimed at undermining legitimacy in communication, infrastructure, and political systems and economies. In essence, these campaigns seek to fracture or destabilize societies by any means necessary.

The mechanics of Russia’s hybrid tactics have been known for some time [North Atlantic Treaty Organization, 2019 ]. Russian propaganda is based on a few core narratives, according to a report by Popovych et al. [ 2018 ]. Overarching, basic, and intended to evoke emotional reactions, these narratives form the key structural elements of the broader disinformation campaigns. They interpret or skew real events to shape public opinion according to a political agenda. These aggressive cyber campaigns are waged across social media and infrastructure, and point to a full-spectrum conflict reaching far beyond Ukraine: Russia has contributed significantly to instability in many regions, including military intervention in Georgia and surrounding Baltic countries, the illegal annexation of Crimea, infiltration of United States elections, cyberattacks against Ukraine and bordering countries, alleged usage of chemical weapons in the United Kingdom and strategic manipulation of public opinion in media and social networks across the globe [Popovych et al., 2018 ].

1.2 Healthcare disinformation in Ukraine

Russian disinformation activities in Ukraine go beyond conventional pro-Russia propaganda; hybrid tactics in health communication have been growing for several years. For example, the Russian government has promoted anti-vaccination messaging abroad, (e.g., measles immunization programs), particularly in conflict-affected areas of Ukraine [Francis, 2018 ]. This tactic aims to weaken the target societies and undermine trust in governmental institutions and health systems [Broniatowski et al., 2018 ]. Such campaigns are often used by the Kremlin to fuel the polarization of society, thus destabilizing it from within and constituting a threat to national security [Harding, 2018 ].

Recent disinformation campaigns have focused on the COVID-19 pandemic [Gabrielle, 2020 ], aiming to cripple health crisis communication to weaken Ukraine’s response to the pandemic. Disinformation about the novel coronavirus and available resources are spread through coordinated and sophisticated cyber campaigns [HHS Cybersecurity Program, 2020 ]. Despite best efforts to curtail these campaigns, as government and public health officials and institutions focus on containing the community spread of the virus and providing a quality response, disinformation is increasing [Talant, 2020 ]. Nonetheless, only a limited amount of evidence is currently available in scientific literature to illustrate the hybrid attacks pertaining to the COVID-19 pandemic. Assessing the state of disinformation on COVID-19 is the first step in considering how to counter it and bolster Ukraine’s ability to respond to the pandemic.

2 Objective

In this paper, we carry out a review to systematically examine the messages surrounding health crisis communication in Ukraine during the COVID-19 outbreak. Our analysis seeks to canvas the ways disinformation about the novel coronavirus is being spread in Ukraine, so as to form a foundation for assessing how to mitigate the problem. How are “fake news” items being used to communicate the COVID-19 response in Ukraine? Given the global reach of disinformation campaigns, these insights may also be useful for other countries and regions of the globe where the COVID-19 pandemic is currently surging.

3 Methods

We conducted a systematic literature review to examine the reported messages, studies, and campaigns surrounding health crisis communication in Ukraine during the COVID-19 pandemic since first case occurred in Wuhan, China on December 31, 2019. Our search methodology included examining news articles, technical reports, policy briefs, and peer-review publications that include data on COVID-19 in Ukraine and the messaging about it. Our goal was to capture the existing knowledge about disinformation about coronavirus and medical responses to it, as discussed and covered in media news articles, policy briefs, technical reports, and peer-reviewed publications.

We used keyword searches in databases of Web of Knowledge (all databases), PubMed, ProQuest, Google News, Google, and Google Scholar (Figure ??). Ukraine’s media environment is modern, but many media organizations are owned or controlled by oligarchs or follow political agendas. There is also a hostile attitude towards many journalists in Eastern Europe; thus, many consider Ukraine’s media landscape a laboratory and ideal setting for hybrid warfare, disinformation, and information operations [Jankowicz, 2019 ]. Because of this, we used two media news-focused databases (ProQuest and Google News) to capture a wide reach of information being shared, in addition to using interdisciplinary academic databases where scholarly peer-reviewed publications could be found on the topic. The following was the keyword strategy used:

  1. COVID
  2. Ukra*
  3. Health*
  4. Healthcare
  5. Disinform*
  6. Misinform*
  7. Fake
  8. Lie*
  9. Media AND tru*
  10. 3 OR 4
  11. 5 OR 6 OR 7 OR 8 OR 9
  12. 1 AND 2 AND 10 AND 11

Searches were carried out from June 3 to 5, 2020, with a publication cutoff date of May 31, 2020. The search query used was the following: COVID AND Ukra* AND (Health* OR Healthcare) AND (Disinform* OR Misinform* OR Fake OR Lie* OR (Media AND tru*)). For any keyword searches with a high amount of volume, such as on Google (i.e. with over 200 pages of results), we reviewed the first 100 unique links after filtering to meet the inclusion criteria (see below). If two or more links came from the same main website or publication, this was considered one unique link. From the first 100 unique links, we extracted any publications (e.g. annual reports or educational handouts) that appeared relevant to the discussion of COVID-19 and health crisis communication for further examination. This search methodology for grey literature in Google databases was used in previous publications [Samet and Patel, 2011 ; Samet and Patel, 2013 ; Patel, Rogers et al., 2017 ].


PIC

Figure 1 : Flow diagram of search strategy.


To be included in the review, publications or documents had to: i) be found on Web of Knowledge, PubMed, ProQuest, Google News, Google, and/or Google Scholar; ii) be open access or accessible by Harvard University Library; iii) have a title and abstract/summary written in English; and iv) mention misinformation, fake news, disinformation, false information with COVID-19 pandemic or health crisis in Ukraine.

Both “misinformation” and “disinformation” were used as search terms. As noted above, the nuances of the terms are often missed or ignored in common literature. For the purposes of this systematic review, whether the authors of the literature understood the distinction between these terms and used them appropriately was not judged or considered cause for exclusion. In this article, we use the term “disinformation” throughout for simplicity and consistency; based on the definition given earlier, most authors were describing disinformation, and their choice of terminology is not important. Health crisis communication has been difficult to define in literature in terms of form or content [Quinn, 2018 ], so we used the definition of health or healthcare communication given by the Centers for Disease Control and Prevention (CDC) to evaluate each potential publication for inclusion. CDC defines it as the “study and use of communication strategies to inform and influence decisions and actions to improve health” [Centers for Disease Control and Prevention, 2020 ].

We designed an analytical spreadsheet to systematically extract data from the accepted literature. The following data were extracted: month and year of publication, type of reference, summary, and key takeaways. Data extraction was carried out by two authors and discussed between both authors for consistency. Thematic analysis was used to synthesize the data by coding it and organizing it into themes [Braun and Clarke, 2006 ].

4 Results

Through our search methodology strategy (Figure ??), we identified 715 texts published from January to May 2020 that contained information about COVID-19-related disinformation messaging in Ukraine. Of these, 134 were selected for full text read based on the title and summary reads. From full-text read, we retained documents that matched the inclusion criteria (listed in Table ??); our review included 35 documents comprising policy briefs, news articles, technical reports, and peer-reviewed publications. Thirty-four percent of the included publications were published in March 2020, which was a 500% increase from the number of the identified publications published in January 2020. From March to May 2020, there was a 50% decrease in publications included in this review, as seen in Figure ??.


PIC

Figure 2 : Timeline diagram of notable misinformation and disinformation incidents as relates to COVID-19 cases.


From the selected materials in this review, the current information related to COVID-19 and healthcare was found to be aligned into two broad categories when discussing disinformation: 1) trust in and accuracy of COVID-19 messaging, and 2) access to COVID-19 related medical goods and services. Within both themes, disinformation presents unique challenges and concerns. Still, in both, the disinformation content discussed in the material revealed a malicious intent by primarily Russia and Russian-backed separatists (as previously seen prior to the COVID-19 pandemic in East Ukraine) [Jozwiak, 2020b ; Tucker, 2020 ; Starbird and Wilson, 2020 ]. Broadly, the malicious intent and execution of disinformation campaigns in Ukraine were intended to amplify and promote discord to create a political impact in Ukraine, particularly in the context of the ongoing war [Peel and Fleming, 2020 ; Sukhankin, 2020 ].


Table 1 : Evidence table of records included in this review. In the “Theme(s)” column, 1 stands for 1. Trust and accuracy of message while 2 for 2. Messages related to COVID-19 resources and support .
PIC

4.1 Disinformation addressing trust and accuracy of official messages

The first theme that emerged in our set of publications was how disinformation affected trust in authorities and the dissemination and reception of accurate health crisis messaging. Various papers discussed a spectrum of disinformation within this theme, from narratives that questioned the accuracy of official messaging to those that reinforced fake news in order to break trust with the government’s response. Overall, the majority of sources found in this review emphasized the importance of establishing trust in the official health advice, specifically regarding the coronavirus and specific ways the Ukrainian healthcare system is combating it [Freelon and Lokot, 2020 ; Peters, 2020 ].

European Union East StratCom offices found 80 coronavirus-related disinformation cases on popular media channels since January 2020 [Polityuk and Zinets, 2020 ; Wanat, Cerulus and Scott, 2020 ]. Fake narratives and information about the pandemic varied, from the coronavirus being a human-made biological weapon tailored to Chinese DNA; to a pandemic spread by the United States (U.S.) President and U.S. soldiers; to the COVID-19 pandemic being entirely fake, as depicted in Russian media [Barnes, Rosenberg and Wong, 2020 ; Emmott, 2020 ].

This “significant disinformation campaign” was reported in a European Union document that states the goal was to worsen the impact of the coronavirus, generate wide-spread panic, and cultivate distrust [Emmott, 2020 ]. For example, disinformation campaigns have targeted vulnerable groups to obstruct measures to contain a COVID-19 outbreak, such as downplaying the case numbers and fatality statistics to convey a low risk of contagion (e.g., no need for physical distancing or face coverings) or disinforming the public that contagious individuals were arriving to a particular municipality, instigating panic and disruption [Arciga, 2020 ; Korybko, 2020 ; Melkozerova and Parafeniuk, 2020 ; Miller, 2020a ; Miller, 2020b ; Peters, 2020 ].

These seemingly contradictory disinformation themes are examples of Russia’s multifaceted disinformation campaign against Ukraine. For instance, disinformation messaging in the occupied territories portrays Ukraine’s efforts to mitigate the pandemic as both inefficient and an overreaction by simultaneously 1) overstating the number of cases in the Ukrainian military (to make the Ukrainian forces look weak), and 2) criticizing Ukraine’s quarantine measures as excessive and damaging to business [Ukraine Crisis Media Center, 2020a ]. The portrayal of the Ukrainian government as weak and incompetent contributes to the public’s distrust of the state’s rules and regulations regarding preventive public health measures. Meanwhile, Russia continues to posture its officials as experts, well prepared and unified in the country’s attack against COVID-19 [Kevere, 2020 ; Shea, 2020 ; Ukraine Crisis Media Center, 2020b ].

Disinformation prevents a truly unified response to combat the spread of disease. In occupied territories, for example, reports describe COVID-19 health measures as “softer and stricter [than those of the Ukrainian government],” with softer limitations such as keeping religious centers open, but stricter measures like criminalizing “fake news” and reducing benefits for senior citizens [Ukraine Crisis Media Center, 2020a ].

4.2 Disinformation related to COVID-19 resources and support

In addition to focusing on disinformation that aimed to undermine trust in official COVID-19 crisis communication, the literature included in our review highlighted narratives that focused on pandemic resources and support, including comparisons to other countries. We found disinformation that interfered with access to legitimate COVID-19 resources and support as well as commentary about resources and support — both abroad and at home in Ukraine. These commentaries suggest an intention to influence political opinion, particularly in the conflict-affected areas.

The literature discussed examples of disinformation that could prevent the public from seeking effective medical goods or following sound medical advice. This disinformation included reports of “fraudsters” distributing medical products of “dubious quality” and disinformation campaigns promoting fake treatments that promised rapid cures for COVID-19 disease [British Broadcasting Corporation, 2020c ; EUvsDisinfo, 2020 ]

In Ukraine and elsewhere, telecommunication service providers have responded to the pandemic by increasing free access to information resources and repurposing apps to serve the COVID-19 response [Lima, 2020 ]. However, telecom services themselves have been the target of disinformation attacks, such as a conspiracy theory linking COVID-19 to 5G which led to the physical destruction of telephone masts in the U.K. [Lima, 2020 ]. Likewise, our search found references to a disinformation campaign in Ukraine that led people to question the safety of their tap and drinking water, leading concerned residents to “overrun” the water agency’s phone lines and leading the agency to issue a clarifying statement [EURACTIV and Agence France-Presse, 2020b ].

Disinformation instigated the physical interruption of access to healthcare, as well. Multiple documents in our review described an incident in Ukraine that involved a fake leaked email supposedly from the Ministry of Health [Arciga, 2020 ; EURACTIV and Agence France-Presse, 2020b ; Korybko, 2020 ; Melkozerova and Parafeniuk, 2020 ; Miller, 2020a ; Miller, 2020b ]. The fake email, which falsely indicated there were five cases of COVID-19 in Ukraine, went viral on the same day that a plane of healthy evacuees from China’s Hubei province landed, sowing panic that incoming people might be bringing the first cases of the virus to Ukraine. This fear led residents in a small town to attack a bus carrying healthy evacuees from China to a medical center for quarantine. Protesters also fought with police and blocked the road to the medical facility. Residents in other towns similarly disrupted access to the local hospital out of fear that contagious people would intentionally be brought into their community [Miller, 2020b ].

The literature described ways disinformation was used as political leverage against U.S. sanctions [Jozwiak, 2020a ; Jozwiak, 2020c ]. Russia, Iran, and China are carrying out coordinated attacks claiming the sanctions against Russia, Iran, Syria, and Venezuela are preventing these countries from carrying out an effective humanitarian and medical response [Jozwiak, 2020a ; Jozwiak, 2020c ]. These narratives were found to serve the three countries’ desire to undermine trust in the West and pressure the U.S. to lift sanctions. At the same time, placing the blame on the U.S. could deflect criticism away from these aforementioned countries for their inability to provide adequate resources and supplies.

Similarly, a policy brief described the competition between the E.U. and Russia/China to be viewed as the “good guys” through health diplomacy [Kosmehl, 2020 ]. Other documents refer to Ukraine’s need for outside assistance to fight the pandemic [Paul and Filipchuk, 2020 ] and the European Commission’s financial assistance issued to Ukraine [Ukraine Business Daily, 2020 ]. On a more local level, the use of healthcare to “win hearts and minds” has a strong history in Eastern Ukraine throughout the ongoing conflict. To garner political support, pro-Russian separatists in the occupied territories and pro-Ukrainians under government control use health services to generate goodwill among people on both sides of the border and to create an image of life being better and safer on their side [Hyde, 2018 ].

5 Discussion

As seen in the included publications, disinformation about COVID-19 is pervasive in Ukraine, with multiple effects and suspected goals. The power and efficacy of disinformation campaigns can be inferred from the responsive actions taken by the Ukrainian government and smaller advocacy groups. In early 2020, a bill was considered in the Ukrainian parliament where purposeful spreading of disinformation could be fined and punished up to seven years in prison [EURACTIV and Agence France-Presse, 2020a ]. Since the outbreak of disinformation started worldwide, various small groups like StopFake and COVID-19MisInfo have created websites and databases to transparently monitor and counter misrepresented information, with a special focus on the COVID-19 pandemic [Haigh, Haigh and Kozak, 2018 ; Ryerson University Social Media Lab, 2020 ]. In May 2020, the Ukrainian government identified over 300 individuals and over 2,100 internet communities online that were actively spreading fake and malicious information about the COVID-19 outbreak [British Broadcasting Corporation, 2020b ].

Our review suggests that the disinformation related to crisis communication about COVID-19 was focused on eroding trust in the government’s response and the accuracy of the official health messaging or misleading the public with regard to accessing and receiving resources or support. Decreased trust in governments and public health systems leads to disregard for the official health advice and impacts the population’s medical decision-making, often with serious detrimental effects [Clark-Ginsberg and Petrun Sayers, 2020 ]. These factors are compounded in disadvantaged or vulnerable populations, such as those living in poverty, regions of conflict or in areas with poor infrastructure. This can be attributed to a legacy of government mistreatment and a general lack of access to reliable information, which strengthens the impact of disinformation campaigns [Clark-Ginsberg and Petrun Sayers, 2020 ].

Media consumption and trust in media are slowly evolving in Ukraine. As of 2018, 82 percent of Ukrainian are online, an increase of 12 percent since 2015, and Ukrainians are increasingly turning to social media for news [Internews in Ukraine, 2018 ]. According to a 2018 Internews study, Ukrainians strongly prefer national (as opposed to regional, global or Russian) media sources, with the exception of print media. Television was the more commonly used source for news, with internet usage varying by region and over time [Internews in Ukraine and USAID, 2018 ].

Importantly, the 2018 Internews study showed a steadily increasing use of Facebook for news — more commonly than other forms of social media. This trend implies that Ukrainians are turning to Facebook as a reliable source of news, with 42 percent of respondents in 2018 indicating a preference for Facebook for news over other forms of social media [Internews in Ukraine and USAID, 2018 ]. Given the prominence of social media as a platform for disinformation campaigns, this trend could reveal an increasing exposure to disinformation among Ukrainians.

Based on the evidence, our recommendations are to 1) increase the transparency of health crisis communication, such as including eyewitness videos in television news communication; and, 2) address the leadership gap in disseminating reliable regional information about COVID-19 resources and support in Ukraine. As seen in many countries around the world, it is challenging to effectively reduce the community spread of the coronavirus when information is being maliciously skewed by a foreign country. Our study findings and recommendations seek to mitigate this problem.

By increasing transparency and including eyewitness videos, information providers earn more trust among the viewing audience. In a previous study examining the conflict in Ukraine in the TV news, analysts found that eyewitness videos affect perceptions of trustworthiness through “feelings of presence and empathy as well as perceptions of authenticity and bias” [Halfmann et al., 2019 ].

Establishing a legitimate and trusted entity at the regional level that possesses the capacity, capabilities and authority required to address the messaging related to COVID-19 resources and support would also inspire trust and confidence among local citizens. This entity would need to be directly supported by a transparently governed international partner, such as the United Nations, World Health Organization (WHO or NATO). An expert leadership committee could monitor and address hybrid tactics that are disinforming regional localities about COVID-19 resources and support. The WHO recently documented misinformation about COVID-19 research in its COVID-19 situational report 121, but only convened experts into a working group to help conduct scientific research on COVID-19, not to monitor disinformation challenges [World Health Organization, 2020 ]. Legitimate international and regional expert committees that can act as credible and powerful watchdog organizations and disseminate verified information are essential to countering fake news. More recently, a new initiative by the United Nations has started to verify COVID-19 messaging [United Nations, 2020a ; United Nations, 2020b ]. Only time will tell if this initiative will be adopted locally in Ukraine and in other countries around the world.

These efforts must be tailored to specific populations according to their vulnerability to disinformation, their media preferences, and their access to quality resources and medical care. Coordination with non-governmental entities, such as faith-based organizations, community organizations and NGOs, may prove effective in areas where these entities have the trust of the local population [Clark-Ginsberg and Petrun Sayers, 2020 ].

Our review has several possible limitations. When identifying publications for review, confirmation bias may have occurred in selection process. To reduce this possible bias, we had two researchers independently choose the accepted papers and established an explicit set of inclusion criteria to use. The reliability of the thematic analysis of this review could be a possible limitation, despite having two researchers performing it. A third impartial researcher was used to check on the reliability of themes assigned to the included papers and broke any tie between the two authors. Additionally, it is quite unlikely that we identified every relevant published article in the literature due to only selecting English articles and not using any specialized Ukrainian search engine, and due to the high amount of rapid publications with COVID-19 research which may even produce low quality publication due to research fatigue [Patel, Webster et al., 2020 ]. It would be beneficial to work with Ukrainian researchers to conduct a similar study in the Ukrainian and Russian languages, taking into consideration the reputability of the media sources. The misinformation or disinformation terminology of the included papers in this review may not accurately align with the DoD mandate of definitions. Thus, there may be cases where original authors wrote “disinformation” but might be discussing instances of misinformation, and vice versa. Similarly, our use of “disinformation” throughout this paper may be implying an intentional campaign in cases where the erroneous information was simply misinformation, such as an honest misinterpretation of medical advice that then went viral.

Nevertheless, this review sheds light on opportunities for further research and scientific inquiry. Further research methodologies and evidence should be established to improve COVID-19 knowledge, resources, and support for vulnerable populations in Ukraine. This recommendation is particularly important if there are subsequent surges of this novel coronavirus or other emerging infections regionally and globally. An analysis of disinformation by platform would lend insight into which channels are the most infiltrated and could be compared with data on Ukrainians’ use of media and trust in specific types of media. Similarly, cases wherein the disinformation was spread by mainstream media deserve more attention to fully grasp how frequently, for what reason, and how this occurred.

Analysis of communication and trust at the institutional level — including non-governmental entities — could also improve strategies to effectively respond to the pandemic: Which organizations do Ukrainians trust most for health-related information and care? How are these organizations communicating, and are they being usurped by disinformation? Could these selected organizations help to counter the negative impact of disinformation that destroys trust? As noted above, these analyses will be most useful when they assess data by community (e.g., rural versus city, Russian-speaking versus Ukrainian-speaking, and conflict versus conflict-free zones).

Further research is needed to assess the efficacy of disinformation campaigns and their overall impact. Additional contextual information such as data on Ukraine’s COVID-19 cases and deaths, an analysis of the strength of the state’s response, as well as measures of civilian adherence to official guidelines could deepen our understanding of how these campaigns work and how severe is the threat they pose. While it may be difficult to obtain, evidence of Russia’s (and other states’) government involvement in disinformation campaigns provides insights for international relations, while uncovering the mode of operation for developing and disseminating the disinformation is also an important piece to the complex overall puzzle.

Additional insight into Ukrainians’ media literacy and awareness of resources to fact check potential disinformation could help formulate more specific educational approaches. For example, the 2018 Internews study found that the number of Ukrainians who critically evaluate news sources for accuracy has steadily (but marginally) increased since 2015. Nonetheless, under 30 percent said they pay attention to the source of the news and whether more than one point of view is presented, and only 14 percent said they pay attention to the owner of the media. At the same time, 26 percent said they trust media intuitively [Internews in Ukraine and USAID, 2018 ]. An updated survey of media literacy in the context of COVID-19 and a study of the use and efficacy of verification sources like StopFake would both complement our review of disinformation literature.

6 Conclusion

The news and media events that occurred in Ukraine are an early indicator of disinformation during the COVID-19 outbreak. This research paper provides an overview of how the disinformation of the health crisis from a calculating outside force impacted communities in Ukraine. This impact makes Ukraine a valuable case-study on Health Crisis Communication during a global pandemic. What makes this particularly intriguing are the numerous examples of disinformation and their negative implications, both regionally and nationally in this vulnerable and conflict-ridden region.

Acknowledgments

Timothy B. Erickson was country lead for NATO Science for Peace and Security Programme fund numbers: G5432 & G5663, which supported activities of Advanced Research Workshop and Advanced Training Course in Ukraine. Sonny S. Patel was supported by the Fogarty International Center and National Institute of Mental Health, of the National Institutes of Health under Award Number D43 TW010543. The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Institutes of Health, NATO Science for Peace and Security Programme, or any institution.

References

Arciga, J. (20th February 2020). ‘Coronavirus disinformation sparked violent protests in Ukraine’. Daily Beast . URL: https://www.thedailybeast.com/coronavirus-disinformation-sparked-violent-protests-in-ukraine .

Barnes, J. E., Rosenberg, M. and Wong, E. (28th March 2020). ‘As virus spreads, China and Russia see openings for disinformation’. The New York Times . URL: https://www.nytimes.com/2020/03/28/us/politics/china-russia-coronavirus-disinformation.html .

Borah, P. (2nd April 2020). ‘Trump’s poor relationship with the media has made the U.S. Covid-19 outbreak worse’. USApp-American Politics and Policy Blog . URL: https://blogs.lse.ac.uk/usappblog/2020/04/02/trumps-poor-relationship-with-the-media-has-made-the-us-covid-19-outbreak-worse/ .

Braun, V. and Clarke, V. (2006). ‘Using thematic analysis in psychology’. Qualitative Research in Psychology 3 (2), pp. 77–101. https://doi.org/10.1191/1478088706qp063oa .

British Broadcasting Corporation (23rd April 2020a). ‘Covid-19 disinformation: Ukraine blocks 5,000 web links’. BBC News . URL: https://search.proquest.com/docview/2393494700 .

— (4th May 2020b). ‘Covid-19 disinformation: Ukraine busts 301 online’. BBC News . URL: http://search.proquest.com/docview/2397471781 .

— (2nd April 2020c). ‘Ukraine opens 30 criminal cases over covid-19 disinformation’. BBC News . URL: http://search.proquest.com/docview/2385109273 .

Broniatowski, D. A., Jamison, A. M., Qi, S., AlKulaib, L., Chen, T., Benton, A., Quinn, S. C. and Dredze, M. (2018). ‘Weaponized health communication: Twitter bots and Russian trolls amplify the vaccine debate’. American Journal of Public Health 108 (10), pp. 1378–1384. https://doi.org/10.2105/ajph.2018.304567 .

Centers for Disease Control and Prevention (2020). URL: https://www.cdc.gov/healthcommunication/healthbasics/index.html .

Clark-Ginsberg, A. and Petrun Sayers, E. L. (2020). ‘Communication missteps during Covid-19 hurt those already most at risk’. Journal of Contingencies and Crisis Management . https://doi.org/10.1111/1468-5973.12304 .

Emmott, R. (18th March 2020). ‘Russia deploying coronavirus disinformation to sow panic in West, E.U. document says’. Reuters . URL: https://www.reuters.com/article/us-health-coronavirus-disinformation/russia-deploying-coronavirus-disinformation-to-sow-panic-in-west-eu-document-says-idUSKBN21518F .

EURACTIV and Agence France-Presse (6th February 2020a). ‘OSCE warns Ukraine over disinformation bill’. EURACTIV . URL: https://www.euractiv.com/section/europe-s-east/news/osce-warns-ukraine-over-disinformation-bill/ .

— (27th March 2020b). ‘Ukraine struggles to debunk fake virus news’. EURACTIV . URL: https://www.euractiv.com/section/europe-s-east/news/ukraine-struggles-to-debunk-fake-virus-news/ .

EUvsDisinfo (26th March 2020). Disinformation can kill . URL: https://euvsdisinfo.eu/disinformation-can-kill/ .

Francis, D. (9th May 2018). ‘Unreality TV: why the Kremlin’s lies stick’. Atlantic Council . URL: https://www.atlanticcouncil.org/blogs/ukrainealert/unreality-tv-why-the-kremlin-s-lies-stick-at-home .

Freelon, D. and Lokot, T. (14th January 2020). ‘Russian Twitter disinformation campaigns reach across the American political spectrum’. Harvard Kennedy School Misinformation Review 1 (1). https://doi.org/10.37016/mr-2020-003 .

Gabrielle, L. (8th May 2020). Global engagement center update on PRC efforts to push disinformation and propaganda around Covid . URL: https://www.state.gov/briefing-with-special-envoy-lea-gabrielle-global-engagement-center-update-on-prc-efforts-to-push-disinformation-and-propaganda-around-covid/ .

Haigh, M., Haigh, T. and Kozak, N. I. (2018). ‘Stopping fake news: the work practices of peer-to-peer counter propaganda’. Journalism Studies 19 (14), pp. 2062–2087. https://doi.org/10.1080/1461670x.2017.1316681 .

Halfmann, A., Dech, H., Riemann, J., Schlenker, L. and Wessler, H. (2019). ‘Moving closer to the action: how viewers’ experiences of eyewitness videos in TV news influence the trustworthiness of the reports’. Journalism & Mass Communication Quarterly 96 (2), pp. 367–384. https://doi.org/10.1177/1077699018785890 .

Harding, L. (3rd May 2018). ‘‘Deny, distract and blame’: how Russia fights propaganda war’. The Guardian . URL: https://www.theguardian.com/uk-news/2018/may/03/russia-propaganda-war-skripal-poisoning-embassy-london .

HHS Cybersecurity Program (14th May 2020). Covid-19 related nation-state and cyber criminal targeting of the healthcare sector . (Report # 202005141030). URL: https://www.aha.org/system/files/media/file/2020/05/hc3-tlp-white-covid19-related-nation-state-cyber-criminal-targeting-healthcare-sector-5-14-2020.pdf .

Hyde, L. (21st August 2018). ‘Now healthcare is a weapon in Ukraine’s war’. Coda . URL: https://www.codastory.com/disinformation/armed-conflict/healthcare-weapon-ukraine/ .

Internews in Ukraine (5th September 2018). Trust in media on the rise in Ukraine — a new USAID-Internews media consumption survey says . URL: https://internews.in.ua/news/trust-in-media-on-the-rise-in-ukraine-a-new-usaid-internews-media-consumption-survey-says/ .

Internews in Ukraine and USAID (2018). Media consumption survey in Ukraine 2018 . URL: https://internews.in.ua/wp-content/uploads/2018/09/2018-MediaConsumSurvey_eng_FIN.pdf .

Jack, C. (2017). Lexicon of lies: terms for problematic information. Data & Society. URL: https://datasociety.net/pubs/oh/DataAndSociety_LexiconofLies.pdf .

Jankowicz, N. (17th April 2019). ‘Ukraine’s election is an all-out disinformation battle’. The Atlantic . URL: https://www.theatlantic.com/international/archive/2019/04/russia-disinformation-ukraine-election/587179/ .

Jozwiak, R. (21st May 2020a). ‘E.U. monitor sees drop in Covid-19 disinformation, urges social media to take more action’. RadioFreeEurope RadioLiberty . URL: https://www.rferl.org/a/covid-19-disinformation-eu-decrease-kremlin-china-bill-5g-gates/30625483.html .

— (18th March 2020b). ‘E.U. monitors say pro-Kremlin media spread coronavirus disinformation’. RadioFreeEurope RadioLiberty . URL: https://www.rferl.org/a/eu-monitors-say-pro-kremlin-media-spread-coronavirus-disinformation/30495695.html .

— (22nd April 2020c). ‘E.U. monitors see coordinated Covid-19 disinformation effort by Iran, Russia, China’. RadioFreeEurope RadioLiberty . URL: https://www.rferl.org/a/eu-monitors-sees-coordinated-covid-19-disinformation-effort-by-iran-russia-china/30570938.html .

Kevere, O. (13th May 2020). ‘The illusion of control: Russia’s media ecosystem and Covid-19 propaganda narratives’. Visegrad Insight . URL: https://visegradinsight.eu/the-illusion-of-control-russian-propaganda-covid19/ .

Korybko, A. (22nd February 2020). ‘Ukraine’s anti-Covid-19 riot is due to fake news and media-driven fear’. CGTN . URL: https://news.cgtn.com/news/2020-02-22/Ukraine-s-anti-COVID-19-riot-is-due-to-fake-news-and-media-driven-fear-Oi9eszIfzW/index.html .

Kosmehl, M. (8th April 2020). Geopolitical symptoms of Covid-19: narrative battles within the eastern partnership . Bertelsmann/Stiftung Policy Brief. URL: https://www.bertelsmann-stiftung.de/en/publications/publication/did/geopolitical-symptoms-of-covid-19-narrative-battles-within-the-eastern-partnership-all .

Krafft, P. M. and Donovan, J. (2020). ‘Disinformation by design: the use of evidence collages and platform filtering in a media manipulation campaign’. Political Communication 37 (2), pp. 194–214. https://doi.org/10.1080/10584609.2019.1686094 .

Kyriakidou, M., Morani, M., Soo, N. and Cushion, S. (7th May 2020). ‘Government and media misinformation about Covid-19 is confusing the public’. LSE COVID-19 Blog . URL: https://blogs.lse.ac.uk/covid19/2020/05/07/government-and-media-misinformation-about-covid-19-is-confusing-the-public/ .

Lima, J. M. (20th April 2020). ‘Live/Covid-19 news: how the telecoms world is dealing with the pandemic’. Capacity Magazine . URL: https://www.capacitymedia.com/articles/3825122/live-covid-19-news-how-the-telecoms-world-is-dealing-with-the-pandemic .

Melkozerova, V. and Parafeniuk, O. (3rd March 2020). ‘How coronavirus disinformation caused chaos in a small Ukrainian town’. NBC News . URL: https://www.nbcnews.com/news/world/how-coronavirus-disinformation-caused-chaos-small-ukrainian-town-n1146936 .

Merriam-Webster (2020a). Disinformation . URL: https://www.merriam-webster.com/dictionary/disinformation (visited on 1st June 2020).

— (2020b). Misinformation . URL: https://www.merriam-webster.com/dictionary/misinformation (visited on 1st June 2020).

Miller, C. (9th March 2020a). ‘A small town was torn apart by coronavirus rumors’. BuzzFeed News . URL: https://www.buzzfeednews.com/article/christopherm51/coronavirus-riots-social-media-ukraine .

— (20th February 2020b). ‘A viral email about coronavirus had people smashing buses and blocking hospitals’. BuzzFeed News . URL: https://www.buzzfeednews.com/article/christopherm51/coronavirus-ukraine-china .

Munoz Mosquera, A. B. and Bachmann, S. D. (2016). ‘Lawfare in hybrid wars: the 2 1 s t century warfare’. Journal of International Humanitarian Legal Studies 7 (1), pp. 63–87. https://doi.org/10.1163/18781527-00701008 .

North Atlantic Treaty Organization (September 2017). NATO Strategic Communication (StratCom) handbook. URL: http://stratcom.nuou.org.ua/wp-content/uploads/2020/01/NATO-STRATEGIC-COMMUNICATION-HANDBOOK.pdf .

— (2019). NATO’s response to hybrid threats . URL: https://www.nato.int/cps/en/natohq/topics_156338.htm .

Office of the Chairman of the Joint Chiefs of Staff (January 2020). DOD dictionary of military and associated terms. U.S.A.: Doctrine for the Armed Forces of the United States. URL: https://www.jcs.mil/Portals/36/Documents/Doctrine/pubs/dictionary.pdf .

Patel, S. S., Rogers, M. B., Amlôt, R. and Rubin, G. J. (2017). ‘What do we mean by ‘community resilience’? A systematic literature review of how it is defined in the literature’. PLoS currents 9.

Patel, S. S., Grace, R. M., Chellew, P., Prodanchuk, M., Romaniuk, O., Skrebets, Y., Ryzhenko, S. A. and Erickson, T. B. (2020). ‘Emerging technologies and medical countermeasures to Chemical, Biological, Radiological and Nuclear (CBRN) agents in east Ukraine’. Conflict and Health 14 (1), 24. https://doi.org/10.1186/s13031-020-00279-9 .

Patel, S. S., Webster, R. K., Greenberg, N., Weston, D. and Brooks, S. K. (2020). ‘Research fatigue in Covid-19 pandemic and post-disaster research: causes, consequences and recommendations’. Disaster Prevention and Management: An International Journal . https://doi.org/10.1108/dpm-05-2020-0164 .

Paul, A. and Filipchuk, V. (31st March 2020). ‘The impact of Covid-19 on the E.U.’s neighbourhood: Ukraine’. European Policy Centre . URL: http://www.epc.eu/en/Publications/The-impact-of-COVID-19-on-the-EUs-neighbourhood-Ukraine~31202c .

Peel, M. and Fleming, S. (17th March 2020). ‘E.U. warns of pro-Kremlin disinformation campaign on coronavirus’. Financial Times . URL: https://www.ft.com/content/d65736da-684e-11ea-800d-da70cff6e4d3 .

Peters, J. (21st February 2020). ‘Coronavirus email hoax led to violent protests in Ukraine’. The Verge . URL: https://www.theverge.com/2020/2/21/21147969/coronavirus-misinformation-protests-ukraine-evacuees .

Polityuk, P. and Zinets, N. (21st February 2020). ‘After clashes, Ukraine blames disinformation campaign for spreading coronavirus panic’. Reuters . URL: https://www.reuters.com/article/china-health-ukraine/after-clashes-ukraine-blames-disinformation-campaign-for-spreading-coronavirus-panic-idINL8N2AL335 .

Pomerantsev, P. and Weiss, M. (2014). The menace of unreality: how the Kremlin weaponizes information, culture and money. New York, NY, U.S.A.: Institute of Modern Russia. URL: https://imrussia.org/media/pdf/Research/Michael_Weiss_and_Peter_Pomerantsev__The_Menace_of_Unreality.pdf .

Popovych, N., Makukhin, O., Tsybulska, L. and Kavatsiuk, R. (2018). Image of European countries on Russian TV. URL: https://uacrisis.org/wp-content/uploads/2018/02/TV-3.pdf .

Quinn, P. (2018). ‘Crisis communication in public health emergencies: the limits of ‘legal control’ and the risks for harmful outcomes in a digital age’. Life Sciences, Society and Policy 14 (1), 4. https://doi.org/10.1186/s40504-018-0067-0 .

RFE/RL (1st April 2020). ‘E.U. monitor: Russia, China sow distrust in west through coronavirus disinformation’. RadioFreeEurope RadioLiberty . URL: https://www.rferl.org/a/eu-monitor-russia-china-sow-distrust-in-west-through-coronavirus-disinformation/30523558.html .

Ryerson University Social Media Lab (2020). About the COVID19MisInfo.org portal . URL: https://covid19misinfo.org/about-page/ .

Samet, J. M. and Patel, S. S. (2011). The psychological and welfare consequences of the Chernobyl disaster: a systematic literature review, focus group findings and future directions. USC Institute for Global Health. URL: https://uscglobalhealth.files.wordpress.com/2016/01/chernobyl_report_april2011.pdf .

— (2013). Selected health consequences of the Chernobyl disaster: a further systematic literature review, focus group findings and future directions. USC Institute for Global Health. URL: https://uscglobalhealth.files.wordpress.com/2013/05/samet-patel-chernobyl-health-report-2013-light-1.pdf .

Shea, J. (21st April 2020). ‘Expert on eastern Europe healthcare voices concern about Covid-19 in Russia and Ukraine’. Medical Express . URL: https://medicalxpress.com/news/2020-04-expert-eastern-europe-healthcare-voices.html .

Starbird, K. and Wilson, T. (14th January 2020). ‘Cross-platform disinformation campaigns: lessons learned and next steps’. Harvard Kennedy School Misinformation Review 1 (1). https://doi.org/10.37016/mr-2020-002 .

Sukhankin, S. (2020). ‘Covid-19 as a tool of information confrontation: Russia’s approach’. The School of Public Policy Publications 13. https://doi.org/10.11575/sppp.v13i0.70113 .

Talant, B. (18th March 2020). ‘Coronavirus misinformation goes viral in Ukraine’. Kyiv Post . URL: https://www.kyivpost.com/ukraine-politics/coronavirus-misinformation-goes-viral-in-ukraine.html .

Tucker, P. (26th March 2020). ‘Russia pushing coronavirus lies as part of anti-NATO influence ops in Europe’. Defense One . URL: https://www.defenseone.com/technology/2020/03/russia-pushing-coronavirus-lies-part-anti-nato-influence-ops-europe/164140/ .

Ukraine Business Daily (8th April 2020). ‘E.U. to issue Eur190 mln to Ukraine to fight coronavirus’. Ukraine Business Daily . URL: http://search.proquest.com./docview/2387424188 .

Ukraine Crisis Media Center (22nd May 2020a). ‘Covid-404. Implications of Covid-19 in the occupied territories of Ukraine’. Ukraine Crisis Media Center . URL: https://uacrisis.org/en/implications-of-covid-19-in-the-occupied-territories-of-ukraine .

— (28th May 2020b). ‘How Kremlin (mis)handles Covid-19 at home and abroad’. Ukraine Crisis Media Center . URL: https://uacrisis.org/en/how-kremlin-mis-handles-covid-19-at-home-and-abroad .

United Nations (21st May 2020a). ‘UN launches new initiative to fight Covid-19 misinformation through ‘digital first responders’’. UN News . URL: https://news.un.org/en/story/2020/05/1064622 .

— (2020b). Verified . URL: https://www.shareverified.com/en .

Wanat, Z., Cerulus, L. and Scott, M. (19th March 2020). ‘E.U. warns of ‘pro-Kremlin’ disinfo on coronavirus pandemic’. Politico . URL: https://www.politico.eu/article/eu-warns-on-pro-kremlin-disinfo-on-coronavirus-pandemic/ .

World Health Organization (2020). Coronavirus disease 2019 (Covid-19) . Situation report 121. URL: https://www.who.int/docs/default-source/coronaviruse/situation-reports/20200520-covid-19-sitrep-121.pdf?sfvrsn=c4be2ec6_4 .

Authors

Sonny S. Patel, MPH, MPhil, NIH Fogarty Global Health Scholar and Fellow, Harvard Humanitarian Initiative, Department of Global Health and Population, Harvard T.H. Chan School of Public Health, Boston, MA, U.S.A. E-mail: spatel@hsph.harvard.edu .

Omar E. Moncayo, MCM, Department of Preventive Medicine, University of Southern California, Los Angeles, CA, U.S.A. E-mail: omoncayo@usc.edu .

Kristina M. Conroy, MA, Communications Manager, Ukrainian Research Institute, Harvard University, Cambridge, MA, U.S.A. E-mail: kconroy@fas.harvard.edu .

Doug Jordan, MA, MS, Doctoral Candidate, Communication, University of South Florida; Course Director, Joint Special Operations University, United States Special Operations Command, MacDill AFB, FL; Former Ministry of Defense Advisor Ukraine, Strategic Communication, U.S. Embassy, Kyiv, Ukraine. E-mail: douglas.jordan@socom.mil .

Timothy B. Erickson, MD, Division of Medical Toxicology, Department of Emergency Medicine, Brigham Health, Harvard Humanitarian Initiative, Harvard Medical School, Boston, MA, U.S.A. E-mail: terickson@bwh.harvard.edu .