1356 publications found
Access to high quality evaluation results is essential for science communicators to identify negative patterns of audience response and improve outcomes. However, there are many good reasons why robust evaluation linked is not routinely conducted and linked to science communication practice. This essay begins by identifying some of the common challenges that explain this gap between evaluation evidence and practice. Automating evaluation processes through new technologies is then explicated as one solution to these challenges, capable of yielding accurate real-time results that can directly feed into practice. Automating evaluation through smartphone and web apps tied to open source analysis tools can deliver on-going evaluation insights without the expense of regularly employing external consultants or hiring evaluation experts in-house. While such automation does not address all evaluation needs, it can save resources and equip science communicators with the information they need to continually enhance practice for the benefit of their audiences.
The Red de Popularización de la Ciencia y la Tecnología en América latina y el Caribe (RedPOP) (Latin American and Caribbean Network for the Popularization of Science and Technology) was created 25 years ago as an expression of a movement that started in the 1960s in favour of a scientific education. The purpose of this movement was to incorporate science into the general knowledge of the population by communicating science through different media, products and spaces such as museums and science centres. Since then, the movement has acquired considerable strength in Latin America and RedPOP has been a key factor to the development of this activity in the region, although several challenges still have to be addressed.
This commentary shares a personal ‘learning curve’ of a science communication researcher about the impact of (playful) tools and processes for inclusive deliberation on emerging techno-scientific topics in the contemporary era of two-way science and technology communication practices; needed and desired in responsible research and innovation (RRI) contexts. From macro-level impacts that these processes are supposed to have on research and innovation practices and society, as encouraged by the RRI community, the author discovers more about ‘micro-level’ impacts; through conversations with peers of her department Athena (VU University, Amsterdam), as well as through experiencing the SiP 2015 conference in Bristol. Based on that, she defines several ‘impact-spheres’: a modular set of flexibly defined micro-level impacts that events in RRI contexts can have on both academic and non-academic participants, with respect and relationship development as focal assets to aim for; individual (micro-)changes that potentially build up towards an ‘RRI world’.
King et al. [2015] argue that ‘emphasis on impact is obfuscating the valuable role of evaluation’ in informal science learning and public engagement (p. 1). The article touches on a number of important issues pertaining to the role of evaluation, informal learning, science communication and public engagement practice. In this critical response essay, I highlight the article’s tendency to construct a straw man version of ‘impact evaluation’ that is impossible to achieve, while exaggerating the value of simple forms of feedback-based evaluation exemplified in the article. I also identify a problematic tendency, evident in the article, to view the role of ‘impact evaluation’ in advocacy terms rather than as a means of improving practice. I go through the evaluation example presented in the article to highlight alternative, impact-oriented evaluation strategies, which would have addressed the targeted outcomes more appropriately than the methods used by King et al. [2015]. I conclude that impact evaluation can be much more widely deployed to deliver essential practical insights for informal learning and public engagement practitioners.
The drive for impact from research projects presents a dilemma for science communication researchers and practitioners — should public engagement be regarded only as a mechanism for providing evidence of the impact of research or as itself a form of impact? This editorial describes the curation of five commentaries resulting from the recent international conference
‘Science in Public: Research, Practice, Impact’. The commentaries reveal the issues science communicators may face in implementing public engagement with science that has an impact; from planning and co-producing projects with impact in mind, to organising and operating activities which meet the needs of our publics, and finally measuring and evaluating the effects on scientists and publics in order to ‘capture impact’.
RedPOP celebrates its 25th anniversary and the congress was a great occasion to commemorate it. More than 400 attendees from 23 countries around the world had the opportunity to talk about the relationship between art, science, education, public policy on science appropriation, science journalism, and new ways to reach the public audience. At the same time a Science Theater Festival was held. The Congress in numbers: 5 Magisterial Conferences, 245 simultaneous presentations, 8 Working
Groups, 9 simultaneous Workshops, 22 poster and 6 theater plays. 10 countries from Latin America (90Conversation was essential in this congress and everything was prepared to motivate it. Participants had the opportunity to hear voices from Latin America an outside of it through the international keynote. The challenging issues that were raised in the plenary sessions as well as the opportunity to make heard their voices during the Working Groups and to be able to work in the Workshops with the keynote speakers, made this a motivational meeting.
The largest meeting of science journalists took place this summer in Seoul, Korea. It bore the imprint of a few of the previous ones — as a gathering to build community and encourage beginners —, but also showed some marked changes from when it all started back in 1992, as told by some of the leading actors.
Between 2010 and July 2015, a group of researchers at the Department of History and Philosophy of Science, University of Cambridge and the National Maritime Museum were engaged in an Arts & Humanities Research Council-funded project “The Board of Longitude 1714–1828: Science, innovation and empire in the Georgian world”. The project team included a dedicated Public Engagement Officer whose role was to engage audiences with the outputs of the research project.
The National Maritime Museum celebrated the 300 th anniversary of the 1714 Longitude Act with a major exhibition, Ships, Clocks & Stars: The Quest for Longitude, which told the story of the 18th century quest for longitude, alongside a series of longitude-themed events. To commemorate the same anniversary, NESTA launched the 2014 Longitude Prize, a challenge to find a solution to today’s equivalent of the longitude problem, with the problem chosen by a public vote. Using these two examples as a case study, I explore how history of science helps science communication organisations engage people with science, and vice versa.
The narrative method of presenting popular science method promises to extend the audience of science, but carries risks related to two broad aspects of story: the power of narrative to impose a compelling and easily interpretable structure on discrete events and the unpredictability and mystique associated with story.
Modern technology and innovation research needs to analyse and collect users’ requirements from the outset of the project’s design, according to the Responsible Research and Innovation (RRI) approach. Bringing in new services without involving end-users in the whole research process does not make for optimal results in terms of scientific, technological and economic impact. This commentary reports on research experience of stakeholder involvement and co-production in Italy, implemented in Earth Observation downstream services at regional level. It reports the participative approach and method adopted and the impacts and benefits derived.