Browse all Publications

Filter by keyword: Public engagement with science and technology

Publications including this keyword are listed below.

293 publications found

Sep 29, 2015 Commentary
Moving beyond the seductive siren of reach: planning for the social and economic impacts emerging from school-university engagement with research

by Richard Holliman and Gareth Davies

In the past 25 years school-university partnerships have undergone a transition from ad hoc to strategic partnerships. Over the previous two-and-a-half-years we have worked in partnership with teachers and pupils from the Denbigh Teaching School Alliance in Milton Keynes, UK.
Our aims have been to encourage the Open University and local schools in Milton Keynes to value, recognise and support school-university engagement with research, and to create a culture of reflective practice.
Through our work we have noted a lack of suitable planning tools that work for researchers, teachers and pupils. Here we propose a flexible and adaptable metric to support stakeholders as they plan for, enact and evaluate direct and meaningful engagement between researchers, teachers and pupils. The objective of the metric is to make transparent the level of activity required of the stakeholders involved — teachers, pupils and researchers — whilst also providing a measure for institutions and funders to assess the relative depth of engagement; in effect, to move beyond the seductive siren of reach.

Volume 14 • Issue 03 • 2015

Sep 29, 2015 Letter
A response to “Highlighting the value of impact evaluation: enhancing informal science learning and public engagement theory and practice”

by Heather King and Kate Steiner

Whilst welcoming Jensen’s response to our original paper, we suggest that our main argument may have been missed. We agree that there are many methods for conducting impact assessments in informal settings. However, the capacity to use such tools is beyond the scope of many practitioners with limited budgets, time, and appropriate expertise to interpret findings.
More particularly, we reiterate the importance of challenging the prevailing policy discourse in which longitudinal impact studies are regarded as the ‘gold standard’, and instead call for a new discourse that acknowledges what is feasible and useful in informal sector evaluation practice.

Volume 14 • Issue 03 • 2015

Jul 24, 2015 Essay
“Queue up, you stupid!”: communicating about technology problems. An exploratory study of warning messages posted on machines in public places

by Beatrice Arbulla and Massimiano Bucchi

Communication about technology has long been neglected within the field of science and technology communication. This visual exploratory study focuses on how users can communicate with and about technology in public places through warning signs posted on technological devices.
Three broad categories of messages have been identified: bad design, malfunctioning and disciplining users. By analyzing examples within each category, we suggest that studying these communicative situations can be a key to understanding how users are engaged in continuous, elaborate and sometimes even conflicting framing of technological devices (e.g. with regard to their purpose, appropriate uses, shifting boundaries between functioning/malfunctioning); how such framing, in turn, can be used to readjust/realign social behavior and organizational routines.

Volume 14 • Issue 03 • 2015

May 26, 2015 Article
Does attending a large science event enthuse young people about science careers?

by Sam Illingworth, Emma Lewis and Carl Percival

A survey was conducted during the University of Manchester’s 2014 ‘Science Extravaganza’, which saw the participation of over 900 Key Stage 3 (ages 11–14) students in a range of interactive demonstrations, all run by active University researchers. The findings of this study suggest that a new approach is necessary in order to use these large science events to actively engage with school students about the career opportunities afforded by science subjects. Recommendations for such an approach are suggested, including the better briefing of researchers, and the invitation of scientists from outside academia to attend and interact with the school students.

Volume 14 • Issue 02 • 2015

Apr 28, 2015 Article
Highlighting the value of evidence-based evaluation: pushing back on demands for ‘impact’

by Heather King, Kate Steiner, Marie Hobson, Amelia Robinson and Hannah Clipson

This paper discusses the value and place of evaluation amidst increasing demands for impact. We note that most informal learning institutions do not have the funds, staff or expertise to conduct impact assessments requiring, as they do, the implementation of rigorous research methodologies. However, many museums and science centres do have the experience and capacity to design and conduct site-specific evaluation protocols that result in valuable and useful insights to inform ongoing and future practice. To illustrate our argument, we discuss the evaluation findings from a museum-led teacher professional development programme, Talk Science.

Volume 14 • Issue 02 • 2015

Sep 22, 2014 Article
Science blogging: an exploratory study of motives, styles, and audience reactions

by Merja Mahrt and Cornelius Puschmann

This paper presents results from three studies on science blogging, the use of blogs for science communication. A survey addresses the views and motives of science bloggers, a first content analysis examines material published in science blogging platforms, while a second content analysis looks at reader responses to controversial issues covered in science blogs. Bloggers determine to a considerable degree which communicative function their blog can realize and how accessible it will be to non-experts Frequently readers are interested in adding their views to a post, a form of involvement which is in turn welcomed by the majority of bloggers.

Volume 13 • Issue 03 • 2014

Mar 13, 2014 Editorial
Do we know the value of what we are doing?

by Brian Trench

The demand for evaluation of science communication practices and the number and variety of such evaluations are all growing. But it is not clear what evaluation tells us - or even what it can tell us about the overall impacts of the now-global spread of science communication initiatives. On the other hand, well-designed evaluation of particular activities can support innovative and improved practices.

Volume 13 • Issue 01 • 2014

Mar 13, 2014 Commentary
Impacts of science communication on publics, cities and actors

by Gema Revuelta

An evaluation toolkit developed as part of the EU-funded PLACES project was applied in 26 case studies across Europe. Results show, among other things, the contribution of science communication initiatives to public curiosity, professional networking and perception of cities where these initiatives are stronger.

Volume 13 • Issue 01 • 2014

Mar 13, 2014 Commentary
The right weight: good practice in evaluating science communication

by Giuseppe Pellegrini

Evaluations of science communication activities before, during and after their implementation can provide findings that are useful in planning further activities. As some selected examples show, designing such evaluation is complex: they may involve assessment at various points, a mix of quantitative and qualitative methods, and show that impacts differ when seen from different perspectives.

Volume 13 • Issue 01 • 2014

Mar 13, 2014 Commentary
The problems with science communication evaluation

by Eric A. Jensen

Even in the best-resourced science communication institutions, poor quality evaluation methods are routinely employed. This leads to questionable data, specious conclusions and stunted growth in the quality and effectiveness of science communication practice. Good impact evaluation requires upstream planning, clear objectives from practitioners, relevant research skills and a commitment to improving practice based on evaluation evidence.

Volume 13 • Issue 01 • 2014