Publications including this keyword are listed below.
293 publications found
In the past 25 years school-university partnerships have undergone a transition from ad hoc to strategic partnerships. Over the previous two-and-a-half-years we have worked in partnership with teachers and pupils from the Denbigh Teaching School Alliance in Milton Keynes, UK.
Our aims have been to encourage the Open University and local schools in Milton Keynes to value, recognise and support school-university engagement with research, and to create a culture of reflective practice.
Through our work we have noted a lack of suitable planning tools that work for researchers, teachers and pupils. Here we propose a flexible and adaptable metric to support stakeholders as they plan for, enact and evaluate direct and meaningful engagement between researchers, teachers and pupils. The objective of the metric is to make transparent the level of activity required of the stakeholders involved — teachers, pupils and researchers — whilst also providing a measure for institutions and funders to assess the relative depth of engagement; in effect, to move beyond the seductive siren of reach.
Whilst welcoming Jensen’s response to our original paper, we suggest that our main argument may have been missed. We agree that there are many methods for conducting impact assessments in informal settings. However, the capacity to use such tools is beyond the scope of many practitioners with limited budgets, time, and appropriate expertise to interpret findings.
More particularly, we reiterate the importance of challenging the prevailing policy discourse in which longitudinal impact studies are regarded as the ‘gold standard’, and instead call for a new discourse that acknowledges what is feasible and useful in informal sector evaluation practice.
Communication about technology has long been neglected within the field of science and technology communication. This visual exploratory study focuses on how users can communicate with and about technology in public places through warning signs posted on technological devices.
Three broad categories of messages have been identified: bad design, malfunctioning and disciplining users. By analyzing examples within each category, we suggest that studying these communicative situations can be a key to understanding how users are engaged in continuous, elaborate and sometimes even conflicting framing of technological devices (e.g. with regard to their purpose, appropriate uses, shifting boundaries between functioning/malfunctioning); how such framing, in turn, can be used to readjust/realign social behavior and organizational routines.
A survey was conducted during the University of Manchester’s 2014 ‘Science Extravaganza’, which saw the participation of over 900 Key Stage 3 (ages 11–14) students in a range of interactive demonstrations, all run by active University researchers. The findings of this study suggest that a new approach is necessary in order to use these large science events to actively engage with school students about the career opportunities afforded by science subjects. Recommendations for such an approach are suggested, including the better briefing of researchers, and the invitation of scientists from outside academia to attend and interact with the school students.
This paper discusses the value and place of evaluation amidst increasing demands for impact. We note that most informal learning institutions do not have the funds, staff or expertise to conduct impact assessments requiring, as they do, the implementation of rigorous research methodologies. However, many museums and science centres do have the experience and capacity to design and conduct site-specific evaluation protocols that result in valuable and useful insights to inform ongoing and future practice. To illustrate our argument, we discuss the evaluation findings from a museum-led teacher professional development programme, Talk Science.
This paper presents results from three studies on science blogging, the use of blogs for science communication. A survey addresses the views and motives of science bloggers, a first content analysis examines material published in science blogging platforms, while a second content analysis looks at reader responses to controversial issues covered in science blogs. Bloggers determine to a considerable degree which communicative function their blog can realize and how accessible it will be to non-experts Frequently readers are interested in adding their views to a post, a form of involvement which is in turn welcomed by the majority of bloggers.
The demand for evaluation of science communication practices and the number and variety of such evaluations are all growing. But it is not clear what evaluation tells us - or even what it can tell us about the overall impacts of the now-global spread of science communication initiatives. On the other hand, well-designed evaluation of particular activities can support innovative and improved practices.
An evaluation toolkit developed as part of the EU-funded PLACES project was applied in 26 case studies across Europe. Results show, among other things, the contribution of science communication initiatives to public curiosity, professional networking and perception of cities where these initiatives are stronger.
Evaluations of science communication activities before, during and after their implementation can provide findings that are useful in planning further activities. As some selected examples show, designing such evaluation is complex: they may involve assessment at various points, a mix of quantitative and qualitative methods, and show that impacts differ when seen from different perspectives.
Even in the best-resourced science communication institutions, poor quality evaluation methods are routinely employed. This leads to questionable data, specious conclusions and stunted growth in the quality and effectiveness of science communication practice. Good impact evaluation requires upstream planning, clear objectives from practitioners, relevant research skills and a commitment to improving practice based on evaluation evidence.