Lastly, we bring the ‘Prove It’ feature together by looking at the research done at Preston FM. They take a multidisciplinary approach and build in different methods and measures depending on the work they are carrying out and want to track.
Richard Lace tells us more
“Preston FM is a community radio station serving Preston and South Ribble. The station currently works with over 300 active volunteers, and regularly needs to be able to demonstrate its impact on direct beneficiaries as well as the community it serves. Evaluation activity informs the way that the station chooses to operate and develop its work, using information related to both direct beneficiaries and the wider community.
The impact on beneficiaries – the station’s volunteer base – is measured using a mixture of methods. The station uses the Soft Outcomes Measurement System (http://bit.ly/bFjdJL) to track volunteers’ progress in 10 soft-skill areas including communication, team-work and self-confidence. The system uses computer-based questionnaires, completed by volunteers at the beginning of their involvement with the station and again at regular intervals, to provide a raw quantitative measurement of volunteers’ progress.
This quantitative measurement is supplemented by twice-yearly “catch-ups” with all of the project’s volunteers. These one-to-one sessions last around an hour and involve talking about volunteers’ on-and off-air involvement in the station, as well as evaluating volunteers’ achievements; identifying any areas of interest for future development; looking at skills gaps and how they might be filled; helping to identify any new support needs; offering signposting to progression opportunities outside the project; and providing space for volunteers to feed in to plans for the development of the project overall. Group evaluation sessions are also held around 2-3 times each year, offering volunteers the chance to reflect on the station’s overall progress and direction, as opposed to their own development.
Sometimes, project-specific evaluation is appropriate. A current project based on using radio to raise awareness of International Development issues, for example, is partnering with a local university to evaluate the impact of the project on participants and on listeners.
The impact on listeners is also important to the station. An annual street survey, undertaken in partnership with a local market research organisation, provides an estimate of reach and, importantly, provides feedback from listeners on the quality of output. On-line feedback is encouraged – both generally and using specific calls related to particular projects (for example, a recent drama project focussing on religious extremism used an on-line survey to gauge how the series had affected listeners’ understanding of the issues explored). A sample of interviewees and contributors are regularly contacted (by phone or e-mail) a few weeks following their appearance on-air, to determine whether the station has assisted them with increased visitor numbers at an event; improved response to a campaign; a spike in calls or website
hits; or recruitment of new volunteers of beneficiaries.
All evaluation data collected – along with “pure numbers” demographic monitoring information measuring people through the door, people attending events and workshops, etc – are fed into the project steering group, which uses the information to identify areas where the project needs to improve its practice or develop new approaches. An annual impact assessment is produced, based loosely on the Achieving Better Community Development framework, summarising the evaluation information collected over the course of the year.”
To find out more about the types of qualitative and quantitative research and how to analyse your findings, refer to Analyse This, a guide for new academics and practitioners, written by Cerlim (Centre for Research in Library and Information Management) at Manchester Metropolitan University.