March 25th, 2010
Monitoring is the process of recording what you do for the benefit of others – usually funders. Evaluation is what you do for your own benefit and sometimes for funders, to improve your performance as a station, and it is another task that is easily forgotten about. Perhaps you are obtaining results which look great on paper and tick all the relevant boxes for your funders and Ofcom alike. But what do your volunteers think? Are they bored by your training, unhappy with your schedule and alienated from each other and the staff? What do your listeners think? Do you even have any? Evaluating your performance is key to identifying your shortcomings and failures and building on your achievements and strengths.
Everything you do must be constantly measured against your stated intents as a station. If your mission statement says you are going to change the lives of local residents, ask yourself how much change you see around you, and whether your day to day activities are responsible.
One popular way of thinking about evaluation is as a spiral (or more simply) a loop (see Figure 6.01).Every time you consider beginning a new project or changing your activities at the station, you should compare that activity against your stated aims,prepare for the activity and set targets, perform the activity, measure the outcome and compare to targets, review and evaluate the performance, then compare the results to your stated aims.
Evaluation loop for new activities
Although in theory that looks complex, in practice it is mostly common sense. Here’s a purely hypothetical example. You spot some grant money available to increase the profile of over-65s in the media. What happens next?
- You ask yourselves whether this fits in with the core objectives of your station and whether or motto begin the loop;
- You plan outreach work and training aimed specifically at that age group, and set targets for recruitment, training and broadcast hours;
- You apply for and receive funding, recruit and train volunteers and they begin broadcasting their own show every Tuesday;
- You examine your monitoring data and evaluate the project against the targets you originally set yourself (and which you agreed with the funders);
- (and 1.) You ask yourselves whether the project(in practice) fits with the core objectives of your station and whether or not to continue.
If the organisational culture is to routinely think about activities in this kind of way, evaluation quickly becomes an integral part of your station, rather than an additional chore.
Some funders will require you to undertake a formal project evaluation and perhaps also to publish and disseminate the resulting evaluation report. If possible, you should employ an external consultant to carry out this work – your project managers will probably not have the time or necessary expertise to carry out such a research exercise, plus the resulting report should be more objective and carry more weight if it hasn’t been produced ‘in-house’. Most funders will be happy for you to include the cost of an evaluator in your project budget. Some easily avoidable pitfalls:
- Choose your evaluator with care. There will be dozens of consultants and academics keen to carry out the evaluation, so make them work for it. If the project is sufficiently large, put it out for tender. Once you’ve short-listed, do some detective work – talk to previous clients (not just the one at the top of the bidders’ reference list)and read some of the bidders’ previous reports .Beware, anyone can call themselves a consultant and we’ve seen some incredibly shoddy evaluation reports in our time.
- Define crystal clear terms of reference e.g. exactly what is being evaluated, how it’s being evaluated, what input is required from you etc. You also need to make it clear to the evaluator that whilst they are producing an independent report,they will need to refer certain things back to you. If they are producing a report based on interviewing volunteers, for example, you will want to check the volunteer statements that they use in the report –the evaluator may not have realised that a particularly vociferous and negatively opinionated volunteer is actually someone with a personal grievance. You shouldn’t cramp the independence of the evaluator but you need to insist that the information they present is valid.
- Build the evaluation into the project from day 1.The report will be as good as worthless if the evaluator ends up scurrying around for the data once the project is over.
Evaluating your audience
Anyone involved in community radio will confirm that when you meet someone and tell them you are from a community station, the first question they ask is ‘so do you get many listeners?’ The usual answer is, ‘erm, we don’t really know.’
Ironically, it is one of the few questions that Ofcom don’t ask. Community radio (thankfully) is not about chasing every listener at the expense of other social gain targets. However there will come a time when you need to get some idea of who is listening. Some funders might insist upon asking. If you intend to sell advertising or sponsorship to any significant extent, then your clients will certainly want to know.BBC and commercial radio stations find out their listings from a company called RAJAR (actually coowned by the BBC and the commercial radio network) which uses sample surveys to calculate how many listeners each station has. Community radio stations are usually too small (geographically)to obtain remotely accurate results using RAJAR,even if you could afford their fees.
You will have some idea of the popularity of your station from the feedback you get anyway – the phone calls, e-mails, website hits and so on. You should invite such feedback at every opportunity. If you are brave you could hold public meetings on regular or sporadic basis where you invite your listeners to come and tell you what they think. Be warned, it may not always be an entirely inspirational occasion – it’s human nature to want to complain about what we don’t like before we applaud what we do. Nevertheless it can be a highly enlightening and useful process – assuming of course that you take the feedback on board and use it to improve what you do.
Ultimately there is no substitute for well-conducted research among a random sample of your community. There are many market research companies who would gladly conduct a survey in your specific area. The costs are generally extravagant, but if you are bringing in large sums of money from advertisers it may be worthwhile. That way there will be a certain credibility to the figures.In practice, you will probably end up doing it yourself,or if you are very lucky, persuading a school or college to take it on as a class project in social sciences, media studies or statistics. An audience survey would normally either be done using a phone and a directory, or a clipboard and a smile. Another approach when asked about your listening figures is to reply that you know what you do works –that the partners you work with want to continue their partnerships and that volunteers keep coming back to do their shows. They wouldn’t do it if something wasn’t working. Some ‘user testimonials’ are always useful to back up this line of argument.
Designing your audience research
First of all you must ask yourself what it is that you want to know. Do you just want to know the raw number of listeners you have, or the proportion of radio listeners at any one time? Do you want to know what the listeners think of you? Which sections of the community like you most – by age,sex, social class or ethnicity? Do you want to know which parts of your schedule are more or less popular? Do you want to know why those who don’t listen to you choose not to?
All those questions are valuable, but remember that the more questions you ask, the more data you will have to analyse, the longer it will take to conduct the survey and the harder it will be to persuade members of the public to participate. Keep your focus on what you really need to know.
Ideally an evaluation questionnaire should be written and delivered by people with no vested interest in the station. It is easy to skew the results by accident or design – to take a silly example, a researcher can ask the question: ‘would you say your local community radio station was A/ Interesting B/Fascinating or C/ Amazing’ then report back how wonderful everyone thinks they are.
More seriously, the psychology of market research is very subtle. You can actually change the results of surveys by changing the wording of questions or even the order in which you ask them. So if a survey were to start off by listing all the station’s social gain achievements and asking whether the interviewee thinks each one is a good thing or not,the effect is to make the interviewee feel more supportive and positive about the station, making it more likely they will say nice things on the later questions. Such flaws often creep into market research unintentionally, and they can work against you just as often as for you.
Make sure that any data you capture is usable – by avoiding the recording of general comment and by using multiple choice options that enable you to extract those vital percentages.
While it may be tempting to try to manipulate results by framing the questionnaire in a particular way,audience feedback is incredibly useful to community radio, and you will find the value of getting accurate results is much greater than the value of getting positive results. Hopefully you will get both.
Further reading and links
- A Practical Guide to Financial Management for Charities (2nd edition). Kate Sayer (Directory of Social Change, 2002)
- A Practical Guide to Charity Accounting (2nd edition).Kate Sayer (Directory of Social Change, 2003)Stress management
- Achieve! Personal effectiveness in the not-for-profit sector. Mark Butcher (Directory of Social Change,2003)
- Managing Workplace Stress: A best practice blueprint. Stephen Williams and Lesley Cooper (John Wiley Sons,2002)
Monitoring and evaluation
- The Complete Guide to Creating and Managing New Projects (2nd edition). Alan Lawrie (Directory of Social Change, 2002)
- Get It Right First Time: A self-help and training guide to project management. Peter James (Russell House Publishing, 2005)
Market and audience research
- A Newcomer’s Guide to Market and Social Research: download the PDF document