Quantity vs Quality – The Error in Target-Driven Outreach

Working in environmental outreach, I am frequently asked by colleagues for figures and statistics. Number of events run, number of volunteers and hours volunteered, percentage of minority visitors, number of disabled users etc. etc.

Numbers are a useful tool for monitoring, and our minds are certainly geared towards valuing numbers and statistics in a work setting much higher than abstract themes such as enjoyment, happiness or health. Objective data makes it far easier to rank, categorise and measure. The perceived wisdom is that it is essential when measuring progress towards a desired target. But in outreach, it is the subjective that matters, the impact on those reached, not their number, ethnicity or age. picture feedbackWe’ve tried to get round this by quantifying happiness and engagement. I’m sure you have at one time or another filled out or asked a parent or child to fill out an evaluation form rating happiness on a scale of 1 – 10, or ticking one of a set of incrementally happier faces. What does this really reflect? Do these types of evaluation ever supply accurate responses? I would argue that by their very definition they can’t. They are too generalised, too rigid and prescriptive to capture the range of emotions a child may experience when, for example, pond dipping for the first time. Additionally, forms handed out and collected personally are more likely to garner generally positive reviews, as people do not wish to offend face to face, as it were. We might like this aspect of paper feedback, but it is doing ourselves a disservice.

To some extent we have been backed into this process, through funders who see the quantitative rather than qualitative as an expedient way to gauge the ‘value for money’ they are getting for their investment. Many of us in the sector are now working in a position that is externally funded and the quantitative are seen as an essential component of justifying our continued employment in a competitive industry. So how can we improve the way we determine ‘success’? It may create a further paperwork headache when involving children, but recording on film, or taking photographs at events offers an opportunity to more accurately assess impacts on an emotional level. Taking written testimonies, using case studies and collecting quotes from attendees would provide a better, fuller picture of how an event was received, particularly as with the ubiquitous smart phone, these can now be done blindly through a host website such as surveymonkey.

Graphs - lovely, lovely graphs

Graphs – lovely, lovely graphs

Gathering information and data is invaluable in understanding public use of green space and using this to proactively manage a site. However, using quantitative data to evaluate the success, or otherwise, of an outreach project is largely and irrelevance. This extends not just to outreach, but to the world of ecology also, where ‘good’, ‘poor’, ‘recovering’ etc. is frequently judged in terms of numbers and percentage cover. This is again a function of the need to report back to others, the need to stand up and say ‘yes, we are delivering, you are getting your moneys worth.’ Often the people we are ultimately reporting to, particularly in the world of ecology, have little knowledge in the area and thrive on this kind of empirically measured stat. For a number of our sites, habitats, reserves and natural areas, someone with knowledge in the sector and with some background knowledge of the site would be able to judge ‘this is good’ or ‘this is better’ after a walk round.

But can we change this? Are we now stuck in a loop of using statistics to back up and compare with newer statistics as we look to judge a project or site by looking at a long stream of data? Without the greater contexts that those on the ground are aware of, these statistics can become overly abstract. We need to work to reduce our reliance on these and take the time to flesh out the bigger picture when promoting our work to the public and to funders. Only by increasing the knowledge and understanding of our audience can we break the hegemony of quantitative evaluation and begin to appraise our outreach and ecological projects in real terms.