For years I’ve been trying to find an easily understandable way to demonstrate the value of my Fire Authority’s spending on corporate communication. As I said in my last blog, in South Yorkshire, we’ve always evaluated the impact of our community fire safety (CFS) campaigns, and have constantly refined this to a good level now. We believe we can make a case for the impact our work with our colleagues in CFS has on reducing emergency incidents, some of which we showcased in our “Community Safety In Numbers” publication.
What has proven elusive is a way to measure the impact of the rest of our work, from media relations to our website; from social media to internal communication. We’re not there yet but, heavily influenced by Westminster Council’s excellent publication: “Evaluating Your Communication Tools”, we’ve made a start.
For the last two years in South Yorkshire, our main media relations performance target has revolved around “the percentage of media stories which include a community safety message.” In that time, our research tells us that the percentage of our population who can recall seeing a media story about us has significantly increased. I hope and believe this is linked. The performance measure has certainly provided focus to what our media relations work is there to achieve.
Our surveys over the past two years also tell us that whilst satisfaction with our emergency response service has slightly declined, the overall reputation of our organisation has improved. It’s far too early to tell whether there is any significance in these trends, or whether communicating effectively has helped our reputation score to hold up, but it’s something we’ll keep an eye on.
Westminster’s document has also helped us to start to evaluate our website, e-newsletter, social media and internal newsletter – these are all vital as we move towards more direct communication with the public, and put more emphasis on internal communication through the cuts. We’ve pulled our 2012/13 evaluation together in one short report. It’s interesting to us, gives us a clear picture of what is working and what isn’t, and goes some way towards showing the outcomes of our day-to-day work. It’s more tangible and scientific performance measurement for communication, which we haven’t been good at as a profession.
It’s something that we need to do better throughout the Fire Service, to provide clear evidence of what we’re achieving, and what may be lost in austerity cuts. I heard that one FRS boasts of cutting communications spending to just £12,000 by outsourcing it to an agency. But without a clear understanding of the role of communication in the FRS, and a way to evaluate its achievements, who knows whether £12,000 is too much to pay, or £500,000 not enough? We believe we can show the savings to society from our CFS campaigns in South Yorkshire alone run into the millions.
I’m convinced that driving ourselves to improve the proportion of media stories with a community safety message is improving public safety; now we’re turning our attention to researching the specific issue of smoke alarm testing because, more than smoke alarm ownership, this is a pure communication measure that will save lives. I’d love to be able to benchmark measures like this against other Fire Services; I’d be prepared to bet that the better-measured and more outcome-focused an FRS communications effort, the better it achieves desired behaviour change. I’d also be interested to know of any other ways Fire Services measure their communications work for its impact on community safety.
Finally on this topic, it’s worth considering the impact of the national Government Fire Kills campaign which, as with all Government campaigns now, is under constant review. The Fire Kills team regularly evaluate the impact of their key campaign periods, and the results are usually favourable – but it’s only carried out on a national basis. What I’d really like to see for 2013/14 is some evaluation which compares the impact in different FRS areas. Are smoke alarm testing rates pretty consistent nationwide? Or does local FRS support for the national message improve these figures during the campaign period? Is smoke alarm testing lower outside of the national campaign period, or does the message stick? And does a testing message put out only by a local FRS at a different time of the year work as well, and as cost effectively, as the national campaign? I’d love to find out.