It’s a well known truism that it is easier to criticise something than it is to solve it. Certainly anyone who has heard me at any of a number of workshops and conferences over the past year or so ask questions of speakers regarding the evidence base for the ‘facts’ and figures they have quoted citing the alleged benefits to be realised through investing in records management will be aware that I have not shied away from the criticism side of things. Though I should, perhaps, add that these questions have always been asked not to try to trip up or embarrass the speaker concerned, but as part of a genuine attempt to understand whether the numbers concerned -: whether it be regarding how much time senior managers spend looking for information or how many copies of the same document exist in the same organisation - are (as I always hoped) based on sound, empirical evidence or (as I always feared) were as mythical as the ‘Coopers & Lybrand’ study that so many seem to reference. Regrettably, if not unpredictably, it seems as though the latter of these scenarios is more often than not the case – as demonstrated in more rigorous fashion by the literature review we published last month.
But as I said at the outset of this piece, lamenting the lack of any reliable, objective, empirical data demonstrating the quantifiable benefits of investing in records management is one thing. The real question facing us was: what to do about it?
After spending a little time wandering up and down blind alleys investigating (and quickly discounting) a ‘Time and Motion’ based approach to measurement we soon settled on a focus on the process as the basis for measurement. After all, records management is surely only ever a means to an end? We spend resources on it to improve how we run our organisations, to improve the service we offer to our stakeholders, to improve our standards of governance and accountability and to ensure we are legally compliant. Surely if we could find ways of measuring how effective a process is before we improve it and then again after we’ve improved it we should have some means of quantifying the impact we have made. Then take away the costs involved in making the change and an even more illuminating set of results emerge.
But what to measure? After all, if you were to automate a previously paper-based process you might expect to see a reduction in both time taken processing information and the space required to store records. We can’t know what it is that you want to measure so we leave it up to you to define what and how many metrics you want to include: be they square metres of storage space, pounds and pence, staff time or C02 emissions - the choice is yours.
A real turning point in the project came when we started to think about the role of RM in a process improvement. After all, there must be few occasions (if ever) when it can be asserted with confidence that records management alone is responsible for achieving an improvement. Indeed, how would we even define what is ‘records management’ in this context? To take our previous example, the introduction of an electronic workflow system to replace a previously manual process clearly has a strong RM influence but it’s also about a technology change. So should it defined as an improvement caused by a new system or RM or both?
The answer (eventually) was obvious. There would be no arbitrary distinction between what aspects of the process improvement RM was responsible for and which were due to other factors. Nor any attempt to classify what counts as RM in this context and what does not. Again we let the user decide. This wasn’t a question of ducking the issue, it was an acknowledgement that process improvements are complex and multifaceted and that individual organisational drivers may differ markedly. The consequence of this decision has been to develop a tool which not only better reflects the complexity of real life, but also broadens its potential scope enormously. Yes, you can measure the improvements realised as a result of RM according to however you choose to define ‘records management’ but equally you can apply the same focus to whatever other element of process improvement that your organisation happens to be interested in measuring the impact of, be that people, IT, equipment or the combination of them all.
All of a sudden we no longer have a tool which might help fill the current dearth of facts and figures regarding the impact of RM, but also a way of deconstructing and measuring process improvement across the board.
But in some ways the hard work still remains to be done. We are well aware that using the Impact Calculator is not a trivial task. In the spirit of ‘garbage in; garbage out’ you can only get reliable, detailed data out if you are prepared to gather raw data of a similar kind in the first place. That, I’m afraid, is down to you.
We are also happy to acknowledge the Impact Calculator as ‘work in progress’. We’re hopeful of funding some pilots studies within the UK HE sector soon and would be very interested to hear the experiences of all those who make use of the tool, wherever they be, so that we can incorporate any improvements into a Version 2 in the near future
.
Finally, I should like to pay credit to my colleague, Joanne, whose statistical skills, sound judgement and commitment to the project have all helped turn my rather sketchy and notional idea of just how such a tool might work into this finished and infinitely superior end product. Nice one.
So do please take the time to download the tool, make use of it and let us know how you get on (if you do post anything online about your experiences we would be grateful if you use the tag ‘impact-calc’ to enable us to track it).
Its available now at www.jiscinfonet.ac.uk/impact-calculator
4 comments:
Hi Steve, will take a deeper look when time permits - meanwhile, on my Mac, the business process page is blank - not sure why.
I think it is great to see this development, but do have a view that practitioners need to engage in similar work so as to build and deepen competencies. There's a tension between the benefit of sound tools and the loss of engagement in the questions that the inevitably shape the design of the tools (if you get my meaning).
'twas ever thus...
Cheers,
John
Hi John,
Sorry to hear you've had an issue accessing part of the Impact Calculator. It would be useful to know about more about the exact problem you are having and the software versions you are using to help us identify the problem (having checked it on our Mac it seems to be working fine). Feel free to email be direct about this at steve.bailey@unn.ac.uk
Thanks
Thanks for this, Steve. Am taking a look now, in the hopes that I can derive some meaningful questions from it, to ask of people taking a survey for my PhD.
Great. Do let us know what you think - there is a feedback form at http://www.jiscinfonet.ac.uk/records-management/measuring-impact/impact-calculator/download
Post a Comment