Sunday 20 May 2007

What is counted, counts?

One day there is every chance that I wont be able to remember my arguments well, even on pretty boring subjects, so I need to get some of these ideas and arguments down. So this may only make sense if you put yourself in the position of two people arguing in an old people's place.

I have been working on the review of a program over the last couple of months. Nothing all that special except that it is a social program that operates primarily in remote Indigenous communities and utilises community development strategies to achieve outcomes. At one level it should be possible to measure performance against some of the outcomes. You can find some things to count. Unfortunately, the things you can find to count are not the things that are most important.

In an earlier life I made decisions about this, and many other, programs. The pressure to find and use quantifiable performance indicators was substantial and, of course, we played the game. My staff and I found things we could count and reported against them. Had to be accountable.

Increasingly, quantifiable performance indicators and their associated targets, are becoming the way many funders justify their decisions. I am becoming concerned that they actually believe that they are doing the right thing. My difficulty with this is that, in my experience, this is not the way good decisions are made. (A couple of subjective statements there BTW). Rather, they are the way that decision makers cover their backsides and justify themselves when under scrutiny.

It is so much easier to answer a question by saying

'the target was x, we achieved y and it was less/more than expected because of a, b and c'

than to say

'we were aiming to increase pride and self esteem across the community so that people would have the chance to begin to deal more successfully with the issues that are destroying their lives. We don't know yet whether the program will be successful but it is our judgement, based partly on the experience of people expert in this field, that we are on the right track.'

The latter gets you chewed up - or at least that is what the parliamentary committee tried to do to me. We did have an excellent argument though where the issues received some coverage that they normally wouldn't - and I wasn't sacked.

Reducing it all to numbers has a tendency to ignore the complexity in people. You may be able to develop 'happiness indicators' and have a nice little quiz with 20 questions to help you but, if you are paid to make judgements, should you not be expected to have the capacity to do so, and to be accountable for those judgements? And isn't it better for the society that it be informed about the reasons people make decisions rather than argue about numbers? Is not this the way that we develop as a society?

But if we can't count it we will never know - or will we?

3 comments:

Anonymous said...

And this applies to the proposed quiz for prospective Australian citizens as well. You can imagine some politician defending a decision to allow Mr XYZ to migrate to Australia: 'Mr XYZ got 100% in the test - that showed he was going to be a good citizen. We can't understand why he killed all those people ten years after he came here".

You don't have to think so hard about variables if you have a target that can be quantified and labelled with a number.

Nabla said...

I think I agree. I have come across the same obsession with quantifiable outcomes, although it is a bit easier to measure if the salinity has decreased in a particular river system, for example. The problem arises when you begin a project which will not show results for 20 years. Then you come up with something they call a "resource condition target" - the overall condition of the resource in a long term (50years) timeframe. Then every new project needs to propose how it will contribute to the target (which are set federally).
So people get the shits, and just measure how many trees they planted, and make up a fluffy link to overall land quality.
I still think there is a danger in going too far the other way. We now have this move towards "performance stories" - you get the community you are operating in to provide a qualitative example of where the particular program has worked. Is it a good idea? Maybe - Joe Hockey and John Howard use the same technique to show how Workchoices has worked - and we all know how great THAT idea was.

Saturday Night Fiver said...

All true. I wrote a post in a similar vein here, although in retrospect I come off sounding like an excessively didactic prig.