143 – Developing a dashboard for performance measurement. A case study – Part 3.

Posted by Colin Weatherby                                                                                         750 words

the score

In the first two posts (see here and here) I discuss the requirements for a performance dashboard for my unit, the sources of performance measurement ideas, and the thinking behind creation of that dashboard. This post has the final dashboard.

At the outset I will remind you that purpose of the dashboard is to provide the performance information that I need in real time to be able to drive my unit. It is not intended to measure everything that might be relevant to understanding the performance of the unit. There will need to be other measures. The objective is to have no more than ten ‘dials’ ion the dashboard.

The performance questions that I have selected that are relevant to real time understanding and management of performance are:

Performance question Measure Indicator Dashboard measure
Are we providing value to our customers and the community?   convenience complaints about registering service requests 1.      number of complaints .2. complaints as a % of all service requests received/week
timeliness customer service requests completed on time 3.      % service requests completed on time/week
purpose fulfilled complaints about expected service not delivered 4.      number of complaints received/week5.      complaints as a % of all service requests received/week
Are we providing value for money?    resources used efficiently $/unit work, staff/unit output 6.      number of staff at work each day as % total staff7.      number of items of plant available each day as % total plant
Are we compliant with the law?  Road Management Plan compliance monthly measurement against intervention levels 8.      percentage of actionable defects completed within intervention time/week
Are we compliant with organisational policies?    procurement compliance number of compliant purchases 9.      number of purchases made/week10.   percentage of purchases made when invoice received after PO raised/week
occupational health and safety compliance reported hazards or incidents 11.   number of hazards/incidents reported/week

One of the constraints I have encountered is the inadequacy of organisational systems to provide information in real time. Most indicators will require manual manipulation of data, sometimes from multiple sources – i.e. the numerator comes from one source and denominator from another. Therefore, the dashboard will provide retrospective data from the week before. This is hardly the way to drive a car but it may be an acceptable way to drive my unit. It will mean that I have 52 opportunities to take corrective action using week old data.

Some of the indicators are for inputs and others are for outputs. There are no ‘in process’ indicators. As it is, the data required to provide weekly reporting using this dashboard is likely to be 4 hours work. That is 10% of someone’s time in the administration team (0.1 EFT) just to create the dashboard. Then there will be time required for the ‘drivers’ (supervisors) to understand and use the information to make corrections.

The value for money indicators are simplistic and intended to alert supervisors to the simple availability of all of the resources at their disposal. It is likely that additional indicators of plant utilisation and labour productivity will be required once the levels of resources are being maintained. Understanding the impact of staff absences and non-availability of plant being repaired is a starting point.

At this stage I haven’t followed Rummler and Brache’s really simple and doable process to develop measures and identify lead indicators based on their ‘nine performance variables’ matrix linking performance needs to levels for performance, as shown below.

the 9 performance variables

Their process to develop measures has four steps:

  1. Identify the most significant outputs of the organisation at the organisation level, process level, and job performance level.
  2. Identify the ‘critical dimensions’ of performance for each of these outputs.
  3. Develop measures for each critical dimensions.
  4. Develop standards for each measure that are specific about the performance expectation.

To keep the discussion with my team simple and connected to their day to day responsibilities, I haven’t worried about whether or not the measure is related to a goal, design or management need, or whether it is for the organisation, a process or a person. I have looked at critical dimensions of performance.

No doubt there will be room for improvement in the next iteration of my dashboard and my team will be better able to participate in the discussion. It will also give me time to work with them on the differences between the nine performance variables and how to get the best return from the effort in using them to manage performance.

Rummler, Geary A., and Brache, Alan P. 1995. Improving Performance – How to Manage the White Space on the Organisation Chart.

2 thoughts on “143 – Developing a dashboard for performance measurement. A case study – Part 3.

  1. It could be argued that the focus on customer complaints as a ‘singular item’ feeds into the reactive approach we are trying to avoid. Wouldn’t it be better to have a measures there such as:

    – does Council have a project that can incorporate the theme of this complaint in its design?
    For example if someone rings up complaining about footpath tree root damage, can Council not only ‘service the customer’ but use this information in an existing ‘Urban Tree Health and Risk’ program that is consolidating data, actions and communications on this issue for the community.

    – if there is no relevant Council project, is this complaint feeding into a pattern of community concerns that Council can use as a basis for which to build a change of procedures?

    – can Council introduce this active and concerned resident into one of its local community initiatives? The opportunity to harness community energy can be made at this critical point. For example, perhaps the above resident concerned about tree root damage may like to join Council’s ‘sustainable gardening workshops’, or participate in the ‘Disability Reference Group’.

    Like

    • Sheridan
      Yes, I agree that focussing only on customer complaints could make works more reactive. The intention is to try and measure customer convenience on the assumption that if they compain about service it wasn’t convenient! In practice, my team have advised that they think we get very few calls (either we do a great job or customers don’t see any point in complaining) and it will be difficult to sort them out from other calls being logged.

      I think I understand where you are heading with the broader and more long-term approach. I suppose at this point the dashboard is trying to measure things that I can use to take corrective action. I would also be interested to think about any patterns that might inform better service design. If we have regular complainants who are engaging with the council to try and improve systems, I would encourage them to join other forums where they can influence what is happening.

      Maybe the answer is to have a periodic measure that looks at people we have regular interactions with and why they feel the need to keep contacting the council. We all have those people. Often they are seen as a problem rather than a source of solutions.

      I like the way you are thinking about the matter.

      CW

      Like

Comments are closed.