Walking the Tightrope: Promoting demand for evaluation evidence in South Africa

Ian Goldman

Dr Ian Goldman, Head of Evaluation and Research, Department of Planning, Monitoring and Evaluation (DPME) in The Presidency, Republic of South Africa
Ian Goldman

Walking the tight rope. Promoting demand for evaluation evidence in South Africa.In South Africa we have now established a government-wide evaluation system, with evaluation happening at national and provincial levels (but not yet local government). A total of 47 evaluations are completed or underway, covering $6 billion of government expenditure, with eight more being planned. In establishing the system we have had to establish many of the underlying systems to support the evaluation system (standards, competences, guidelines, courses, quality assessment system), i.e. the supply system.

As well delivering evaluations, it is very important for us to work on demand, so that evidence is asked for and used once the results are available. Far too much discussion around evaluation focuses on supply, a comfort zone for technical people. In this blog post I want to focus more on the demand side, how senior managers and politicians can desire to have honest feedback on what is working or not, and how government’s work can be strengthened.

Political support is essential

We are fortunate that in South Africa we have a ‘burning platform’ which is leading to a strong political commitment to use evaluations. The burning platform is a term used in change management for what is leading people to be prepared to change. In our case there is political recognition, at the highest level of dissatisfaction and protests among citizens, and that government must improve its service delivery. M&E was seen as a key tool for this and there was the political decision to establish a Minister and Department of Performance Monitoring and Evaluation (DPME) .

The President and Cabinet see evaluations as imperative for understanding what is working and not working and why. Cabinet is spending at least one hour discussing each evaluation. This reflects the seriousness with which they take the findings. Apart from technical issues around each programme, what is emerging is that more work is needed to ensure that policies and programmes are planned properly in the first place, and that the design of interventions is guided by appropriate diagnostics, analysis of root causes and consideration of different options, before moving to a specific theory of change and design.

And ministers must feel that the system is to help, not to punish. The second evaluation we presented to Cabinet was on the reception year of schooling, and the impact evaluation showed that this was not having the desired impact on children in poorer schools in less well managed provinces. The Deputy Minister presented the findings in a very constructive way, commending the department for the scale of rollout, but then pointing out that quality needed to be improved if it was to have the impact needed. This set the tone for a very constructive discussion. These are the sort of milestones that enable a system to work, and build confidence.

Senior managers need to buy into the need for evidence

If senior managers do not want evidence, they will not support evaluations. We have now run four courses for Directors-General (permanent secretaries) and Deputy Directors-General on evidence, training about 110 of the top managers in the public service. This has been very helpful in creating more interest in the use of evidence. Ten out of 46 national Directors-General have attended, and some are becoming evidence champions. As a direct result we have approved for 2016/2017 evaluations proposed by the Treasury and the National Departments of Justice and Home Affairs.

Inasmuch as our National Evaluation Policy Framework is seeking to engender a common language around evaluation, these courses are seeking to develop a common purpose around evidence and the key role that M&E (and research) plays in this.

Some Directors-General are not attending, and we see higher levels of participation from the stronger departments. But success breeds success, so it is good to start with what will work, not where it will be a big struggle.

It is important that Parliaments also use evaluations

Parliamentary portfolio committees can be a key source of demand. They often struggle to get a deep understanding of what is happening in departments. All too often reports obscure, rather than provide a spotlight on what is not working and why. Evaluations can provide these committees with in-depth analyses that can be very helpful in their deliberations.

DPME is working with the Parliamentary Budget Office and also seeking to help ensure that chairs of committees and the research staff that support them are aware of the findings of evaluations, and are following up on implementation of the results.

A challenge is that Members of Parliament (MPs) change. For example, the committee that DPME reports to changed after the last election, as did all the members. This makes it more difficult to support MPs to use M&E evidence effectively. Much more work is needed in this domain.

The challenge: linking supply and demand

 In all cases the challenge is linking supply and demand. The role of units such as DPME is to be an evidence broker – to help create the supply of rigorous evidence, but also make sure it is available in the right form, in the right place, at the right time, and that appropriate preparation is done and relationships built to help ensure that the evidence is used.

An example is the use of a steering committee. This involves key stakeholders and make them part of the learning process and collective decision-making, so that they are more likely to be committed to implementing the findings.

The process of developing the Cabinet memorandum, drawing out and highlighting the key messages from the evaluation, is another very important process following the evaluation. This is done by compiling a short seven page story of the evaluation. The way one presents it is equally important – building ownership and making comments constructive, as in the example of the reception year of schooling above.

A key message is that process skills are as important as technical skills, and that it is essential to have some understanding of where senior managers and politicians are coming from.

So this is the tightrope we are walking: making both supply and demand effective, having technical and process skills so that we get high quality evidence, which actually does improve programmes and, as a result, people’s lives.

Share this Article

2 thoughts on “Walking the Tightrope: Promoting demand for evaluation evidence in South Africa

  1. I am not very familiar with the conditions under which your system operates but after going through your article, I feel obliged to say something;

    I understand the frustration of first making senior people understand the important of scientific evidence. Worse still, the frustration of having to orientate a new committee all the time to have them accept that such information is necessary to reveal the state of affairs and to provides recommendations for further improvement.

    I had hoped that each department would have a qualified evaluation researcher that would assist in building evaluation alongside each programme, from the levels of needs analysis, intrinsic evaluation to the more popular implementation, impact etc. It is scary to imagine a programme running towards the end without continuous advise, with no implementation being continuously and properly monitored and necessary interventions secured, at this point in South Africa. Proper processes from the start do not only make evaluation automatic but also improves buy-in and ownership. Let us hope that departments take the responsibilities and not leave it to your department. The use of empirical evidence has to be engraved in a way that no programme runs without it, not only for Parliament but as a way of life.

    The other factor that may assist may be to have a participatory kind of arrangement. People, even at high levels of operation, may not want to hear about research terminologies but they enjoy to take part in the collection of data, and wish to hear about the outcome. This may improve the levels of ownership also. Stakeholders and the rest of them need to be involved.

    As I said, I am reading the article short of internal operations information and may be suggesting things that are already implemented. I also wish to raise concern about the analyses of policies of government and would not wish to deliberate, lest this resides in another department.

Leave a Reply

Your email address will not be published. Required fields are marked *