By Russell Wildeman
(Workshop participants reflecting on the evidence challenges in their respective departments)
“Once you enter the space of policy-making in government, you are on your own. This is a very lonely road, especially in terms of the capacity required to make credible policies.”
These are the words of a senior official in the South African government who shared with me the challenges of making policy in his department. The overriding feeling is that our officials are isolated, they do not know who to turn to, and they simply do not have the staff capacity and numbers to help them choose a suitable policy course.
As I listened to him, I could not help feeling a sense of anxiety while knowing that the challenges facing him and his department are more likely to prolong the policy-making process and result in an end-product that is compromised from the start. I also realised that this is a common situation faced across departments and across the various spheres of government. Having studied education policy implementation for the last thirteen years, it struck me that much of what qualifies as policy-making in the post-1994 period has been and continues to be a precarious business.
At the heart of this official’s problems is the ability to access the evidence that would allow him and his department to choose among a bewildering array of policy options. And, even if he does access this information, he has to consider whether the different contexts under which evidence was collected are commensurate with our own challenges and contexts.
It is for this and other similar reasons that we hosted a two-day capacity building workshop on evidence-informed decision-making in Pretoria, South Africa. Government officials comprised the bulk of the attendees, but we also succeeded in attracting researchers from government-funded organisations such as the Human Science Research Council (HSRC) and the Council for Scientific and Industrial Research (CSIR). Our main facilitator was Dr Phil Davies, who heads up the London Office of 3ie, the International Initiative for Impact Evaluation.
Our approach to the programme was to offer strong methodological content and to get participants to think through some of the enablers and barriers to using research evidence in their policy-making work. We placed a premium on group work, as one of the key goals of the workshop was to get participants to think about policy-making in a realistic, resources-bound manner. However, the limited duration of the workshop made it harder to fully appreciate the contextual challenges facing South African government officials and this is one area of our work that will receive more attention in follow-up workshops.
In spite of the time limitations, we believe the workshop signalled important gains for participants and our programme as a whole. Some of the gains include
- An enhanced understanding of how to assess the quality of evidence, especially the use of counterfactuals in scientific research;
- An improved understanding of the various evaluation designs, when to use them, their benefits and limitations;
- Exposing participants to key international websites devoted to displaying and summarising the results of systematic reviews; and
- Promoting a larger network of individuals who now form part of the growing network of interested evidence-users.
Another positive spin-off of the latest workshop is that we have had multiple requests from various individuals in government departments to extend our mentorship-based programmes. Presently, the UJ-BCURE team are considering applications for new mentorship-based support, and we are convinced that the additional investment would further strengthen the positive impact our programme has on officials’ ability to produce evidence-informed policies.
So does the government official feel less lonely and isolated after our workshop? Interestingly enough, it was the workshop attendees who suggested that individuals who attended the workshop should learn to network more with colleagues present at the workshop and those who were not able to attend. Dr Davies shared an example from the United Kingdom (UK) where officials set aside an hour every week to inform themselves of recent evidence developments in their respective fields. Local officials will do well to replicate something similar. Furthermore, attendees will now become part of our public dissemination list and will be informed of future events, workshops or any knowledge events. Finally, our mentorship-based programmes will continue to address relevant policy issues and provide participants with the intellectual support they need.
In a small, but meaningful way, the workshop and our broader programmes, are beginning to give hope to policy-makers, and begin a process of undoing their sense of isolation by connecting them to an active community of evidence-users and experts.