Day 2: Understanding evidence and context at #evidence2016

By Ameer Hohlfeld

cross-panel-government

Cross governmental panel representatives from South African departments: DPSA, DPME, DST, PSPPD and National Treasury

Day two of #evidence2016 aimed at understanding the evidence. The focus was mainly placed on government officials tasked with implementing policy. The day started off with flavours of day one’s theme of engaging. Ms. Beryl Leach made some profound statements which resonated with me as a researcher.  She highlighted that presenting policy-makers with the evidence alone is never going to be enough to bring about a policy change. It was also pointed out that we as researchers are well read in our fields but need to put ourselves in the shoes of the policy-makers as we need to understand what their needs and wants are. This was a lesson she learnt early on in her research career. Ms. Leach also emphasized that we need to acknowledge that building a relationship with the relevant policy-makers is an important step in driving policy change. However also taking note that relationships are complex, they are dynamic and there is no one size fits all when it comes to relationships. It was furthermore suggested that language plays an important role when presenting evidence.

A common theme throughout day two was the use of language and the way researchers inform others of their findings. When presenting evidence one needs to take into consideration who the target audience will be, how best to capture their attention, and what language to use for them to fully appreciate the message that is trying to be conveyed. Ultimately, this may determine whether a policy change will result from the evidence. Furthermore, we need to understand that there are key players other than policy-makers in governments who need to understand the evidence. Civil Societies, the media, and the affected public are some of the main target audiences are some who definitely would benefit if evidence is presented to them in a comprehensible manner. During the day someone asked how policy-makers without a research background should choose between different potential policies when being presented with multiple contradicting findings. To which the answer was that policy-makers should request a synthesis of evidence through systematic reviews and consult independent researchers.

The second common theme was the needs to appreciate the context before policy implementation takes place. Understanding that different societies may be culturally and or ethnically diverse to the evidence that came out of the research. Therefore, requiring implementers and researchers to firstly understand the intended target population prior to carrying out the policy. We need to always be asking the question whether findings from the research will work for us, as noted by Dr. Muller. For example, randomized controlled trials are generally carried out in controlled   environments where confounding variables are eliminated. Thus deducing information about these can be tricky as pointed out by Dr. Muller. Not knowing the factors that may have greater impacts on the causal relationships are not always known. He thus emphasized that copying and pasting conclusions from research are unacceptable. More attention is needed on how evidence from one context may or may not be applicable to our current context.

The cross-governmental Panel took place in the afternoon. We learnt that good policy poor implementation is a misnomer, but good policy considers the implementation context. Encouragingly the panel made us aware of the Early Childhood Development (ECD) policy which is considered to be South Africa’s greatest success story in poverty alleviation and this policy is based on evidence. They also noted that improving on policy takes many years and to expect overnight success is unfair considering democracy being so young and the structural changes that were required by our government to address previous injustices. Government officials also should not be ashamed to admit when policies do not work. If there is sufficient evidence the reset and begin again, but do so with caution as restarting can cause stagnation.

Repeatedly occurring questions were:

Who exactly needs to understand the evidence? how does one structure evidence in a manner for others to understand it? Also, does one need to change the way one presents evidence? Or should those whom the evidence is directed to (policy-makers) just know what how to interpret it? Alternatively, should the receivers of evidence be taught how to interpret evidence, and if not who should teach them and how does one teach them?