Science of using science: an introduction to two new research reports on the use of evidence in decision-making

2016-04-27
Science of using science: an introduction to two new research reports on the use of evidence in decision-making

The use of research evidence to inform decision-making can increase the efficacy and relevance of policies and practices. From the US federal regulation on blood alcohol limits, to the design and funding of microfinance programmes in low- and middle-income countries, and the establishment of behavioral science units in public administrations, research evidence has and continues to inform decision-making. However, evidence of how and why research results were considered by decision-makers and to what effect they were used often remains anecdotal; and for every example of an evidence-informed policy, one can likely point to a policy decision that did not consider the relevant evidence base. This issue of research use or rather non-use is longstanding, and studies have greatly enhanced our understanding of the different types of research use (Weiss 1979); the many other factors that influence decision-making and the political nature of decisions themselves (e.g. Cairney 2015ODI 2014); as well as the barriers and facilitators of research use (Oliver et al 2014). Research use is clearly not a linear process and its practice and study needs to acknowledge the messiness of human behavior and decision-making processes. Notwithstanding, the moral and intellectual imperative to study and enhance the science of using science is clear too: investing (largely public) resources into the production of primary research studies without harvesting the results of these studies, synthesizing them to identify and understand larger patterns and effects, and ensuring that decision-makers can access and consider these results if desired, seems unethical and intellectually unreasonable.

If then evidence-informed decision-making is a desirable practice – an admittedly messy and non-linear practice – it seems justified to design and implement active interventions aiming to bring about the use of research evidence by decision-makers. There are in fact many such interventions being proposed: some propose that we need to improve decision-makers access to evidence (e.g. Lavis et al 2003); other focus on decision-makers’ skills to use evidence (e.g. CASP) and the relationships between decision-makers and researchers (e.g. Africa Evidence Network); systemic change and changes in decision-making processes and norms, too, have been positioned to foster evidence use (e.g. NICE). However, the efficacy of these different interventions, i.e. what works for research use is less clear. Similarly, the range of these interventions is often limited and informed by a narrow (but in-depth) existing literature on research use, and we might therefore be missing innovative strategies and techniques proposed in other research fields (e.g. behavioral sciences, communication and media studies, andragogy) that are applicable in the context of increasing evidence use too. For this reason, a collaboration [1] between the Alliance for Useful Evidence, the UCL EPPI-center, the Wellcome Trust, and What Works Wellbeing set out to investigate what we know (and don’t know) about the efficacy of research use interventions as well as what other interventions could be experimented with based on suggestions from the wider social science literature.

Our first aim was to harvest the existing research evidence on interventions supporting decision-makers’ use of evidence. For this, we employed systematic review methodology (more details on methods here) and developed a conceptual framework to unpack and categorise research use interventions (see below). Drawing for example on work from Nutely and colleagues, we applied six mechanisms of change through which interventions could influence evidence use. Seeing that the use of evidence in practice refers to a behaviour change on part of the decision-maker, we further integrated Michie and colleagues’ COM-B model of behaviour change into our conceptual framework. The COM-B model suggests that behaviour change (B) is a function of capability to change (C), opportunity to change (O), and motivation to change (M). Interventions targeting research use could do so either through improving the decision-makers’ capability, motivation, and opportunity to use evidence. Taken together, we propose the framework below as a tool to structure research use interventions and to synthesise their effects.

 

Our second aim was to explore whether there might be additional interventions of relevance to support decision-makers’ use of evidence. Our hunch was that social science research holds a vast body of knowledge on potentially relevant interventions that could be closer integrated and experimented with in the research use domain. We therefore applied the same mechanism and COM-B framework to scope social science research (e.g. psychology, organizational and management studies, adult education) for relevant interventions. Having found relevant interventions, their reported effectiveness in the social science was assessed before configuring how these interventions might be of benefit to support the mechanisms of change and strategies to improve decision-makers’ evidence use.

Following a fascinating and intense nine months of research, the findings of our Science of Using Science project were published in a report launch at the Wellome Trust in London two weeks ago. We produced two project reports depending on the level of detail that the reader is interested in (accessible here and here). The Alliance for Useful Evidence further produced a Discussion Paper contextualizing the review findings as well as extending and translating them for a policy audience. Over the next couple of weeks, we will be blogging about the review findings in more detail. So if you are interested in answers to the below (and much more), you can either access the project publications (here and here) or keep an eye out for our upcoming blogs on the Science of Using Science!

  • Do improved skills to access and make sense of evidence increase its use during decision-making? And if so, do we know what pedagogies and programmes are most relevant for adult learners?
  • Is access to evidence sufficient to increase its use? If not, what else can be done to improve effective access to evidence?
  • How do organizational structures and processes reinforce or mitigate evidence use? Can evidence use become part of organizational cultures?
  • Have we considered behavioral norms in our theory of change for evidence use? Can social marketing, social norms, and public branding support evidence use?
  • What is the role of interactions and relationships between decision-makers and researchers to support evidence use? Do we have evidence if and how they work?
  • Can we nudge evidence use? and why should we talk more about information design and evidence literacies?

[1] This project was led by the Alliance for Useful Evidence, with generous funding and support from Wellcome Trust and the What Works Centre for Wellbeing. The research was undertaken by Laurenz Langer, Janice Tripney, and David Gough of the EPPI-Centre, Social Science Research Unit, UCL Institute of Education, University College London.