The Research-Policy Interface: Models, Frames, and Affecting Factors

student submissionTo ensure an optimal evidence-based decision making process, it is essential to have an understanding of the research-policy interface and to be able to measure the effectiveness of the use and influence of research on policy. Different models for conceptualizing the research-policy interface have emerged over the past several decades and have, increasingly, accepted the inadvisability of over-simplifying this relationship (Nutley, Walter, & Davies, 2007).

In chapter four of Using Evidence, Nutley, Walter, and Davies (2007) explain various models of the research-policy interface from simple, hierarchical models to more complex, network models. Traditional models of the research-policy relationship tend to be simple, linear, and rational. The relationship these models describe is one-sided: researchers disseminating their information to passively receptive policy-makers. Although the models may be useful in some contexts, what they leave out and the assumptions they make are generally unhelpful in attempting to gain a realistic and holistic understanding of the research-policy relationship.

A more complex understanding of the relationship is found in network approaches and context-focused models of research use (Nutley et al., 2007). Both emphasize the importance of the relationships between actors, the seeming irrationality and non-linearity of the process, and that research is only one of many factors that influence policy. Postmodern accounts of research use deny the distinction between research and policy, focusing instead on situated knowledges – subjective and partial knowledge that necessitates being situated in a particular time, place, and context (Nutley et al., 2007). According to these accounts, research and policy are both socially constructed through power relations. Postmodernists critique previous models for privileging research, failing to recognize the power dynamics in the assumed superiority of science over local knowledge. What is excluded or suppressed in the research-policy relationship, therefore, reveals power struggles. Coffey and O’Toole (2012) identify four forms of knowledge involved in most research-policy relationships: scientific, managerial, lay, and Indigenous. The authors state that “in considering barriers and filters, it is important to consider the potentially pervasive ways in which some forms of knowledge are privileged and others marginalized” (Coffey & O’Toole, 2012, p. 324). In other words, societal power relations denote greater legitimacy, credibility, and relevance to some types of knowledge than others. Subsequently, the information derived from certain forms of knowledge will have less impact on policy-making than others, based on power inequality rather than inherent value.

Coffey and O’Toole (2012) discuss a knowledge systems framework for understanding how different forms and sources of knowledge interact and differently influence policy, which is an example of a complex model of the research-policy relationship. The authors identify four different approaches for studying the research-policy relationship: stakeholder analysis, network analysis, institutional analysis, and discourse analysis. These different methods take into account the potential for a large number and variety of stakeholders, social systems and relationships as well as the importance of institutions and discursive practices.

Leith et al. (2014) discuss operating environments of the research-policy interface which they describe as the links between various stakeholders, actors, values, stakes, institutions, and processes. They also specify that problem structures within these operating environments will determine to what extent a research-policy interaction will follow traditional, linear, rational models or complex, contextual, irrational ones. The authors detail the levels of problem structures ranging from well-structured, to moderately-structured, poorly-structured, and unstructured. On the one side, well-structured problems are characterized by low uncertainty, few differing values and opinions, and will generally follow more traditional models. On the other hand, unstructured problems have high uncertainty and many differing values and opinions; in this case, more complex, irrational models will be prevalent.

The factors that affect the efficacy of the research-policy interface include uncertainty, credibility, relevance, and legitimacy – all of which can act as barriers or enablers depending on the circumstance. Heink et al. (2015) discuss the challenge – and the importance – of increasing credibility, relevance, and legitimacy of research for use in policy. Drawing from Sarkki et al. (2014), the authors explain trade-offs that mean increasing one of these attributes will likely be to the detriment of another. For example, the clarity-complexity trade-off means that while increasing clarity will give the research greater relevance, it will be at the expense of a loss of complexity, which often decreases legitimacy. The speed-quality trade-off, again, increases relevance as speed increases, but leads to a decrease in legitimacy as quality is decreased. Striking a balance between speed and quality and between clarity and complexity, then, is necessary to optimize the research-policy interface. Credibility is also a necessary component for encouraging the use of research in policy-making. The credibility of research, however, is not based solely on the quality of the data themselves, but on the medium and mechanisms through which information is presented to those outside of the research community, as well as on the existing knowledge, beliefs, and biases of those receiving the information. Determining the credibility of research information is, therefore, far from a value-free process.

Morton et al. (2011) performed two experiments to show how presenting the same information through different frames can alter the way people understand information and inform their likelihood to take action. In particular, the researchers examined how uncertainty can, depending on how it is framed, act as either a barrier or an enabler for action. The results of both experiments confirmed the researchers’ hypothesis that information presented as certain with negative framing led to the lowest willingness to act while information presented as uncertain with positive framing led to the highest willingness to act. The authors attribute these results to the fact that empowerment leads to action; when people believe there is a greater chance that their actions could have a positive impact, they will be more willing to act. These experiments show the importance of how research is presented and communicated if the aim is to encourage policy-makers, the general public, or others to act on the information.

In summary, many different models have been proposed to describe the research-policy interface. These models range from simple, linear models to complex models and the most suitable model to describe this relationship depends on the particular issue under consideration (Leith et al., 2014). How uncertainty is framed and the balance struck in trade-offs between relevance and legitimacy can change how the research-policy relationship is affected by these factors (Morton et al., 2011; Heink et al., 2015). It is important to keep in mind that the research-policy relationship is not value-free or equitable: power relations and subjective assessments of credibility play important roles in determining how research is or is not able to influence policy (Coffey & O’Toole, 2012; Nutley et al., 2007; Heink et al., 2015).

 

References

Coffey, B., & O’Toole, K. (2012). Towards an improved understanding of knowledge dynamics in integrated coastal zone management: A knowledge systems framework. Conservation and Society, 10 (4), 318-329. DOI: 10.4103/0972-4923.105513

Heink, U., Marquard, E., Heubach, K., Jax, K., Kugel, C., Neßhover, C…. Vandewalle, M. (2015). Conceptualizing credibility, relevance and legitimacy for evaluating the effectiveness of science-policy interfaces: Challenges and opportunities. Science and Public Policy, 42, 676-689. doi:10.1093/scipol/scu082

Leith, P., O’Toole, K., Haward, M., Coffey, B., Rees, C., & Ogier, E. (2014). Analysis of operating environments: A diagnostic model for linking science, society and policy for sustainability. Environmental Science and Policy, 39, 162-171. http://dx.doi.org/10.1016/j.envsci.2014.01.001

Morton, T. A., Rabinovich, A., Marshall, D., & Bretschneider, P. (2011). The future that may (or may not) come: How framing changes responses to uncertainty in climate change communications. Global Environmental Change, 21, 103-109. doi:10.1016/j.gloenvcha.2010.09.013

Nutley, S. M., Walter, I., & Davies, H. T. O. (2007). Using evidence: How research can inform public services. Bristol: Policy Press.

Sarkki, S., Niemela, J., Tinch, R., van den Hove, S., Watt, A., & Young, J. C. (2014) Balancing credibility, relevance and legitimacy: A critical assessment of trade-offs in science– policy interfaces. Science and Public Policy, 41: 194–206. doi: 10.1093/scipol/sct046

 

Author: Laura Cutmore

 

This blog post is part of a series of posts authored by students in the graduate course “The Role of Information in Public Policy and Decision Making,” offered at Dalhousie University.

Please follow and like us: