In this post, we investigate the various forms and difficulties associated with interpreting evidence and highlight its multifaceted nature. We consider the issue of different interpretations of evidence and emphasize the vital role of quality assurance processes such as peer review, which aim to assure the reliability and credibility of research. To effectively support policymakers’ decision-making processes, we also underline the importance of bridging organizational gaps and assembling carefully chosen, high-quality evidence.
What is Evidence?
“Evidence” is defined as information that substantiates a claim or proposition. It can take many forms, including quotations, examples, or facts, and appear in many media, including digital evidence, testimonial or oral evidence, and physical evidence. The definition of evidence is diverse and not standardized at present (Yu et al., 2023). The nature of evidence depends on the context where it is being used. One well-known typology was developed by Carol Weiss, who identified four types of evidence (Blewden et al., 2010). While conceptual evidence can indirectly affect a decision by changing one’s understanding of a problem, evidence is instrumental when a study’s conclusion directly influences a decision. Impeded evidence occurs when evidence analysis is required during a decision-making process, and symbolic or political evidence arises when evidence is used to support or legitimize a decision that has already been made (Melancon, 2023).
However, even these definitions are contested by many people who use evidence in their work. A study conducted by MacKillop and Downe (2023) used Q methodology to investigate policymakers’ attitudes and perceptions about what constitutes evidence. They found that the participants had four distinct definitions of evidence: Evidence-Based Policymaker (EBPM) Idealists contend that policymakers have a duty to use evidence in a way that is rigorous, understandable, and well-presented. Political respondents acknowledge that politics affects the definition of evidence, but they did not refute every aspect of EBPM. Pragmatists hold the opinion that the definition of what constitutes evidence will change depending on several variables. They also highlighted the challenges associated with using evidence, including determining its quality and the fact that not all evidence is quantifiable. Inclusive respondents believe what counts as evidence should be as broad and open as possible.
Creating Quality Evidence for Researchers
Because the definition of “evidence” varies widely, it can be challenging to judge the quality of research, especially regarding qualitative research. Evidence can be viewed as untrustworthy if an audience perceives that information has been altered by researchers or creators, either knowingly or unknowingly. Conflicts of interest and biases are two factors that could sway opinions or undermine the validity of analyses. Nutley et al. write that quality assurance procedures like peer review are important for creating hierarchy-based evidence, which places a higher value on evidence that has undergone numerous verifications (Nutley et al., 2019).
It is important to develop techniques for measuring research reliability and validity. For example, a recent editorial in Nature explored some of the challenges associated with “digital agriculture” (Editors, 2022). Digital agriculture refers to a movement undertaken by the Juno Project to provide important information to farmers in developing countries. The editors point out that while there is an overload of literature, it has not been adequately synthesized so that researchers can use it effectively. They discuss how key terms like “small-scale farmer” have not been universally defined, making it beneficial to hold conversations across scales and countries. Flemming and Noyes (2021) provide useful examples of how qualitative studies, which are usually context dependent, can be brought together through “Qualitative Evidence Synthesis,” which show commonalities across qualitative research in different contexts. This approach helps qualitative research to be verified and repeated. Both papers show the importance of evidence syntheses for providing some screening procedures like quality assurance, systematic review, and peer review, and for increasing the usefulness and trustworthiness of qualitative research.
Creating Quality Evidence for Policymakers
High quality evidence is important in the formulation of public policy, but it can be challenging to place the right information in the hands of policymakers. A number of important enablers can support the creation of informative, high-quality evidence for policymakers.
Bozeman (2022) studied how policies were made during the recent pandemic to trace the quality of evidence that was used in policy creation and evaluate the types of evidence that were applied most successfully. He found that it is not helpful to only produce scientific papers to inform policymakers about the issues that were the focus of policies they were writing. Just as scientists use evidence syntheses, policymakers also look for curated, second-hand evidence from scientific and technical information to influence policy. He suggests that these syntheses can be very important for improving policymakers’ and politicians’ science literacy.
In addition to curated evidence, bridging actors play a crucial role in facilitating communication between scientists and policymakers. Decentralization and poor communication within government agencies can impede the efficient execution of policies and decision-making, and bridging organizations can support more collaborative and diverse problem solving (Soomai et al., 2017). For example, Geddes (2023) highlighted the role of select committees in bridging the gap between researchers and decision makers in the United Kingdom Parliament. His report found that these committees play an important role in synthesizing diverse evidence into language that is understandable to policymakers, which brings in more diversity and public participation in committee activities. When it comes to the government agency in charge of managing fisheries, Soomai (2017) found that managers are typically operational decision makers who relay the information generated by scientists to policy analysts, making them a bridge between science and policy. The way that managers frame issues pertaining to the economic, social, and biological aspects of fisheries management has an impact on the priorities for policy, giving managers considerable power in decision-making.
Conclusion
There are many kinds of evidence, and solid evidence can be acquired from a variety of sources, leading to a more comprehensive understanding of a situation. Nonetheless, given the variety of forms that evidence can take, it is crucial to critically evaluate both the evidence and its source (Yu et al., 2023). Understanding the sources and motivations behind information may affect its interpretation and reveal whether the research was unduly influenced. To verify the legitimacy, relevance, and trustworthiness of data used in decision-making, tools like evidence syntheses and bridging organizations are essential. Both scientists and policy makers should be prepared to use open communication and cooperation and should work towards improving the efficiency of information channels within and between their organizations.
References
Bozeman, B. (2022). Use of science in public policy: Lessons from the COVID-19 pandemic efforts to “Follow the Science.” Science and Public Policy, 49(5), 806-817. https://doi.org/10.1093/scipol/scac026
Blewden, M., Carroll, P., & Witten, K. (2010). The use of social science research to inform policy development: Case studies from recent immigration policy. Kōtuitui: New Zealand Journal of Social Sciences Online, 5(1), 13-25. https://doi.org/10.1080/1175083X.2010.498087
Editors. (2022, November 17). Agriculture sorely needs a system for evidence synthesis. Nature, 611(7936), 425-426. https://doi.org/10.1038/d41586-022-03694-5
Flemming, K., & Noyes, J. (2021). Qualitative evidence synthesis: Where are we at? International Journal of Qualitative Methods, 20, 1-13. https://doi.org/10.1177/1609406921993276
Geddes, M. (2023). Good evidence. How do select committees use evidence to support their work? Findings from a Parliamentary Academic Fellowship Scheme project (p. 43). University of Edinburgh. https://www.sps.ed.ac.uk/sites/default/files/assets/pdf/GoodEvidence-MarcGeddes-Jan2023.pdf
MacKillop, E., & Downe, J. (2023, January 10). Researchers engaging with policy should take into account policymakers’ varied perceptions of evidence. LSE Impact Blog. https://blogs.lse.ac.uk/impactofsocialsciences/2023/01/10/researchers-engaging-with-policy-should-take-into-account-policymakers-varied-perceptions-of-evidence/
Melancon, L. (2023, June 2). The demand for evidence in the Canadian public service. Research Impact Canada. https://researchimpact.ca/perspectives/the-demand-for-evidence-in-the-canadian-public-service/
Nutley, S. M., Davies, H. T. O., & Hughes, J. (2019). Assessing and labelling evidence. In A. Boaz, H. T. O. Davies, A. Fraser, & S. M. Nutley (Eds.), What works now? Evidence-informed policy and practice (Chapter 11, pp. 225-249). Bristol: The Policy Press.
Soomai, S. S. (2017). The science-policy interface in fisheries management: Insights about the influence of organizational structure and culture on information pathways. Marine Policy, 81, 53-63. https://doi.org/10.1016/j.marpol.2017.03.016
Yu, X., Wu, S., Sun, Y., Wang, P., Wang, L., Su, R., Zhao. J., Fadlallah, R., Boeira, L., Oliver, S., Abraha, Y. G., Sewankambo, N. K., El-Jardali, F., Norris, S. L., & Chen, Y. (2023). Exploring the diverse definitions of “evidence”: A scoping review. BMJ Evidence-Based Medicine, 29(1), 37-43. https://doi:10.1136/bmjebm-2023-112355
Authors: Twinkle Dev and Lori Mombourquette
This blog post is part of a series of posts authored by students in the graduate course “Information in Public Policy and Decision Making” offered at Dalhousie University.