Improving Students’ Evidence-Based Reasoning: Two Studies Demonstrating the Promise of An Intervention

Open Access
- Author:
- Du, Hongcui
- Graduate Program:
- Educational Psychology
- Degree:
- Doctor of Philosophy
- Document Type:
- Dissertation
- Date of Defense:
- July 20, 2023
- Committee Members:
- Alexandra List, Chair & Dissertation Advisor
Amy Crosson, Outside Unit & Field Member
Matthew McCrudden, Major Field Member
Peggy Van Meter, Major Field Member
Matthew McCrudden, Professor in Charge/Director of Graduate Studies - Keywords:
- evidence-based reasoning
evidence identification
evidence evaluation
critical thinking - Abstract:
- The importance of evidence-based reasoning has been emphasized in educational standards across domains (e.g., the Common Core Standards, 2010; the Next Generation Science Standards, 2013) as well as in the learning goals for various introductory undergraduate courses (e.g., psychology, American Psychological Association, 2013; sociology, Pike et al., 2017; statistics, Carver et al., 2016). Nevertheless, prior research has found students to experience a variety of challenges with evidence-based reasoning. These included difficulties in reasoning about evidence (e.g., differentiating among various types of quantitative evidence, List et al., 2022; drawing appropriate evidence-based conclusions, Bleske-Rechek et al., 2015) and reasoning with evidence (e.g., providing appropriate and sufficient evidence in support of claims, McNeill & Krajcik, 2008). While a number of interventions have been developed to improve students’ evidence-based reasoning, these have mostly focused on fostering students’ provision of evidence while engaging in argumentations (i.e., reasoning with evidence, Iordanou & Constantinou, 2014; Reznitskaya et al., 2007). Less has been done to improve students’ reasoning about evidence, an important aspect of evidence-based reasoning and a precursor to students’ effective evidence provision. The purpose of the present studies was to examine the effectiveness of an Evidence-Based Reasoning (EBR) intervention in improving students’ reasoning about (i.e., identification, interpretation, and evaluation of) four types of evidence that are quantitative (i.e., evidence that is descriptive, comparative, causal, and correlational). Participants were undergraduates enrolled in an introduction educational psychology course. In Study 1, students first completed a set of pre-test measures. This included an Objective Evidence-Based Reasoning (OEBR) task assessing students’ abilities to identify different evidence types (i.e., identification) and draw appropriate evidence-based conclusions (i.e., interpretation), and a Constructed Evidence-Evaluation (CEE) task, assessing students’ abilities to evaluate conclusions or claims based on the type of evidence presented within constructed newspaper stories. Then, students were randomly assigned to one of two conditions, either completing the EBR training (i.e., intervention condition) or completing a training about research methods adapted from a textbook for Introduction to Psychology (i.e., control condition). Finally, all students completed a set of post-test measures, including the OEBR and CEE task as well as an Authentic-Constructed Evidence Evaluation (A-CEE) task assessing students’ evaluation of evidence-based conclusions drawn from real newspaper articles. Results from Study 1 showed that the EBR intervention was more effective in improving students’ OEBR task performance (i.e., evidence identification and interpretation), but not on their CEE task performance (i.e., evaluation of evidence-based conclusions). Analyzing students’ justifications in the evidence evaluation task, I found that one possible reason for this lack of effect on the CEE task was that the key reasoning strategy of identifying evidence type was not made salient enough to students. Therefore, I conducted a follow-up study to examine the effectiveness of a modified version of the EBR training (Study 2). In Study 2, the last module of the EBR intervention was removed. Instead, students were asked to identify the type of evidence presented and explicitly directed to consider evidence type when evaluating evidence-based conclusions during their completion of CEE and A-CEE tasks at post-test. Students otherwise followed the same procedure as in Study 1. Results showed that this modified version of EBR had similar effects in improving students’ OEBR task performance as the enhanced book chapter. At the same time, the EBR intervention was more effective in improving students’ CEE and A-CEE task performance than the enhanced book chapter. This indicates the EBR intervention might have helped students develop more robust schema for evidence types, that students could then draw upon when evaluating evidence-based conclusions. Thus, across two studies, I demonstrate the promise of the EBR intervention in improving students’ evidence-based reasoning.