What is the Critical Thinking Test?

Critical thinking practice test, take a free practice critical thinking test, practice critical thinking test.

Updated November 16, 2023

Edward Melett

The Critical Thinking Test is a comprehensive evaluation designed to assess individuals' cognitive capacities and analytical prowess.

This formal examination, often referred to as the critical thinking assessment, is a benchmark for those aiming to demonstrate their proficiency in discernment and problem-solving.

In addition, this evaluative tool meticulously gauges a range of skills, including logical reasoning, analytical thinking, and the ability to evaluate and synthesize information.

This article will embark on an exploration of the Critical Thinking Test, elucidating its intricacies and elucidating its paramount importance. We will dissect the essential skills it measures and clarify its significance in gauging one's intellectual aptitude.

We will examine examples of critical thinking questions, illuminating the challenging scenarios that candidates encounter prompting them to navigate the complexities of thought with finesse.

Before going ahead to take the critical thinking test, let's delve into the realm of preparation. This segment serves as a crucible for honing the skills assessed in the actual examination, offering candidates a chance to refine their analytical blades before facing the real challenge. Here are some skills that will help you with the critical thinking assessment: Logical Reasoning: The practice test meticulously evaluates your ability to deduce conclusions from given information, assess the validity of arguments, and recognize patterns in logic. Analytical Thinking: Prepare to dissect complex scenarios, identify key components, and synthesize information to draw insightful conclusions—a fundamental aspect of the critical thinking assessment. Problem-Solving Proficiency: Navigate through intricate problems that mirror real-world challenges, honing your capacity to approach issues systematically and derive effective solutions. What to Expect: The Critical Thinking Practice Test is crafted to mirror the format and complexity of the actual examination. Expect a series of scenarios, each accompanied by a set of questions that demand thoughtful analysis and logical deduction. These scenarios span diverse fields, from business and science to everyday scenarios, ensuring a comprehensive evaluation of your critical thinking skills. Examples of Critical Thinking Questions Scenario: In a business context, analyze the potential impacts of a proposed strategy on both short-term profitability and long-term sustainability. Question: What factors would you consider in determining the viability of the proposed strategy, and how might it affect the company's overall success? Scenario: Evaluate conflicting scientific studies on a pressing environmental issue.

Question: Identify the key methodologies and data points in each study. How would you reconcile the disparities to form an informed, unbiased conclusion?

Why Practice Matters

Engaging in the Critical Thinking Practice Test familiarizes you with the test format and cultivates a mindset geared towards agile and astute reasoning. This preparatory phase allows you to refine your cognitive toolkit, ensuring you approach the assessment with confidence and finesse.

We'll navigate through specific examples as we proceed, offering insights into effective strategies for tackling critical thinking questions. Prepare to embark on a journey of intellectual sharpening, where each practice question refines your analytical prowess for the challenges ahead.

This is a practice critical thinking test.

The test consists of three questions . 

After you have answered all the questions, you will be shown the correct answers and given full explanations.

Make sure you read and fully understand each question before answering. Work quickly, but don't rush. You cannot afford to make mistakes on a real test .

If you get a question wrong, make sure you find out why and learn how to answer this type of question in the future. 

Six friends are seated in a restaurant across a rectangular table. There are three chairs on each side. Adam and Dorky do not have anyone sitting to their right and Clyde and Benjamin do not have anyone sitting to their left. Adam and Benjamin are not sitting on the same side of the table.

If Ethan is not sitting next to Dorky, who is seated immediately to the left of Felix?

Job Test Prep

You might also be interested in these other PRT articles:

15 Free Psychometric Test Questions and Answers

  • Find Flashcards
  • Why It Works
  • Tutors & resellers
  • Content partnerships
  • Teachers & professors
  • Employee training

Brainscape's Knowledge Genome TM

  • Entrance Exams
  • Professional Certifications
  • Foreign Languages
  • Medical & Nursing

Humanities & Social Studies

  • Mathematics

Health & Fitness

Business & finance, technology & engineering, food & beverage.

  • Random Knowledge

See full index

Tags: critical thinking , humanities & social studies , philosophy, critical thinking flashcards.

CRITICAL THINKING

CRITICAL THINKING

By: emma trinker.

  • Critical Thinking

By: poda poda

Sat critical thinking class, by: pamela gaskill, by: cesar a. contla, critical thinking, by: kimberly brown, by: isabel lemen, by: claudia parrott, by: nathan copeland, by: manesh nguyen, critical thinking, by: julia bussmann, by: marie opara, critical thinking midterm, by: paige jarrell, critical legal thinking, by: olivia pinto, ssh 105- critical thinking, by: yaseen khan.

Critical Thinking

By: Feliks Jürisson

Critical thinking wshs, by: leeann stout, rlst: critical thinking, by: brandy last name, a level - critical thinking, by: joysa s., by: manleen pattar.

Critical Thinking (PHIL252)

Critical Thinking (PHIL252)

By: myroslav p, by: josef hansi, by: wendy pan, by: amin shakourloo, by: marina yeroozedek, by: ross arnold, nursing process/critical thinking, by: pamela pence.

STEM Critical Thinking

STEM Critical Thinking

By: laasya potuluri, critical thinking & logic c168, by: roxanne pettiford, dr. dykmans critical thinking in psychology, psych 110, by: kayla naude, critical thinking - medical terminology, by: amber schwinn, knowledge genome.

  • Humanities & Social Studies
  • Medical & Nursing
  • Health & Fitness
  • Business & Finance
  • Technology & Engineering
  • Food & Beverage
  • Anthropology
  • Citizenship
  • Communications
  • Criminal Justice
  • Political Science
  • Religion and Bible
  • Early Childhood Education
  • Social Studies
  • Social Work
  • Intro Philosophy
  • Corporate Training
  • Teachers & Schools
  • Android App
  • Help Center
  • Law Education
  • All Subjects A-Z
  • All Certified Classes
  • Earn Money!

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • CBE Life Sci Educ
  • v.17(1); Spring 2018

Understanding the Complex Relationship between Critical Thinking and Science Reasoning among Undergraduate Thesis Writers

Jason e. dowd.

† Department of Biology, Duke University, Durham, NC 27708

Robert J. Thompson, Jr.

‡ Department of Psychology and Neuroscience, Duke University, Durham, NC 27708

Leslie A. Schiff

§ Department of Microbiology and Immunology, University of Minnesota, Minneapolis, MN 55455

Julie A. Reynolds

Associated data.

This study empirically examines the relationship between students’ critical-thinking skills and scientific reasoning as reflected in undergraduate thesis writing in biology. Writing offers a unique window into studying this relationship, and the findings raise potential implications for instruction.

Developing critical-thinking and scientific reasoning skills are core learning objectives of science education, but little empirical evidence exists regarding the interrelationships between these constructs. Writing effectively fosters students’ development of these constructs, and it offers a unique window into studying how they relate. In this study of undergraduate thesis writing in biology at two universities, we examine how scientific reasoning exhibited in writing (assessed using the Biology Thesis Assessment Protocol) relates to general and specific critical-thinking skills (assessed using the California Critical Thinking Skills Test), and we consider implications for instruction. We find that scientific reasoning in writing is strongly related to inference , while other aspects of science reasoning that emerge in writing (epistemological considerations, writing conventions, etc.) are not significantly related to critical-thinking skills. Science reasoning in writing is not merely a proxy for critical thinking. In linking features of students’ writing to their critical-thinking skills, this study 1) provides a bridge to prior work suggesting that engagement in science writing enhances critical thinking and 2) serves as a foundational step for subsequently determining whether instruction focused explicitly on developing critical-thinking skills (particularly inference ) can actually improve students’ scientific reasoning in their writing.

INTRODUCTION

Critical-thinking and scientific reasoning skills are core learning objectives of science education for all students, regardless of whether or not they intend to pursue a career in science or engineering. Consistent with the view of learning as construction of understanding and meaning ( National Research Council, 2000 ), the pedagogical practice of writing has been found to be effective not only in fostering the development of students’ conceptual and procedural knowledge ( Gerdeman et al. , 2007 ) and communication skills ( Clase et al. , 2010 ), but also scientific reasoning ( Reynolds et al. , 2012 ) and critical-thinking skills ( Quitadamo and Kurtz, 2007 ).

Critical thinking and scientific reasoning are similar but different constructs that include various types of higher-order cognitive processes, metacognitive strategies, and dispositions involved in making meaning of information. Critical thinking is generally understood as the broader construct ( Holyoak and Morrison, 2005 ), comprising an array of cognitive processes and dispostions that are drawn upon differentially in everyday life and across domains of inquiry such as the natural sciences, social sciences, and humanities. Scientific reasoning, then, may be interpreted as the subset of critical-thinking skills (cognitive and metacognitive processes and dispositions) that 1) are involved in making meaning of information in scientific domains and 2) support the epistemological commitment to scientific methodology and paradigm(s).

Although there has been an enduring focus in higher education on promoting critical thinking and reasoning as general or “transferable” skills, research evidence provides increasing support for the view that reasoning and critical thinking are also situational or domain specific ( Beyer et al. , 2013 ). Some researchers, such as Lawson (2010) , present frameworks in which science reasoning is characterized explicitly in terms of critical-thinking skills. There are, however, limited coherent frameworks and empirical evidence regarding either the general or domain-specific interrelationships of scientific reasoning, as it is most broadly defined, and critical-thinking skills.

The Vision and Change in Undergraduate Biology Education Initiative provides a framework for thinking about these constructs and their interrelationship in the context of the core competencies and disciplinary practice they describe ( American Association for the Advancement of Science, 2011 ). These learning objectives aim for undergraduates to “understand the process of science, the interdisciplinary nature of the new biology and how science is closely integrated within society; be competent in communication and collaboration; have quantitative competency and a basic ability to interpret data; and have some experience with modeling, simulation and computational and systems level approaches as well as with using large databases” ( Woodin et al. , 2010 , pp. 71–72). This framework makes clear that science reasoning and critical-thinking skills play key roles in major learning outcomes; for example, “understanding the process of science” requires students to engage in (and be metacognitive about) scientific reasoning, and having the “ability to interpret data” requires critical-thinking skills. To help students better achieve these core competencies, we must better understand the interrelationships of their composite parts. Thus, the next step is to determine which specific critical-thinking skills are drawn upon when students engage in science reasoning in general and with regard to the particular scientific domain being studied. Such a determination could be applied to improve science education for both majors and nonmajors through pedagogical approaches that foster critical-thinking skills that are most relevant to science reasoning.

Writing affords one of the most effective means for making thinking visible ( Reynolds et al. , 2012 ) and learning how to “think like” and “write like” disciplinary experts ( Meizlish et al. , 2013 ). As a result, student writing affords the opportunities to both foster and examine the interrelationship of scientific reasoning and critical-thinking skills within and across disciplinary contexts. The purpose of this study was to better understand the relationship between students’ critical-thinking skills and scientific reasoning skills as reflected in the genre of undergraduate thesis writing in biology departments at two research universities, the University of Minnesota and Duke University.

In the following subsections, we discuss in greater detail the constructs of scientific reasoning and critical thinking, as well as the assessment of scientific reasoning in students’ thesis writing. In subsequent sections, we discuss our study design, findings, and the implications for enhancing educational practices.

Critical Thinking

The advances in cognitive science in the 21st century have increased our understanding of the mental processes involved in thinking and reasoning, as well as memory, learning, and problem solving. Critical thinking is understood to include both a cognitive dimension and a disposition dimension (e.g., reflective thinking) and is defined as “purposeful, self-regulatory judgment which results in interpretation, analysis, evaluation, and inference, as well as explanation of the evidential, conceptual, methodological, criteriological, or contextual considera­tions upon which that judgment is based” ( Facione, 1990, p. 3 ). Although various other definitions of critical thinking have been proposed, researchers have generally coalesced on this consensus: expert view ( Blattner and Frazier, 2002 ; Condon and Kelly-Riley, 2004 ; Bissell and Lemons, 2006 ; Quitadamo and Kurtz, 2007 ) and the corresponding measures of critical-­thinking skills ( August, 2016 ; Stephenson and Sadler-McKnight, 2016 ).

Both the cognitive skills and dispositional components of critical thinking have been recognized as important to science education ( Quitadamo and Kurtz, 2007 ). Empirical research demonstrates that specific pedagogical practices in science courses are effective in fostering students’ critical-thinking skills. Quitadamo and Kurtz (2007) found that students who engaged in a laboratory writing component in the context of a general education biology course significantly improved their overall critical-thinking skills (and their analytical and inference skills, in particular), whereas students engaged in a traditional quiz-based laboratory did not improve their critical-thinking skills. In related work, Quitadamo et al. (2008) found that a community-based inquiry experience, involving inquiry, writing, research, and analysis, was associated with improved critical thinking in a biology course for nonmajors, compared with traditionally taught sections. In both studies, students who exhibited stronger presemester critical-thinking skills exhibited stronger gains, suggesting that “students who have not been explicitly taught how to think critically may not reach the same potential as peers who have been taught these skills” ( Quitadamo and Kurtz, 2007 , p. 151).

Recently, Stephenson and Sadler-McKnight (2016) found that first-year general chemistry students who engaged in a science writing heuristic laboratory, which is an inquiry-based, writing-to-learn approach to instruction ( Hand and Keys, 1999 ), had significantly greater gains in total critical-thinking scores than students who received traditional laboratory instruction. Each of the four components—inquiry, writing, collaboration, and reflection—have been linked to critical thinking ( Stephenson and Sadler-McKnight, 2016 ). Like the other studies, this work highlights the value of targeting critical-thinking skills and the effectiveness of an inquiry-based, writing-to-learn approach to enhance critical thinking. Across studies, authors advocate adopting critical thinking as the course framework ( Pukkila, 2004 ) and developing explicit examples of how critical thinking relates to the scientific method ( Miri et al. , 2007 ).

In these examples, the important connection between writing and critical thinking is highlighted by the fact that each intervention involves the incorporation of writing into science, technology, engineering, and mathematics education (either alone or in combination with other pedagogical practices). However, critical-thinking skills are not always the primary learning outcome; in some contexts, scientific reasoning is the primary outcome that is assessed.

Scientific Reasoning

Scientific reasoning is a complex process that is broadly defined as “the skills involved in inquiry, experimentation, evidence evaluation, and inference that are done in the service of conceptual change or scientific understanding” ( Zimmerman, 2007 , p. 172). Scientific reasoning is understood to include both conceptual knowledge and the cognitive processes involved with generation of hypotheses (i.e., inductive processes involved in the generation of hypotheses and the deductive processes used in the testing of hypotheses), experimentation strategies, and evidence evaluation strategies. These dimensions are interrelated, in that “experimentation and inference strategies are selected based on prior conceptual knowledge of the domain” ( Zimmerman, 2000 , p. 139). Furthermore, conceptual and procedural knowledge and cognitive process dimensions can be general and domain specific (or discipline specific).

With regard to conceptual knowledge, attention has been focused on the acquisition of core methodological concepts fundamental to scientists’ causal reasoning and metacognitive distancing (or decontextualized thinking), which is the ability to reason independently of prior knowledge or beliefs ( Greenhoot et al. , 2004 ). The latter involves what Kuhn and Dean (2004) refer to as the coordination of theory and evidence, which requires that one question existing theories (i.e., prior knowledge and beliefs), seek contradictory evidence, eliminate alternative explanations, and revise one’s prior beliefs in the face of contradictory evidence. Kuhn and colleagues (2008) further elaborate that scientific thinking requires “a mature understanding of the epistemological foundations of science, recognizing scientific knowledge as constructed by humans rather than simply discovered in the world,” and “the ability to engage in skilled argumentation in the scientific domain, with an appreciation of argumentation as entailing the coordination of theory and evidence” ( Kuhn et al. , 2008 , p. 435). “This approach to scientific reasoning not only highlights the skills of generating and evaluating evidence-based inferences, but also encompasses epistemological appreciation of the functions of evidence and theory” ( Ding et al. , 2016 , p. 616). Evaluating evidence-based inferences involves epistemic cognition, which Moshman (2015) defines as the subset of metacognition that is concerned with justification, truth, and associated forms of reasoning. Epistemic cognition is both general and domain specific (or discipline specific; Moshman, 2015 ).

There is empirical support for the contributions of both prior knowledge and an understanding of the epistemological foundations of science to scientific reasoning. In a study of undergraduate science students, advanced scientific reasoning was most often accompanied by accurate prior knowledge as well as sophisticated epistemological commitments; additionally, for students who had comparable levels of prior knowledge, skillful reasoning was associated with a strong epistemological commitment to the consistency of theory with evidence ( Zeineddin and Abd-El-Khalick, 2010 ). These findings highlight the importance of the need for instructional activities that intentionally help learners develop sophisticated epistemological commitments focused on the nature of knowledge and the role of evidence in supporting knowledge claims ( Zeineddin and Abd-El-Khalick, 2010 ).

Scientific Reasoning in Students’ Thesis Writing

Pedagogical approaches that incorporate writing have also focused on enhancing scientific reasoning. Many rubrics have been developed to assess aspects of scientific reasoning in written artifacts. For example, Timmerman and colleagues (2011) , in the course of describing their own rubric for assessing scientific reasoning, highlight several examples of scientific reasoning assessment criteria ( Haaga, 1993 ; Tariq et al. , 1998 ; Topping et al. , 2000 ; Kelly and Takao, 2002 ; Halonen et al. , 2003 ; Willison and O’Regan, 2007 ).

At both the University of Minnesota and Duke University, we have focused on the genre of the undergraduate honors thesis as the rhetorical context in which to study and improve students’ scientific reasoning and writing. We view the process of writing an undergraduate honors thesis as a form of professional development in the sciences (i.e., a way of engaging students in the practices of a community of discourse). We have found that structured courses designed to scaffold the thesis-­writing process and promote metacognition can improve writing and reasoning skills in biology, chemistry, and economics ( Reynolds and Thompson, 2011 ; Dowd et al. , 2015a , b ). In the context of this prior work, we have defined scientific reasoning in writing as the emergent, underlying construct measured across distinct aspects of students’ written discussion of independent research in their undergraduate theses.

The Biology Thesis Assessment Protocol (BioTAP) was developed at Duke University as a tool for systematically guiding students and faculty through a “draft–feedback–revision” writing process, modeled after professional scientific peer-review processes ( Reynolds et al. , 2009 ). BioTAP includes activities and worksheets that allow students to engage in critical peer review and provides detailed descriptions, presented as rubrics, of the questions (i.e., dimensions, shown in Table 1 ) upon which such review should focus. Nine rubric dimensions focus on communication to the broader scientific community, and four rubric dimensions focus on the accuracy and appropriateness of the research. These rubric dimensions provide criteria by which the thesis is assessed, and therefore allow BioTAP to be used as an assessment tool as well as a teaching resource ( Reynolds et al. , 2009 ). Full details are available at www.science-writing.org/biotap.html .

Theses assessment protocol dimensions

In previous work, we have used BioTAP to quantitatively assess students’ undergraduate honors theses and explore the relationship between thesis-writing courses (or specific interventions within the courses) and the strength of students’ science reasoning in writing across different science disciplines: biology ( Reynolds and Thompson, 2011 ); chemistry ( Dowd et al. , 2015b ); and economics ( Dowd et al. , 2015a ). We have focused exclusively on the nine dimensions related to reasoning and writing (questions 1–9), as the other four dimensions (questions 10–13) require topic-specific expertise and are intended to be used by the student’s thesis supervisor.

Beyond considering individual dimensions, we have investigated whether meaningful constructs underlie students’ thesis scores. We conducted exploratory factor analysis of students’ theses in biology, economics, and chemistry and found one dominant underlying factor in each discipline; we termed the factor “scientific reasoning in writing” ( Dowd et al. , 2015a , b , 2016 ). That is, each of the nine dimensions could be understood as reflecting, in different ways and to different degrees, the construct of scientific reasoning in writing. The findings indicated evidence of both general and discipline-specific components to scientific reasoning in writing that relate to epistemic beliefs and paradigms, in keeping with broader ideas about science reasoning discussed earlier. Specifically, scientific reasoning in writing is more strongly associated with formulating a compelling argument for the significance of the research in the context of current literature in biology, making meaning regarding the implications of the findings in chemistry, and providing an organizational framework for interpreting the thesis in economics. We suggested that instruction, whether occurring in writing studios or in writing courses to facilitate thesis preparation, should attend to both components.

Research Question and Study Design

The genre of thesis writing combines the pedagogies of writing and inquiry found to foster scientific reasoning ( Reynolds et al. , 2012 ) and critical thinking ( Quitadamo and Kurtz, 2007 ; Quitadamo et al. , 2008 ; Stephenson and Sadler-­McKnight, 2016 ). However, there is no empirical evidence regarding the general or domain-specific interrelationships of scientific reasoning and critical-thinking skills, particularly in the rhetorical context of the undergraduate thesis. The BioTAP studies discussed earlier indicate that the rubric-based assessment produces evidence of scientific reasoning in the undergraduate thesis, but it was not designed to foster or measure critical thinking. The current study was undertaken to address the research question: How are students’ critical-thinking skills related to scientific reasoning as reflected in the genre of undergraduate thesis writing in biology? Determining these interrelationships could guide efforts to enhance students’ scientific reasoning and writing skills through focusing instruction on specific critical-thinking skills as well as disciplinary conventions.

To address this research question, we focused on undergraduate thesis writers in biology courses at two institutions, Duke University and the University of Minnesota, and examined the extent to which students’ scientific reasoning in writing, assessed in the undergraduate thesis using BioTAP, corresponds to students’ critical-thinking skills, assessed using the California Critical Thinking Skills Test (CCTST; August, 2016 ).

Study Sample

The study sample was composed of students enrolled in courses designed to scaffold the thesis-writing process in the Department of Biology at Duke University and the College of Biological Sciences at the University of Minnesota. Both courses complement students’ individual work with research advisors. The course is required for thesis writers at the University of Minnesota and optional for writers at Duke University. Not all students are required to complete a thesis, though it is required for students to graduate with honors; at the University of Minnesota, such students are enrolled in an honors program within the college. In total, 28 students were enrolled in the course at Duke University and 44 students were enrolled in the course at the University of Minnesota. Of those students, two students did not consent to participate in the study; additionally, five students did not validly complete the CCTST (i.e., attempted fewer than 60% of items or completed the test in less than 15 minutes). Thus, our overall rate of valid participation is 90%, with 27 students from Duke University and 38 students from the University of Minnesota. We found no statistically significant differences in thesis assessment between students with valid CCTST scores and invalid CCTST scores. Therefore, we focus on the 65 students who consented to participate and for whom we have complete and valid data in most of this study. Additionally, in asking students for their consent to participate, we allowed them to choose whether to provide or decline access to academic and demographic background data. Of the 65 students who consented to participate, 52 students granted access to such data. Therefore, for additional analyses involving academic and background data, we focus on the 52 students who consented. We note that the 13 students who participated but declined to share additional data performed slightly lower on the CCTST than the 52 others (perhaps suggesting that they differ by other measures, but we cannot determine this with certainty). Among the 52 students, 60% identified as female and 10% identified as being from underrepresented ethnicities.

In both courses, students completed the CCTST online, either in class or on their own, late in the Spring 2016 semester. This is the same assessment that was used in prior studies of critical thinking ( Quitadamo and Kurtz, 2007 ; Quitadamo et al. , 2008 ; Stephenson and Sadler-McKnight, 2016 ). It is “an objective measure of the core reasoning skills needed for reflective decision making concerning what to believe or what to do” ( Insight Assessment, 2016a ). In the test, students are asked to read and consider information as they answer multiple-choice questions. The questions are intended to be appropriate for all users, so there is no expectation of prior disciplinary knowledge in biology (or any other subject). Although actual test items are protected, sample items are available on the Insight Assessment website ( Insight Assessment, 2016b ). We have included one sample item in the Supplemental Material.

The CCTST is based on a consensus definition of critical thinking, measures cognitive and metacognitive skills associated with critical thinking, and has been evaluated for validity and reliability at the college level ( August, 2016 ; Stephenson and Sadler-McKnight, 2016 ). In addition to providing overall critical-thinking score, the CCTST assesses seven dimensions of critical thinking: analysis, interpretation, inference, evaluation, explanation, induction, and deduction. Scores on each dimension are calculated based on students’ performance on items related to that dimension. Analysis focuses on identifying assumptions, reasons, and claims and examining how they interact to form arguments. Interpretation, related to analysis, focuses on determining the precise meaning and significance of information. Inference focuses on drawing conclusions from reasons and evidence. Evaluation focuses on assessing the credibility of sources of information and claims they make. Explanation, related to evaluation, focuses on describing the evidence, assumptions, or rationale for beliefs and conclusions. Induction focuses on drawing inferences about what is probably true based on evidence. Deduction focuses on drawing conclusions about what must be true when the context completely determines the outcome. These are not independent dimensions; the fact that they are related supports their collective interpretation as critical thinking. Together, the CCTST dimensions provide a basis for evaluating students’ overall strength in using reasoning to form reflective judgments about what to believe or what to do ( August, 2016 ). Each of the seven dimensions and the overall CCTST score are measured on a scale of 0–100, where higher scores indicate superior performance. Scores correspond to superior (86–100), strong (79–85), moderate (70–78), weak (63–69), or not manifested (62 and below) skills.

Scientific Reasoning in Writing

At the end of the semester, students’ final, submitted undergraduate theses were assessed using BioTAP, which consists of nine rubric dimensions that focus on communication to the broader scientific community and four additional dimensions that focus on the exhibition of topic-specific expertise ( Reynolds et al. , 2009 ). These dimensions, framed as questions, are displayed in Table 1 .

Student theses were assessed on questions 1–9 of BioTAP using the same procedures described in previous studies ( Reynolds and Thompson, 2011 ; Dowd et al. , 2015a , b ). In this study, six raters were trained in the valid, reliable use of BioTAP rubrics. Each dimension was rated on a five-point scale: 1 indicates the dimension is missing, incomplete, or below acceptable standards; 3 indicates that the dimension is adequate but not exhibiting mastery; and 5 indicates that the dimension is excellent and exhibits mastery (intermediate ratings of 2 and 4 are appropriate when different parts of the thesis make a single category challenging). After training, two raters independently assessed each thesis and then discussed their independent ratings with one another to form a consensus rating. The consensus score is not an average score, but rather an agreed-upon, discussion-based score. On a five-point scale, raters independently assessed dimensions to be within 1 point of each other 82.4% of the time before discussion and formed consensus ratings 100% of the time after discussion.

In this study, we consider both categorical (mastery/nonmastery, where a score of 5 corresponds to mastery) and numerical treatments of individual BioTAP scores to better relate the manifestation of critical thinking in BioTAP assessment to all of the prior studies. For comprehensive/cumulative measures of BioTAP, we focus on the partial sum of questions 1–5, as these questions relate to higher-order scientific reasoning (whereas questions 6–9 relate to mid- and lower-order writing mechanics [ Reynolds et al. , 2009 ]), and the factor scores (i.e., numerical representations of the extent to which each student exhibits the underlying factor), which are calculated from the factor loadings published by Dowd et al. (2016) . We do not focus on questions 6–9 individually in statistical analyses, because we do not expect critical-thinking skills to relate to mid- and lower-order writing skills.

The final, submitted thesis reflects the student’s writing, the student’s scientific reasoning, the quality of feedback provided to the student by peers and mentors, and the student’s ability to incorporate that feedback into his or her work. Therefore, our assessment is not the same as an assessment of unpolished, unrevised samples of students’ written work. While one might imagine that such an unpolished sample may be more strongly correlated with critical-thinking skills measured by the CCTST, we argue that the complete, submitted thesis, assessed using BioTAP, is ultimately a more appropriate reflection of how students exhibit science reasoning in the scientific community.

Statistical Analyses

We took several steps to analyze the collected data. First, to provide context for subsequent interpretations, we generated descriptive statistics for the CCTST scores of the participants based on the norms for undergraduate CCTST test takers. To determine the strength of relationships among CCTST dimensions (including overall score) and the BioTAP dimensions, partial-sum score (questions 1–5), and factor score, we calculated Pearson’s correlations for each pair of measures. To examine whether falling on one side of the nonmastery/mastery threshold (as opposed to a linear scale of performance) was related to critical thinking, we grouped BioTAP dimensions into categories (mastery/nonmastery) and conducted Student’s t tests to compare the means scores of the two groups on each of the seven dimensions and overall score of the CCTST. Finally, for the strongest relationship that emerged, we included additional academic and background variables as covariates in multiple linear-regression analysis to explore questions about how much observed relationships between critical-thinking skills and science reasoning in writing might be explained by variation in these other factors.

Although BioTAP scores represent discreet, ordinal bins, the five-point scale is intended to capture an underlying continuous construct (from inadequate to exhibiting mastery). It has been argued that five categories is an appropriate cutoff for treating ordinal variables as pseudo-continuous ( Rhemtulla et al. , 2012 )—and therefore using continuous-variable statistical methods (e.g., Pearson’s correlations)—as long as the underlying assumption that ordinal scores are linearly distributed is valid. Although we have no way to statistically test this assumption, we interpret adequate scores to be approximately halfway between inadequate and mastery scores, resulting in a linear scale. In part because this assumption is subject to disagreement, we also consider and interpret a categorical (mastery/nonmastery) treatment of BioTAP variables.

We corrected for multiple comparisons using the Holm-Bonferroni method ( Holm, 1979 ). At the most general level, where we consider the single, comprehensive measures for BioTAP (partial-sum and factor score) and the CCTST (overall score), there is no need to correct for multiple comparisons, because the multiple, individual dimensions are collapsed into single dimensions. When we considered individual CCTST dimensions in relation to comprehensive measures for BioTAP, we accounted for seven comparisons; similarly, when we considered individual dimensions of BioTAP in relation to overall CCTST score, we accounted for five comparisons. When all seven CCTST and five BioTAP dimensions were examined individually and without prior knowledge, we accounted for 35 comparisons; such a rigorous threshold is likely to reject weak and moderate relationships, but it is appropriate if there are no specific pre-existing hypotheses. All p values are presented in tables for complete transparency, and we carefully consider the implications of our interpretation of these data in the Discussion section.

CCTST scores for students in this sample ranged from the 39th to 99th percentile of the general population of undergraduate CCTST test takers (mean percentile = 84.3, median = 85th percentile; Table 2 ); these percentiles reflect overall scores that range from moderate to superior. Scores on individual dimensions and overall scores were sufficiently normal and far enough from the ceiling of the scale to justify subsequent statistical analyses.

Descriptive statistics of CCTST dimensions a

MinimumMeanMedianMaximum
Analysis7088.690100
Interpretation7489.787100
Inference7887.989100
Evaluation6383.684100
Explanation6184.487100
Induction7487.48797
Deduction7186.48797
Overall73868597

a Scores correspond to superior (86–100), strong (79–85), moderate (70–78), weak (63–69), or not manifested (62 and lower) skills.

The Pearson’s correlations between students’ cumulative scores on BioTAP (the factor score based on loadings published by Dowd et al. , 2016 , and the partial sum of scores on questions 1–5) and students’ overall scores on the CCTST are presented in Table 3 . We found that the partial-sum measure of BioTAP was significantly related to the overall measure of critical thinking ( r = 0.27, p = 0.03), while the BioTAP factor score was marginally related to overall CCTST ( r = 0.24, p = 0.05). When we looked at relationships between comprehensive BioTAP measures and scores for individual dimensions of the CCTST ( Table 3 ), we found significant positive correlations between the both BioTAP partial-sum and factor scores and CCTST inference ( r = 0.45, p < 0.001, and r = 0.41, p < 0.001, respectively). Although some other relationships have p values below 0.05 (e.g., the correlations between BioTAP partial-sum scores and CCTST induction and interpretation scores), they are not significant when we correct for multiple comparisons.

Correlations between dimensions of CCTST and dimensions of BioTAP a

a In each cell, the top number is the correlation, and the bottom, italicized number is the associated p value. Correlations that are statistically significant after correcting for multiple comparisons are shown in bold.

b This is the partial sum of BioTAP scores on questions 1–5.

c This is the factor score calculated from factor loadings published by Dowd et al. (2016) .

When we expanded comparisons to include all 35 potential correlations among individual BioTAP and CCTST dimensions—and, accordingly, corrected for 35 comparisons—we did not find any additional statistically significant relationships. The Pearson’s correlations between students’ scores on each dimension of BioTAP and students’ scores on each dimension of the CCTST range from −0.11 to 0.35 ( Table 3 ); although the relationship between discussion of implications (BioTAP question 5) and inference appears to be relatively large ( r = 0.35), it is not significant ( p = 0.005; the Holm-Bonferroni cutoff is 0.00143). We found no statistically significant relationships between BioTAP questions 6–9 and CCTST dimensions (unpublished data), regardless of whether we correct for multiple comparisons.

The results of Student’s t tests comparing scores on each dimension of the CCTST of students who exhibit mastery with those of students who do not exhibit mastery on each dimension of BioTAP are presented in Table 4 . Focusing first on the overall CCTST scores, we found that the difference between those who exhibit mastery and those who do not in discussing implications of results (BioTAP question 5) is statistically significant ( t = 2.73, p = 0.008, d = 0.71). When we expanded t tests to include all 35 comparisons—and, like above, corrected for 35 comparisons—we found a significant difference in inference scores between students who exhibit mastery on question 5 and students who do not ( t = 3.41, p = 0.0012, d = 0.88), as well as a marginally significant difference in these students’ induction scores ( t = 3.26, p = 0.0018, d = 0.84; the Holm-Bonferroni cutoff is p = 0.00147). Cohen’s d effect sizes, which reveal the strength of the differences for statistically significant relationships, range from 0.71 to 0.88.

The t statistics and effect sizes of differences in ­dimensions of CCTST across dimensions of BioTAP a

a In each cell, the top number is the t statistic for each comparison, and the middle, italicized number is the associated p value. The bottom number is the effect size. Correlations that are statistically significant after correcting for multiple comparisons are shown in bold.

Finally, we more closely examined the strongest relationship that we observed, which was between the CCTST dimension of inference and the BioTAP partial-sum composite score (shown in Table 3 ), using multiple regression analysis ( Table 5 ). Focusing on the 52 students for whom we have background information, we looked at the simple relationship between BioTAP and inference (model 1), a robust background model including multiple covariates that one might expect to explain some part of the variation in BioTAP (model 2), and a combined model including all variables (model 3). As model 3 shows, the covariates explain very little variation in BioTAP scores, and the relationship between inference and BioTAP persists even in the presence of all of the covariates.

Partial sum (questions 1–5) of BioTAP scores ( n = 52)

VariableModel 1Model 2Model 3
CCTST inference0.536***0.491**
Grade point average0.1760.092
Independent study courses−0.0870.001
Writing-intensive courses0.1310.021
Institution0.3290.115
Male0.0850.041
Underrepresented group−0.114−0.060
Adjusted 0.273−0. 0220.195

** p < 0.01.

*** p < 0.001.

The aim of this study was to examine the extent to which the various components of scientific reasoning—manifested in writing in the genre of undergraduate thesis and assessed using BioTAP—draw on general and specific critical-thinking skills (assessed using CCTST) and to consider the implications for educational practices. Although science reasoning involves critical-thinking skills, it also relates to conceptual knowledge and the epistemological foundations of science disciplines ( Kuhn et al. , 2008 ). Moreover, science reasoning in writing , captured in students’ undergraduate theses, reflects habits, conventions, and the incorporation of feedback that may alter evidence of individuals’ critical-thinking skills. Our findings, however, provide empirical evidence that cumulative measures of science reasoning in writing are nonetheless related to students’ overall critical-thinking skills ( Table 3 ). The particularly significant roles of inference skills ( Table 3 ) and the discussion of implications of results (BioTAP question 5; Table 4 ) provide a basis for more specific ideas about how these constructs relate to one another and what educational interventions may have the most success in fostering these skills.

Our results build on previous findings. The genre of thesis writing combines pedagogies of writing and inquiry found to foster scientific reasoning ( Reynolds et al. , 2012 ) and critical thinking ( Quitadamo and Kurtz, 2007 ; Quitadamo et al. , 2008 ; Stephenson and Sadler-McKnight, 2016 ). Quitadamo and Kurtz (2007) reported that students who engaged in a laboratory writing component in a general education biology course significantly improved their inference and analysis skills, and Quitadamo and colleagues (2008) found that participation in a community-based inquiry biology course (that included a writing component) was associated with significant gains in students’ inference and evaluation skills. The shared focus on inference is noteworthy, because these prior studies actually differ from the current study; the former considered critical-­thinking skills as the primary learning outcome of writing-­focused interventions, whereas the latter focused on emergent links between two learning outcomes (science reasoning in writing and critical thinking). In other words, inference skills are impacted by writing as well as manifested in writing.

Inference focuses on drawing conclusions from argument and evidence. According to the consensus definition of critical thinking, the specific skill of inference includes several processes: querying evidence, conjecturing alternatives, and drawing conclusions. All of these activities are central to the independent research at the core of writing an undergraduate thesis. Indeed, a critical part of what we call “science reasoning in writing” might be characterized as a measure of students’ ability to infer and make meaning of information and findings. Because the cumulative BioTAP measures distill underlying similarities and, to an extent, suppress unique aspects of individual dimensions, we argue that it is appropriate to relate inference to scientific reasoning in writing . Even when we control for other potentially relevant background characteristics, the relationship is strong ( Table 5 ).

In taking the complementary view and focusing on BioTAP, when we compared students who exhibit mastery with those who do not, we found that the specific dimension of “discussing the implications of results” (question 5) differentiates students’ performance on several critical-thinking skills. To achieve mastery on this dimension, students must make connections between their results and other published studies and discuss the future directions of the research; in short, they must demonstrate an understanding of the bigger picture. The specific relationship between question 5 and inference is the strongest observed among all individual comparisons. Altogether, perhaps more than any other BioTAP dimension, this aspect of students’ writing provides a clear view of the role of students’ critical-thinking skills (particularly inference and, marginally, induction) in science reasoning.

While inference and discussion of implications emerge as particularly strongly related dimensions in this work, we note that the strongest contribution to “science reasoning in writing in biology,” as determined through exploratory factor analysis, is “argument for the significance of research” (BioTAP question 2, not question 5; Dowd et al. , 2016 ). Question 2 is not clearly related to critical-thinking skills. These findings are not contradictory, but rather suggest that the epistemological and disciplinary-specific aspects of science reasoning that emerge in writing through BioTAP are not completely aligned with aspects related to critical thinking. In other words, science reasoning in writing is not simply a proxy for those critical-thinking skills that play a role in science reasoning.

In a similar vein, the content-related, epistemological aspects of science reasoning, as well as the conventions associated with writing the undergraduate thesis (including feedback from peers and revision), may explain the lack of significant relationships between some science reasoning dimensions and some critical-thinking skills that might otherwise seem counterintuitive (e.g., BioTAP question 2, which relates to making an argument, and the critical-thinking skill of argument). It is possible that an individual’s critical-thinking skills may explain some variation in a particular BioTAP dimension, but other aspects of science reasoning and practice exert much stronger influence. Although these relationships do not emerge in our analyses, the lack of significant correlation does not mean that there is definitively no correlation. Correcting for multiple comparisons suppresses type 1 error at the expense of exacerbating type 2 error, which, combined with the limited sample size, constrains statistical power and makes weak relationships more difficult to detect. Ultimately, though, the relationships that do emerge highlight places where individuals’ distinct critical-thinking skills emerge most coherently in thesis assessment, which is why we are particularly interested in unpacking those relationships.

We recognize that, because only honors students submit theses at these institutions, this study sample is composed of a selective subset of the larger population of biology majors. Although this is an inherent limitation of focusing on thesis writing, links between our findings and results of other studies (with different populations) suggest that observed relationships may occur more broadly. The goal of improved science reasoning and critical thinking is shared among all biology majors, particularly those engaged in capstone research experiences. So while the implications of this work most directly apply to honors thesis writers, we provisionally suggest that all students could benefit from further study of them.

There are several important implications of this study for science education practices. Students’ inference skills relate to the understanding and effective application of scientific content. The fact that we find no statistically significant relationships between BioTAP questions 6–9 and CCTST dimensions suggests that such mid- to lower-order elements of BioTAP ( Reynolds et al. , 2009 ), which tend to be more structural in nature, do not focus on aspects of the finished thesis that draw strongly on critical thinking. In keeping with prior analyses ( Reynolds and Thompson, 2011 ; Dowd et al. , 2016 ), these findings further reinforce the notion that disciplinary instructors, who are most capable of teaching and assessing scientific reasoning and perhaps least interested in the more mechanical aspects of writing, may nonetheless be best suited to effectively model and assess students’ writing.

The goal of the thesis writing course at both Duke University and the University of Minnesota is not merely to improve thesis scores but to move students’ writing into the category of mastery across BioTAP dimensions. Recognizing that students with differing critical-thinking skills (particularly inference) are more or less likely to achieve mastery in the undergraduate thesis (particularly in discussing implications [question 5]) is important for developing and testing targeted pedagogical interventions to improve learning outcomes for all students.

The competencies characterized by the Vision and Change in Undergraduate Biology Education Initiative provide a general framework for recognizing that science reasoning and critical-thinking skills play key roles in major learning outcomes of science education. Our findings highlight places where science reasoning–related competencies (like “understanding the process of science”) connect to critical-thinking skills and places where critical thinking–related competencies might be manifested in scientific products (such as the ability to discuss implications in scientific writing). We encourage broader efforts to build empirical connections between competencies and pedagogical practices to further improve science education.

One specific implication of this work for science education is to focus on providing opportunities for students to develop their critical-thinking skills (particularly inference). Of course, as this correlational study is not designed to test causality, we do not claim that enhancing students’ inference skills will improve science reasoning in writing. However, as prior work shows that science writing activities influence students’ inference skills ( Quitadamo and Kurtz, 2007 ; Quitadamo et al. , 2008 ), there is reason to test such a hypothesis. Nevertheless, the focus must extend beyond inference as an isolated skill; rather, it is important to relate inference to the foundations of the scientific method ( Miri et al. , 2007 ) in terms of the epistemological appreciation of the functions and coordination of evidence ( Kuhn and Dean, 2004 ; Zeineddin and Abd-El-Khalick, 2010 ; Ding et al. , 2016 ) and disciplinary paradigms of truth and justification ( Moshman, 2015 ).

Although this study is limited to the domain of biology at two institutions with a relatively small number of students, the findings represent a foundational step in the direction of achieving success with more integrated learning outcomes. Hopefully, it will spur greater interest in empirically grounding discussions of the constructs of scientific reasoning and critical-thinking skills.

This study contributes to the efforts to improve science education, for both majors and nonmajors, through an empirically driven analysis of the relationships between scientific reasoning reflected in the genre of thesis writing and critical-thinking skills. This work is rooted in the usefulness of BioTAP as a method 1) to facilitate communication and learning and 2) to assess disciplinary-specific and general dimensions of science reasoning. The findings support the important role of the critical-thinking skill of inference in scientific reasoning in writing, while also highlighting ways in which other aspects of science reasoning (epistemological considerations, writing conventions, etc.) are not significantly related to critical thinking. Future research into the impact of interventions focused on specific critical-thinking skills (i.e., inference) for improved science reasoning in writing will build on this work and its implications for science education.

Supplementary Material

Acknowledgments.

We acknowledge the contributions of Kelaine Haas and Alexander Motten to the implementation and collection of data. We also thank Mine Çetinkaya-­Rundel for her insights regarding our statistical analyses. This research was funded by National Science Foundation award DUE-1525602.

  • American Association for the Advancement of Science. (2011). Vision and change in undergraduate biology education: A call to action . Washington, DC: Retrieved September 26, 2017, from https://visionandchange.org/files/2013/11/aaas-VISchange-web1113.pdf . [ Google Scholar ]
  • August D. (2016). California Critical Thinking Skills Test user manual and resource guide . San Jose: Insight Assessment/California Academic Press. [ Google Scholar ]
  • Beyer C. H., Taylor E., Gillmore G. M. (2013). Inside the undergraduate teaching experience: The University of Washington’s growth in faculty teaching study . Albany, NY: SUNY Press. [ Google Scholar ]
  • Bissell A. N., Lemons P. P. (2006). A new method for assessing critical thinking in the classroom . BioScience , ( 1 ), 66–72. https://doi.org/10.1641/0006-3568(2006)056[0066:ANMFAC]2.0.CO;2 . [ Google Scholar ]
  • Blattner N. H., Frazier C. L. (2002). Developing a performance-based assessment of students’ critical thinking skills . Assessing Writing , ( 1 ), 47–64. [ Google Scholar ]
  • Clase K. L., Gundlach E., Pelaez N. J. (2010). Calibrated peer review for computer-assisted learning of biological research competencies . Biochemistry and Molecular Biology Education , ( 5 ), 290–295. [ PubMed ] [ Google Scholar ]
  • Condon W., Kelly-Riley D. (2004). Assessing and teaching what we value: The relationship between college-level writing and critical thinking abilities . Assessing Writing , ( 1 ), 56–75. https://doi.org/10.1016/j.asw.2004.01.003 . [ Google Scholar ]
  • Ding L., Wei X., Liu X. (2016). Variations in university students’ scientific reasoning skills across majors, years, and types of institutions . Research in Science Education , ( 5 ), 613–632. https://doi.org/10.1007/s11165-015-9473-y . [ Google Scholar ]
  • Dowd J. E., Connolly M. P., Thompson R. J., Jr., Reynolds J. A. (2015a). Improved reasoning in undergraduate writing through structured workshops . Journal of Economic Education , ( 1 ), 14–27. https://doi.org/10.1080/00220485.2014.978924 . [ Google Scholar ]
  • Dowd J. E., Roy C. P., Thompson R. J., Jr., Reynolds J. A. (2015b). “On course” for supporting expanded participation and improving scientific reasoning in undergraduate thesis writing . Journal of Chemical Education , ( 1 ), 39–45. https://doi.org/10.1021/ed500298r . [ Google Scholar ]
  • Dowd J. E., Thompson R. J., Jr., Reynolds J. A. (2016). Quantitative genre analysis of undergraduate theses: Uncovering different ways of writing and thinking in science disciplines . WAC Journal , , 36–51. [ Google Scholar ]
  • Facione P. A. (1990). Critical thinking: a statement of expert consensus for purposes of educational assessment and instruction. Research findings and recommendations . Newark, DE: American Philosophical Association; Retrieved September 26, 2017, from https://philpapers.org/archive/FACCTA.pdf . [ Google Scholar ]
  • Gerdeman R. D., Russell A. A., Worden K. J., Gerdeman R. D., Russell A. A., Worden K. J. (2007). Web-based student writing and reviewing in a large biology lecture course . Journal of College Science Teaching , ( 5 ), 46–52. [ Google Scholar ]
  • Greenhoot A. F., Semb G., Colombo J., Schreiber T. (2004). Prior beliefs and methodological concepts in scientific reasoning . Applied Cognitive Psychology , ( 2 ), 203–221. https://doi.org/10.1002/acp.959 . [ Google Scholar ]
  • Haaga D. A. F. (1993). Peer review of term papers in graduate psychology courses . Teaching of Psychology , ( 1 ), 28–32. https://doi.org/10.1207/s15328023top2001_5 . [ Google Scholar ]
  • Halonen J. S., Bosack T., Clay S., McCarthy M., Dunn D. S., Hill G. W., Whitlock K. (2003). A rubric for learning, teaching, and assessing scientific inquiry in psychology . Teaching of Psychology , ( 3 ), 196–208. https://doi.org/10.1207/S15328023TOP3003_01 . [ Google Scholar ]
  • Hand B., Keys C. W. (1999). Inquiry investigation . Science Teacher , ( 4 ), 27–29. [ Google Scholar ]
  • Holm S. (1979). A simple sequentially rejective multiple test procedure . Scandinavian Journal of Statistics , ( 2 ), 65–70. [ Google Scholar ]
  • Holyoak K. J., Morrison R. G. (2005). The Cambridge handbook of thinking and reasoning . New York: Cambridge University Press. [ Google Scholar ]
  • Insight Assessment. (2016a). California Critical Thinking Skills Test (CCTST) Retrieved September 26, 2017, from www.insightassessment.com/Products/Products-Summary/Critical-Thinking-Skills-Tests/California-Critical-Thinking-Skills-Test-CCTST .
  • Insight Assessment. (2016b). Sample thinking skills questions. Retrieved September 26, 2017, from www.insightassessment.com/Resources/Teaching-Training-and-Learning-Tools/node_1487 .
  • Kelly G. J., Takao A. (2002). Epistemic levels in argument: An analysis of university oceanography students’ use of evidence in writing . Science Education , ( 3 ), 314–342. https://doi.org/10.1002/sce.10024 . [ Google Scholar ]
  • Kuhn D., Dean D., Jr. (2004). Connecting scientific reasoning and causal inference . Journal of Cognition and Development , ( 2 ), 261–288. https://doi.org/10.1207/s15327647jcd0502_5 . [ Google Scholar ]
  • Kuhn D., Iordanou K., Pease M., Wirkala C. (2008). Beyond control of variables: What needs to develop to achieve skilled scientific thinking? . Cognitive Development , ( 4 ), 435–451. https://doi.org/10.1016/j.cogdev.2008.09.006 . [ Google Scholar ]
  • Lawson A. E. (2010). Basic inferences of scientific reasoning, argumentation, and discovery . Science Education , ( 2 ), 336–364. https://doi.org/­10.1002/sce.20357 . [ Google Scholar ]
  • Meizlish D., LaVaque-Manty D., Silver N., Kaplan M. (2013). Think like/write like: Metacognitive strategies to foster students’ development as disciplinary thinkers and writers . In Thompson R. J. (Ed.), Changing the conversation about higher education (pp. 53–73). Lanham, MD: Rowman & Littlefield. [ Google Scholar ]
  • Miri B., David B.-C., Uri Z. (2007). Purposely teaching for the promotion of higher-order thinking skills: A case of critical thinking . Research in Science Education , ( 4 ), 353–369. https://doi.org/10.1007/s11165-006-9029-2 . [ Google Scholar ]
  • Moshman D. (2015). Epistemic cognition and development: The psychology of justification and truth . New York: Psychology Press. [ Google Scholar ]
  • National Research Council. (2000). How people learn: Brain, mind, experience, and school . Expanded ed. Washington, DC: National Academies Press. [ Google Scholar ]
  • Pukkila P. J. (2004). Introducing student inquiry in large introductory genetics classes . Genetics , ( 1 ), 11–18. https://doi.org/10.1534/genetics.166.1.11 . [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Quitadamo I. J., Faiola C. L., Johnson J. E., Kurtz M. J. (2008). Community-based inquiry improves critical thinking in general education biology . CBE—Life Sciences Education , ( 3 ), 327–337. https://doi.org/10.1187/cbe.07-11-0097 . [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Quitadamo I. J., Kurtz M. J. (2007). Learning to improve: Using writing to increase critical thinking performance in general education biology . CBE—Life Sciences Education , ( 2 ), 140–154. https://doi.org/10.1187/cbe.06-11-0203 . [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Reynolds J. A., Smith R., Moskovitz C., Sayle A. (2009). BioTAP: A systematic approach to teaching scientific writing and evaluating undergraduate theses . BioScience , ( 10 ), 896–903. https://doi.org/10.1525/bio.2009.59.10.11 . [ Google Scholar ]
  • Reynolds J. A., Thaiss C., Katkin W., Thompson R. J. (2012). Writing-to-learn in undergraduate science education: A community-based, conceptually driven approach . CBE—Life Sciences Education , ( 1 ), 17–25. https://doi.org/10.1187/cbe.11-08-0064 . [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Reynolds J. A., Thompson R. J. (2011). Want to improve undergraduate thesis writing? Engage students and their faculty readers in scientific peer review . CBE—Life Sciences Education , ( 2 ), 209–215. https://doi.org/­10.1187/cbe.10-10-0127 . [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Rhemtulla M., Brosseau-Liard P. E., Savalei V. (2012). When can categorical variables be treated as continuous? A comparison of robust continuous and categorical SEM estimation methods under suboptimal conditions . Psychological Methods , ( 3 ), 354–373. https://doi.org/­10.1037/a0029315 . [ PubMed ] [ Google Scholar ]
  • Stephenson N. S., Sadler-McKnight N. P. (2016). Developing critical thinking skills using the science writing heuristic in the chemistry laboratory . Chemistry Education Research and Practice , ( 1 ), 72–79. https://doi.org/­10.1039/C5RP00102A . [ Google Scholar ]
  • Tariq V. N., Stefani L. A. J., Butcher A. C., Heylings D. J. A. (1998). Developing a new approach to the assessment of project work . Assessment and Evaluation in Higher Education , ( 3 ), 221–240. https://doi.org/­10.1080/0260293980230301 . [ Google Scholar ]
  • Timmerman B. E. C., Strickland D. C., Johnson R. L., Payne J. R. (2011). Development of a “universal” rubric for assessing undergraduates’ scientific reasoning skills using scientific writing . Assessment and Evaluation in Higher Education , ( 5 ), 509–547. https://doi.org/10.1080/­02602930903540991 . [ Google Scholar ]
  • Topping K. J., Smith E. F., Swanson I., Elliot A. (2000). Formative peer assessment of academic writing between postgraduate students . Assessment and Evaluation in Higher Education , ( 2 ), 149–169. https://doi.org/10.1080/713611428 . [ Google Scholar ]
  • Willison J., O’Regan K. (2007). Commonly known, commonly not known, totally unknown: A framework for students becoming researchers . Higher Education Research and Development , ( 4 ), 393–409. https://doi.org/10.1080/07294360701658609 . [ Google Scholar ]
  • Woodin T., Carter V. C., Fletcher L. (2010). Vision and Change in Biology Undergraduate Education: A Call for Action—Initial responses . CBE—Life Sciences Education , ( 2 ), 71–73. https://doi.org/10.1187/cbe.10-03-0044 . [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Zeineddin A., Abd-El-Khalick F. (2010). Scientific reasoning and epistemological commitments: Coordination of theory and evidence among college science students . Journal of Research in Science Teaching , ( 9 ), 1064–1093. https://doi.org/10.1002/tea.20368 . [ Google Scholar ]
  • Zimmerman C. (2000). The development of scientific reasoning skills . Developmental Review , ( 1 ), 99–149. https://doi.org/10.1006/drev.1999.0497 . [ Google Scholar ]
  • Zimmerman C. (2007). The development of scientific thinking skills in elementary and middle school . Developmental Review , ( 2 ), 172–223. https://doi.org/10.1016/j.dr.2006.12.001 . [ Google Scholar ]

Bookmark this page

  • A Model for the National Assessment of Higher Order Thinking
  • International Critical Thinking Essay Test
  • Online Critical Thinking Basic Concepts Test
  • Online Critical Thinking Basic Concepts Sample Test

Consequential Validity: Using Assessment to Drive Instruction

Translate this page from English...

*Machine translated pages not guaranteed for accuracy. Click Here for our professional translations.

science and critical thinking exam 2

Critical Thinking Testing and Assessment









The purpose of assessment in instruction is improvement. The purpose of assessing instruction for critical thinking is improving the teaching of discipline-based thinking (historical, biological, sociological, mathematical, etc.) It is to improve students’ abilities to think their way through content using disciplined skill in reasoning. The more particular we can be about what we want students to learn about critical thinking, the better we can devise instruction with that particular end in view.

science and critical thinking exam 2

The Foundation for Critical Thinking offers assessment instruments which share in the same general goal: to enable educators to gather evidence relevant to determining the extent to which instruction is teaching students to think critically (in the process of learning content). To this end, the Fellows of the Foundation recommend:

that academic institutions and units establish an oversight committee for critical thinking, and

that this oversight committee utilizes a combination of assessment instruments (the more the better) to generate incentives for faculty, by providing them with as much evidence as feasible of the actual state of instruction for critical thinking.

The following instruments are available to generate evidence relevant to critical thinking teaching and learning:

Course Evaluation Form : Provides evidence of whether, and to what extent, students perceive faculty as fostering critical thinking in instruction (course by course). Machine-scoreable.

Online Critical Thinking Basic Concepts Test : Provides evidence of whether, and to what extent, students understand the fundamental concepts embedded in critical thinking (and hence tests student readiness to think critically). Machine-scoreable.

Critical Thinking Reading and Writing Test : Provides evidence of whether, and to what extent, students can read closely and write substantively (and hence tests students' abilities to read and write critically). Short-answer.

International Critical Thinking Essay Test : Provides evidence of whether, and to what extent, students are able to analyze and assess excerpts from textbooks or professional writing. Short-answer.

Commission Study Protocol for Interviewing Faculty Regarding Critical Thinking : Provides evidence of whether, and to what extent, critical thinking is being taught at a college or university. Can be adapted for high school. Based on the California Commission Study . Short-answer.

Protocol for Interviewing Faculty Regarding Critical Thinking : Provides evidence of whether, and to what extent, critical thinking is being taught at a college or university. Can be adapted for high school. Short-answer.

Protocol for Interviewing Students Regarding Critical Thinking : Provides evidence of whether, and to what extent, students are learning to think critically at a college or university. Can be adapted for high school). Short-answer. 

Criteria for Critical Thinking Assignments : Can be used by faculty in designing classroom assignments, or by administrators in assessing the extent to which faculty are fostering critical thinking.

Rubrics for Assessing Student Reasoning Abilities : A useful tool in assessing the extent to which students are reasoning well through course content.  

All of the above assessment instruments can be used as part of pre- and post-assessment strategies to gauge development over various time periods.

Consequential Validity

All of the above assessment instruments, when used appropriately and graded accurately, should lead to a high degree of consequential validity. In other words, the use of the instruments should cause teachers to teach in such a way as to foster critical thinking in their various subjects. In this light, for students to perform well on the various instruments, teachers will need to design instruction so that students can perform well on them. Students cannot become skilled in critical thinking without learning (first) the concepts and principles that underlie critical thinking and (second) applying them in a variety of forms of thinking: historical thinking, sociological thinking, biological thinking, etc. Students cannot become skilled in analyzing and assessing reasoning without practicing it. However, when they have routine practice in paraphrasing, summariz­ing, analyzing, and assessing, they will develop skills of mind requisite to the art of thinking well within any subject or discipline, not to mention thinking well within the various domains of human life.

For full copies of this and many other critical thinking articles, books, videos, and more, join us at the Center for Critical Thinking Community Online - the world's leading online community dedicated to critical thinking!   Also featuring interactive learning activities, study groups, and even a social media component, this learning platform will change your conception of intellectual development.

Practice4Me

  • AON Hewitt G.A.T.E.
  • PI Cognitive Assessment (PLI Test)
  • Korn Ferry Leadership Assessment
  • Berke Assessment
  • Ergometrics
  • Thomas International
  • Predictive Index (PI)
  • NEO Personality Inventory
  • Leadership Assessment
  • Gallup’s CliftonStrengths
  • Sales Personality Tests
  • Personality Management Tests
  • Saville Wave
  • McQuaig Word Survey
  • Bell Personality Test
  • Myers Briggs Personality Test
  • DISC Personality Test
  • Management SJT
  • Supervisory SJT
  • Administrative SJT
  • Call Center SJT
  • Customer Service SJT
  • Firefighter SJT
  • Numerical Reasoning Tests
  • Verbal Reasoning Tests
  • Logical Reasoning Tests
  • Cognitive Ability Tests
  • Technical Aptitude Tests
  • Spatial Reasoning Tests
  • Abstract Reasoning Test
  • Deductive Reasoning Tests
  • Inductive Reasoning Tests
  • Mechanical Reasoning Tests
  • Diagrammatic Reasoning Tests
  • Fault Finding Aptitude Tests
  • Mathematical Reasoning Tests
  • Critical Thinking Tests
  • Analytical Reasoning Tests
  • Raven’s Progressive Matrices Test
  • Criteria’s CCAT
  • Matrigma Test
  • Air Traffic Controller Test
  • Administrative Assistant Exam
  • Clerical Ability Exam
  • School Secretary Tests
  • State Trooper Exam
  • Probation Officer Exam
  • FBI Entrance Exam
  • Office Assistant Exam
  • Clerk Typist Test
  • Police Records Clerk Exam
  • Canada’s Public Service Exams
  • Firefighter Exams
  • Police Exams
  • Army Aptitude Tests
  • USPS Postal Exams
  • Hiring Process by Professions
  • Recruiting Companies

Select Page

Critical Thinking Test: Online Preparation & Free Practice Questions – 2024

Job Assessment

  • Information
  • Free Example Questions

What Is Critical Thinking?

Critical thinking is a form of decision making and reasoning using data and observations. Someone who is a strong critical thinker can find quality solutions efficiently and can evaluate issues objectively.

What Is a Critical Thinking Test?

Critical thinking tests provide companies valuable insight into the leadership, reasoning, and overall capabilities of candidates. Because strong critical thinking skills are highly sought after, the critical thinking test can be applicable to any field and discipline across multiple levels of expertise from recent graduate to executive. However, it is commonly administered to those applying for criminal justice and business-related occupations.

Job seekers with upcoming critical thinking tests will be evaluated on more than their ability to rationalize, critical thinking tests also measure the following subsets:

  • Organizing & Planning
  • Strategizing
  • Decision Making
  • Problem Solving

The format of the critical thinking uses hypothetical scenarios to assess candidates. The scenarios are typically relevant to the field you are interested in to assess your knowledge of the role. There will also be general questions concerning more basic issues or problems that commonly occur in a workplace environment.

The critical thinking test is multiple-choice with thirty minutes to complete the assessment. Candidates will receive a notification stating whether or not they passed within a week of completion.

How Is the Critical Thinking Test Scored?

The critical reasoning test is scored based on your raw score and your percentile in comparison with your norm group. It’s important to note that these will not be the same number.

A norm group is a collection of scores from individuals in your field at your level of experience. The percentile score is used to alert employers if you exceed, meet or miss the benchmark for the average expectations of candidates. You will be rated on a scale of one to one hundred with fifty consisting of the mean and median scores.

A raw score is simply the number of correct answers. The critical thinking test comprises your raw score based on the performance in the following areas:

  • Recognizing Assumptions The candidate must be able to understand when a statement is made with no supporting evidence and how this can affect a decision. Further, candidates are asked to identify these discrepancies, whether they are stated explicitly or implicitly, and assess its relevance to the given scenario.
  • Evaluating Arguments Candidates must evaluate arguments without considering inferences or being subjective. Beyond that, candidates must assess the supporting evidence, the structure of the argument and the degree of its influence. It is very important to dismiss emotions for this portion of the critical thinking test.
  • Drawing Conclusions Drawing conclusions puts a large emphasis on reasoning. In this section, it’s important to assess all of the available evidence and data to form a plausible conclusion that accurately applies to all the given information. Employers also want to see candidates that will consider all possible solutions rather than making the evidence fit a desired narrative.

Employers will receive all of this information in a performance report construed by the assessment company. Employers will also be given insight into your overall potential, job knowledge, creativity and job performance per the report.

Where Will I Take a Critical Thinking Test?

Critical thinking tests are non-proctored online assessments that are typically sent via email after an initial screening. For some occupations, the company may ask that the candidate take the critical thinking test again on-site either before their final interview or during an assessment day. The most common test candidates are asked to take is the Watson Glaser Critical Thinking Appraisal (WGCTA) created by the popular assessment company, Pearson . This assessment company is on their third edition with new scoring and subsets described above. The WGCTA gained popularity because of its ability to assess a candidate’s potential alongside their aptitude. Another established assessment is the SHL Critical Reasoning Battery that contains sixty questions with a thirty-minute time limit. Both of the aforementioned critical thinking tests are multiple choice.

How to Prepare for the Critical Thinking Test?

The critical thinking test is difficult to study for because the test is designed to assess your bare knowledge and raw skills. In order to prepare successfully, it is important to focus on the areas of the test that you can equip yourself for. One aspect of the test that demands preparation is the time limit. Many candidates’ scores are negatively impacted because they skip or guess too many of the questions in an attempt to beat the clock. If you want to optimize your chances of achieving a good score, use online practice tests to acquaint yourself with the time constraint and the general theme of the questions. By utilizing the online practice tests, you can find the pace that works best for you. Another helpful way to prepare is running through sample questions. This way, you can warm-up your brain and gain an understanding of the expectations that both the test and the company have of you.

Free Sample Questions to Practice

  • Look over her past quizzes to see what she missed.
  • Set aside more time during the week to review the material for the quiz.
  • Get to class on early Wednesday and briefly look over the chapters.
  • Get a good night’s sleep.
  • Parents should find an alternative way to get their kids to school next week.
  • The premiums must be over-priced.
  • Collective bargaining is no longer a feasible solution.
  • Their employers are being unreasonable.
  • People in Hawaii dislike living on an island.
  • Colder climates induce more happiness than warmer climates.
  • The high scores on the Alaska survey were produced by people who enjoy snow.
  • People in Hawaii should move to Alaska.
  • Jenny’s credit card was declined at the mall.
  • Jenny’s bank keeps charging her $30 overdraft fees.
  • Jenny’s check bounced when she attempted to purchase a new TV.
  • Jenny spends more money than she makes.
  • Lori has thirty cans of soda in a refrigerator in her garage and another fourteen sitting on the counter. Lori does not have anymore cans of soda. Therefore, Lori has 44 cans of soda.
  • The accounting department loves math. My friend works in the accounting department. My friend loves math.
  • Everyone southbound on the freeway yesterday was late to work. Jackie was southbound on the freeway. Jackie was late to work.
  • Adrian lives in either Springfield, California, or Springfield, Illinois. If he lives in Illinois, then he is an American.

Aptitude Tests

  • Aptitude Tests Guide
  • Numerical Reasoning Test
  • Verbal Reasoning Test
  • Cognitive Ability Test
  • Critical Thinking Test
  • Logical Reasoning Test
  • Spatial Reasoning Test
  • Technical Aptitude Test
  • Inductive Reasoning Test
  • Analytical Reasoning Test
  • Deductive Reasoning Test
  • Mechanical Reasoning Test
  • Non-Verbal Reasoning Tests
  • Diagrammatic Reasoning Test
  • Concentration Assessment Test
  • Finance Reasoning Aptitude Test
  • Fault Finding (Fault Diagnosis) Test
  • Senior Management Aptitude Tests
  • Error Checking Tests
  • In-Basket Exercise
  • Practice Tests
  • Predictive Index
  • Firefighter
  • Hogan Assessments
  • Leadership Assessment
  • Ramsay Technician Assessments
  • Watson-Glaser
  • Raven's Progressive Matrix
  • NEO Personality Inventory
  • Texas Success Initiative
  • Birkman Personality Test
  • TSA Prep Booster™ Course
  • TSA Practice Test
  • TSA Written Skills Assessment
  • TSA CBT X-Ray Object Recognition Test
  • TSA Connect the Dots
  • SHL Assessment Prep Course
  • Practice Test & Answers
  • SHL Practice Tests
  • SHL Test Answers
  • SHL Inductive Reasoning Test
  • SHL Numerical Reasoning Test
  • SHL Verbal Reasoning Test
  • SHL Verify G+ Test
  • SHL Mechanical Comprehension Test
  • SHL Situational Judgment Test
  • SHL OPQ Personality Test
  • Predictive Index Master (Cognitive & Behavioral)
  • Predictive Index Cognitive Assessment
  • Predictive Index Behavioral Assessment
  • Predictive Index Practice Test
  • Predictive Index Results
  • Caliper Course
  • Caliper Test Prep With Real Practice Test
  • USPS Postal Exam
  • Postal Exam 474
  • Postal Exam 475
  • Postal Exam 476
  • Postal Exam 477
  • USPS Postal Exam Prep
  • Pass the 2024 Postal Exam With Practice Tests
  • Virtual Entry Assessment (VEA)
  • General Police Prep Course
  • Police Situational Judgement Test
  • Police Psychological Exam Course
  • Massachusetts State Police Exam
  • Pennsylvania Police Exam
  • Philadelphia Police Exam
  • Nassau County Police Exam Course
  • Suffolk County Police Exam
  • Correctional Officer Exam
  • MTA Police Exam
  • New York State Police Exam Prep Course
  • School Safety Agent Course
  • Police Officer NYPD Exam
  • Police Fitness Prep Course
  • Exam Formats
  • EB Jacobs Law Enforcement Aptitude Battery
  • CJBAT Study Guide
  • DELPOE Police Exam
  • Texas LEVEL Test With Expert Guides
  • PELLETB Course
  • FBI Test Phase 1 (Special Agent Exam): Guide with Practice Test [2024]
  • Police Test Preparation Suite
  • Pass a Polygraph Test (Lie Detector): Expert Tips & Questions – 2024
  • Firefighter Test
  • FDNY Firefighter Prep Course
  • Firefighter Psych Test
  • NFSI Firefighter Prep Course
  • FCTC Firefighter Prep Course
  • Firefighter Aptitude and Character Test
  • FireTeam Prep Course
  • Master Course
  • Hogan Assessments Master Course
  • Personality Courses
  • Hogan Personality Inventory (HPI)
  • Hogan Development Survey (HDS)
  • Hogan Motives, Values & Preferences Inventory (MVPI)
  • Busines Reasoning Course
  • Hogan Business Reasoning Inventory (HBRI)
  • Leadership Assessment Test
  • GardaWorld Pre Board Primer
  • Bennett Mechanical Comprehension Test II (BMCT-II) Success Prep Course
  • Beat the 2024 BMCT With Industry Expert Guides & Realistic Practice Tests
  • 911 Dispatcher
  • CHP Dispatcher
  • Exam Format
  • Criticall Dispatcher
  • Criticall Dispatcher Test
  • Criteria Cognitive Aptitude Test - CCAT Course
  • Universal Cognitive Aptitude Test - UCAT Course
  • CCAT Practice Test
  • Criteria Pre-employment Testing: Personality, Aptitude & Skill Tests
  • Korn Ferry Course
  • Ace the 2024 Korn Ferry Assessment With Practice Test & Expert Guides
  • Ramsay Electrical Assessment
  • Ramsay Maintenance Assessment
  • Ramsay Mechanical Assessment
  • Ramsay Multicraft Assessment
  • Ramsay Electrical Practice Test
  • Ramsay Maintenance Practice Test
  • Ramsay Mechanical Practice Test
  • Ramsay Multicraft Practice Test
  • Ramsay Test Prep
  • AFOQT Study Guide
  • ASTB Study Guide
  • SIFT Study Guide
  • Watson-Glaser Critical Thinking Course
  • Beat the Watson Glaser and Upgrade Your Career
  • Raven's Advanced Progressive Matrices
  • Texas Success Initiative Course
  • TSI Practice Test 2024: Math, Reading & Writing
  • TSI Reading Practice Test: 15 Q&A with Explanations
  • Pass our Free TSI Math Practice Test (2024 Update)
  • Take our Free TSI Writing Practice Test (2024)
  • Birkman Personality Course
  • How it Works

Critical Thinking Test: Sample Questions with Explanations (2024)

Employers value and seek candidates who demonstrate advanced critical thinking skills. They often administer critical thinking tests as part of their hiring process. Critical thinking tests can be very difficult for those who don’t prepare. A great way to start practicing is by taking our critical thinking free practice test.

What Does The Critical Thinking Test Include?

The Critical Thinking Test assesses your capacity to think critically and form logical conclusions when given written information. Critical thinking tests are generally used in job recruitment processes, in the legal sector. These tests measure the analytical critical thinking abilities of a candidate.

Why Is Critical Thinking Useful?

Critical thinking is put into action in various stages of decision-making and problem-solving tasks:

  • Identify the problem
  • Choose suitable information to find the solution
  • Identify the assumptions that are implied and written in the text
  • Form hypotheses and choose the most suitable and credible answers
  • Form well-founded conclusions and determine the soundness of inferences

What is Watson Glaser Test and what Critical Thinking Skills it Measures?

The most common type of critical thinking test is the Watson-Glaser Critical Thinking Appraisal (W-GCTA). Typically used by legal and financial organizations, as well as management businesses, a Watson Glaser test is created to assess candidates’ critical thinking skills.

The test consists of 10 questions to be answered in 10 minutes approx (although there is no timer on the test itself). Our test is slightly harder than the real thing, to make it sufficiently challenging practice.

You need to get 70% correct to pass the test. Don’t forget to first check out the test techniques section further down this page beforehand.

Questions          25

Pass percentage          70%.

The test is broken down into five central areas:

  • Assumptions
  • Interpretation

Critical Thinking Course

  • 1 BONUS Interview Prep Video Guide Buy this Course: Get full access to all lessons, practice tests and guides.

The Five Critical Thinking Skills Explained

1. recognition of assumption.

You’ll be presented with a statement. The statement is then followed by several proposed assumptions. When answering, you must work out if an assumption was made or if an assumption was not made in the statement. An assumption is a proclamation that an individual takes for granted. This section of the tests measures your ability to withhold from forming assumptions about things that are not necessarily correct.

  • 1: Assumption Made
  • 2: Assumption Not Made

Although the passage does state that Charlie’s fundraising team is doing its best so that the charity event can meet its goal, nowhere did it state that their team is leading the event.

2. Evaluation of Arguments

You will be presented with an argument. You will then be asked to decide whether the argument is strong or weak. An argument is considered strong if it directly connects to the statement provided, and is believed to be significant.

No, participation awards should not be given in every competition because studies have shown that this would cause the participants to put in less effort because they will get a prize no matter what the outcome is.

  • 1: Strong Argument
  • 2: Weak Argument

This is a strong argument as it provides evidence as to why participation awards should not be given in every competition

3. Deductions

In deduction questions, you will need to form conclusions based solely on the information provided in the question and not based on your knowledge. You will be given a small passage of information and you will need to evaluate a list of deductions made based on that passage. If the conclusion cannot be formed for the information provided, then the conclusion does not follow. The answer must be entirely founded on the statements made and not on conclusions drawn from your knowledge.

In a surprise party for Donna, Edna arrived after Felix and Gary did. Kelly arrived before Felix and Gary did.

  • 1: Conclusion Follows
  • 2: Conclusion Does not Follow

For questions like this, jot down the clues to help you out. Use initials as a quick reference.

K | F&G | E

Looking at the simple diagram, “K”, which stands for “Kelly,” arrived before Edna “E” did. The answer is A.

4. Interpretation

In these questions, you are given a passage of information followed by a list of possible conclusions. You will need to interpret the information in the paragraph and determine whether or not each conclusion follows, based solely on the information given.

A number of students were given the following advice:

“The use of powerful words is a technique, which makes you a better writer. Your choice of words is very important in molding the way people interaction with the article. You should use powerful words to spice up your article. Power words should be used liberally to enhance the flavor of what you write! ”

In the fourth sentence, it is stated, “Power words should be used liberally to enhance the flavor of what you write!”

Thus, if you were to write an essay, using powerful words can give more flavor to it.

5. Inferences

An inference is a conclusion made from observed or supposed facts and details. It is information that is not apparent in the information provided but rather is extracted from it. In this section, you will be provided with a passage of information about a specific scene or event. A list of possible inferences will then be given, and you will need to decide if they are ‘true’, ‘false’, ‘possibly true’, ‘possibly false’, or whether it is not possible to say based on the information provided.

With the advancement of technology, the need for more infrastructure has never been higher. According to the plan of the current U.S. Administration, it aims to put a $1 trillion investment on improving infrastructure, a portion of which will include priority projects and technologies that can strengthen its economic competitiveness such as transportation, 5G wireless communication technology, rural broadband technologies, advanced manufacturing technologies, and even artificial intelligence.

It stated that it expects to work with Congress to develop a comprehensive infrastructure package, which is expected to have a budget of $200 billion for certain priorities.

  • 2: Probably True
  • 3: Not Enough Information
  • 4: Probably False

Although it was mentioned in the passage that the U.S. government is to allocate $200 billion on certain priorities, it did not specify if these certain priorities were for ‘transportation, 5G wireless communication technology, rural broadband technologies, advanced manufacturing technologies, and artificial intelligence’ or if the aforementioned priorities will have a different allocation.

What we can be sure of, however, is that at least a portion of the $1 trillion infrastructure budget will be used on the mentioned priorities regardless, meaning that there is a chance that $200 billion will be used on those aforementioned areas.

Improve Your Score with Prepterminal’s Critical Thinking Course

The Critical Thinking test is difficult, but not impossible to overcome with practice. At PrepTerminal our psychometric test experts have developed a critical thinking preparatory test to provide you with the material you need to practice for your critical thinking test. Prepare with us to increase your chance of successfully overcoming this hurdle in the recruitment process.

Prepterminal’s preparatory critical thinking course features a structured study course along with critical thinking practice tests to help you improve your exam score. Our course includes video and text-based information presented in a clear and easy-to-understand manner so you can follow along at your own pace with ease.

Matt

Created by: Matt

Psychometric tutor, prepterminal test expert, 414 students, 4.7 , 73 reviews.

Critical Thinking Tests ({YEAR} Guide)

What Is Critical Thinking?

Who uses critical thinking tests and why, how to prepare for a critical thinking test in 2024, final thoughts, critical thinking tests (2024 guide).

Updated November 18, 2023

Nikki Dale

Critical thinking is the ability to scrutinize evidence using intellectual skills. Reflective skills are employed to reach clear, coherent and logical conclusions – rather than just accepting information as it is provided.

Critical thinking tests measure the candidate’s understanding of logical connections between ideas, the strength of an argument, alternate interpretations and the significance of a particular claim.

A major facet of critical thinking is the ability to separate facts from opinions and work against any subconscious bias.

In critical thinking tests, employers are looking for people who can think critically about information, showing they are open-minded, good problem-solvers and excellent decision-makers.

Critical thinking tests assess how well a candidate can analyze and reason when presented with specific information.

They are used as part of the application process in several industries, most commonly for professions where employees would need to use advanced judgment and analysis skills in decision-making.

For example:

Academic applications – In some instances, critical thinking tests are used to assess whether prospective students have the skills required to be successful in higher education.

Law – Critical thinking assessments are often used in the legal sector as part of the application process. In many law positions, facts are more important than opinion, subconscious bias or pre-existing ideas so an applicant needs to be skilled in critical thinking.

Finance – In financial institutions, decisions often need to be made based on facts rather than emotion or opinion. Judgments made in banking need to be skilled decisions based on logic and the strength of data and information – so to be successful, candidates need to demonstrate that they will not accept arguments and conclusions at face value.

Graduate roles – In some sectors, critical thinking tests are used in graduate recruitment because they are considered to be predictors of ability.

With several different tests available, suited to different industries, many top-level jobs are likely to include critical thinking assessments as part of the application process.

Critical Thinking Tests Explained

Critical thinking tests are usually presented in a similar format no matter who the publisher is. A paragraph of information and data is given, with a statement that is under scrutiny.

Multiple-choice answers are presented for each statement, and there may be more than one question about the same paragraph.

While each question is presented in the same way, different aspects of critical thinking are assessed throughout the test.

Assessing Assumptions

For this type of question, there may be something ‘taken for granted’ in the information provided – and it might not be explicitly stated.

The candidate needs to evaluate the scenario and conclude whether any assumptions are present. The statement below the scenario may or may not support the statement and the answer selection will be about whether the stated assumption is made or not made in the scenario.

Example Question for Assessing Assumptions

Practice Critical Thinking Test with JobTestPrep

The mainstream media presents information that is supported by the political party in power.

Assumption: The information that the mainstream media presents is always correct.

a) Assumption made b) Assumption not made

Determining Inferences

Following a paragraph of information containing evidence, you will be presented with an inference and need to assess whether the inference is absolutely true, possibly true, possibly false, absolutely false, or it is not possible to reach a decision.

An inference is a conclusion that can be reached based on logical reasoning from the information. Although all the evidence to support (or not support) the inference is included in the passage, it will not be obvious or explicitly stated, which makes the inference harder to conclude.

Example Question for Determining Inferences

It has been snowing all night and there is thick snow on the ground. Today’s weather is sunny and bright.

Inference: The snow will melt today.

a) Possibly true b) Absolutely true c) Possibly false d) Absolutely false e) Not possible to reach a decision

Making Deductions

For this type of question, the information presented will be a set of factual statements and the candidate will need to decide if the deduction applies or does not apply.

This logical thinking is a top-down exercise where all the information is provided and needs to be read in the order it is presented.

If statement A = B, does B = C? There should be no grey areas – it either does or does not follow.

Example Question for Making Deductions

All plants have leaves. All leaves are green.

Proposed deduction: All plants are green.

a) Deduction follows b) Deduction does not follow

If you need to prepare for a number of different employment tests and want to outsmart the competition, choose a Premium Membership from JobTestPrep . You will get access to three PrepPacks of your choice, from a database that covers all the major test providers and employers and tailored profession packs.

Get a Premium Package Now

Interpretation of Conclusions

Presented with information, the candidate needs to assess whether a given conclusion is correct based on the evidence provided.

For the purposes of the test, we need to believe that all the information provided in the paragraph is true, even if we have opinions about the correctness of the statement.

Example Question for Interpretation of Conclusions

When cooking a meal, one of the most important things to get right is the balance between major food groups. Satisfaction from a good meal comes from getting the most nutrition and can therefore be attributed to a wide variety of flavors, including vegetables, a good source of protein and carbohydrates. A balanced diet is about more than just everything in moderation and should be considered a scientific process with measuring of ingredients and efficient cooking methods.

Proposed conclusion: The best meals are those that are scientifically prepared.

a) Conclusion follows b) Conclusion does not follow

Evaluation of Arguments (Analysis of Arguments)

In this analysis section, the candidate is presented with a scenario and an argument that might be in favor of the scenario or against it.

The candidate needs to evaluate whether the argument itself is weak or strong. This needs to be based on the relevance to the scenario and whether it accurately addresses the question.

Example Question for Evaluation of Arguments

Should all drugs be made legal?

Proposed argument: No, all drugs are dangerous to everyone.

a) Argument is strong b) Argument is weak

Most Common Critical Thinking Tests in 2024

Watson glaser test.

Watson Glaser is the most commonly used test publisher for critical thinking assessments and is used by many industries.

When sitting a Watson Glaser test, your results will be compared against a sample group of over 1,500 test-takers who are considered representative of graduate-level candidates.

The test is usually 40 questions long, with 30 minutes to answer, but there is a longer version that asks 80 questions with a time limit of an hour.

Who Uses This Test?

The Watson Glaser Test is used in a wide variety of industries for different roles, especially in the legal and banking sectors. Some employers that use the Watson Glaser Test are:

  • Bank of England
  • Irwin Mitchell
  • Simmons & Simmons

What Is the RED model?

The Watson Glaser Test is based on something called the ‘RED model’. The questions in the test are based on:

  • Recognizing assumptions
  • Evaluating arguments
  • Drawing conclusions

The science behind the Watson Glaser Test shows that candidates who show strong critical thinking skills in these areas are more likely to perform well in roles where logical decisions and judgments have to be made.

Where to Take a Free Practice Test

Watson Glaser Tests have a specific layout and format. If you are going to be completing one of the assessments as part of your application, it’s best to practice questions that match the test format.

You can find Watson Glaser practice tests at JobTestPrep as well as a prep pack to give you all the tips, tricks and information you need to make the most of your practice time.

Take a Practice Watson Glaser Test

SHL Critical Reasoning Battery Test

The SHL Critical Reasoning Battery Test includes questions based on numerical, verbal and inductive reasoning. This test is usually used for managerial and supervisory roles, and can include mechanical comprehension if needed for the job role (usually in engineering or mechanical roles).

You can find out more on JobTestPrep’s SHL Critical Reasoning Battery pages .

Take a Practice SHL Test

The Graduate Management Admissions Test (GMAT) is an online adaptive test – using sophisticated algorithms to adjust the difficulty of the questions according to the answers already provided.

Questions include integrated, quantitative and verbal reasoning as well as an analytical writing assessment. The GMAT is widely used to predict performance in business or management programs in more than 1,700 universities and organizations.

Take a Practice GMAT

Preparation is key to success in any pre-employment assessment. While some people think critical reasoning is not a skill you can practice, there are some steps you can take to perform at your best.

Critical thinking tests are straightforward but not necessarily easy.

Step 1 . Consider Buying a Preparation Pack

If you can determine who the publisher is for the test you will take, it may be worthwhile investing in a prep pack from that particular publisher.

JobTestPrep offers prep packs for many major test publishers. These packs include realistic practice tests as well as study guides, tips and tricks to help you build your own question-solving strategies.

Step 2 . Use Practice Tests

Even if you decide not to purchase a prep pack, taking practice tests will help you focus on the areas where you need to improve to be successful.

It is important to find out the publisher of the test you will take because not all critical thinking tests are at the same level and they may not follow the same structure. Timings, answering methodologies and the number of questions will vary between publishers.

You can usually find out the test publisher before you take the assessment by asking the recruiter or searching online.

Step 3 . Practice Under Test Conditions

Critical thinking tests are timed. To give yourself the best chance of achieving a high score, you need to answer the questions quickly and efficiently.

Practicing under test conditions – including the time limit – will help you to understand how much time you need to spend on each question and will help you to develop efficient time management skills for the assessment.

Practicing under test conditions will also help you focus so you can make the most of the session.

Step 4 . Practice Abstract Reasoning

Abstract reasoning is a form of critical thinking that uses logic to form a conclusion. Some abstract reasoning tests are presented as word problems.

Practicing these is a good way to flex critical thinking muscles. You can find practice questions on the Psychometric Success website .

Step 5 . Practice Critical Thinking in Everyday Life

Reading widely, especially non-fiction, is a good way to practice your critical thinking skills in everyday life.

Newspaper articles, scientific or technical journals, and other sources of information present an opportunity to think about:

  • The strength of arguments
  • The perspective of the author
  • Whether there are enough facts presented to draw the conclusion given
  • Whether other conclusions could be drawn from the same information

Step 6 . Revise Logical Fallacies

Knowledge of logical fallacies will help you to judge the effectiveness of an argument. Fallacy describes ‘faulty reasoning’ in an argument and is often seen in hyperbole or opinion pieces in newspapers and magazines.

There are many types of fallacy that you might come across, such as:

  • Strawman – An argument that doesn’t address the statement.
  • False cause – An argument based on a connection that doesn’t exist.
  • Ambiguity – An argument using a phrase that is unclear or that may have different meanings.
  • Appeal to popularity – An argument that states it must be true because many people believe it.

There are many others, including red herrings, appeal to authority and false dichotomy. Learning these will help you to identify a weak argument.

Step 7 . Focus on Long-Term Practice

Cramming and panicking about a critical thinking assessment is rarely conducive to great performance.

If you are looking for a career in a sector where critical thinking skills are necessary, then long-term practice will have better results when you come to be assessed. Make critical thinking a part of life – so that every day can be a chance to practice recognizing assumptions.

Key Tips for Critical Thinking Test Success

Understand the format of the test and each question type.

Familiarity is important for any assessment, and in critical thinking tests, it is essential that you can recognize what the question is looking for. As mentioned above, this is usually one of the following:

  • Assessing assumptions
  • Determining inferences
  • Making deductions
  • Interpreting conclusions

Practice tests will help you become comfortable with the structure and format of the test, including ways to answer, and will also demonstrate what the question types look like.

Read Test Content Carefully

Taking time to read and understand the content provided in the question is important to ensure that you can answer correctly.

The information you need to determine the correct answer will be provided although it might not be explicitly stated. Careful reading is an important part of critical thinking.

Only Use the Information Provided

While some of the information provided in the critical thinking test might be related to the role you are applying for, or about something that you have existing knowledge of, you mustn't use this knowledge during the test.

A facet of critical thinking is avoiding subconscious bias and opinion, so only use the information that is provided to answer the question.

Look Out for Facts and Fallacies

Throughout the critical thinking test, look out for facts and fallacies in the information and arguments provided.

Identifying fallacies will help you decide if an argument is strong and will help you answer questions correctly.

Critical thinking tests are used as pre-employment assessments for jobs that require effective communication, good problem-solving and great decision-making, such as those in the legal sector and banking.

These tests assess the ability of candidates to question and scrutinize evidence, make logical connections between ideas, find alternative interpretations and decide on the strength of an argument.

All critical thinking tests are not the same, but they do have similar question types. Learning what these are and how to answer them will help you perform better. Practicing tests based on the specific publisher of your test will give you the best results.

You might also be interested in these other Psychometric Success articles:

The Watson Glaser Critical Thinking Appraisal

Or explore the Aptitude Tests / Test Types sections.

Pardon Our Interruption

As you were browsing something about your browser made us think you were a bot. There are a few reasons this might happen:

  • You've disabled JavaScript in your web browser.
  • You're a power user moving through this website with super-human speed.
  • You've disabled cookies in your web browser.
  • A third-party browser plugin, such as Ghostery or NoScript, is preventing JavaScript from running. Additional information is available in this support article .

To regain access, please make sure that cookies and JavaScript are enabled before reloading the page.

IMAGES

  1. ATI Critical Thinking Exam 2: Questions & Answers: Updated Solution

    science and critical thinking exam 2

  2. SOLUTION: Ati critical thinking exam with questions and answers ranked

    science and critical thinking exam 2

  3. Critical Thinking Exam 2 Final

    science and critical thinking exam 2

  4. critical thinking exam 2 Flashcards

    science and critical thinking exam 2

  5. Trends-networks-and-critical-thinking-in-the-21st-century-midterm-exam

    science and critical thinking exam 2

  6. Critical Thinking Exam 2 Flashcards

    science and critical thinking exam 2

VIDEO

  1. Critical Thinking in Large Classes

  2. Wisdom of Proverbs

  3. 27th Critical Thinking Conference Keynote Address

  4. Logic & Critical Thinking Exam Questions @ DMU

  5. Logic & Critical Thinking Exam Questions @ WU

  6. 【VOD】 Psychology Journal Club: Teaching critical thinking with a game (de Vero & Barr, 2023), part 2

COMMENTS

  1. Science and Critical Thinking Exam 2 Flashcards

    system of belief or religious practice based on supposed communication with spirits of the dead. bird recording and said there were voices. Study with Quizlet and memorize flashcards containing terms like important aspects of scientific laws, Newton's first law of motion, Newtons second law of motion and more.

  2. science and critical thinking exam 2 Flashcards

    science and critical thinking exam 2. Get a hint. How do you calculate the probability of an event? Click the card to flip 👆. The probability outcome is the event divided by the total number of possible outcomes. Click the card to flip 👆. 1 / 58.

  3. Science and Critical Thinking

    Science and Critical Thinking - Exam 2 study guide by brookewegner includes 82 questions covering vocabulary, terms and more. Quizlet flashcards, activities and games help you improve your grades.

  4. NSCI 1050

    Studying NSCI 1050 Science And Critical Thinking at University of Nebraska at Omaha? On Studocu you will find 27 lecture notes, summaries, coursework, assignments. ... Science And Critical Thinking (NSCI 1050) Prepare your exam. Highest rated. 13. NSCI study guide 2. Other 100% (9) 14. Study guide 3. Other 100% (8) 6. Exam 1 Study Guide. Other ...

  5. Exam 2 Study Guide

    Exam 2 Study Guide. Completed Study Guides for exam 2. Course. Science And Critical Thinking (NSCI 1050) 28 Documents. Students shared 28 documents in this course. University University of Nebraska at Omaha. Info More info. Academic year: 2018/2019. Uploaded by: Blake Kinnaman 999+

  6. Study Guide 1

    More from: Science And Critical Thinking (NSCI 1050) More from: Science And Critical Thinking NSCI 1050. University of Nebraska at Omaha. 28 Documents. Go to course. 13. ... Psych 1020 Exam 2 Studyguide. Introduction To Psychology II 100% (7) 21. TEST 4 - exam 4 study guide, prof. wilson . Science And Critical Thinking 90% (10) More from: JS. JS.

  7. Critical Thinking Test: Free Practice Questions

    PRT Critical Thinking Test: question 1 of 3. Six friends are seated in a restaurant across a rectangular table. There are three chairs on each side. Adam and Dorky do not have anyone sitting to their right and Clyde and Benjamin do not have anyone sitting to their left. Adam and Benjamin are not sitting on the same side of the table.

  8. Critical Thinking Flashcards & Quizzes

    Critical Thinking - Medical Terminology. Basics to Nervous System, Digestive System to Male Reproductive System, Biochemistry ... Learn all about Critical Thinking and get the test results you deserve. Study practice exams using our adaptive online flashcards now!

  9. Understanding the Complex Relationship between Critical Thinking and

    In a similar vein, the content-related, epistemological aspects of science reasoning, as well as the conventions associated with writing the undergraduate thesis (including feedback from peers and revision), may explain the lack of significant relationships between some science reasoning dimensions and some critical-thinking skills that might ...

  10. Science and Critical Thinking Test 2 Flashcards

    2) Directional selection favors on particular trait and the population changes to be more like the favored trait. This causes a change in the mean, but not necessarily in the variation around the mean. 3) Disruptive selection favors both extremes in the trait, but not the average.

  11. Critical Thinking Testing and Assessment

    The purpose of assessing instruction for critical thinking is improving the teaching of discipline-based thinking (historical, biological, sociological, mathematical, etc.) It is to improve students' abilities to think their way through content using disciplined skill in reasoning. The more particular we can be about what we want students to ...

  12. Critical Thinking Test: Free Practice Questions & Tips

    There will also be general questions concerning more basic issues or problems that commonly occur in a workplace environment. The critical thinking test is multiple-choice with thirty minutes to complete the assessment. Candidates will receive a notification stating whether or not they passed within a week of completion.

  13. Free Critical Thinking Test: Sample Questions & Explanations

    The Five Critical Thinking Skills Explained. 1. Recognition of Assumption. You'll be presented with a statement. The statement is then followed by several proposed assumptions. When answering, you must work out if an assumption was made or if an assumption was not made in the statement.

  14. Study guide 3

    More from: Science And Critical Thinking (NSCI 1050) More from: Science And Critical Thinking NSCI 1050. University of Nebraska at Omaha. 27 Documents. Go to course. 13. ... Psych 1020 Exam 2 Studyguide. Introduction To Psychology II 100% (7) 7. Study Guide 1. Science And Critical Thinking 100% (6) 21.

  15. Exam 3 Study Guide

    More from: Science And Critical Thinking (NSCI 1050) More from: Science And Critical Thinking NSCI 1050. University of Nebraska at Omaha. 28 Documents. Go to course. 13. ... Exam 2 Study Guide; Exam 1 Study Guide; 1.5 NSCI1050 p.1; Morphing Stair Assignment; Nsci ec tech love - podcast summary; Nsci ec human - podcast summary;

  16. Critical Thinking Tests: A Complete Guide

    Most Common Critical Thinking Tests in 2024 Watson Glaser Test. Watson Glaser is the most commonly used test publisher for critical thinking assessments and is used by many industries.. When sitting a Watson Glaser test, your results will be compared against a sample group of over 1,500 test-takers who are considered representative of graduate-level candidates.

  17. Science and Critical Thinking Exam 2

    Quiz yourself with questions and answers for Science and Critical Thinking Exam 2, so you can be ready for test day. Explore quizzes and practice tests created by teachers and students or create one from your course material. ... a concept of having only 2 dimensions-space and time. a theory that states time is the only dimension that exists. a ...

  18. Teaching critical thinking in science

    1. Identifying a problem and asking questions about that problem. 2. Selecting information to respond to the problem and evaluating it. 3. Drawing conclusions from the evidence. Critical thinking can be developed through focussed learning activities. Students not only need to receive information but also benefit from being encouraged to think ...

  19. PDF LEARNING STRAND 2 SCIENTIFIC AND CRITICAL THINKING SKILLS

    session guide 2 WHY DO I NEED TO BELIEVE IN SCIENCE? Session Guide No. 2 I. Objectives 1. Explain how to plan and organize thoughts for an oral presentation (LS1CS/EN-S-PSB- AE-12); 2. Determine the properties of a good visual aid (LS1CS/EN-S-PSB- AE-12, LS6DC-DA/PS PSC- AE/JHS-66); and 3. Deliver an effective oral presentation (LS1CS/EN-S-PSB ...

  20. science and critical thinking exam 2 Flashcards

    the total amount of energy will always remain the same. The energy may not be the same energy, but the amount of energy will be the same. In living systems, the beginning energy is chemical bond energy and the ending energy is heat. The TYPE of energy can change, but the AMOUNT of energy will remain the same.

  21. Master Clinical Decision Making and Critical Thinking: Exam #2

    Exam #2 Study Guide: You will have 50 questions which will be multiple choice, select all that apply and drop and drag. You will have 90 minutes to complete the test. All of the following information should be reviewed prior to the test: Clinical decision making Critical thinking Clinical judgment Diagnostic reasoning Phases of the nursing process (assessing, planning, implementing, evaluating ...

  22. TEST 4

    More from: Science And Critical Thinking (NSCI 1050) More from: Science And Critical Thinking NSCI 1050. University of Nebraska at Omaha. 27 Documents. Go to course. 13. ... Psych 1020 Exam 2 Studyguide. Introduction To Psychology II 100% (7) 7. Study Guide 1. Science And Critical Thinking 100% (6) More from: JS. JS.