SEP home page

  • Table of Contents
  • Random Entry
  • Chronological
  • Editorial Information
  • About the SEP
  • Editorial Board
  • How to Cite the SEP
  • Special Characters
  • Advanced Tools
  • Support the SEP
  • PDFs for SEP Friends
  • Make a Donation
  • SEPIA for Libraries
  • Entry Contents

Bibliography

Academic tools.

  • Friends PDF Preview
  • Author and Citation Info
  • Back to Top

Critical Thinking

Critical thinking is a widely accepted educational goal. Its definition is contested, but the competing definitions can be understood as differing conceptions of the same basic concept: careful thinking directed to a goal. Conceptions differ with respect to the scope of such thinking, the type of goal, the criteria and norms for thinking carefully, and the thinking components on which they focus. Its adoption as an educational goal has been recommended on the basis of respect for students’ autonomy and preparing students for success in life and for democratic citizenship. “Critical thinkers” have the dispositions and abilities that lead them to think critically when appropriate. The abilities can be identified directly; the dispositions indirectly, by considering what factors contribute to or impede exercise of the abilities. Standardized tests have been developed to assess the degree to which a person possesses such dispositions and abilities. Educational intervention has been shown experimentally to improve them, particularly when it includes dialogue, anchored instruction, and mentoring. Controversies have arisen over the generalizability of critical thinking across domains, over alleged bias in critical thinking theories and instruction, and over the relationship of critical thinking to other types of thinking.

2.1 Dewey’s Three Main Examples

2.2 dewey’s other examples, 2.3 further examples, 2.4 non-examples, 3. the definition of critical thinking, 4. its value, 5. the process of thinking critically, 6. components of the process, 7. contributory dispositions and abilities, 8.1 initiating dispositions, 8.2 internal dispositions, 9. critical thinking abilities, 10. required knowledge, 11. educational methods, 12.1 the generalizability of critical thinking, 12.2 bias in critical thinking theory and pedagogy, 12.3 relationship of critical thinking to other types of thinking, other internet resources, related entries.

Use of the term ‘critical thinking’ to describe an educational goal goes back to the American philosopher John Dewey (1910), who more commonly called it ‘reflective thinking’. He defined it as

active, persistent and careful consideration of any belief or supposed form of knowledge in the light of the grounds that support it, and the further conclusions to which it tends. (Dewey 1910: 6; 1933: 9)

and identified a habit of such consideration with a scientific attitude of mind. His lengthy quotations of Francis Bacon, John Locke, and John Stuart Mill indicate that he was not the first person to propose development of a scientific attitude of mind as an educational goal.

In the 1930s, many of the schools that participated in the Eight-Year Study of the Progressive Education Association (Aikin 1942) adopted critical thinking as an educational goal, for whose achievement the study’s Evaluation Staff developed tests (Smith, Tyler, & Evaluation Staff 1942). Glaser (1941) showed experimentally that it was possible to improve the critical thinking of high school students. Bloom’s influential taxonomy of cognitive educational objectives (Bloom et al. 1956) incorporated critical thinking abilities. Ennis (1962) proposed 12 aspects of critical thinking as a basis for research on the teaching and evaluation of critical thinking ability.

Since 1980, an annual international conference in California on critical thinking and educational reform has attracted tens of thousands of educators from all levels of education and from many parts of the world. Also since 1980, the state university system in California has required all undergraduate students to take a critical thinking course. Since 1983, the Association for Informal Logic and Critical Thinking has sponsored sessions in conjunction with the divisional meetings of the American Philosophical Association (APA). In 1987, the APA’s Committee on Pre-College Philosophy commissioned a consensus statement on critical thinking for purposes of educational assessment and instruction (Facione 1990a). Researchers have developed standardized tests of critical thinking abilities and dispositions; for details, see the Supplement on Assessment . Educational jurisdictions around the world now include critical thinking in guidelines for curriculum and assessment.

For details on this history, see the Supplement on History .

2. Examples and Non-Examples

Before considering the definition of critical thinking, it will be helpful to have in mind some examples of critical thinking, as well as some examples of kinds of thinking that would apparently not count as critical thinking.

Dewey (1910: 68–71; 1933: 91–94) takes as paradigms of reflective thinking three class papers of students in which they describe their thinking. The examples range from the everyday to the scientific.

Transit : “The other day, when I was down town on 16th Street, a clock caught my eye. I saw that the hands pointed to 12:20. This suggested that I had an engagement at 124th Street, at one o’clock. I reasoned that as it had taken me an hour to come down on a surface car, I should probably be twenty minutes late if I returned the same way. I might save twenty minutes by a subway express. But was there a station near? If not, I might lose more than twenty minutes in looking for one. Then I thought of the elevated, and I saw there was such a line within two blocks. But where was the station? If it were several blocks above or below the street I was on, I should lose time instead of gaining it. My mind went back to the subway express as quicker than the elevated; furthermore, I remembered that it went nearer than the elevated to the part of 124th Street I wished to reach, so that time would be saved at the end of the journey. I concluded in favor of the subway, and reached my destination by one o’clock.” (Dewey 1910: 68–69; 1933: 91–92)

Ferryboat : “Projecting nearly horizontally from the upper deck of the ferryboat on which I daily cross the river is a long white pole, having a gilded ball at its tip. It suggested a flagpole when I first saw it; its color, shape, and gilded ball agreed with this idea, and these reasons seemed to justify me in this belief. But soon difficulties presented themselves. The pole was nearly horizontal, an unusual position for a flagpole; in the next place, there was no pulley, ring, or cord by which to attach a flag; finally, there were elsewhere on the boat two vertical staffs from which flags were occasionally flown. It seemed probable that the pole was not there for flag-flying.

“I then tried to imagine all possible purposes of the pole, and to consider for which of these it was best suited: (a) Possibly it was an ornament. But as all the ferryboats and even the tugboats carried poles, this hypothesis was rejected. (b) Possibly it was the terminal of a wireless telegraph. But the same considerations made this improbable. Besides, the more natural place for such a terminal would be the highest part of the boat, on top of the pilot house. (c) Its purpose might be to point out the direction in which the boat is moving.

“In support of this conclusion, I discovered that the pole was lower than the pilot house, so that the steersman could easily see it. Moreover, the tip was enough higher than the base, so that, from the pilot’s position, it must appear to project far out in front of the boat. Moreover, the pilot being near the front of the boat, he would need some such guide as to its direction. Tugboats would also need poles for such a purpose. This hypothesis was so much more probable than the others that I accepted it. I formed the conclusion that the pole was set up for the purpose of showing the pilot the direction in which the boat pointed, to enable him to steer correctly.” (Dewey 1910: 69–70; 1933: 92–93)

Bubbles : “In washing tumblers in hot soapsuds and placing them mouth downward on a plate, bubbles appeared on the outside of the mouth of the tumblers and then went inside. Why? The presence of bubbles suggests air, which I note must come from inside the tumbler. I see that the soapy water on the plate prevents escape of the air save as it may be caught in bubbles. But why should air leave the tumbler? There was no substance entering to force it out. It must have expanded. It expands by increase of heat, or by decrease of pressure, or both. Could the air have become heated after the tumbler was taken from the hot suds? Clearly not the air that was already entangled in the water. If heated air was the cause, cold air must have entered in transferring the tumblers from the suds to the plate. I test to see if this supposition is true by taking several more tumblers out. Some I shake so as to make sure of entrapping cold air in them. Some I take out holding mouth downward in order to prevent cold air from entering. Bubbles appear on the outside of every one of the former and on none of the latter. I must be right in my inference. Air from the outside must have been expanded by the heat of the tumbler, which explains the appearance of the bubbles on the outside. But why do they then go inside? Cold contracts. The tumbler cooled and also the air inside it. Tension was removed, and hence bubbles appeared inside. To be sure of this, I test by placing a cup of ice on the tumbler while the bubbles are still forming outside. They soon reverse” (Dewey 1910: 70–71; 1933: 93–94).

Dewey (1910, 1933) sprinkles his book with other examples of critical thinking. We will refer to the following.

Weather : A man on a walk notices that it has suddenly become cool, thinks that it is probably going to rain, looks up and sees a dark cloud obscuring the sun, and quickens his steps (1910: 6–10; 1933: 9–13).

Disorder : A man finds his rooms on his return to them in disorder with his belongings thrown about, thinks at first of burglary as an explanation, then thinks of mischievous children as being an alternative explanation, then looks to see whether valuables are missing, and discovers that they are (1910: 82–83; 1933: 166–168).

Typhoid : A physician diagnosing a patient whose conspicuous symptoms suggest typhoid avoids drawing a conclusion until more data are gathered by questioning the patient and by making tests (1910: 85–86; 1933: 170).

Blur : A moving blur catches our eye in the distance, we ask ourselves whether it is a cloud of whirling dust or a tree moving its branches or a man signaling to us, we think of other traits that should be found on each of those possibilities, and we look and see if those traits are found (1910: 102, 108; 1933: 121, 133).

Suction pump : In thinking about the suction pump, the scientist first notes that it will draw water only to a maximum height of 33 feet at sea level and to a lesser maximum height at higher elevations, selects for attention the differing atmospheric pressure at these elevations, sets up experiments in which the air is removed from a vessel containing water (when suction no longer works) and in which the weight of air at various levels is calculated, compares the results of reasoning about the height to which a given weight of air will allow a suction pump to raise water with the observed maximum height at different elevations, and finally assimilates the suction pump to such apparently different phenomena as the siphon and the rising of a balloon (1910: 150–153; 1933: 195–198).

Diamond : A passenger in a car driving in a diamond lane reserved for vehicles with at least one passenger notices that the diamond marks on the pavement are far apart in some places and close together in others. Why? The driver suggests that the reason may be that the diamond marks are not needed where there is a solid double line separating the diamond lane from the adjoining lane, but are needed when there is a dotted single line permitting crossing into the diamond lane. Further observation confirms that the diamonds are close together when a dotted line separates the diamond lane from its neighbour, but otherwise far apart.

Rash : A woman suddenly develops a very itchy red rash on her throat and upper chest. She recently noticed a mark on the back of her right hand, but was not sure whether the mark was a rash or a scrape. She lies down in bed and thinks about what might be causing the rash and what to do about it. About two weeks before, she began taking blood pressure medication that contained a sulfa drug, and the pharmacist had warned her, in view of a previous allergic reaction to a medication containing a sulfa drug, to be on the alert for an allergic reaction; however, she had been taking the medication for two weeks with no such effect. The day before, she began using a new cream on her neck and upper chest; against the new cream as the cause was mark on the back of her hand, which had not been exposed to the cream. She began taking probiotics about a month before. She also recently started new eye drops, but she supposed that manufacturers of eye drops would be careful not to include allergy-causing components in the medication. The rash might be a heat rash, since she recently was sweating profusely from her upper body. Since she is about to go away on a short vacation, where she would not have access to her usual physician, she decides to keep taking the probiotics and using the new eye drops but to discontinue the blood pressure medication and to switch back to the old cream for her neck and upper chest. She forms a plan to consult her regular physician on her return about the blood pressure medication.

Candidate : Although Dewey included no examples of thinking directed at appraising the arguments of others, such thinking has come to be considered a kind of critical thinking. We find an example of such thinking in the performance task on the Collegiate Learning Assessment (CLA+), which its sponsoring organization describes as

a performance-based assessment that provides a measure of an institution’s contribution to the development of critical-thinking and written communication skills of its students. (Council for Aid to Education 2017)

A sample task posted on its website requires the test-taker to write a report for public distribution evaluating a fictional candidate’s policy proposals and their supporting arguments, using supplied background documents, with a recommendation on whether to endorse the candidate.

Immediate acceptance of an idea that suggests itself as a solution to a problem (e.g., a possible explanation of an event or phenomenon, an action that seems likely to produce a desired result) is “uncritical thinking, the minimum of reflection” (Dewey 1910: 13). On-going suspension of judgment in the light of doubt about a possible solution is not critical thinking (Dewey 1910: 108). Critique driven by a dogmatically held political or religious ideology is not critical thinking; thus Paulo Freire (1968 [1970]) is using the term (e.g., at 1970: 71, 81, 100, 146) in a more politically freighted sense that includes not only reflection but also revolutionary action against oppression. Derivation of a conclusion from given data using an algorithm is not critical thinking.

What is critical thinking? There are many definitions. Ennis (2016) lists 14 philosophically oriented scholarly definitions and three dictionary definitions. Following Rawls (1971), who distinguished his conception of justice from a utilitarian conception but regarded them as rival conceptions of the same concept, Ennis maintains that the 17 definitions are different conceptions of the same concept. Rawls articulated the shared concept of justice as

a characteristic set of principles for assigning basic rights and duties and for determining… the proper distribution of the benefits and burdens of social cooperation. (Rawls 1971: 5)

Bailin et al. (1999b) claim that, if one considers what sorts of thinking an educator would take not to be critical thinking and what sorts to be critical thinking, one can conclude that educators typically understand critical thinking to have at least three features.

  • It is done for the purpose of making up one’s mind about what to believe or do.
  • The person engaging in the thinking is trying to fulfill standards of adequacy and accuracy appropriate to the thinking.
  • The thinking fulfills the relevant standards to some threshold level.

One could sum up the core concept that involves these three features by saying that critical thinking is careful goal-directed thinking. This core concept seems to apply to all the examples of critical thinking described in the previous section. As for the non-examples, their exclusion depends on construing careful thinking as excluding jumping immediately to conclusions, suspending judgment no matter how strong the evidence, reasoning from an unquestioned ideological or religious perspective, and routinely using an algorithm to answer a question.

If the core of critical thinking is careful goal-directed thinking, conceptions of it can vary according to its presumed scope, its presumed goal, one’s criteria and threshold for being careful, and the thinking component on which one focuses. As to its scope, some conceptions (e.g., Dewey 1910, 1933) restrict it to constructive thinking on the basis of one’s own observations and experiments, others (e.g., Ennis 1962; Fisher & Scriven 1997; Johnson 1992) to appraisal of the products of such thinking. Ennis (1991) and Bailin et al. (1999b) take it to cover both construction and appraisal. As to its goal, some conceptions restrict it to forming a judgment (Dewey 1910, 1933; Lipman 1987; Facione 1990a). Others allow for actions as well as beliefs as the end point of a process of critical thinking (Ennis 1991; Bailin et al. 1999b). As to the criteria and threshold for being careful, definitions vary in the term used to indicate that critical thinking satisfies certain norms: “intellectually disciplined” (Scriven & Paul 1987), “reasonable” (Ennis 1991), “skillful” (Lipman 1987), “skilled” (Fisher & Scriven 1997), “careful” (Bailin & Battersby 2009). Some definitions specify these norms, referring variously to “consideration of any belief or supposed form of knowledge in the light of the grounds that support it and the further conclusions to which it tends” (Dewey 1910, 1933); “the methods of logical inquiry and reasoning” (Glaser 1941); “conceptualizing, applying, analyzing, synthesizing, and/or evaluating information gathered from, or generated by, observation, experience, reflection, reasoning, or communication” (Scriven & Paul 1987); the requirement that “it is sensitive to context, relies on criteria, and is self-correcting” (Lipman 1987); “evidential, conceptual, methodological, criteriological, or contextual considerations” (Facione 1990a); and “plus-minus considerations of the product in terms of appropriate standards (or criteria)” (Johnson 1992). Stanovich and Stanovich (2010) propose to ground the concept of critical thinking in the concept of rationality, which they understand as combining epistemic rationality (fitting one’s beliefs to the world) and instrumental rationality (optimizing goal fulfillment); a critical thinker, in their view, is someone with “a propensity to override suboptimal responses from the autonomous mind” (2010: 227). These variant specifications of norms for critical thinking are not necessarily incompatible with one another, and in any case presuppose the core notion of thinking carefully. As to the thinking component singled out, some definitions focus on suspension of judgment during the thinking (Dewey 1910; McPeck 1981), others on inquiry while judgment is suspended (Bailin & Battersby 2009, 2021), others on the resulting judgment (Facione 1990a), and still others on responsiveness to reasons (Siegel 1988). Kuhn (2019) takes critical thinking to be more a dialogic practice of advancing and responding to arguments than an individual ability.

In educational contexts, a definition of critical thinking is a “programmatic definition” (Scheffler 1960: 19). It expresses a practical program for achieving an educational goal. For this purpose, a one-sentence formulaic definition is much less useful than articulation of a critical thinking process, with criteria and standards for the kinds of thinking that the process may involve. The real educational goal is recognition, adoption and implementation by students of those criteria and standards. That adoption and implementation in turn consists in acquiring the knowledge, abilities and dispositions of a critical thinker.

Conceptions of critical thinking generally do not include moral integrity as part of the concept. Dewey, for example, took critical thinking to be the ultimate intellectual goal of education, but distinguished it from the development of social cooperation among school children, which he took to be the central moral goal. Ennis (1996, 2011) added to his previous list of critical thinking dispositions a group of dispositions to care about the dignity and worth of every person, which he described as a “correlative” (1996) disposition without which critical thinking would be less valuable and perhaps harmful. An educational program that aimed at developing critical thinking but not the correlative disposition to care about the dignity and worth of every person, he asserted, “would be deficient and perhaps dangerous” (Ennis 1996: 172).

Dewey thought that education for reflective thinking would be of value to both the individual and society; recognition in educational practice of the kinship to the scientific attitude of children’s native curiosity, fertile imagination and love of experimental inquiry “would make for individual happiness and the reduction of social waste” (Dewey 1910: iii). Schools participating in the Eight-Year Study took development of the habit of reflective thinking and skill in solving problems as a means to leading young people to understand, appreciate and live the democratic way of life characteristic of the United States (Aikin 1942: 17–18, 81). Harvey Siegel (1988: 55–61) has offered four considerations in support of adopting critical thinking as an educational ideal. (1) Respect for persons requires that schools and teachers honour students’ demands for reasons and explanations, deal with students honestly, and recognize the need to confront students’ independent judgment; these requirements concern the manner in which teachers treat students. (2) Education has the task of preparing children to be successful adults, a task that requires development of their self-sufficiency. (3) Education should initiate children into the rational traditions in such fields as history, science and mathematics. (4) Education should prepare children to become democratic citizens, which requires reasoned procedures and critical talents and attitudes. To supplement these considerations, Siegel (1988: 62–90) responds to two objections: the ideology objection that adoption of any educational ideal requires a prior ideological commitment and the indoctrination objection that cultivation of critical thinking cannot escape being a form of indoctrination.

Despite the diversity of our 11 examples, one can recognize a common pattern. Dewey analyzed it as consisting of five phases:

  • suggestions , in which the mind leaps forward to a possible solution;
  • an intellectualization of the difficulty or perplexity into a problem to be solved, a question for which the answer must be sought;
  • the use of one suggestion after another as a leading idea, or hypothesis , to initiate and guide observation and other operations in collection of factual material;
  • the mental elaboration of the idea or supposition as an idea or supposition ( reasoning , in the sense on which reasoning is a part, not the whole, of inference); and
  • testing the hypothesis by overt or imaginative action. (Dewey 1933: 106–107; italics in original)

The process of reflective thinking consisting of these phases would be preceded by a perplexed, troubled or confused situation and followed by a cleared-up, unified, resolved situation (Dewey 1933: 106). The term ‘phases’ replaced the term ‘steps’ (Dewey 1910: 72), thus removing the earlier suggestion of an invariant sequence. Variants of the above analysis appeared in (Dewey 1916: 177) and (Dewey 1938: 101–119).

The variant formulations indicate the difficulty of giving a single logical analysis of such a varied process. The process of critical thinking may have a spiral pattern, with the problem being redefined in the light of obstacles to solving it as originally formulated. For example, the person in Transit might have concluded that getting to the appointment at the scheduled time was impossible and have reformulated the problem as that of rescheduling the appointment for a mutually convenient time. Further, defining a problem does not always follow after or lead immediately to an idea of a suggested solution. Nor should it do so, as Dewey himself recognized in describing the physician in Typhoid as avoiding any strong preference for this or that conclusion before getting further information (Dewey 1910: 85; 1933: 170). People with a hypothesis in mind, even one to which they have a very weak commitment, have a so-called “confirmation bias” (Nickerson 1998): they are likely to pay attention to evidence that confirms the hypothesis and to ignore evidence that counts against it or for some competing hypothesis. Detectives, intelligence agencies, and investigators of airplane accidents are well advised to gather relevant evidence systematically and to postpone even tentative adoption of an explanatory hypothesis until the collected evidence rules out with the appropriate degree of certainty all but one explanation. Dewey’s analysis of the critical thinking process can be faulted as well for requiring acceptance or rejection of a possible solution to a defined problem, with no allowance for deciding in the light of the available evidence to suspend judgment. Further, given the great variety of kinds of problems for which reflection is appropriate, there is likely to be variation in its component events. Perhaps the best way to conceptualize the critical thinking process is as a checklist whose component events can occur in a variety of orders, selectively, and more than once. These component events might include (1) noticing a difficulty, (2) defining the problem, (3) dividing the problem into manageable sub-problems, (4) formulating a variety of possible solutions to the problem or sub-problem, (5) determining what evidence is relevant to deciding among possible solutions to the problem or sub-problem, (6) devising a plan of systematic observation or experiment that will uncover the relevant evidence, (7) carrying out the plan of systematic observation or experimentation, (8) noting the results of the systematic observation or experiment, (9) gathering relevant testimony and information from others, (10) judging the credibility of testimony and information gathered from others, (11) drawing conclusions from gathered evidence and accepted testimony, and (12) accepting a solution that the evidence adequately supports (cf. Hitchcock 2017: 485).

Checklist conceptions of the process of critical thinking are open to the objection that they are too mechanical and procedural to fit the multi-dimensional and emotionally charged issues for which critical thinking is urgently needed (Paul 1984). For such issues, a more dialectical process is advocated, in which competing relevant world views are identified, their implications explored, and some sort of creative synthesis attempted.

If one considers the critical thinking process illustrated by the 11 examples, one can identify distinct kinds of mental acts and mental states that form part of it. To distinguish, label and briefly characterize these components is a useful preliminary to identifying abilities, skills, dispositions, attitudes, habits and the like that contribute causally to thinking critically. Identifying such abilities and habits is in turn a useful preliminary to setting educational goals. Setting the goals is in its turn a useful preliminary to designing strategies for helping learners to achieve the goals and to designing ways of measuring the extent to which learners have done so. Such measures provide both feedback to learners on their achievement and a basis for experimental research on the effectiveness of various strategies for educating people to think critically. Let us begin, then, by distinguishing the kinds of mental acts and mental events that can occur in a critical thinking process.

  • Observing : One notices something in one’s immediate environment (sudden cooling of temperature in Weather , bubbles forming outside a glass and then going inside in Bubbles , a moving blur in the distance in Blur , a rash in Rash ). Or one notes the results of an experiment or systematic observation (valuables missing in Disorder , no suction without air pressure in Suction pump )
  • Feeling : One feels puzzled or uncertain about something (how to get to an appointment on time in Transit , why the diamonds vary in spacing in Diamond ). One wants to resolve this perplexity. One feels satisfaction once one has worked out an answer (to take the subway express in Transit , diamonds closer when needed as a warning in Diamond ).
  • Wondering : One formulates a question to be addressed (why bubbles form outside a tumbler taken from hot water in Bubbles , how suction pumps work in Suction pump , what caused the rash in Rash ).
  • Imagining : One thinks of possible answers (bus or subway or elevated in Transit , flagpole or ornament or wireless communication aid or direction indicator in Ferryboat , allergic reaction or heat rash in Rash ).
  • Inferring : One works out what would be the case if a possible answer were assumed (valuables missing if there has been a burglary in Disorder , earlier start to the rash if it is an allergic reaction to a sulfa drug in Rash ). Or one draws a conclusion once sufficient relevant evidence is gathered (take the subway in Transit , burglary in Disorder , discontinue blood pressure medication and new cream in Rash ).
  • Knowledge : One uses stored knowledge of the subject-matter to generate possible answers or to infer what would be expected on the assumption of a particular answer (knowledge of a city’s public transit system in Transit , of the requirements for a flagpole in Ferryboat , of Boyle’s law in Bubbles , of allergic reactions in Rash ).
  • Experimenting : One designs and carries out an experiment or a systematic observation to find out whether the results deduced from a possible answer will occur (looking at the location of the flagpole in relation to the pilot’s position in Ferryboat , putting an ice cube on top of a tumbler taken from hot water in Bubbles , measuring the height to which a suction pump will draw water at different elevations in Suction pump , noticing the spacing of diamonds when movement to or from a diamond lane is allowed in Diamond ).
  • Consulting : One finds a source of information, gets the information from the source, and makes a judgment on whether to accept it. None of our 11 examples include searching for sources of information. In this respect they are unrepresentative, since most people nowadays have almost instant access to information relevant to answering any question, including many of those illustrated by the examples. However, Candidate includes the activities of extracting information from sources and evaluating its credibility.
  • Identifying and analyzing arguments : One notices an argument and works out its structure and content as a preliminary to evaluating its strength. This activity is central to Candidate . It is an important part of a critical thinking process in which one surveys arguments for various positions on an issue.
  • Judging : One makes a judgment on the basis of accumulated evidence and reasoning, such as the judgment in Ferryboat that the purpose of the pole is to provide direction to the pilot.
  • Deciding : One makes a decision on what to do or on what policy to adopt, as in the decision in Transit to take the subway.

By definition, a person who does something voluntarily is both willing and able to do that thing at that time. Both the willingness and the ability contribute causally to the person’s action, in the sense that the voluntary action would not occur if either (or both) of these were lacking. For example, suppose that one is standing with one’s arms at one’s sides and one voluntarily lifts one’s right arm to an extended horizontal position. One would not do so if one were unable to lift one’s arm, if for example one’s right side was paralyzed as the result of a stroke. Nor would one do so if one were unwilling to lift one’s arm, if for example one were participating in a street demonstration at which a white supremacist was urging the crowd to lift their right arm in a Nazi salute and one were unwilling to express support in this way for the racist Nazi ideology. The same analysis applies to a voluntary mental process of thinking critically. It requires both willingness and ability to think critically, including willingness and ability to perform each of the mental acts that compose the process and to coordinate those acts in a sequence that is directed at resolving the initiating perplexity.

Consider willingness first. We can identify causal contributors to willingness to think critically by considering factors that would cause a person who was able to think critically about an issue nevertheless not to do so (Hamby 2014). For each factor, the opposite condition thus contributes causally to willingness to think critically on a particular occasion. For example, people who habitually jump to conclusions without considering alternatives will not think critically about issues that arise, even if they have the required abilities. The contrary condition of willingness to suspend judgment is thus a causal contributor to thinking critically.

Now consider ability. In contrast to the ability to move one’s arm, which can be completely absent because a stroke has left the arm paralyzed, the ability to think critically is a developed ability, whose absence is not a complete absence of ability to think but absence of ability to think well. We can identify the ability to think well directly, in terms of the norms and standards for good thinking. In general, to be able do well the thinking activities that can be components of a critical thinking process, one needs to know the concepts and principles that characterize their good performance, to recognize in particular cases that the concepts and principles apply, and to apply them. The knowledge, recognition and application may be procedural rather than declarative. It may be domain-specific rather than widely applicable, and in either case may need subject-matter knowledge, sometimes of a deep kind.

Reflections of the sort illustrated by the previous two paragraphs have led scholars to identify the knowledge, abilities and dispositions of a “critical thinker”, i.e., someone who thinks critically whenever it is appropriate to do so. We turn now to these three types of causal contributors to thinking critically. We start with dispositions, since arguably these are the most powerful contributors to being a critical thinker, can be fostered at an early stage of a child’s development, and are susceptible to general improvement (Glaser 1941: 175)

8. Critical Thinking Dispositions

Educational researchers use the term ‘dispositions’ broadly for the habits of mind and attitudes that contribute causally to being a critical thinker. Some writers (e.g., Paul & Elder 2006; Hamby 2014; Bailin & Battersby 2016a) propose to use the term ‘virtues’ for this dimension of a critical thinker. The virtues in question, although they are virtues of character, concern the person’s ways of thinking rather than the person’s ways of behaving towards others. They are not moral virtues but intellectual virtues, of the sort articulated by Zagzebski (1996) and discussed by Turri, Alfano, and Greco (2017).

On a realistic conception, thinking dispositions or intellectual virtues are real properties of thinkers. They are general tendencies, propensities, or inclinations to think in particular ways in particular circumstances, and can be genuinely explanatory (Siegel 1999). Sceptics argue that there is no evidence for a specific mental basis for the habits of mind that contribute to thinking critically, and that it is pedagogically misleading to posit such a basis (Bailin et al. 1999a). Whatever their status, critical thinking dispositions need motivation for their initial formation in a child—motivation that may be external or internal. As children develop, the force of habit will gradually become important in sustaining the disposition (Nieto & Valenzuela 2012). Mere force of habit, however, is unlikely to sustain critical thinking dispositions. Critical thinkers must value and enjoy using their knowledge and abilities to think things through for themselves. They must be committed to, and lovers of, inquiry.

A person may have a critical thinking disposition with respect to only some kinds of issues. For example, one could be open-minded about scientific issues but not about religious issues. Similarly, one could be confident in one’s ability to reason about the theological implications of the existence of evil in the world but not in one’s ability to reason about the best design for a guided ballistic missile.

Facione (1990a: 25) divides “affective dispositions” of critical thinking into approaches to life and living in general and approaches to specific issues, questions or problems. Adapting this distinction, one can usefully divide critical thinking dispositions into initiating dispositions (those that contribute causally to starting to think critically about an issue) and internal dispositions (those that contribute causally to doing a good job of thinking critically once one has started). The two categories are not mutually exclusive. For example, open-mindedness, in the sense of willingness to consider alternative points of view to one’s own, is both an initiating and an internal disposition.

Using the strategy of considering factors that would block people with the ability to think critically from doing so, we can identify as initiating dispositions for thinking critically attentiveness, a habit of inquiry, self-confidence, courage, open-mindedness, willingness to suspend judgment, trust in reason, wanting evidence for one’s beliefs, and seeking the truth. We consider briefly what each of these dispositions amounts to, in each case citing sources that acknowledge them.

  • Attentiveness : One will not think critically if one fails to recognize an issue that needs to be thought through. For example, the pedestrian in Weather would not have looked up if he had not noticed that the air was suddenly cooler. To be a critical thinker, then, one needs to be habitually attentive to one’s surroundings, noticing not only what one senses but also sources of perplexity in messages received and in one’s own beliefs and attitudes (Facione 1990a: 25; Facione, Facione, & Giancarlo 2001).
  • Habit of inquiry : Inquiry is effortful, and one needs an internal push to engage in it. For example, the student in Bubbles could easily have stopped at idle wondering about the cause of the bubbles rather than reasoning to a hypothesis, then designing and executing an experiment to test it. Thus willingness to think critically needs mental energy and initiative. What can supply that energy? Love of inquiry, or perhaps just a habit of inquiry. Hamby (2015) has argued that willingness to inquire is the central critical thinking virtue, one that encompasses all the others. It is recognized as a critical thinking disposition by Dewey (1910: 29; 1933: 35), Glaser (1941: 5), Ennis (1987: 12; 1991: 8), Facione (1990a: 25), Bailin et al. (1999b: 294), Halpern (1998: 452), and Facione, Facione, & Giancarlo (2001).
  • Self-confidence : Lack of confidence in one’s abilities can block critical thinking. For example, if the woman in Rash lacked confidence in her ability to figure things out for herself, she might just have assumed that the rash on her chest was the allergic reaction to her medication against which the pharmacist had warned her. Thus willingness to think critically requires confidence in one’s ability to inquire (Facione 1990a: 25; Facione, Facione, & Giancarlo 2001).
  • Courage : Fear of thinking for oneself can stop one from doing it. Thus willingness to think critically requires intellectual courage (Paul & Elder 2006: 16).
  • Open-mindedness : A dogmatic attitude will impede thinking critically. For example, a person who adheres rigidly to a “pro-choice” position on the issue of the legal status of induced abortion is likely to be unwilling to consider seriously the issue of when in its development an unborn child acquires a moral right to life. Thus willingness to think critically requires open-mindedness, in the sense of a willingness to examine questions to which one already accepts an answer but which further evidence or reasoning might cause one to answer differently (Dewey 1933; Facione 1990a; Ennis 1991; Bailin et al. 1999b; Halpern 1998, Facione, Facione, & Giancarlo 2001). Paul (1981) emphasizes open-mindedness about alternative world-views, and recommends a dialectical approach to integrating such views as central to what he calls “strong sense” critical thinking. In three studies, Haran, Ritov, & Mellers (2013) found that actively open-minded thinking, including “the tendency to weigh new evidence against a favored belief, to spend sufficient time on a problem before giving up, and to consider carefully the opinions of others in forming one’s own”, led study participants to acquire information and thus to make accurate estimations.
  • Willingness to suspend judgment : Premature closure on an initial solution will block critical thinking. Thus willingness to think critically requires a willingness to suspend judgment while alternatives are explored (Facione 1990a; Ennis 1991; Halpern 1998).
  • Trust in reason : Since distrust in the processes of reasoned inquiry will dissuade one from engaging in it, trust in them is an initiating critical thinking disposition (Facione 1990a, 25; Bailin et al. 1999b: 294; Facione, Facione, & Giancarlo 2001; Paul & Elder 2006). In reaction to an allegedly exclusive emphasis on reason in critical thinking theory and pedagogy, Thayer-Bacon (2000) argues that intuition, imagination, and emotion have important roles to play in an adequate conception of critical thinking that she calls “constructive thinking”. From her point of view, critical thinking requires trust not only in reason but also in intuition, imagination, and emotion.
  • Seeking the truth : If one does not care about the truth but is content to stick with one’s initial bias on an issue, then one will not think critically about it. Seeking the truth is thus an initiating critical thinking disposition (Bailin et al. 1999b: 294; Facione, Facione, & Giancarlo 2001). A disposition to seek the truth is implicit in more specific critical thinking dispositions, such as trying to be well-informed, considering seriously points of view other than one’s own, looking for alternatives, suspending judgment when the evidence is insufficient, and adopting a position when the evidence supporting it is sufficient.

Some of the initiating dispositions, such as open-mindedness and willingness to suspend judgment, are also internal critical thinking dispositions, in the sense of mental habits or attitudes that contribute causally to doing a good job of critical thinking once one starts the process. But there are many other internal critical thinking dispositions. Some of them are parasitic on one’s conception of good thinking. For example, it is constitutive of good thinking about an issue to formulate the issue clearly and to maintain focus on it. For this purpose, one needs not only the corresponding ability but also the corresponding disposition. Ennis (1991: 8) describes it as the disposition “to determine and maintain focus on the conclusion or question”, Facione (1990a: 25) as “clarity in stating the question or concern”. Other internal dispositions are motivators to continue or adjust the critical thinking process, such as willingness to persist in a complex task and willingness to abandon nonproductive strategies in an attempt to self-correct (Halpern 1998: 452). For a list of identified internal critical thinking dispositions, see the Supplement on Internal Critical Thinking Dispositions .

Some theorists postulate skills, i.e., acquired abilities, as operative in critical thinking. It is not obvious, however, that a good mental act is the exercise of a generic acquired skill. Inferring an expected time of arrival, as in Transit , has some generic components but also uses non-generic subject-matter knowledge. Bailin et al. (1999a) argue against viewing critical thinking skills as generic and discrete, on the ground that skilled performance at a critical thinking task cannot be separated from knowledge of concepts and from domain-specific principles of good thinking. Talk of skills, they concede, is unproblematic if it means merely that a person with critical thinking skills is capable of intelligent performance.

Despite such scepticism, theorists of critical thinking have listed as general contributors to critical thinking what they variously call abilities (Glaser 1941; Ennis 1962, 1991), skills (Facione 1990a; Halpern 1998) or competencies (Fisher & Scriven 1997). Amalgamating these lists would produce a confusing and chaotic cornucopia of more than 50 possible educational objectives, with only partial overlap among them. It makes sense instead to try to understand the reasons for the multiplicity and diversity, and to make a selection according to one’s own reasons for singling out abilities to be developed in a critical thinking curriculum. Two reasons for diversity among lists of critical thinking abilities are the underlying conception of critical thinking and the envisaged educational level. Appraisal-only conceptions, for example, involve a different suite of abilities than constructive-only conceptions. Some lists, such as those in (Glaser 1941), are put forward as educational objectives for secondary school students, whereas others are proposed as objectives for college students (e.g., Facione 1990a).

The abilities described in the remaining paragraphs of this section emerge from reflection on the general abilities needed to do well the thinking activities identified in section 6 as components of the critical thinking process described in section 5 . The derivation of each collection of abilities is accompanied by citation of sources that list such abilities and of standardized tests that claim to test them.

Observational abilities : Careful and accurate observation sometimes requires specialist expertise and practice, as in the case of observing birds and observing accident scenes. However, there are general abilities of noticing what one’s senses are picking up from one’s environment and of being able to articulate clearly and accurately to oneself and others what one has observed. It helps in exercising them to be able to recognize and take into account factors that make one’s observation less trustworthy, such as prior framing of the situation, inadequate time, deficient senses, poor observation conditions, and the like. It helps as well to be skilled at taking steps to make one’s observation more trustworthy, such as moving closer to get a better look, measuring something three times and taking the average, and checking what one thinks one is observing with someone else who is in a good position to observe it. It also helps to be skilled at recognizing respects in which one’s report of one’s observation involves inference rather than direct observation, so that one can then consider whether the inference is justified. These abilities come into play as well when one thinks about whether and with what degree of confidence to accept an observation report, for example in the study of history or in a criminal investigation or in assessing news reports. Observational abilities show up in some lists of critical thinking abilities (Ennis 1962: 90; Facione 1990a: 16; Ennis 1991: 9). There are items testing a person’s ability to judge the credibility of observation reports in the Cornell Critical Thinking Tests, Levels X and Z (Ennis & Millman 1971; Ennis, Millman, & Tomko 1985, 2005). Norris and King (1983, 1985, 1990a, 1990b) is a test of ability to appraise observation reports.

Emotional abilities : The emotions that drive a critical thinking process are perplexity or puzzlement, a wish to resolve it, and satisfaction at achieving the desired resolution. Children experience these emotions at an early age, without being trained to do so. Education that takes critical thinking as a goal needs only to channel these emotions and to make sure not to stifle them. Collaborative critical thinking benefits from ability to recognize one’s own and others’ emotional commitments and reactions.

Questioning abilities : A critical thinking process needs transformation of an inchoate sense of perplexity into a clear question. Formulating a question well requires not building in questionable assumptions, not prejudging the issue, and using language that in context is unambiguous and precise enough (Ennis 1962: 97; 1991: 9).

Imaginative abilities : Thinking directed at finding the correct causal explanation of a general phenomenon or particular event requires an ability to imagine possible explanations. Thinking about what policy or plan of action to adopt requires generation of options and consideration of possible consequences of each option. Domain knowledge is required for such creative activity, but a general ability to imagine alternatives is helpful and can be nurtured so as to become easier, quicker, more extensive, and deeper (Dewey 1910: 34–39; 1933: 40–47). Facione (1990a) and Halpern (1998) include the ability to imagine alternatives as a critical thinking ability.

Inferential abilities : The ability to draw conclusions from given information, and to recognize with what degree of certainty one’s own or others’ conclusions follow, is universally recognized as a general critical thinking ability. All 11 examples in section 2 of this article include inferences, some from hypotheses or options (as in Transit , Ferryboat and Disorder ), others from something observed (as in Weather and Rash ). None of these inferences is formally valid. Rather, they are licensed by general, sometimes qualified substantive rules of inference (Toulmin 1958) that rest on domain knowledge—that a bus trip takes about the same time in each direction, that the terminal of a wireless telegraph would be located on the highest possible place, that sudden cooling is often followed by rain, that an allergic reaction to a sulfa drug generally shows up soon after one starts taking it. It is a matter of controversy to what extent the specialized ability to deduce conclusions from premisses using formal rules of inference is needed for critical thinking. Dewey (1933) locates logical forms in setting out the products of reflection rather than in the process of reflection. Ennis (1981a), on the other hand, maintains that a liberally-educated person should have the following abilities: to translate natural-language statements into statements using the standard logical operators, to use appropriately the language of necessary and sufficient conditions, to deal with argument forms and arguments containing symbols, to determine whether in virtue of an argument’s form its conclusion follows necessarily from its premisses, to reason with logically complex propositions, and to apply the rules and procedures of deductive logic. Inferential abilities are recognized as critical thinking abilities by Glaser (1941: 6), Facione (1990a: 9), Ennis (1991: 9), Fisher & Scriven (1997: 99, 111), and Halpern (1998: 452). Items testing inferential abilities constitute two of the five subtests of the Watson Glaser Critical Thinking Appraisal (Watson & Glaser 1980a, 1980b, 1994), two of the four sections in the Cornell Critical Thinking Test Level X (Ennis & Millman 1971; Ennis, Millman, & Tomko 1985, 2005), three of the seven sections in the Cornell Critical Thinking Test Level Z (Ennis & Millman 1971; Ennis, Millman, & Tomko 1985, 2005), 11 of the 34 items on Forms A and B of the California Critical Thinking Skills Test (Facione 1990b, 1992), and a high but variable proportion of the 25 selected-response questions in the Collegiate Learning Assessment (Council for Aid to Education 2017).

Experimenting abilities : Knowing how to design and execute an experiment is important not just in scientific research but also in everyday life, as in Rash . Dewey devoted a whole chapter of his How We Think (1910: 145–156; 1933: 190–202) to the superiority of experimentation over observation in advancing knowledge. Experimenting abilities come into play at one remove in appraising reports of scientific studies. Skill in designing and executing experiments includes the acknowledged abilities to appraise evidence (Glaser 1941: 6), to carry out experiments and to apply appropriate statistical inference techniques (Facione 1990a: 9), to judge inductions to an explanatory hypothesis (Ennis 1991: 9), and to recognize the need for an adequately large sample size (Halpern 1998). The Cornell Critical Thinking Test Level Z (Ennis & Millman 1971; Ennis, Millman, & Tomko 1985, 2005) includes four items (out of 52) on experimental design. The Collegiate Learning Assessment (Council for Aid to Education 2017) makes room for appraisal of study design in both its performance task and its selected-response questions.

Consulting abilities : Skill at consulting sources of information comes into play when one seeks information to help resolve a problem, as in Candidate . Ability to find and appraise information includes ability to gather and marshal pertinent information (Glaser 1941: 6), to judge whether a statement made by an alleged authority is acceptable (Ennis 1962: 84), to plan a search for desired information (Facione 1990a: 9), and to judge the credibility of a source (Ennis 1991: 9). Ability to judge the credibility of statements is tested by 24 items (out of 76) in the Cornell Critical Thinking Test Level X (Ennis & Millman 1971; Ennis, Millman, & Tomko 1985, 2005) and by four items (out of 52) in the Cornell Critical Thinking Test Level Z (Ennis & Millman 1971; Ennis, Millman, & Tomko 1985, 2005). The College Learning Assessment’s performance task requires evaluation of whether information in documents is credible or unreliable (Council for Aid to Education 2017).

Argument analysis abilities : The ability to identify and analyze arguments contributes to the process of surveying arguments on an issue in order to form one’s own reasoned judgment, as in Candidate . The ability to detect and analyze arguments is recognized as a critical thinking skill by Facione (1990a: 7–8), Ennis (1991: 9) and Halpern (1998). Five items (out of 34) on the California Critical Thinking Skills Test (Facione 1990b, 1992) test skill at argument analysis. The College Learning Assessment (Council for Aid to Education 2017) incorporates argument analysis in its selected-response tests of critical reading and evaluation and of critiquing an argument.

Judging skills and deciding skills : Skill at judging and deciding is skill at recognizing what judgment or decision the available evidence and argument supports, and with what degree of confidence. It is thus a component of the inferential skills already discussed.

Lists and tests of critical thinking abilities often include two more abilities: identifying assumptions and constructing and evaluating definitions.

In addition to dispositions and abilities, critical thinking needs knowledge: of critical thinking concepts, of critical thinking principles, and of the subject-matter of the thinking.

We can derive a short list of concepts whose understanding contributes to critical thinking from the critical thinking abilities described in the preceding section. Observational abilities require an understanding of the difference between observation and inference. Questioning abilities require an understanding of the concepts of ambiguity and vagueness. Inferential abilities require an understanding of the difference between conclusive and defeasible inference (traditionally, between deduction and induction), as well as of the difference between necessary and sufficient conditions. Experimenting abilities require an understanding of the concepts of hypothesis, null hypothesis, assumption and prediction, as well as of the concept of statistical significance and of its difference from importance. They also require an understanding of the difference between an experiment and an observational study, and in particular of the difference between a randomized controlled trial, a prospective correlational study and a retrospective (case-control) study. Argument analysis abilities require an understanding of the concepts of argument, premiss, assumption, conclusion and counter-consideration. Additional critical thinking concepts are proposed by Bailin et al. (1999b: 293), Fisher & Scriven (1997: 105–106), Black (2012), and Blair (2021).

According to Glaser (1941: 25), ability to think critically requires knowledge of the methods of logical inquiry and reasoning. If we review the list of abilities in the preceding section, however, we can see that some of them can be acquired and exercised merely through practice, possibly guided in an educational setting, followed by feedback. Searching intelligently for a causal explanation of some phenomenon or event requires that one consider a full range of possible causal contributors, but it seems more important that one implements this principle in one’s practice than that one is able to articulate it. What is important is “operational knowledge” of the standards and principles of good thinking (Bailin et al. 1999b: 291–293). But the development of such critical thinking abilities as designing an experiment or constructing an operational definition can benefit from learning their underlying theory. Further, explicit knowledge of quirks of human thinking seems useful as a cautionary guide. Human memory is not just fallible about details, as people learn from their own experiences of misremembering, but is so malleable that a detailed, clear and vivid recollection of an event can be a total fabrication (Loftus 2017). People seek or interpret evidence in ways that are partial to their existing beliefs and expectations, often unconscious of their “confirmation bias” (Nickerson 1998). Not only are people subject to this and other cognitive biases (Kahneman 2011), of which they are typically unaware, but it may be counter-productive for one to make oneself aware of them and try consciously to counteract them or to counteract social biases such as racial or sexual stereotypes (Kenyon & Beaulac 2014). It is helpful to be aware of these facts and of the superior effectiveness of blocking the operation of biases—for example, by making an immediate record of one’s observations, refraining from forming a preliminary explanatory hypothesis, blind refereeing, double-blind randomized trials, and blind grading of students’ work. It is also helpful to be aware of the prevalence of “noise” (unwanted unsystematic variability of judgments), of how to detect noise (through a noise audit), and of how to reduce noise: make accuracy the goal, think statistically, break a process of arriving at a judgment into independent tasks, resist premature intuitions, in a group get independent judgments first, favour comparative judgments and scales (Kahneman, Sibony, & Sunstein 2021). It is helpful as well to be aware of the concept of “bounded rationality” in decision-making and of the related distinction between “satisficing” and optimizing (Simon 1956; Gigerenzer 2001).

Critical thinking about an issue requires substantive knowledge of the domain to which the issue belongs. Critical thinking abilities are not a magic elixir that can be applied to any issue whatever by somebody who has no knowledge of the facts relevant to exploring that issue. For example, the student in Bubbles needed to know that gases do not penetrate solid objects like a glass, that air expands when heated, that the volume of an enclosed gas varies directly with its temperature and inversely with its pressure, and that hot objects will spontaneously cool down to the ambient temperature of their surroundings unless kept hot by insulation or a source of heat. Critical thinkers thus need a rich fund of subject-matter knowledge relevant to the variety of situations they encounter. This fact is recognized in the inclusion among critical thinking dispositions of a concern to become and remain generally well informed.

Experimental educational interventions, with control groups, have shown that education can improve critical thinking skills and dispositions, as measured by standardized tests. For information about these tests, see the Supplement on Assessment .

What educational methods are most effective at developing the dispositions, abilities and knowledge of a critical thinker? In a comprehensive meta-analysis of experimental and quasi-experimental studies of strategies for teaching students to think critically, Abrami et al. (2015) found that dialogue, anchored instruction, and mentoring each increased the effectiveness of the educational intervention, and that they were most effective when combined. They also found that in these studies a combination of separate instruction in critical thinking with subject-matter instruction in which students are encouraged to think critically was more effective than either by itself. However, the difference was not statistically significant; that is, it might have arisen by chance.

Most of these studies lack the longitudinal follow-up required to determine whether the observed differential improvements in critical thinking abilities or dispositions continue over time, for example until high school or college graduation. For details on studies of methods of developing critical thinking skills and dispositions, see the Supplement on Educational Methods .

12. Controversies

Scholars have denied the generalizability of critical thinking abilities across subject domains, have alleged bias in critical thinking theory and pedagogy, and have investigated the relationship of critical thinking to other kinds of thinking.

McPeck (1981) attacked the thinking skills movement of the 1970s, including the critical thinking movement. He argued that there are no general thinking skills, since thinking is always thinking about some subject-matter. It is futile, he claimed, for schools and colleges to teach thinking as if it were a separate subject. Rather, teachers should lead their pupils to become autonomous thinkers by teaching school subjects in a way that brings out their cognitive structure and that encourages and rewards discussion and argument. As some of his critics (e.g., Paul 1985; Siegel 1985) pointed out, McPeck’s central argument needs elaboration, since it has obvious counter-examples in writing and speaking, for which (up to a certain level of complexity) there are teachable general abilities even though they are always about some subject-matter. To make his argument convincing, McPeck needs to explain how thinking differs from writing and speaking in a way that does not permit useful abstraction of its components from the subject-matters with which it deals. He has not done so. Nevertheless, his position that the dispositions and abilities of a critical thinker are best developed in the context of subject-matter instruction is shared by many theorists of critical thinking, including Dewey (1910, 1933), Glaser (1941), Passmore (1980), Weinstein (1990), Bailin et al. (1999b), and Willingham (2019).

McPeck’s challenge prompted reflection on the extent to which critical thinking is subject-specific. McPeck argued for a strong subject-specificity thesis, according to which it is a conceptual truth that all critical thinking abilities are specific to a subject. (He did not however extend his subject-specificity thesis to critical thinking dispositions. In particular, he took the disposition to suspend judgment in situations of cognitive dissonance to be a general disposition.) Conceptual subject-specificity is subject to obvious counter-examples, such as the general ability to recognize confusion of necessary and sufficient conditions. A more modest thesis, also endorsed by McPeck, is epistemological subject-specificity, according to which the norms of good thinking vary from one field to another. Epistemological subject-specificity clearly holds to a certain extent; for example, the principles in accordance with which one solves a differential equation are quite different from the principles in accordance with which one determines whether a painting is a genuine Picasso. But the thesis suffers, as Ennis (1989) points out, from vagueness of the concept of a field or subject and from the obvious existence of inter-field principles, however broadly the concept of a field is construed. For example, the principles of hypothetico-deductive reasoning hold for all the varied fields in which such reasoning occurs. A third kind of subject-specificity is empirical subject-specificity, according to which as a matter of empirically observable fact a person with the abilities and dispositions of a critical thinker in one area of investigation will not necessarily have them in another area of investigation.

The thesis of empirical subject-specificity raises the general problem of transfer. If critical thinking abilities and dispositions have to be developed independently in each school subject, how are they of any use in dealing with the problems of everyday life and the political and social issues of contemporary society, most of which do not fit into the framework of a traditional school subject? Proponents of empirical subject-specificity tend to argue that transfer is more likely to occur if there is critical thinking instruction in a variety of domains, with explicit attention to dispositions and abilities that cut across domains. But evidence for this claim is scanty. There is a need for well-designed empirical studies that investigate the conditions that make transfer more likely.

It is common ground in debates about the generality or subject-specificity of critical thinking dispositions and abilities that critical thinking about any topic requires background knowledge about the topic. For example, the most sophisticated understanding of the principles of hypothetico-deductive reasoning is of no help unless accompanied by some knowledge of what might be plausible explanations of some phenomenon under investigation.

Critics have objected to bias in the theory, pedagogy and practice of critical thinking. Commentators (e.g., Alston 1995; Ennis 1998) have noted that anyone who takes a position has a bias in the neutral sense of being inclined in one direction rather than others. The critics, however, are objecting to bias in the pejorative sense of an unjustified favoring of certain ways of knowing over others, frequently alleging that the unjustly favoured ways are those of a dominant sex or culture (Bailin 1995). These ways favour:

  • reinforcement of egocentric and sociocentric biases over dialectical engagement with opposing world-views (Paul 1981, 1984; Warren 1998)
  • distancing from the object of inquiry over closeness to it (Martin 1992; Thayer-Bacon 1992)
  • indifference to the situation of others over care for them (Martin 1992)
  • orientation to thought over orientation to action (Martin 1992)
  • being reasonable over caring to understand people’s ideas (Thayer-Bacon 1993)
  • being neutral and objective over being embodied and situated (Thayer-Bacon 1995a)
  • doubting over believing (Thayer-Bacon 1995b)
  • reason over emotion, imagination and intuition (Thayer-Bacon 2000)
  • solitary thinking over collaborative thinking (Thayer-Bacon 2000)
  • written and spoken assignments over other forms of expression (Alston 2001)
  • attention to written and spoken communications over attention to human problems (Alston 2001)
  • winning debates in the public sphere over making and understanding meaning (Alston 2001)

A common thread in this smorgasbord of accusations is dissatisfaction with focusing on the logical analysis and evaluation of reasoning and arguments. While these authors acknowledge that such analysis and evaluation is part of critical thinking and should be part of its conceptualization and pedagogy, they insist that it is only a part. Paul (1981), for example, bemoans the tendency of atomistic teaching of methods of analyzing and evaluating arguments to turn students into more able sophists, adept at finding fault with positions and arguments with which they disagree but even more entrenched in the egocentric and sociocentric biases with which they began. Martin (1992) and Thayer-Bacon (1992) cite with approval the self-reported intimacy with their subject-matter of leading researchers in biology and medicine, an intimacy that conflicts with the distancing allegedly recommended in standard conceptions and pedagogy of critical thinking. Thayer-Bacon (2000) contrasts the embodied and socially embedded learning of her elementary school students in a Montessori school, who used their imagination, intuition and emotions as well as their reason, with conceptions of critical thinking as

thinking that is used to critique arguments, offer justifications, and make judgments about what are the good reasons, or the right answers. (Thayer-Bacon 2000: 127–128)

Alston (2001) reports that her students in a women’s studies class were able to see the flaws in the Cinderella myth that pervades much romantic fiction but in their own romantic relationships still acted as if all failures were the woman’s fault and still accepted the notions of love at first sight and living happily ever after. Students, she writes, should

be able to connect their intellectual critique to a more affective, somatic, and ethical account of making risky choices that have sexist, racist, classist, familial, sexual, or other consequences for themselves and those both near and far… critical thinking that reads arguments, texts, or practices merely on the surface without connections to feeling/desiring/doing or action lacks an ethical depth that should infuse the difference between mere cognitive activity and something we want to call critical thinking. (Alston 2001: 34)

Some critics portray such biases as unfair to women. Thayer-Bacon (1992), for example, has charged modern critical thinking theory with being sexist, on the ground that it separates the self from the object and causes one to lose touch with one’s inner voice, and thus stigmatizes women, who (she asserts) link self to object and listen to their inner voice. Her charge does not imply that women as a group are on average less able than men to analyze and evaluate arguments. Facione (1990c) found no difference by sex in performance on his California Critical Thinking Skills Test. Kuhn (1991: 280–281) found no difference by sex in either the disposition or the competence to engage in argumentative thinking.

The critics propose a variety of remedies for the biases that they allege. In general, they do not propose to eliminate or downplay critical thinking as an educational goal. Rather, they propose to conceptualize critical thinking differently and to change its pedagogy accordingly. Their pedagogical proposals arise logically from their objections. They can be summarized as follows:

  • Focus on argument networks with dialectical exchanges reflecting contesting points of view rather than on atomic arguments, so as to develop “strong sense” critical thinking that transcends egocentric and sociocentric biases (Paul 1981, 1984).
  • Foster closeness to the subject-matter and feeling connected to others in order to inform a humane democracy (Martin 1992).
  • Develop “constructive thinking” as a social activity in a community of physically embodied and socially embedded inquirers with personal voices who value not only reason but also imagination, intuition and emotion (Thayer-Bacon 2000).
  • In developing critical thinking in school subjects, treat as important neither skills nor dispositions but opening worlds of meaning (Alston 2001).
  • Attend to the development of critical thinking dispositions as well as skills, and adopt the “critical pedagogy” practised and advocated by Freire (1968 [1970]) and hooks (1994) (Dalgleish, Girard, & Davies 2017).

A common thread in these proposals is treatment of critical thinking as a social, interactive, personally engaged activity like that of a quilting bee or a barn-raising (Thayer-Bacon 2000) rather than as an individual, solitary, distanced activity symbolized by Rodin’s The Thinker . One can get a vivid description of education with the former type of goal from the writings of bell hooks (1994, 2010). Critical thinking for her is open-minded dialectical exchange across opposing standpoints and from multiple perspectives, a conception similar to Paul’s “strong sense” critical thinking (Paul 1981). She abandons the structure of domination in the traditional classroom. In an introductory course on black women writers, for example, she assigns students to write an autobiographical paragraph about an early racial memory, then to read it aloud as the others listen, thus affirming the uniqueness and value of each voice and creating a communal awareness of the diversity of the group’s experiences (hooks 1994: 84). Her “engaged pedagogy” is thus similar to the “freedom under guidance” implemented in John Dewey’s Laboratory School of Chicago in the late 1890s and early 1900s. It incorporates the dialogue, anchored instruction, and mentoring that Abrami (2015) found to be most effective in improving critical thinking skills and dispositions.

What is the relationship of critical thinking to problem solving, decision-making, higher-order thinking, creative thinking, and other recognized types of thinking? One’s answer to this question obviously depends on how one defines the terms used in the question. If critical thinking is conceived broadly to cover any careful thinking about any topic for any purpose, then problem solving and decision making will be kinds of critical thinking, if they are done carefully. Historically, ‘critical thinking’ and ‘problem solving’ were two names for the same thing. If critical thinking is conceived more narrowly as consisting solely of appraisal of intellectual products, then it will be disjoint with problem solving and decision making, which are constructive.

Bloom’s taxonomy of educational objectives used the phrase “intellectual abilities and skills” for what had been labeled “critical thinking” by some, “reflective thinking” by Dewey and others, and “problem solving” by still others (Bloom et al. 1956: 38). Thus, the so-called “higher-order thinking skills” at the taxonomy’s top levels of analysis, synthesis and evaluation are just critical thinking skills, although they do not come with general criteria for their assessment (Ennis 1981b). The revised version of Bloom’s taxonomy (Anderson et al. 2001) likewise treats critical thinking as cutting across those types of cognitive process that involve more than remembering (Anderson et al. 2001: 269–270). For details, see the Supplement on History .

As to creative thinking, it overlaps with critical thinking (Bailin 1987, 1988). Thinking about the explanation of some phenomenon or event, as in Ferryboat , requires creative imagination in constructing plausible explanatory hypotheses. Likewise, thinking about a policy question, as in Candidate , requires creativity in coming up with options. Conversely, creativity in any field needs to be balanced by critical appraisal of the draft painting or novel or mathematical theory.

  • Abrami, Philip C., Robert M. Bernard, Eugene Borokhovski, David I. Waddington, C. Anne Wade, and Tonje Person, 2015, “Strategies for Teaching Students to Think Critically: A Meta-analysis”, Review of Educational Research , 85(2): 275–314. doi:10.3102/0034654314551063
  • Aikin, Wilford M., 1942, The Story of the Eight-year Study, with Conclusions and Recommendations , Volume I of Adventure in American Education , New York and London: Harper & Brothers. [ Aikin 1942 available online ]
  • Alston, Kal, 1995, “Begging the Question: Is Critical Thinking Biased?”, Educational Theory , 45(2): 225–233. doi:10.1111/j.1741-5446.1995.00225.x
  • –––, 2001, “Re/Thinking Critical Thinking: The Seductions of Everyday Life”, Studies in Philosophy and Education , 20(1): 27–40. doi:10.1023/A:1005247128053
  • American Educational Research Association, 2014, Standards for Educational and Psychological Testing / American Educational Research Association, American Psychological Association, National Council on Measurement in Education , Washington, DC: American Educational Research Association.
  • Anderson, Lorin W., David R. Krathwohl, Peter W. Airiasian, Kathleen A. Cruikshank, Richard E. Mayer, Paul R. Pintrich, James Raths, and Merlin C. Wittrock, 2001, A Taxonomy for Learning, Teaching and Assessing: A Revision of Bloom’s Taxonomy of Educational Objectives , New York: Longman, complete edition.
  • Bailin, Sharon, 1987, “Critical and Creative Thinking”, Informal Logic , 9(1): 23–30. [ Bailin 1987 available online ]
  • –––, 1988, Achieving Extraordinary Ends: An Essay on Creativity , Dordrecht: Kluwer. doi:10.1007/978-94-009-2780-3
  • –––, 1995, “Is Critical Thinking Biased? Clarifications and Implications”, Educational Theory , 45(2): 191–197. doi:10.1111/j.1741-5446.1995.00191.x
  • Bailin, Sharon and Mark Battersby, 2009, “Inquiry: A Dialectical Approach to Teaching Critical Thinking”, in Juho Ritola (ed.), Argument Cultures: Proceedings of OSSA 09 , CD-ROM (pp. 1–10), Windsor, ON: OSSA. [ Bailin & Battersby 2009 available online ]
  • –––, 2016a, “Fostering the Virtues of Inquiry”, Topoi , 35(2): 367–374. doi:10.1007/s11245-015-9307-6
  • –––, 2016b, Reason in the Balance: An Inquiry Approach to Critical Thinking , Indianapolis: Hackett, 2nd edition.
  • –––, 2021, “Inquiry: Teaching for Reasoned Judgment”, in Daniel Fasko, Jr. and Frank Fair (eds.), Critical Thinking and Reasoning: Theory, Development, Instruction, and Assessment , Leiden: Brill, pp. 31–46. doi: 10.1163/9789004444591_003
  • Bailin, Sharon, Roland Case, Jerrold R. Coombs, and Leroi B. Daniels, 1999a, “Common Misconceptions of Critical Thinking”, Journal of Curriculum Studies , 31(3): 269–283. doi:10.1080/002202799183124
  • –––, 1999b, “Conceptualizing Critical Thinking”, Journal of Curriculum Studies , 31(3): 285–302. doi:10.1080/002202799183133
  • Blair, J. Anthony, 2021, Studies in Critical Thinking , Windsor, ON: Windsor Studies in Argumentation, 2nd edition. [Available online at https://windsor.scholarsportal.info/omp/index.php/wsia/catalog/book/106]
  • Berman, Alan M., Seth J. Schwartz, William M. Kurtines, and Steven L. Berman, 2001, “The Process of Exploration in Identity Formation: The Role of Style and Competence”, Journal of Adolescence , 24(4): 513–528. doi:10.1006/jado.2001.0386
  • Black, Beth (ed.), 2012, An A to Z of Critical Thinking , London: Continuum International Publishing Group.
  • Bloom, Benjamin Samuel, Max D. Engelhart, Edward J. Furst, Walter H. Hill, and David R. Krathwohl, 1956, Taxonomy of Educational Objectives. Handbook I: Cognitive Domain , New York: David McKay.
  • Boardman, Frank, Nancy M. Cavender, and Howard Kahane, 2018, Logic and Contemporary Rhetoric: The Use of Reason in Everyday Life , Boston: Cengage, 13th edition.
  • Browne, M. Neil and Stuart M. Keeley, 2018, Asking the Right Questions: A Guide to Critical Thinking , Hoboken, NJ: Pearson, 12th edition.
  • Center for Assessment & Improvement of Learning, 2017, Critical Thinking Assessment Test , Cookeville, TN: Tennessee Technological University.
  • Cleghorn, Paul. 2021. “Critical Thinking in the Elementary School: Practical Guidance for Building a Culture of Thinking”, in Daniel Fasko, Jr. and Frank Fair (eds.), Critical Thinking and Reasoning: Theory, Development, Instruction, and Assessmen t, Leiden: Brill, pp. 150–167. doi: 10.1163/9789004444591_010
  • Cohen, Jacob, 1988, Statistical Power Analysis for the Behavioral Sciences , Hillsdale, NJ: Lawrence Erlbaum Associates, 2nd edition.
  • College Board, 1983, Academic Preparation for College. What Students Need to Know and Be Able to Do , New York: College Entrance Examination Board, ERIC document ED232517.
  • Commission on the Relation of School and College of the Progressive Education Association, 1943, Thirty Schools Tell Their Story , Volume V of Adventure in American Education , New York and London: Harper & Brothers.
  • Council for Aid to Education, 2017, CLA+ Student Guide . Available at http://cae.org/images/uploads/pdf/CLA_Student_Guide_Institution.pdf ; last accessed 2022 07 16.
  • Dalgleish, Adam, Patrick Girard, and Maree Davies, 2017, “Critical Thinking, Bias and Feminist Philosophy: Building a Better Framework through Collaboration”, Informal Logic , 37(4): 351–369. [ Dalgleish et al. available online ]
  • Dewey, John, 1910, How We Think , Boston: D.C. Heath. [ Dewey 1910 available online ]
  • –––, 1916, Democracy and Education: An Introduction to the Philosophy of Education , New York: Macmillan.
  • –––, 1933, How We Think: A Restatement of the Relation of Reflective Thinking to the Educative Process , Lexington, MA: D.C. Heath.
  • –––, 1936, “The Theory of the Chicago Experiment”, Appendix II of Mayhew & Edwards 1936: 463–477.
  • –––, 1938, Logic: The Theory of Inquiry , New York: Henry Holt and Company.
  • Dominguez, Caroline (coord.), 2018a, A European Collection of the Critical Thinking Skills and Dispositions Needed in Different Professional Fields for the 21st Century , Vila Real, Portugal: UTAD. Available at http://bit.ly/CRITHINKEDUO1 ; last accessed 2022 07 16.
  • ––– (coord.), 2018b, A European Review on Critical Thinking Educational Practices in Higher Education Institutions , Vila Real: UTAD. Available at http://bit.ly/CRITHINKEDUO2 ; last accessed 2022 07 16.
  • ––– (coord.), 2018c, The CRITHINKEDU European Course on Critical Thinking Education for University Teachers: From Conception to Delivery , Vila Real: UTAD. Available at http:/bit.ly/CRITHINKEDU03; last accessed 2022 07 16.
  • Dominguez Caroline and Rita Payan-Carreira (eds.), 2019, Promoting Critical Thinking in European Higher Education Institutions: Towards an Educational Protocol , Vila Real: UTAD. Available at http:/bit.ly/CRITHINKEDU04; last accessed 2022 07 16.
  • Ennis, Robert H., 1958, “An Appraisal of the Watson-Glaser Critical Thinking Appraisal”, The Journal of Educational Research , 52(4): 155–158. doi:10.1080/00220671.1958.10882558
  • –––, 1962, “A Concept of Critical Thinking: A Proposed Basis for Research on the Teaching and Evaluation of Critical Thinking Ability”, Harvard Educational Review , 32(1): 81–111.
  • –––, 1981a, “A Conception of Deductive Logical Competence”, Teaching Philosophy , 4(3/4): 337–385. doi:10.5840/teachphil198143/429
  • –––, 1981b, “Eight Fallacies in Bloom’s Taxonomy”, in C. J. B. Macmillan (ed.), Philosophy of Education 1980: Proceedings of the Thirty-seventh Annual Meeting of the Philosophy of Education Society , Bloomington, IL: Philosophy of Education Society, pp. 269–273.
  • –––, 1984, “Problems in Testing Informal Logic, Critical Thinking, Reasoning Ability”, Informal Logic , 6(1): 3–9. [ Ennis 1984 available online ]
  • –––, 1987, “A Taxonomy of Critical Thinking Dispositions and Abilities”, in Joan Boykoff Baron and Robert J. Sternberg (eds.), Teaching Thinking Skills: Theory and Practice , New York: W. H. Freeman, pp. 9–26.
  • –––, 1989, “Critical Thinking and Subject Specificity: Clarification and Needed Research”, Educational Researcher , 18(3): 4–10. doi:10.3102/0013189X018003004
  • –––, 1991, “Critical Thinking: A Streamlined Conception”, Teaching Philosophy , 14(1): 5–24. doi:10.5840/teachphil19911412
  • –––, 1996, “Critical Thinking Dispositions: Their Nature and Assessability”, Informal Logic , 18(2–3): 165–182. [ Ennis 1996 available online ]
  • –––, 1998, “Is Critical Thinking Culturally Biased?”, Teaching Philosophy , 21(1): 15–33. doi:10.5840/teachphil19982113
  • –––, 2011, “Critical Thinking: Reflection and Perspective Part I”, Inquiry: Critical Thinking across the Disciplines , 26(1): 4–18. doi:10.5840/inquiryctnews20112613
  • –––, 2013, “Critical Thinking across the Curriculum: The Wisdom CTAC Program”, Inquiry: Critical Thinking across the Disciplines , 28(2): 25–45. doi:10.5840/inquiryct20132828
  • –––, 2016, “Definition: A Three-Dimensional Analysis with Bearing on Key Concepts”, in Patrick Bondy and Laura Benacquista (eds.), Argumentation, Objectivity, and Bias: Proceedings of the 11th International Conference of the Ontario Society for the Study of Argumentation (OSSA), 18–21 May 2016 , Windsor, ON: OSSA, pp. 1–19. Available at http://scholar.uwindsor.ca/ossaarchive/OSSA11/papersandcommentaries/105 ; last accessed 2022 07 16.
  • –––, 2018, “Critical Thinking Across the Curriculum: A Vision”, Topoi , 37(1): 165–184. doi:10.1007/s11245-016-9401-4
  • Ennis, Robert H., and Jason Millman, 1971, Manual for Cornell Critical Thinking Test, Level X, and Cornell Critical Thinking Test, Level Z , Urbana, IL: Critical Thinking Project, University of Illinois.
  • Ennis, Robert H., Jason Millman, and Thomas Norbert Tomko, 1985, Cornell Critical Thinking Tests Level X & Level Z: Manual , Pacific Grove, CA: Midwest Publication, 3rd edition.
  • –––, 2005, Cornell Critical Thinking Tests Level X & Level Z: Manual , Seaside, CA: Critical Thinking Company, 5th edition.
  • Ennis, Robert H. and Eric Weir, 1985, The Ennis-Weir Critical Thinking Essay Test: Test, Manual, Criteria, Scoring Sheet: An Instrument for Teaching and Testing , Pacific Grove, CA: Midwest Publications.
  • Facione, Peter A., 1990a, Critical Thinking: A Statement of Expert Consensus for Purposes of Educational Assessment and Instruction , Research Findings and Recommendations Prepared for the Committee on Pre-College Philosophy of the American Philosophical Association, ERIC Document ED315423.
  • –––, 1990b, California Critical Thinking Skills Test, CCTST – Form A , Millbrae, CA: The California Academic Press.
  • –––, 1990c, The California Critical Thinking Skills Test--College Level. Technical Report #3. Gender, Ethnicity, Major, CT Self-Esteem, and the CCTST , ERIC Document ED326584.
  • –––, 1992, California Critical Thinking Skills Test: CCTST – Form B, Millbrae, CA: The California Academic Press.
  • –––, 2000, “The Disposition Toward Critical Thinking: Its Character, Measurement, and Relationship to Critical Thinking Skill”, Informal Logic , 20(1): 61–84. [ Facione 2000 available online ]
  • Facione, Peter A. and Noreen C. Facione, 1992, CCTDI: A Disposition Inventory , Millbrae, CA: The California Academic Press.
  • Facione, Peter A., Noreen C. Facione, and Carol Ann F. Giancarlo, 2001, California Critical Thinking Disposition Inventory: CCTDI: Inventory Manual , Millbrae, CA: The California Academic Press.
  • Facione, Peter A., Carol A. Sánchez, and Noreen C. Facione, 1994, Are College Students Disposed to Think? , Millbrae, CA: The California Academic Press. ERIC Document ED368311.
  • Fisher, Alec, and Michael Scriven, 1997, Critical Thinking: Its Definition and Assessment , Norwich: Centre for Research in Critical Thinking, University of East Anglia.
  • Freire, Paulo, 1968 [1970], Pedagogia do Oprimido . Translated as Pedagogy of the Oppressed , Myra Bergman Ramos (trans.), New York: Continuum, 1970.
  • Gigerenzer, Gerd, 2001, “The Adaptive Toolbox”, in Gerd Gigerenzer and Reinhard Selten (eds.), Bounded Rationality: The Adaptive Toolbox , Cambridge, MA: MIT Press, pp. 37–50.
  • Glaser, Edward Maynard, 1941, An Experiment in the Development of Critical Thinking , New York: Bureau of Publications, Teachers College, Columbia University.
  • Groarke, Leo A. and Christopher W. Tindale, 2012, Good Reasoning Matters! A Constructive Approach to Critical Thinking , Don Mills, ON: Oxford University Press, 5th edition.
  • Halpern, Diane F., 1998, “Teaching Critical Thinking for Transfer Across Domains: Disposition, Skills, Structure Training, and Metacognitive Monitoring”, American Psychologist , 53(4): 449–455. doi:10.1037/0003-066X.53.4.449
  • –––, 2016, Manual: Halpern Critical Thinking Assessment , Mödling, Austria: Schuhfried. Available at https://pdfcoffee.com/hcta-test-manual-pdf-free.html; last accessed 2022 07 16.
  • Hamby, Benjamin, 2014, The Virtues of Critical Thinkers , Doctoral dissertation, Philosophy, McMaster University. [ Hamby 2014 available online ]
  • –––, 2015, “Willingness to Inquire: The Cardinal Critical Thinking Virtue”, in Martin Davies and Ronald Barnett (eds.), The Palgrave Handbook of Critical Thinking in Higher Education , New York: Palgrave Macmillan, pp. 77–87.
  • Haran, Uriel, Ilana Ritov, and Barbara A. Mellers, 2013, “The Role of Actively Open-minded Thinking in Information Acquisition, Accuracy, and Calibration”, Judgment and Decision Making , 8(3): 188–201.
  • Hatcher, Donald and Kevin Possin, 2021, “Commentary: Thinking Critically about Critical Thinking Assessment”, in Daniel Fasko, Jr. and Frank Fair (eds.), Critical Thinking and Reasoning: Theory, Development, Instruction, and Assessment , Leiden: Brill, pp. 298–322. doi: 10.1163/9789004444591_017
  • Haynes, Ada, Elizabeth Lisic, Kevin Harris, Katie Leming, Kyle Shanks, and Barry Stein, 2015, “Using the Critical Thinking Assessment Test (CAT) as a Model for Designing Within-Course Assessments: Changing How Faculty Assess Student Learning”, Inquiry: Critical Thinking Across the Disciplines , 30(3): 38–48. doi:10.5840/inquiryct201530316
  • Haynes, Ada and Barry Stein, 2021, “Observations from a Long-Term Effort to Assess and Improve Critical Thinking”, in Daniel Fasko, Jr. and Frank Fair (eds.), Critical Thinking and Reasoning: Theory, Development, Instruction, and Assessment , Leiden: Brill, pp. 231–254. doi: 10.1163/9789004444591_014
  • Hiner, Amanda L. 2021. “Equipping Students for Success in College and Beyond: Placing Critical Thinking Instruction at the Heart of a General Education Program”, in Daniel Fasko, Jr. and Frank Fair (eds.), Critical Thinking and Reasoning: Theory, Development, Instruction, and Assessment , Leiden: Brill, pp. 188–208. doi: 10.1163/9789004444591_012
  • Hitchcock, David, 2017, “Critical Thinking as an Educational Ideal”, in his On Reasoning and Argument: Essays in Informal Logic and on Critical Thinking , Dordrecht: Springer, pp. 477–497. doi:10.1007/978-3-319-53562-3_30
  • –––, 2021, “Seven Philosophical Implications of Critical Thinking: Themes, Variations, Implications”, in Daniel Fasko, Jr. and Frank Fair (eds.), Critical Thinking and Reasoning: Theory, Development, Instruction, and Assessment , Leiden: Brill, pp. 9–30. doi: 10.1163/9789004444591_002
  • hooks, bell, 1994, Teaching to Transgress: Education as the Practice of Freedom , New York and London: Routledge.
  • –––, 2010, Teaching Critical Thinking: Practical Wisdom , New York and London: Routledge.
  • Johnson, Ralph H., 1992, “The Problem of Defining Critical Thinking”, in Stephen P, Norris (ed.), The Generalizability of Critical Thinking , New York: Teachers College Press, pp. 38–53.
  • Kahane, Howard, 1971, Logic and Contemporary Rhetoric: The Use of Reason in Everyday Life , Belmont, CA: Wadsworth.
  • Kahneman, Daniel, 2011, Thinking, Fast and Slow , New York: Farrar, Straus and Giroux.
  • Kahneman, Daniel, Olivier Sibony, & Cass R. Sunstein, 2021, Noise: A Flaw in Human Judgment , New York: Little, Brown Spark.
  • Kenyon, Tim, and Guillaume Beaulac, 2014, “Critical Thinking Education and Debasing”, Informal Logic , 34(4): 341–363. [ Kenyon & Beaulac 2014 available online ]
  • Krathwohl, David R., Benjamin S. Bloom, and Bertram B. Masia, 1964, Taxonomy of Educational Objectives, Handbook II: Affective Domain , New York: David McKay.
  • Kuhn, Deanna, 1991, The Skills of Argument , New York: Cambridge University Press. doi:10.1017/CBO9780511571350
  • –––, 2019, “Critical Thinking as Discourse”, Human Development, 62 (3): 146–164. doi:10.1159/000500171
  • Lipman, Matthew, 1987, “Critical Thinking–What Can It Be?”, Analytic Teaching , 8(1): 5–12. [ Lipman 1987 available online ]
  • –––, 2003, Thinking in Education , Cambridge: Cambridge University Press, 2nd edition.
  • Loftus, Elizabeth F., 2017, “Eavesdropping on Memory”, Annual Review of Psychology , 68: 1–18. doi:10.1146/annurev-psych-010416-044138
  • Makaiau, Amber Strong, 2021, “The Good Thinker’s Tool Kit: How to Engage Critical Thinking and Reasoning in Secondary Education”, in Daniel Fasko, Jr. and Frank Fair (eds.), Critical Thinking and Reasoning: Theory, Development, Instruction, and Assessment , Leiden: Brill, pp. 168–187. doi: 10.1163/9789004444591_011
  • Martin, Jane Roland, 1992, “Critical Thinking for a Humane World”, in Stephen P. Norris (ed.), The Generalizability of Critical Thinking , New York: Teachers College Press, pp. 163–180.
  • Mayhew, Katherine Camp, and Anna Camp Edwards, 1936, The Dewey School: The Laboratory School of the University of Chicago, 1896–1903 , New York: Appleton-Century. [ Mayhew & Edwards 1936 available online ]
  • McPeck, John E., 1981, Critical Thinking and Education , New York: St. Martin’s Press.
  • Moore, Brooke Noel and Richard Parker, 2020, Critical Thinking , New York: McGraw-Hill, 13th edition.
  • Nickerson, Raymond S., 1998, “Confirmation Bias: A Ubiquitous Phenomenon in Many Guises”, Review of General Psychology , 2(2): 175–220. doi:10.1037/1089-2680.2.2.175
  • Nieto, Ana Maria, and Jorge Valenzuela, 2012, “A Study of the Internal Structure of Critical Thinking Dispositions”, Inquiry: Critical Thinking across the Disciplines , 27(1): 31–38. doi:10.5840/inquiryct20122713
  • Norris, Stephen P., 1985, “Controlling for Background Beliefs When Developing Multiple-choice Critical Thinking Tests”, Educational Measurement: Issues and Practice , 7(3): 5–11. doi:10.1111/j.1745-3992.1988.tb00437.x
  • Norris, Stephen P. and Robert H. Ennis, 1989, Evaluating Critical Thinking (The Practitioners’ Guide to Teaching Thinking Series), Pacific Grove, CA: Midwest Publications.
  • Norris, Stephen P. and Ruth Elizabeth King, 1983, Test on Appraising Observations , St. John’s, NL: Institute for Educational Research and Development, Memorial University of Newfoundland.
  • –––, 1984, The Design of a Critical Thinking Test on Appraising Observations , St. John’s, NL: Institute for Educational Research and Development, Memorial University of Newfoundland. ERIC Document ED260083.
  • –––, 1985, Test on Appraising Observations: Manual , St. John’s, NL: Institute for Educational Research and Development, Memorial University of Newfoundland.
  • –––, 1990a, Test on Appraising Observations , St. John’s, NL: Institute for Educational Research and Development, Memorial University of Newfoundland, 2nd edition.
  • –––, 1990b, Test on Appraising Observations: Manual , St. John’s, NL: Institute for Educational Research and Development, Memorial University of Newfoundland, 2nd edition.
  • OCR [Oxford, Cambridge and RSA Examinations], 2011, AS/A Level GCE: Critical Thinking – H052, H452 , Cambridge: OCR. Past papers available at https://pastpapers.co/ocr/?dir=A-Level/Critical-Thinking-H052-H452; last accessed 2022 07 16.
  • Ontario Ministry of Education, 2013, The Ontario Curriculum Grades 9 to 12: Social Sciences and Humanities . Available at http://www.edu.gov.on.ca/eng/curriculum/secondary/ssciences9to122013.pdf ; last accessed 2022 07 16.
  • Passmore, John Arthur, 1980, The Philosophy of Teaching , London: Duckworth.
  • Paul, Richard W., 1981, “Teaching Critical Thinking in the ‘Strong’ Sense: A Focus on Self-Deception, World Views, and a Dialectical Mode of Analysis”, Informal Logic , 4(2): 2–7. [ Paul 1981 available online ]
  • –––, 1984, “Critical Thinking: Fundamental to Education for a Free Society”, Educational Leadership , 42(1): 4–14.
  • –––, 1985, “McPeck’s Mistakes”, Informal Logic , 7(1): 35–43. [ Paul 1985 available online ]
  • Paul, Richard W. and Linda Elder, 2006, The Miniature Guide to Critical Thinking: Concepts and Tools , Dillon Beach, CA: Foundation for Critical Thinking, 4th edition.
  • Payette, Patricia, and Edna Ross, 2016, “Making a Campus-Wide Commitment to Critical Thinking: Insights and Promising Practices Utilizing the Paul-Elder Approach at the University of Louisville”, Inquiry: Critical Thinking Across the Disciplines , 31(1): 98–110. doi:10.5840/inquiryct20163118
  • Possin, Kevin, 2008, “A Field Guide to Critical-Thinking Assessment”, Teaching Philosophy , 31(3): 201–228. doi:10.5840/teachphil200831324
  • –––, 2013a, “Some Problems with the Halpern Critical Thinking Assessment (HCTA) Test”, Inquiry: Critical Thinking across the Disciplines , 28(3): 4–12. doi:10.5840/inquiryct201328313
  • –––, 2013b, “A Serious Flaw in the Collegiate Learning Assessment (CLA) Test”, Informal Logic , 33(3): 390–405. [ Possin 2013b available online ]
  • –––, 2013c, “A Fatal Flaw in the Collegiate Learning Assessment Test”, Assessment Update , 25 (1): 8–12.
  • –––, 2014, “Critique of the Watson-Glaser Critical Thinking Appraisal Test: The More You Know, the Lower Your Score”, Informal Logic , 34(4): 393–416. [ Possin 2014 available online ]
  • –––, 2020, “CAT Scan: A Critical Review of the Critical-Thinking Assessment Test”, Informal Logic , 40 (3): 489–508. [Available online at https://informallogic.ca/index.php/informal_logic/article/view/6243]
  • Rawls, John, 1971, A Theory of Justice , Cambridge, MA: Harvard University Press.
  • Rear, David, 2019, “One Size Fits All? The Limitations of Standardised Assessment in Critical Thinking”, Assessment & Evaluation in Higher Education , 44(5): 664–675. doi: 10.1080/02602938.2018.1526255
  • Rousseau, Jean-Jacques, 1762, Émile , Amsterdam: Jean Néaulme.
  • Scheffler, Israel, 1960, The Language of Education , Springfield, IL: Charles C. Thomas.
  • Scriven, Michael, and Richard W. Paul, 1987, Defining Critical Thinking , Draft statement written for the National Council for Excellence in Critical Thinking Instruction. Available at http://www.criticalthinking.org/pages/defining-critical-thinking/766 ; last accessed 2022 07 16.
  • Sheffield, Clarence Burton Jr., 2018, “Promoting Critical Thinking in Higher Education: My Experiences as the Inaugural Eugene H. Fram Chair in Applied Critical Thinking at Rochester Institute of Technology”, Topoi , 37(1): 155–163. doi:10.1007/s11245-016-9392-1
  • Siegel, Harvey, 1985, “McPeck, Informal Logic and the Nature of Critical Thinking”, in David Nyberg (ed.), Philosophy of Education 1985: Proceedings of the Forty-First Annual Meeting of the Philosophy of Education Society , Normal, IL: Philosophy of Education Society, pp. 61–72.
  • –––, 1988, Educating Reason: Rationality, Critical Thinking, and Education , New York: Routledge.
  • –––, 1999, “What (Good) Are Thinking Dispositions?”, Educational Theory , 49(2): 207–221. doi:10.1111/j.1741-5446.1999.00207.x
  • Simon, Herbert A., 1956, “Rational Choice and the Structure of the Environment”, Psychological Review , 63(2): 129–138. doi: 10.1037/h0042769
  • Simpson, Elizabeth, 1966–67, “The Classification of Educational Objectives: Psychomotor Domain”, Illinois Teacher of Home Economics , 10(4): 110–144, ERIC document ED0103613. [ Simpson 1966–67 available online ]
  • Skolverket, 2018, Curriculum for the Compulsory School, Preschool Class and School-age Educare , Stockholm: Skolverket, revised 2018. Available at https://www.skolverket.se/download/18.31c292d516e7445866a218f/1576654682907/pdf3984.pdf; last accessed 2022 07 15.
  • Smith, B. Othanel, 1953, “The Improvement of Critical Thinking”, Progressive Education , 30(5): 129–134.
  • Smith, Eugene Randolph, Ralph Winfred Tyler, and the Evaluation Staff, 1942, Appraising and Recording Student Progress , Volume III of Adventure in American Education , New York and London: Harper & Brothers.
  • Splitter, Laurance J., 1987, “Educational Reform through Philosophy for Children”, Thinking: The Journal of Philosophy for Children , 7(2): 32–39. doi:10.5840/thinking1987729
  • Stanovich Keith E., and Paula J. Stanovich, 2010, “A Framework for Critical Thinking, Rational Thinking, and Intelligence”, in David D. Preiss and Robert J. Sternberg (eds), Innovations in Educational Psychology: Perspectives on Learning, Teaching and Human Development , New York: Springer Publishing, pp 195–237.
  • Stanovich Keith E., Richard F. West, and Maggie E. Toplak, 2011, “Intelligence and Rationality”, in Robert J. Sternberg and Scott Barry Kaufman (eds.), Cambridge Handbook of Intelligence , Cambridge: Cambridge University Press, 3rd edition, pp. 784–826. doi:10.1017/CBO9780511977244.040
  • Tankersley, Karen, 2005, Literacy Strategies for Grades 4–12: Reinforcing the Threads of Reading , Alexandria, VA: Association for Supervision and Curriculum Development.
  • Thayer-Bacon, Barbara J., 1992, “Is Modern Critical Thinking Theory Sexist?”, Inquiry: Critical Thinking Across the Disciplines , 10(1): 3–7. doi:10.5840/inquiryctnews199210123
  • –––, 1993, “Caring and Its Relationship to Critical Thinking”, Educational Theory , 43(3): 323–340. doi:10.1111/j.1741-5446.1993.00323.x
  • –––, 1995a, “Constructive Thinking: Personal Voice”, Journal of Thought , 30(1): 55–70.
  • –––, 1995b, “Doubting and Believing: Both are Important for Critical Thinking”, Inquiry: Critical Thinking across the Disciplines , 15(2): 59–66. doi:10.5840/inquiryctnews199515226
  • –––, 2000, Transforming Critical Thinking: Thinking Constructively , New York: Teachers College Press.
  • Toulmin, Stephen Edelston, 1958, The Uses of Argument , Cambridge: Cambridge University Press.
  • Turri, John, Mark Alfano, and John Greco, 2017, “Virtue Epistemology”, in Edward N. Zalta (ed.), The Stanford Encyclopedia of Philosophy (Winter 2017 Edition). URL = < https://plato.stanford.edu/archives/win2017/entries/epistemology-virtue/ >
  • Vincent-Lancrin, Stéphan, Carlos González-Sancho, Mathias Bouckaert, Federico de Luca, Meritxell Fernández-Barrerra, Gwénaël Jacotin, Joaquin Urgel, and Quentin Vidal, 2019, Fostering Students’ Creativity and Critical Thinking: What It Means in School. Educational Research and Innovation , Paris: OECD Publishing.
  • Warren, Karen J. 1988. “Critical Thinking and Feminism”, Informal Logic , 10(1): 31–44. [ Warren 1988 available online ]
  • Watson, Goodwin, and Edward M. Glaser, 1980a, Watson-Glaser Critical Thinking Appraisal, Form A , San Antonio, TX: Psychological Corporation.
  • –––, 1980b, Watson-Glaser Critical Thinking Appraisal: Forms A and B; Manual , San Antonio, TX: Psychological Corporation,
  • –––, 1994, Watson-Glaser Critical Thinking Appraisal, Form B , San Antonio, TX: Psychological Corporation.
  • Weinstein, Mark, 1990, “Towards a Research Agenda for Informal Logic and Critical Thinking”, Informal Logic , 12(3): 121–143. [ Weinstein 1990 available online ]
  • –––, 2013, Logic, Truth and Inquiry , London: College Publications.
  • Willingham, Daniel T., 2019, “How to Teach Critical Thinking”, Education: Future Frontiers , 1: 1–17. [Available online at https://prod65.education.nsw.gov.au/content/dam/main-education/teaching-and-learning/education-for-a-changing-world/media/documents/How-to-teach-critical-thinking-Willingham.pdf.]
  • Zagzebski, Linda Trinkaus, 1996, Virtues of the Mind: An Inquiry into the Nature of Virtue and the Ethical Foundations of Knowledge , Cambridge: Cambridge University Press. doi:10.1017/CBO9781139174763
How to cite this entry . Preview the PDF version of this entry at the Friends of the SEP Society . Look up topics and thinkers related to this entry at the Internet Philosophy Ontology Project (InPhO). Enhanced bibliography for this entry at PhilPapers , with links to its database.
  • Association for Informal Logic and Critical Thinking (AILACT)
  • Critical Thinking Across the European Higher Education Curricula (CRITHINKEDU)
  • Critical Thinking Definition, Instruction, and Assessment: A Rigorous Approach
  • Critical Thinking Research (RAIL)
  • Foundation for Critical Thinking
  • Insight Assessment
  • Partnership for 21st Century Learning (P21)
  • The Critical Thinking Consortium
  • The Nature of Critical Thinking: An Outline of Critical Thinking Dispositions and Abilities , by Robert H. Ennis

abilities | bias, implicit | children, philosophy for | civic education | decision-making capacity | Dewey, John | dispositions | education, philosophy of | epistemology: virtue | logic: informal

Copyright © 2022 by David Hitchcock < hitchckd @ mcmaster . ca >

  • Accessibility

Support SEP

Mirror sites.

View this site from another server:

  • Info about mirror sites

The Stanford Encyclopedia of Philosophy is copyright © 2024 by The Metaphysics Research Lab , Department of Philosophy, Stanford University

Library of Congress Catalog Data: ISSN 1095-5054

Have a language expert improve your writing

Run a free plagiarism check in 10 minutes, generate accurate citations for free.

  • Knowledge Base
  • Working with sources
  • What Is Critical Thinking? | Definition & Examples

What Is Critical Thinking? | Definition & Examples

Published on May 30, 2022 by Eoghan Ryan . Revised on May 31, 2023.

Critical thinking is the ability to effectively analyze information and form a judgment .

To think critically, you must be aware of your own biases and assumptions when encountering information, and apply consistent standards when evaluating sources .

Critical thinking skills help you to:

  • Identify credible sources
  • Evaluate and respond to arguments
  • Assess alternative viewpoints
  • Test hypotheses against relevant criteria

Table of contents

Why is critical thinking important, critical thinking examples, how to think critically, other interesting articles, frequently asked questions about critical thinking.

Critical thinking is important for making judgments about sources of information and forming your own arguments. It emphasizes a rational, objective, and self-aware approach that can help you to identify credible sources and strengthen your conclusions.

Critical thinking is important in all disciplines and throughout all stages of the research process . The types of evidence used in the sciences and in the humanities may differ, but critical thinking skills are relevant to both.

In academic writing , critical thinking can help you to determine whether a source:

  • Is free from research bias
  • Provides evidence to support its research findings
  • Considers alternative viewpoints

Outside of academia, critical thinking goes hand in hand with information literacy to help you form opinions rationally and engage independently and critically with popular media.

Scribbr Citation Checker New

The AI-powered Citation Checker helps you avoid common mistakes such as:

  • Missing commas and periods
  • Incorrect usage of “et al.”
  • Ampersands (&) in narrative citations
  • Missing reference entries

scientific definition for critical thinking

Critical thinking can help you to identify reliable sources of information that you can cite in your research paper . It can also guide your own research methods and inform your own arguments.

Outside of academia, critical thinking can help you to be aware of both your own and others’ biases and assumptions.

Academic examples

However, when you compare the findings of the study with other current research, you determine that the results seem improbable. You analyze the paper again, consulting the sources it cites.

You notice that the research was funded by the pharmaceutical company that created the treatment. Because of this, you view its results skeptically and determine that more independent research is necessary to confirm or refute them. Example: Poor critical thinking in an academic context You’re researching a paper on the impact wireless technology has had on developing countries that previously did not have large-scale communications infrastructure. You read an article that seems to confirm your hypothesis: the impact is mainly positive. Rather than evaluating the research methodology, you accept the findings uncritically.

Nonacademic examples

However, you decide to compare this review article with consumer reviews on a different site. You find that these reviews are not as positive. Some customers have had problems installing the alarm, and some have noted that it activates for no apparent reason.

You revisit the original review article. You notice that the words “sponsored content” appear in small print under the article title. Based on this, you conclude that the review is advertising and is therefore not an unbiased source. Example: Poor critical thinking in a nonacademic context You support a candidate in an upcoming election. You visit an online news site affiliated with their political party and read an article that criticizes their opponent. The article claims that the opponent is inexperienced in politics. You accept this without evidence, because it fits your preconceptions about the opponent.

There is no single way to think critically. How you engage with information will depend on the type of source you’re using and the information you need.

However, you can engage with sources in a systematic and critical way by asking certain questions when you encounter information. Like the CRAAP test , these questions focus on the currency , relevance , authority , accuracy , and purpose of a source of information.

When encountering information, ask:

  • Who is the author? Are they an expert in their field?
  • What do they say? Is their argument clear? Can you summarize it?
  • When did they say this? Is the source current?
  • Where is the information published? Is it an academic article? Is it peer-reviewed ?
  • Why did the author publish it? What is their motivation?
  • How do they make their argument? Is it backed up by evidence? Does it rely on opinion, speculation, or appeals to emotion ? Do they address alternative arguments?

Critical thinking also involves being aware of your own biases, not only those of others. When you make an argument or draw your own conclusions, you can ask similar questions about your own writing:

  • Am I only considering evidence that supports my preconceptions?
  • Is my argument expressed clearly and backed up with credible sources?
  • Would I be convinced by this argument coming from someone else?

If you want to know more about ChatGPT, AI tools , citation , and plagiarism , make sure to check out some of our other articles with explanations and examples.

  • ChatGPT vs human editor
  • ChatGPT citations
  • Is ChatGPT trustworthy?
  • Using ChatGPT for your studies
  • What is ChatGPT?
  • Chicago style
  • Paraphrasing

 Plagiarism

  • Types of plagiarism
  • Self-plagiarism
  • Avoiding plagiarism
  • Academic integrity
  • Consequences of plagiarism
  • Common knowledge

The only proofreading tool specialized in correcting academic writing - try for free!

The academic proofreading tool has been trained on 1000s of academic texts and by native English editors. Making it the most accurate and reliable proofreading tool for students.

scientific definition for critical thinking

Try for free

Critical thinking refers to the ability to evaluate information and to be aware of biases or assumptions, including your own.

Like information literacy , it involves evaluating arguments, identifying and solving problems in an objective and systematic way, and clearly communicating your ideas.

Critical thinking skills include the ability to:

You can assess information and arguments critically by asking certain questions about the source. You can use the CRAAP test , focusing on the currency , relevance , authority , accuracy , and purpose of a source of information.

Ask questions such as:

  • Who is the author? Are they an expert?
  • How do they make their argument? Is it backed up by evidence?

A credible source should pass the CRAAP test  and follow these guidelines:

  • The information should be up to date and current.
  • The author and publication should be a trusted authority on the subject you are researching.
  • The sources the author cited should be easy to find, clear, and unbiased.
  • For a web source, the URL and layout should signify that it is trustworthy.

Information literacy refers to a broad range of skills, including the ability to find, evaluate, and use sources of information effectively.

Being information literate means that you:

  • Know how to find credible sources
  • Use relevant sources to inform your research
  • Understand what constitutes plagiarism
  • Know how to cite your sources correctly

Confirmation bias is the tendency to search, interpret, and recall information in a way that aligns with our pre-existing values, opinions, or beliefs. It refers to the ability to recollect information best when it amplifies what we already believe. Relatedly, we tend to forget information that contradicts our opinions.

Although selective recall is a component of confirmation bias, it should not be confused with recall bias.

On the other hand, recall bias refers to the differences in the ability between study participants to recall past events when self-reporting is used. This difference in accuracy or completeness of recollection is not related to beliefs or opinions. Rather, recall bias relates to other factors, such as the length of the recall period, age, and the characteristics of the disease under investigation.

Cite this Scribbr article

If you want to cite this source, you can copy and paste the citation or click the “Cite this Scribbr article” button to automatically add the citation to our free Citation Generator.

Ryan, E. (2023, May 31). What Is Critical Thinking? | Definition & Examples. Scribbr. Retrieved April 15, 2024, from https://www.scribbr.com/working-with-sources/critical-thinking/

Is this article helpful?

Eoghan Ryan

Eoghan Ryan

Other students also liked, student guide: information literacy | meaning & examples, what are credible sources & how to spot them | examples, applying the craap test & evaluating sources, unlimited academic ai-proofreading.

✔ Document error-free in 5minutes ✔ Unlimited document corrections ✔ Specialized in correcting academic texts

  • Search Menu
  • Browse content in Arts and Humanities
  • Browse content in Archaeology
  • Anglo-Saxon and Medieval Archaeology
  • Archaeological Methodology and Techniques
  • Archaeology by Region
  • Archaeology of Religion
  • Archaeology of Trade and Exchange
  • Biblical Archaeology
  • Contemporary and Public Archaeology
  • Environmental Archaeology
  • Historical Archaeology
  • History and Theory of Archaeology
  • Industrial Archaeology
  • Landscape Archaeology
  • Mortuary Archaeology
  • Prehistoric Archaeology
  • Underwater Archaeology
  • Urban Archaeology
  • Zooarchaeology
  • Browse content in Architecture
  • Architectural Structure and Design
  • History of Architecture
  • Residential and Domestic Buildings
  • Theory of Architecture
  • Browse content in Art
  • Art Subjects and Themes
  • History of Art
  • Industrial and Commercial Art
  • Theory of Art
  • Biographical Studies
  • Byzantine Studies
  • Browse content in Classical Studies
  • Classical Literature
  • Classical Reception
  • Classical History
  • Classical Philosophy
  • Classical Mythology
  • Classical Art and Architecture
  • Classical Oratory and Rhetoric
  • Greek and Roman Papyrology
  • Greek and Roman Archaeology
  • Greek and Roman Epigraphy
  • Greek and Roman Law
  • Late Antiquity
  • Religion in the Ancient World
  • Digital Humanities
  • Browse content in History
  • Colonialism and Imperialism
  • Diplomatic History
  • Environmental History
  • Genealogy, Heraldry, Names, and Honours
  • Genocide and Ethnic Cleansing
  • Historical Geography
  • History by Period
  • History of Emotions
  • History of Agriculture
  • History of Education
  • History of Gender and Sexuality
  • Industrial History
  • Intellectual History
  • International History
  • Labour History
  • Legal and Constitutional History
  • Local and Family History
  • Maritime History
  • Military History
  • National Liberation and Post-Colonialism
  • Oral History
  • Political History
  • Public History
  • Regional and National History
  • Revolutions and Rebellions
  • Slavery and Abolition of Slavery
  • Social and Cultural History
  • Theory, Methods, and Historiography
  • Urban History
  • World History
  • Browse content in Language Teaching and Learning
  • Language Learning (Specific Skills)
  • Language Teaching Theory and Methods
  • Browse content in Linguistics
  • Applied Linguistics
  • Cognitive Linguistics
  • Computational Linguistics
  • Forensic Linguistics
  • Grammar, Syntax and Morphology
  • Historical and Diachronic Linguistics
  • History of English
  • Language Evolution
  • Language Reference
  • Language Variation
  • Language Families
  • Language Acquisition
  • Lexicography
  • Linguistic Anthropology
  • Linguistic Theories
  • Linguistic Typology
  • Phonetics and Phonology
  • Psycholinguistics
  • Sociolinguistics
  • Translation and Interpretation
  • Writing Systems
  • Browse content in Literature
  • Bibliography
  • Children's Literature Studies
  • Literary Studies (Romanticism)
  • Literary Studies (American)
  • Literary Studies (Modernism)
  • Literary Studies (Asian)
  • Literary Studies (European)
  • Literary Studies (Eco-criticism)
  • Literary Studies - World
  • Literary Studies (1500 to 1800)
  • Literary Studies (19th Century)
  • Literary Studies (20th Century onwards)
  • Literary Studies (African American Literature)
  • Literary Studies (British and Irish)
  • Literary Studies (Early and Medieval)
  • Literary Studies (Fiction, Novelists, and Prose Writers)
  • Literary Studies (Gender Studies)
  • Literary Studies (Graphic Novels)
  • Literary Studies (History of the Book)
  • Literary Studies (Plays and Playwrights)
  • Literary Studies (Poetry and Poets)
  • Literary Studies (Postcolonial Literature)
  • Literary Studies (Queer Studies)
  • Literary Studies (Science Fiction)
  • Literary Studies (Travel Literature)
  • Literary Studies (War Literature)
  • Literary Studies (Women's Writing)
  • Literary Theory and Cultural Studies
  • Mythology and Folklore
  • Shakespeare Studies and Criticism
  • Browse content in Media Studies
  • Browse content in Music
  • Applied Music
  • Dance and Music
  • Ethics in Music
  • Ethnomusicology
  • Gender and Sexuality in Music
  • Medicine and Music
  • Music Cultures
  • Music and Media
  • Music and Culture
  • Music and Religion
  • Music Education and Pedagogy
  • Music Theory and Analysis
  • Musical Scores, Lyrics, and Libretti
  • Musical Structures, Styles, and Techniques
  • Musicology and Music History
  • Performance Practice and Studies
  • Race and Ethnicity in Music
  • Sound Studies
  • Browse content in Performing Arts
  • Browse content in Philosophy
  • Aesthetics and Philosophy of Art
  • Epistemology
  • Feminist Philosophy
  • History of Western Philosophy
  • Metaphysics
  • Moral Philosophy
  • Non-Western Philosophy
  • Philosophy of Language
  • Philosophy of Mind
  • Philosophy of Perception
  • Philosophy of Action
  • Philosophy of Law
  • Philosophy of Religion
  • Philosophy of Science
  • Philosophy of Mathematics and Logic
  • Practical Ethics
  • Social and Political Philosophy
  • Browse content in Religion
  • Biblical Studies
  • Christianity
  • East Asian Religions
  • History of Religion
  • Judaism and Jewish Studies
  • Qumran Studies
  • Religion and Education
  • Religion and Health
  • Religion and Politics
  • Religion and Science
  • Religion and Law
  • Religion and Art, Literature, and Music
  • Religious Studies
  • Browse content in Society and Culture
  • Cookery, Food, and Drink
  • Cultural Studies
  • Customs and Traditions
  • Ethical Issues and Debates
  • Hobbies, Games, Arts and Crafts
  • Lifestyle, Home, and Garden
  • Natural world, Country Life, and Pets
  • Popular Beliefs and Controversial Knowledge
  • Sports and Outdoor Recreation
  • Technology and Society
  • Travel and Holiday
  • Visual Culture
  • Browse content in Law
  • Arbitration
  • Browse content in Company and Commercial Law
  • Commercial Law
  • Company Law
  • Browse content in Comparative Law
  • Systems of Law
  • Competition Law
  • Browse content in Constitutional and Administrative Law
  • Government Powers
  • Judicial Review
  • Local Government Law
  • Military and Defence Law
  • Parliamentary and Legislative Practice
  • Construction Law
  • Contract Law
  • Browse content in Criminal Law
  • Criminal Procedure
  • Criminal Evidence Law
  • Sentencing and Punishment
  • Employment and Labour Law
  • Environment and Energy Law
  • Browse content in Financial Law
  • Banking Law
  • Insolvency Law
  • History of Law
  • Human Rights and Immigration
  • Intellectual Property Law
  • Browse content in International Law
  • Private International Law and Conflict of Laws
  • Public International Law
  • IT and Communications Law
  • Jurisprudence and Philosophy of Law
  • Law and Society
  • Law and Politics
  • Browse content in Legal System and Practice
  • Courts and Procedure
  • Legal Skills and Practice
  • Primary Sources of Law
  • Regulation of Legal Profession
  • Medical and Healthcare Law
  • Browse content in Policing
  • Criminal Investigation and Detection
  • Police and Security Services
  • Police Procedure and Law
  • Police Regional Planning
  • Browse content in Property Law
  • Personal Property Law
  • Study and Revision
  • Terrorism and National Security Law
  • Browse content in Trusts Law
  • Wills and Probate or Succession
  • Browse content in Medicine and Health
  • Browse content in Allied Health Professions
  • Arts Therapies
  • Clinical Science
  • Dietetics and Nutrition
  • Occupational Therapy
  • Operating Department Practice
  • Physiotherapy
  • Radiography
  • Speech and Language Therapy
  • Browse content in Anaesthetics
  • General Anaesthesia
  • Neuroanaesthesia
  • Clinical Neuroscience
  • Browse content in Clinical Medicine
  • Acute Medicine
  • Cardiovascular Medicine
  • Clinical Genetics
  • Clinical Pharmacology and Therapeutics
  • Dermatology
  • Endocrinology and Diabetes
  • Gastroenterology
  • Genito-urinary Medicine
  • Geriatric Medicine
  • Infectious Diseases
  • Medical Toxicology
  • Medical Oncology
  • Pain Medicine
  • Palliative Medicine
  • Rehabilitation Medicine
  • Respiratory Medicine and Pulmonology
  • Rheumatology
  • Sleep Medicine
  • Sports and Exercise Medicine
  • Community Medical Services
  • Critical Care
  • Emergency Medicine
  • Forensic Medicine
  • Haematology
  • History of Medicine
  • Browse content in Medical Skills
  • Clinical Skills
  • Communication Skills
  • Nursing Skills
  • Surgical Skills
  • Medical Ethics
  • Browse content in Medical Dentistry
  • Oral and Maxillofacial Surgery
  • Paediatric Dentistry
  • Restorative Dentistry and Orthodontics
  • Surgical Dentistry
  • Medical Statistics and Methodology
  • Browse content in Neurology
  • Clinical Neurophysiology
  • Neuropathology
  • Nursing Studies
  • Browse content in Obstetrics and Gynaecology
  • Gynaecology
  • Occupational Medicine
  • Ophthalmology
  • Otolaryngology (ENT)
  • Browse content in Paediatrics
  • Neonatology
  • Browse content in Pathology
  • Chemical Pathology
  • Clinical Cytogenetics and Molecular Genetics
  • Histopathology
  • Medical Microbiology and Virology
  • Patient Education and Information
  • Browse content in Pharmacology
  • Psychopharmacology
  • Browse content in Popular Health
  • Caring for Others
  • Complementary and Alternative Medicine
  • Self-help and Personal Development
  • Browse content in Preclinical Medicine
  • Cell Biology
  • Molecular Biology and Genetics
  • Reproduction, Growth and Development
  • Primary Care
  • Professional Development in Medicine
  • Browse content in Psychiatry
  • Addiction Medicine
  • Child and Adolescent Psychiatry
  • Forensic Psychiatry
  • Learning Disabilities
  • Old Age Psychiatry
  • Psychotherapy
  • Browse content in Public Health and Epidemiology
  • Epidemiology
  • Public Health
  • Browse content in Radiology
  • Clinical Radiology
  • Interventional Radiology
  • Nuclear Medicine
  • Radiation Oncology
  • Reproductive Medicine
  • Browse content in Surgery
  • Cardiothoracic Surgery
  • Gastro-intestinal and Colorectal Surgery
  • General Surgery
  • Neurosurgery
  • Paediatric Surgery
  • Peri-operative Care
  • Plastic and Reconstructive Surgery
  • Surgical Oncology
  • Transplant Surgery
  • Trauma and Orthopaedic Surgery
  • Vascular Surgery
  • Browse content in Science and Mathematics
  • Browse content in Biological Sciences
  • Aquatic Biology
  • Biochemistry
  • Bioinformatics and Computational Biology
  • Developmental Biology
  • Ecology and Conservation
  • Evolutionary Biology
  • Genetics and Genomics
  • Microbiology
  • Molecular and Cell Biology
  • Natural History
  • Plant Sciences and Forestry
  • Research Methods in Life Sciences
  • Structural Biology
  • Systems Biology
  • Zoology and Animal Sciences
  • Browse content in Chemistry
  • Analytical Chemistry
  • Computational Chemistry
  • Crystallography
  • Environmental Chemistry
  • Industrial Chemistry
  • Inorganic Chemistry
  • Materials Chemistry
  • Medicinal Chemistry
  • Mineralogy and Gems
  • Organic Chemistry
  • Physical Chemistry
  • Polymer Chemistry
  • Study and Communication Skills in Chemistry
  • Theoretical Chemistry
  • Browse content in Computer Science
  • Artificial Intelligence
  • Computer Architecture and Logic Design
  • Game Studies
  • Human-Computer Interaction
  • Mathematical Theory of Computation
  • Programming Languages
  • Software Engineering
  • Systems Analysis and Design
  • Virtual Reality
  • Browse content in Computing
  • Business Applications
  • Computer Games
  • Computer Security
  • Computer Networking and Communications
  • Digital Lifestyle
  • Graphical and Digital Media Applications
  • Operating Systems
  • Browse content in Earth Sciences and Geography
  • Atmospheric Sciences
  • Environmental Geography
  • Geology and the Lithosphere
  • Maps and Map-making
  • Meteorology and Climatology
  • Oceanography and Hydrology
  • Palaeontology
  • Physical Geography and Topography
  • Regional Geography
  • Soil Science
  • Urban Geography
  • Browse content in Engineering and Technology
  • Agriculture and Farming
  • Biological Engineering
  • Civil Engineering, Surveying, and Building
  • Electronics and Communications Engineering
  • Energy Technology
  • Engineering (General)
  • Environmental Science, Engineering, and Technology
  • History of Engineering and Technology
  • Mechanical Engineering and Materials
  • Technology of Industrial Chemistry
  • Transport Technology and Trades
  • Browse content in Environmental Science
  • Applied Ecology (Environmental Science)
  • Conservation of the Environment (Environmental Science)
  • Environmental Sustainability
  • Environmentalist Thought and Ideology (Environmental Science)
  • Management of Land and Natural Resources (Environmental Science)
  • Natural Disasters (Environmental Science)
  • Nuclear Issues (Environmental Science)
  • Pollution and Threats to the Environment (Environmental Science)
  • Social Impact of Environmental Issues (Environmental Science)
  • History of Science and Technology
  • Browse content in Materials Science
  • Ceramics and Glasses
  • Composite Materials
  • Metals, Alloying, and Corrosion
  • Nanotechnology
  • Browse content in Mathematics
  • Applied Mathematics
  • Biomathematics and Statistics
  • History of Mathematics
  • Mathematical Education
  • Mathematical Finance
  • Mathematical Analysis
  • Numerical and Computational Mathematics
  • Probability and Statistics
  • Pure Mathematics
  • Browse content in Neuroscience
  • Cognition and Behavioural Neuroscience
  • Development of the Nervous System
  • Disorders of the Nervous System
  • History of Neuroscience
  • Invertebrate Neurobiology
  • Molecular and Cellular Systems
  • Neuroendocrinology and Autonomic Nervous System
  • Neuroscientific Techniques
  • Sensory and Motor Systems
  • Browse content in Physics
  • Astronomy and Astrophysics
  • Atomic, Molecular, and Optical Physics
  • Biological and Medical Physics
  • Classical Mechanics
  • Computational Physics
  • Condensed Matter Physics
  • Electromagnetism, Optics, and Acoustics
  • History of Physics
  • Mathematical and Statistical Physics
  • Measurement Science
  • Nuclear Physics
  • Particles and Fields
  • Plasma Physics
  • Quantum Physics
  • Relativity and Gravitation
  • Semiconductor and Mesoscopic Physics
  • Browse content in Psychology
  • Affective Sciences
  • Clinical Psychology
  • Cognitive Psychology
  • Cognitive Neuroscience
  • Criminal and Forensic Psychology
  • Developmental Psychology
  • Educational Psychology
  • Evolutionary Psychology
  • Health Psychology
  • History and Systems in Psychology
  • Music Psychology
  • Neuropsychology
  • Organizational Psychology
  • Psychological Assessment and Testing
  • Psychology of Human-Technology Interaction
  • Psychology Professional Development and Training
  • Research Methods in Psychology
  • Social Psychology
  • Browse content in Social Sciences
  • Browse content in Anthropology
  • Anthropology of Religion
  • Human Evolution
  • Medical Anthropology
  • Physical Anthropology
  • Regional Anthropology
  • Social and Cultural Anthropology
  • Theory and Practice of Anthropology
  • Browse content in Business and Management
  • Business Ethics
  • Business History
  • Business Strategy
  • Business and Technology
  • Business and Government
  • Business and the Environment
  • Comparative Management
  • Corporate Governance
  • Corporate Social Responsibility
  • Entrepreneurship
  • Health Management
  • Human Resource Management
  • Industrial and Employment Relations
  • Industry Studies
  • Information and Communication Technologies
  • International Business
  • Knowledge Management
  • Management and Management Techniques
  • Operations Management
  • Organizational Theory and Behaviour
  • Pensions and Pension Management
  • Public and Nonprofit Management
  • Strategic Management
  • Supply Chain Management
  • Browse content in Criminology and Criminal Justice
  • Criminal Justice
  • Criminology
  • Forms of Crime
  • International and Comparative Criminology
  • Youth Violence and Juvenile Justice
  • Development Studies
  • Browse content in Economics
  • Agricultural, Environmental, and Natural Resource Economics
  • Asian Economics
  • Behavioural Finance
  • Behavioural Economics and Neuroeconomics
  • Econometrics and Mathematical Economics
  • Economic History
  • Economic Methodology
  • Economic Systems
  • Economic Development and Growth
  • Financial Markets
  • Financial Institutions and Services
  • General Economics and Teaching
  • Health, Education, and Welfare
  • History of Economic Thought
  • International Economics
  • Labour and Demographic Economics
  • Law and Economics
  • Macroeconomics and Monetary Economics
  • Microeconomics
  • Public Economics
  • Urban, Rural, and Regional Economics
  • Welfare Economics
  • Browse content in Education
  • Adult Education and Continuous Learning
  • Care and Counselling of Students
  • Early Childhood and Elementary Education
  • Educational Equipment and Technology
  • Educational Strategies and Policy
  • Higher and Further Education
  • Organization and Management of Education
  • Philosophy and Theory of Education
  • Schools Studies
  • Secondary Education
  • Teaching of a Specific Subject
  • Teaching of Specific Groups and Special Educational Needs
  • Teaching Skills and Techniques
  • Browse content in Environment
  • Applied Ecology (Social Science)
  • Climate Change
  • Conservation of the Environment (Social Science)
  • Environmentalist Thought and Ideology (Social Science)
  • Natural Disasters (Environment)
  • Social Impact of Environmental Issues (Social Science)
  • Browse content in Human Geography
  • Cultural Geography
  • Economic Geography
  • Political Geography
  • Browse content in Interdisciplinary Studies
  • Communication Studies
  • Museums, Libraries, and Information Sciences
  • Browse content in Politics
  • African Politics
  • Asian Politics
  • Chinese Politics
  • Comparative Politics
  • Conflict Politics
  • Elections and Electoral Studies
  • Environmental Politics
  • European Union
  • Foreign Policy
  • Gender and Politics
  • Human Rights and Politics
  • Indian Politics
  • International Relations
  • International Organization (Politics)
  • International Political Economy
  • Irish Politics
  • Latin American Politics
  • Middle Eastern Politics
  • Political Behaviour
  • Political Economy
  • Political Institutions
  • Political Theory
  • Political Methodology
  • Political Communication
  • Political Philosophy
  • Political Sociology
  • Politics and Law
  • Public Policy
  • Public Administration
  • Quantitative Political Methodology
  • Regional Political Studies
  • Russian Politics
  • Security Studies
  • State and Local Government
  • UK Politics
  • US Politics
  • Browse content in Regional and Area Studies
  • African Studies
  • Asian Studies
  • East Asian Studies
  • Japanese Studies
  • Latin American Studies
  • Middle Eastern Studies
  • Native American Studies
  • Scottish Studies
  • Browse content in Research and Information
  • Research Methods
  • Browse content in Social Work
  • Addictions and Substance Misuse
  • Adoption and Fostering
  • Care of the Elderly
  • Child and Adolescent Social Work
  • Couple and Family Social Work
  • Developmental and Physical Disabilities Social Work
  • Direct Practice and Clinical Social Work
  • Emergency Services
  • Human Behaviour and the Social Environment
  • International and Global Issues in Social Work
  • Mental and Behavioural Health
  • Social Justice and Human Rights
  • Social Policy and Advocacy
  • Social Work and Crime and Justice
  • Social Work Macro Practice
  • Social Work Practice Settings
  • Social Work Research and Evidence-based Practice
  • Welfare and Benefit Systems
  • Browse content in Sociology
  • Childhood Studies
  • Community Development
  • Comparative and Historical Sociology
  • Economic Sociology
  • Gender and Sexuality
  • Gerontology and Ageing
  • Health, Illness, and Medicine
  • Marriage and the Family
  • Migration Studies
  • Occupations, Professions, and Work
  • Organizations
  • Population and Demography
  • Race and Ethnicity
  • Social Theory
  • Social Movements and Social Change
  • Social Research and Statistics
  • Social Stratification, Inequality, and Mobility
  • Sociology of Religion
  • Sociology of Education
  • Sport and Leisure
  • Urban and Rural Studies
  • Browse content in Warfare and Defence
  • Defence Strategy, Planning, and Research
  • Land Forces and Warfare
  • Military Administration
  • Military Life and Institutions
  • Naval Forces and Warfare
  • Other Warfare and Defence Issues
  • Peace Studies and Conflict Resolution
  • Weapons and Equipment

The Oxford Handbook of Thinking and Reasoning

  • < Previous chapter
  • Next chapter >

The Oxford Handbook of Thinking and Reasoning

35 Scientific Thinking and Reasoning

Kevin N. Dunbar, Department of Human Development and Quantitative Methodology, University of Maryland, College Park, MD

David Klahr, Department of Psychology, Carnegie Mellon University, Pittsburgh, PA

  • Published: 21 November 2012
  • Cite Icon Cite
  • Permissions Icon Permissions

Scientific thinking refers to both thinking about the content of science and the set of reasoning processes that permeate the field of science: induction, deduction, experimental design, causal reasoning, concept formation, hypothesis testing, and so on. Here we cover both the history of research on scientific thinking and the different approaches that have been used, highlighting common themes that have emerged over the past 50 years of research. Future research will focus on the collaborative aspects of scientific thinking, on effective methods for teaching science, and on the neural underpinnings of the scientific mind.

There is no unitary activity called “scientific discovery”; there are activities of designing experiments, gathering data, inventing and developing observational instruments, formulating and modifying theories, deducing consequences from theories, making predictions from theories, testing theories, inducing regularities and invariants from data, discovering theoretical constructs, and others. — Simon, Langley, & Bradshaw, 1981 , p. 2

What Is Scientific Thinking and Reasoning?

There are two kinds of thinking we call “scientific.” The first, and most obvious, is thinking about the content of science. People are engaged in scientific thinking when they are reasoning about such entities and processes as force, mass, energy, equilibrium, magnetism, atoms, photosynthesis, radiation, geology, or astrophysics (and, of course, cognitive psychology!). The second kind of scientific thinking includes the set of reasoning processes that permeate the field of science: induction, deduction, experimental design, causal reasoning, concept formation, hypothesis testing, and so on. However, these reasoning processes are not unique to scientific thinking: They are the very same processes involved in everyday thinking. As Einstein put it:

The scientific way of forming concepts differs from that which we use in our daily life, not basically, but merely in the more precise definition of concepts and conclusions; more painstaking and systematic choice of experimental material, and greater logical economy. (The Common Language of Science, 1941, reprinted in Einstein, 1950 , p. 98)

Nearly 40 years after Einstein's remarkably insightful statement, Francis Crick offered a similar perspective: that great discoveries in science result not from extraordinary mental processes, but rather from rather common ones. The greatness of the discovery lies in the thing discovered.

I think what needs to be emphasized about the discovery of the double helix is that the path to it was, scientifically speaking, fairly commonplace. What was important was not the way it was discovered , but the object discovered—the structure of DNA itself. (Crick, 1988 , p. 67; emphasis added)

Under this view, scientific thinking involves the same general-purpose cognitive processes—such as induction, deduction, analogy, problem solving, and causal reasoning—that humans apply in nonscientific domains. These processes are covered in several different chapters of this handbook: Rips, Smith, & Medin, Chapter 11 on induction; Evans, Chapter 8 on deduction; Holyoak, Chapter 13 on analogy; Bassok & Novick, Chapter 21 on problem solving; and Cheng & Buehner, Chapter 12 on causality. One might question the claim that the highly specialized procedures associated with doing science in the “real world” can be understood by investigating the thinking processes used in laboratory studies of the sort described in this volume. However, when the focus is on major scientific breakthroughs, rather than on the more routine, incremental progress in a field, the psychology of problem solving provides a rich source of ideas about how such discoveries might occur. As Simon and his colleagues put it:

It is understandable, if ironic, that ‘normal’ science fits … the description of expert problem solving, while ‘revolutionary’ science fits the description of problem solving by novices. It is understandable because scientific activity, particularly at the revolutionary end of the continuum, is concerned with the discovery of new truths, not with the application of truths that are already well-known … it is basically a journey into unmapped terrain. Consequently, it is mainly characterized, as is novice problem solving, by trial-and-error search. The search may be highly selective—but it reaches its goal only after many halts, turnings, and back-trackings. (Simon, Langley, & Bradshaw, 1981 , p. 5)

The research literature on scientific thinking can be roughly categorized according to the two types of scientific thinking listed in the opening paragraph of this chapter: (1) One category focuses on thinking that directly involves scientific content . Such research ranges from studies of young children reasoning about the sun-moon-earth system (Vosniadou & Brewer, 1992 ) to college students reasoning about chemical equilibrium (Davenport, Yaron, Klahr, & Koedinger, 2008 ), to research that investigates collaborative problem solving by world-class researchers in real-world molecular biology labs (Dunbar, 1995 ). (2) The other category focuses on “general” cognitive processes, but it tends to do so by analyzing people's problem-solving behavior when they are presented with relatively complex situations that involve the integration and coordination of several different types of processes, and that are designed to capture some essential features of “real-world” science in the psychology laboratory (Bruner, Goodnow, & Austin, 1956 ; Klahr & Dunbar, 1988 ; Mynatt, Doherty, & Tweney, 1977 ).

There are a number of overlapping research traditions that have been used to investigate scientific thinking. We will cover both the history of research on scientific thinking and the different approaches that have been used, highlighting common themes that have emerged over the past 50 years of research.

A Brief History of Research on Scientific Thinking

Science is often considered one of the hallmarks of the human species, along with art and literature. Illuminating the thought processes used in science thus reveal key aspects of the human mind. The thought processes underlying scientific thinking have fascinated both scientists and nonscientists because the products of science have transformed our world and because the process of discovery is shrouded in mystery. Scientists talk of the chance discovery, the flash of insight, the years of perspiration, and the voyage of discovery. These images of science have helped make the mental processes underlying the discovery process intriguing to cognitive scientists as they attempt to uncover what really goes on inside the scientific mind and how scientists really think. Furthermore, the possibilities that scientists can be taught to think better by avoiding mistakes that have been clearly identified in research on scientific thinking, and that their scientific process could be partially automated, makes scientific thinking a topic of enduring interest.

The cognitive processes underlying scientific discovery and day-to-day scientific thinking have been a topic of intense scrutiny and speculation for almost 400 years (e.g., Bacon, 1620 ; Galilei 1638 ; Klahr 2000 ; Tweney, Doherty, & Mynatt, 1981 ). Understanding the nature of scientific thinking has been a central issue not only for our understanding of science but also for our understating of what it is to be human. Bacon's Novumm Organum in 1620 sketched out some of the key features of the ways that experiments are designed and data interpreted. Over the ensuing 400 years philosophers and scientists vigorously debated about the appropriate methods that scientists should use (see Giere, 1993 ). These debates over the appropriate methods for science typically resulted in the espousal of a particular type of reasoning method, such as induction or deduction. It was not until the Gestalt psychologists began working on the nature of human problem solving, during the 1940s, that experimental psychologists began to investigate the cognitive processes underlying scientific thinking and reasoning.

The Gestalt psychologist Max Wertheimer pioneered the investigation of scientific thinking (of the first type described earlier: thinking about scientific content ) in his landmark book Productive Thinking (Wertheimer, 1945 ). Wertheimer spent a considerable amount of time corresponding with Albert Einstein, attempting to discover how Einstein generated the concept of relativity. Wertheimer argued that Einstein had to overcome the structure of Newtonian physics at each step in his theorizing, and the ways that Einstein actually achieved this restructuring were articulated in terms of Gestalt theories. (For a recent and different account of how Einstein made his discovery, see Galison, 2003 .) We will see later how this process of overcoming alternative theories is an obstacle that both scientists and nonscientists need to deal with when evaluating and theorizing about the world.

One of the first investigations of scientific thinking of the second type (i.e., collections of general-purpose processes operating on complex, abstract, components of scientific thought) was carried out by Jerome Bruner and his colleagues at Harvard (Bruner et al., 1956 ). They argued that a key activity engaged in by scientists is to determine whether a particular instance is a member of a category. For example, a scientist might want to discover which substances undergo fission when bombarded by neutrons and which substances do not. Here, scientists have to discover the attributes that make a substance undergo fission. Bruner et al. saw scientific thinking as the testing of hypotheses and the collecting of data with the end goal of determining whether something is a member of a category. They invented a paradigm where people were required to formulate hypotheses and collect data that test their hypotheses. In one type of experiment, the participants were shown a card such as one with two borders and three green triangles. The participants were asked to determine the concept that this card represented by choosing other cards and getting feedback from the experimenter as to whether the chosen card was an example of the concept. In this case the participant may have thought that the concept was green and chosen a card with two green squares and one border. If the underlying concept was green, then the experimenter would say that the card was an example of the concept. In terms of scientific thinking, choosing a new card is akin to conducting an experiment, and the feedback from the experimenter is similar to knowing whether a hypothesis is confirmed or disconfirmed. Using this approach, Bruner et al. identified a number of strategies that people use to formulate and test hypotheses. They found that a key factor determining which hypothesis-testing strategy that people use is the amount of memory capacity that the strategy takes up (see also Morrison & Knowlton, Chapter 6 ; Medin et al., Chapter 11 ). Another key factor that they discovered was that it was much more difficult for people to discover negative concepts (e.g., not blue) than positive concepts (e.g., blue). Although Bruner et al.'s research is most commonly viewed as work on concepts, they saw their work as uncovering a key component of scientific thinking.

A second early line of research on scientific thinking was developed by Peter Wason and his colleagues (Wason, 1968 ). Like Bruner et al., Wason saw a key component of scientific thinking as being the testing of hypotheses. Whereas Bruner et al. focused on the different types of strategies that people use to formulate hypotheses, Wason focused on whether people adopt a strategy of trying to confirm or disconfirm their hypotheses. Using Popper's ( 1959 ) theory that scientists should try and falsify rather than confirm their hypotheses, Wason devised a deceptively simple task in which participants were given three numbers, such as 2-4-6, and were asked to discover the rule underlying the three numbers. Participants were asked to generate other triads of numbers and the experimenter would tell the participant whether the triad was consistent or inconsistent with the rule. They were told that when they were sure they knew what the rule was they should state it. Most participants began the experiment by thinking that the rule was even numbers increasing by 2. They then attempted to confirm their hypothesis by generating a triad like 8-10-12, then 14-16-18. These triads are consistent with the rule and the participants were told yes, that the triads were indeed consistent with the rule. However, when they proposed the rule—even numbers increasing by 2—they were told that the rule was incorrect. The correct rule was numbers of increasing magnitude! From this research, Wason concluded that people try to confirm their hypotheses, whereas normatively speaking, they should try to disconfirm their hypotheses. One implication of this research is that confirmation bias is not just restricted to scientists but is a general human tendency.

It was not until the 1970s that a general account of scientific reasoning was proposed. Herbert Simon, often in collaboration with Allan Newell, proposed that scientific thinking is a form of problem solving. He proposed that problem solving is a search in a problem space. Newell and Simon's theory of problem solving is discussed in many places in this handbook, usually in the context of specific problems (see especially Bassok & Novick, Chapter 21 ). Herbert Simon, however, devoted considerable time to understanding many different scientific discoveries and scientific reasoning processes. The common thread in his research was that scientific thinking and discovery is not a mysterious magical process but a process of problem solving in which clear heuristics are used. Simon's goal was to articulate the heuristics that scientists use in their research at a fine-grained level. By constructing computer programs that simulated the process of several major scientific discoveries, Simon and colleagues were able to articulate the specific computations that scientists could have used in making those discoveries (Langley, Simon, Bradshaw, & Zytkow, 1987 ; see section on “Computational Approaches to Scientific Thinking”). Particularly influential was Simon and Lea's ( 1974 ) work demonstrating that concept formation and induction consist of a search in two problem spaces: a space of instances and a space of rules. This idea has influenced problem-solving accounts of scientific thinking that will be discussed in the next section.

Overall, the work of Bruner, Wason, and Simon laid the foundations for contemporary research on scientific thinking. Early research on scientific thinking is summarized in Tweney, Doherty and Mynatt's 1981 book On Scientific Thinking , where they sketched out many of the themes that have dominated research on scientific thinking over the past few decades. Other more recent books such as Cognitive Models of Science (Giere, 1993 ), Exploring Science (Klahr, 2000 ), Cognitive Basis of Science (Carruthers, Stich, & Siegal, 2002 ), and New Directions in Scientific and Technical Thinking (Gorman, Kincannon, Gooding, & Tweney, 2004 ) provide detailed analyses of different aspects of scientific discovery. Another important collection is Vosnadiau's handbook on conceptual change research (Vosniadou, 2008 ). In this chapter, we discuss the main approaches that have been used to investigate scientific thinking.

How does one go about investigating the many different aspects of scientific thinking? One common approach to the study of the scientific mind has been to investigate several key aspects of scientific thinking using abstract tasks designed to mimic some essential characteristics of “real-world” science. There have been numerous methodologies that have been used to analyze the genesis of scientific concepts, theories, hypotheses, and experiments. Researchers have used experiments, verbal protocols, computer programs, and analyzed particular scientific discoveries. A more recent development has been to increase the ecological validity of such research by investigating scientists as they reason “live” (in vivo studies of scientific thinking) in their own laboratories (Dunbar, 1995 , 2002 ). From a “Thinking and Reasoning” standpoint the major aspects of scientific thinking that have been most actively investigated are problem solving, analogical reasoning, hypothesis testing, conceptual change, collaborative reasoning, inductive reasoning, and deductive reasoning.

Scientific Thinking as Problem Solving

One of the primary goals of accounts of scientific thinking has been to provide an overarching framework to understand the scientific mind. One framework that has had a great influence in cognitive science is that scientific thinking and scientific discovery can be conceived as a form of problem solving. As noted in the opening section of this chapter, Simon ( 1977 ; Simon, Langley, & Bradshaw, 1981 ) argued that both scientific thinking in general and problem solving in particular could be thought of as a search in a problem space. A problem space consists of all the possible states of a problem and all the operations that a problem solver can use to get from one state to the next. According to this view, by characterizing the types of representations and procedures that people use to get from one state to another it is possible to understand scientific thinking. Thus, scientific thinking can be characterized as a search in various problem spaces (Simon, 1977 ). Simon investigated a number of scientific discoveries by bringing participants into the laboratory, providing the participants with the data that a scientist had access to, and getting the participants to reason about the data and rediscover a scientific concept. He then analyzed the verbal protocols that participants generated and mapped out the types of problem spaces that the participants search in (e.g., Qin & Simon, 1990 ). Kulkarni and Simon ( 1988 ) used a more historical approach to uncover the problem-solving heuristics that Krebs used in his discovery of the urea cycle. Kulkarni and Simon analyzed Krebs's diaries and proposed a set of problem-solving heuristics that he used in his research. They then built a computer program incorporating the heuristics and biological knowledge that Krebs had before he made his discoveries. Of particular importance are the search heuristics that the program uses, which include experimental proposal heuristics and data interpretation heuristics. A key heuristic was an unusualness heuristic that focused on unusual findings, which guided search through a space of theories and a space of experiments.

Klahr and Dunbar ( 1988 ) extended the search in a problem space approach and proposed that scientific thinking can be thought of as a search through two related spaces: an hypothesis space and an experiment space. Each problem space that a scientist uses will have its own types of representations and operators used to change the representations. Search in the hypothesis space constrains search in the experiment space. Klahr and Dunbar found that some participants move from the hypothesis space to the experiment space, whereas others move from the experiment space to the hypothesis space. These different types of searches lead to the proposal of different types of hypotheses and experiments. More recent work has extended the dual-space approach to include alternative problem-solving spaces, including those for data, instrumentation, and domain-specific knowledge (Klahr & Simon, 1999 ; Schunn & Klahr, 1995 , 1996 ).

Scientific Thinking as Hypothesis Testing

Many researchers have regarded testing specific hypotheses predicted by theories as one of the key attributes of scientific thinking. Hypothesis testing is the process of evaluating a proposition by collecting evidence regarding its truth. Experimental cognitive research on scientific thinking that specifically examines this issue has tended to fall into two broad classes of investigations. The first class is concerned with the types of reasoning that lead scientists astray, thus blocking scientific ingenuity. A large amount of research has been conducted on the potentially faulty reasoning strategies that both participants in experiments and scientists use, such as considering only one favored hypothesis at a time and how this prevents the scientists from making discoveries. The second class is concerned with uncovering the mental processes underlying the generation of new scientific hypotheses and concepts. This research has tended to focus on the use of analogy and imagery in science, as well as the use of specific types of problem-solving heuristics.

Turning first to investigations of what diminishes scientific creativity, philosophers, historians, and experimental psychologists have devoted a considerable amount of research to “confirmation bias.” This occurs when scientists only consider one hypothesis (typically the favored hypothesis) and ignore other alternative hypotheses or potentially relevant hypotheses. This important phenomenon can distort the design of experiments, formulation of theories, and interpretation of data. Beginning with the work of Wason ( 1968 ) and as discussed earlier, researchers have repeatedly shown that when participants are asked to design an experiment to test a hypothesis they will predominantly design experiments that they think will yield results consistent with the hypothesis. Using the 2-4-6 task mentioned earlier, Klayman and Ha ( 1987 ) showed that in situations where one's hypothesis is likely to be confirmed, seeking confirmation is a normatively incorrect strategy, whereas when the probability of confirming one's hypothesis is low, then attempting to confirm one's hypothesis can be an appropriate strategy. Historical analyses by Tweney ( 1989 ), concerning the way that Faraday made his discoveries, and experiments investigating people testing hypotheses, have revealed that people use a confirm early, disconfirm late strategy: When people initially generate or are given hypotheses, they try and gather evidence that is consistent with the hypothesis. Once enough evidence has been gathered, then people attempt to find the boundaries of their hypothesis and often try to disconfirm their hypotheses.

In an interesting variant on the confirmation bias paradigm, Gorman ( 1989 ) showed that when participants are told that there is the possibility of error in the data that they receive, participants assume that any data that are inconsistent with their favored hypothesis are due to error. Thus, the possibility of error “insulates” hypotheses against disconfirmation. This intriguing hypothesis has not been confirmed by other researchers (Penner & Klahr, 1996 ), but it is an intriguing hypothesis that warrants further investigation.

Confirmation bias is very difficult to overcome. Even when participants are asked to consider alternate hypotheses, they will often fail to conduct experiments that could potentially disconfirm their hypothesis. Tweney and his colleagues provide an excellent overview of this phenomenon in their classic monograph On Scientific Thinking (1981). The precise reasons for this type of block are still widely debated. Researchers such as Michael Doherty have argued that working memory limitations make it difficult for people to consider more than one hypothesis. Consistent with this view, Dunbar and Sussman ( 1995 ) have shown that when participants are asked to hold irrelevant items in working memory while testing hypotheses, the participants will be unable to switch hypotheses in the face of inconsistent evidence. While working memory limitations are involved in the phenomenon of confirmation bias, even groups of scientists can also display confirmation bias. For example, the controversy over cold fusion is an example of confirmation bias. Here, large groups of scientists had other hypotheses available to explain their data yet maintained their hypotheses in the face of other more standard alternative hypotheses. Mitroff ( 1974 ) provides some interesting examples of NASA scientists demonstrating confirmation bias, which highlight the roles of commitment and motivation in this process. See also MacPherson and Stanovich ( 2007 ) for specific strategies that can be used to overcome confirmation bias.

Causal Thinking in Science

Much of scientific thinking and scientific theory building pertains to the development of causal models between variables of interest. For example, do vaccines cause illnesses? Do carbon dioxide emissions cause global warming? Does water on a planet indicate that there is life on the planet? Scientists and nonscientists alike are constantly bombarded with statements regarding the causal relationship between such variables. How does one evaluate the status of such claims? What kinds of data are informative? How do scientists and nonscientists deal with data that are inconsistent with their theory?

A central issue in the causal reasoning literature, one that is directly relevant to scientific thinking, is the extent to which scientists and nonscientists alike are governed by the search for causal mechanisms (i.e., how a variable works) versus the search for statistical data (i.e., how often variables co-occur). This dichotomy can be boiled down to the search for qualitative versus quantitative information about the paradigm the scientist is investigating. Researchers from a number of cognitive psychology laboratories have found that people prefer to gather more information about an underlying mechanism than covariation between a cause and an effect (e.g., Ahn, Kalish, Medin, & Gelman, 1995 ). That is, the predominant strategy that students in simulations of scientific thinking use is to gather as much information as possible about how the objects under investigation work, rather than collecting large amounts of quantitative data to determine whether the observations hold across multiple samples. These findings suggest that a central component of scientific thinking may be to formulate explicit mechanistic causal models of scientific events.

One type of situation in which causal reasoning has been observed extensively is when scientists obtain unexpected findings. Both historical and naturalistic research has revealed that reasoning causally about unexpected findings plays a central role in science. Indeed, scientists themselves frequently state that a finding was due to chance or was unexpected. Given that claims of unexpected findings are such a frequent component of scientists' autobiographies and interviews in the media, Dunbar ( 1995 , 1997 , 1999 ; Dunbar & Fugelsang, 2005 ; Fugelsang, Stein, Green, & Dunbar, 2004 ) decided to investigate the ways that scientists deal with unexpected findings. In 1991–1992 Dunbar spent 1 year in three molecular biology laboratories and one immunology laboratory at a prestigious U.S. university. He used the weekly laboratory meeting as a source of data on scientific discovery and scientific reasoning. (He termed this type of study “in vivo” cognition.) When he looked at the types of findings that the scientists made, he found that over 50% of the findings were unexpected and that these scientists had evolved a number of effective strategies for dealing with such findings. One clear strategy was to reason causally about the findings: Scientists attempted to build causal models of their unexpected findings. This causal model building results in the extensive use of collaborative reasoning, analogical reasoning, and problem-solving heuristics (Dunbar, 1997 , 2001 ).

Many of the key unexpected findings that scientists reasoned about in the in vivo studies of scientific thinking were inconsistent with the scientists' preexisting causal models. A laboratory equivalent of the biology labs involved creating a situation in which students obtained unexpected findings that were inconsistent with their preexisting theories. Dunbar and Fugelsang ( 2005 ) examined this issue by creating a scientific causal thinking simulation where experimental outcomes were either expected or unexpected. Dunbar ( 1995 ) has called the study of people reasoning in a cognitive laboratory “in vitro” cognition. These investigators found that students spent considerably more time reasoning about unexpected findings than expected findings. In addition, when assessing the overall degree to which their hypothesis was supported or refuted, participants spent the majority of their time considering unexpected findings. An analysis of participants' verbal protocols indicates that much of this extra time was spent formulating causal models for the unexpected findings. Similarly, scientists spend more time considering unexpected than expected findings, and this time is devoted to building causal models (Dunbar & Fugelsang, 2004 ).

Scientists know that unexpected findings occur often, and they have developed many strategies to take advantage of their unexpected findings. One of the most important places that they anticipate the unexpected is in designing experiments (Baker & Dunbar, 2000 ). They build different causal models of their experiments incorporating many conditions and controls. These multiple conditions and controls allow unknown mechanisms to manifest themselves. Thus, rather than being the victims of the unexpected, they create opportunities for unexpected events to occur, and once these events do occur, they have causal models that allow them to determine exactly where in the causal chain their unexpected finding arose. The results of these in vivo and in vitro studies all point to a more complex and nuanced account of how scientists and nonscientists alike test and evaluate hypotheses about theories.

The Roles of Inductive, Abductive, and Deductive Thinking in Science

One of the most basic characteristics of science is that scientists assume that the universe that we live in follows predictable rules. Scientists reason using a variety of different strategies to make new scientific discoveries. Three frequently used types of reasoning strategies that scientists use are inductive, abductive, and deductive reasoning. In the case of inductive reasoning, a scientist may observe a series of events and try to discover a rule that governs the event. Once a rule is discovered, scientists can extrapolate from the rule to formulate theories of observed and yet-to-be-observed phenomena. One example is the discovery using inductive reasoning that a certain type of bacterium is a cause of many ulcers (Thagard, 1999 ). In a fascinating series of articles, Thagard documented the reasoning processes that Marshall and Warren went through in proposing this novel hypothesis. One key reasoning process was the use of induction by generalization. Marshall and Warren noted that almost all patients with gastric entritis had a spiral bacterium in their stomachs, and he formed the generalization that this bacterium is the cause of stomach ulcers. There are numerous other examples of induction by generalization in science, such as Tycho De Brea's induction about the motion of planets from his observations, Dalton's use of induction in chemistry, and the discovery of prions as the source of mad cow disease. Many theories of induction have used scientific discovery and reasoning as examples of this important reasoning process.

Another common type of inductive reasoning is to map a feature of one member of a category to another member of a category. This is called categorical induction. This type of induction is a way of projecting a known property of one item onto another item that is from the same category. Thus, knowing that the Rous Sarcoma virus is a retrovirus that uses RNA rather than DNA, a biologist might assume that another virus that is thought to be a retrovirus also uses RNA rather than DNA. While research on this type of induction typically has not been discussed in accounts of scientific thinking, this type of induction is common in science. For an influential contribution to this literature, see Smith, Shafir, and Osherson ( 1993 ), and for reviews of this literature see Heit ( 2000 ) and Medin et al. (Chapter 11 ).

While less commonly mentioned than inductive reasoning, abductive reasoning is an important form of reasoning that scientists use when they are seeking to propose explanations for events such as unexpected findings (see Lombrozo, Chapter 14 ; Magnani, et al., 2010 ). In Figure 35.1 , taken from King ( 2011 ), the differences between inductive, abductive, and deductive thinking are highlighted. In the case of abduction, the reasoner attempts to generate explanations of the form “if situation X had occurred, could it have produced the current evidence I am attempting to interpret?” (For an interesting of analysis of abductive reasoning see the brief paper by Klahr & Masnick, 2001 ). Of course, as in classical induction, such reasoning may produce a plausible account that is still not the correct one. However, abduction does involve the generation of new knowledge, and is thus also related to research on creativity.

The different processes underlying inductive, abductive, and deductive reasoning in science. (Figure reproduced from King 2011 ).)

Turning now to deductive thinking, many thinking processes that scientists adhere to follow traditional rules of deductive logic. These processes correspond to those conditions in which a hypothesis may lead to, or is deducible to, a conclusion. Though they are not always phrased in syllogistic form, deductive arguments can be phrased as “syllogisms,” or as brief, mathematical statements in which the premises lead to the conclusion. Deductive reasoning is an extremely important aspect of scientific thinking because it underlies a large component of how scientists conduct their research. By looking at many scientific discoveries, we can often see that deductive reasoning is at work. Deductive reasoning statements all contain information or rules that state an assumption about how the world works, as well as a conclusion that would necessarily follow from the rule. Numerous discoveries in physics such as the discovery of dark matter by Vera Rubin are based on deductions. In the dark matter case, Rubin measured galactic rotation curves and based on the differences between the predicted and observed angular motions of galaxies she deduced that the structure of the universe was uneven. This led her to propose that dark matter existed. In contemporary physics the CERN Large Hadron Collider is being used to search for the Higgs Boson. The Higgs Boson is a deductive prediction from contemporary physics. If the Higgs Boson is not found, it may lead to a radical revision of the nature of physics and a new understanding of mass (Hecht, 2011 ).

The Roles of Analogy in Scientific Thinking

One of the most widely mentioned reasoning processes used in science is analogy. Scientists use analogies to form a bridge between what they already know and what they are trying to explain, understand, or discover. In fact, many scientists have claimed that the making of certain analogies was instrumental in their making a scientific discovery, and almost all scientific autobiographies and biographies feature one particular analogy that is discussed in depth. Coupled with the fact that there has been an enormous research program on analogical thinking and reasoning (see Holyoak, Chapter 13 ), we now have a number of models and theories of analogical reasoning that suggest how analogy can play a role in scientific discovery (see Gentner, Holyoak, & Kokinov, 2001 ). By analyzing several major discoveries in the history of science, Thagard and Croft ( 1999 ), Nersessian ( 1999 , 2008 ), and Gentner and Jeziorski ( 1993 ) have all shown that analogical reasoning is a key aspect of scientific discovery.

Traditional accounts of analogy distinguish between two components of analogical reasoning: the target and the source (Holyoak, Chapter 13 ; Gentner 2010 ). The target is the concept or problem that a scientist is attempting to explain or solve. The source is another piece of knowledge that the scientist uses to understand the target or to explain the target to others. What the scientist does when he or she makes an analogy is to map features of the source onto features of the target. By mapping the features of the source onto the target, new features of the target may be discovered, or the features of the target may be rearranged so that a new concept is invented and a scientific discovery is made. For example, a common analogy that is used with computers is to describe a harmful piece of software as a computer virus. Once a piece of software is called a virus, people can map features of biological viruses, such as that it is small, spreads easily, self-replicates using a host, and causes damage. People not only map individual features of the source onto the target but also the systems of relations. For example, if a computer virus is similar to a biological virus, then an immune system can be created on computers that can protect computers from future variants of a virus. One of the reasons that scientific analogy is so powerful is that it can generate new knowledge, such as the creation of a computational immune system having many of the features of a real biological immune system. This analogy also leads to predictions that there will be newer computer viruses that are the computational equivalent of retroviruses, lacking DNA, or standard instructions, that will elude the computational immune system.

The process of making an analogy involves a number of key steps: retrieval of a source from memory, aligning the features of the source with those of the target, mapping features of the source onto those of the target, and possibly making new inferences about the target. Scientific discoveries are made when the source highlights a hitherto unknown feature of the target or restructures the target into a new set of relations. Interestingly, research on analogy has shown that participants do not easily use remote analogies (see Gentner et al., 1997 ; Holyoak & Thagard 1995 ). Participants in experiments tend to focus on the sharing of a superficial feature between the source and the target, rather than the relations among features. In his in vivo studies of science, Dunbar ( 1995 , 2001 , 2002 ) investigated the ways that scientists use analogies while they are conducting their research and found that scientists use both relational and superficial features when they make analogies. Whether they use superficial or relational features depends on their goals. If their goal is to fix a problem in an experiment, their analogies are based upon superficial features. However, if their goal is to formulate hypotheses, they focus on analogies based upon sets of relations. One important difference between scientists and participants in experiments is that the scientists have deep relational knowledge of the processes that they are investigating and can hence use this relational knowledge to make analogies (see Holyoak, Chapter 13 for a thorough review of analogical reasoning).

Are scientific analogies always useful? Sometimes analogies can lead scientists and students astray. For example, Evelyn Fox-Keller ( 1985 ) shows how an analogy between the pulsing of a lighthouse and the activity of the slime mold dictyostelium led researchers astray for a number of years. Likewise, the analogy between the solar system (the source) and the structure of the atom (the target) has been shown to be potentially misleading to students taking more advanced courses in physics or chemistry. The solar system analogy has a number of misalignments to the structure of the atom, such as electrons being repelled from each other rather than attracted; moreover, electrons do not have individual orbits like planets but have orbit clouds of electron density. Furthermore, students have serious misconceptions about the nature of the solar system, which can compound their misunderstanding of the nature of the atom (Fischler & Lichtfeld, 1992 ). While analogy is a powerful tool in science, like all forms of induction, incorrect conclusions can be reached.

Conceptual Change in Science

Scientific knowledge continually accumulates as scientists gather evidence about the natural world. Over extended time, this knowledge accumulation leads to major revisions, extensions, and new organizational forms for expressing what is known about nature. Indeed, these changes are so substantial that philosophers of science speak of “revolutions” in a variety of scientific domains (Kuhn, 1962 ). The psychological literature that explores the idea of revolutionary conceptual change can be roughly divided into (a) investigations of how scientists actually make discoveries and integrate those discoveries into existing scientific contexts, and (b) investigations of nonscientists ranging from infants, to children, to students in science classes. In this section we summarize the adult studies of conceptual change, and in the next section we look at its developmental aspects.

Scientific concepts, like all concepts, can be characterized as containing a variety of “knowledge elements”: representations of words, thoughts, actions, objects, and processes. At certain points in the history of science, the accumulated evidence has demanded major shifts in the way these collections of knowledge elements are organized. This “radical conceptual change” process (see Keil, 1999 ; Nersessian 1998 , 2002 ; Thagard, 1992 ; Vosniadou 1998, for reviews) requires the formation of a new conceptual system that organizes knowledge in new ways, adds new knowledge, and results in a very different conceptual structure. For more recent research on conceptual change, The International Handbook of Research on Conceptual Change (Vosniadou, 2008 ) provides a detailed compendium of theories and controversies within the field.

While conceptual change in science is usually characterized by large-scale changes in concepts that occur over extensive periods of time, it has been possible to observe conceptual change using in vivo methodologies. Dunbar ( 1995 ) reported a major conceptual shift that occurred in immunologists, where they obtained a series of unexpected findings that forced the scientists to propose a new concept in immunology that in turn forced the change in other concepts. The drive behind this conceptual change was the discovery of a series of different unexpected findings or anomalies that required the scientists to both revise and reorganize their conceptual knowledge. Interestingly, this conceptual change was achieved by a group of scientists reasoning collaboratively, rather than by a scientist working alone. Different scientists tend to work on different aspects of concepts, and also different concepts, that when put together lead to a rapid change in entire conceptual structures.

Overall, accounts of conceptual change in individuals indicate that it is indeed similar to that of conceptual change in entire scientific fields. Individuals need to be confronted with anomalies that their preexisting theories cannot explain before entire conceptual structures are overthrown. However, replacement conceptual structures have to be generated before the old conceptual structure can be discarded. Sometimes, people do not overthrow their original conceptual theories and through their lives maintain their original views of many fundamental scientific concepts. Whether people actively possess naive theories, or whether they appear to have a naive theory because of the demand characteristics of the testing context, is a lively source of debate within the science education community (see Gupta, Hammer, & Redish, 2010 ).

Scientific Thinking in Children

Well before their first birthday, children appear to know several fundamental facts about the physical world. For example, studies with infants show that they behave as if they understand that solid objects endure over time (e.g., they don't just disappear and reappear, they cannot move through each other, and they move as a result of collisions with other solid objects or the force of gravity (Baillargeon, 2004 ; Carey 1985 ; Cohen & Cashon, 2006 ; Duschl, Schweingruber, & Shouse, 2007 ; Gelman & Baillargeon, 1983 ; Gelman & Kalish, 2006 ; Mandler, 2004 ; Metz 1995 ; Munakata, Casey, & Diamond, 2004 ). And even 6-month-olds are able to predict the future location of a moving object that they are attempting to grasp (Von Hofsten, 1980 ; Von Hofsten, Feng, & Spelke, 2000 ). In addition, they appear to be able to make nontrivial inferences about causes and their effects (Gopnik et al., 2004 ).

The similarities between children's thinking and scientists' thinking have an inherent allure and an internal contradiction. The allure resides in the enthusiastic wonder and openness with which both children and scientists approach the world around them. The paradox comes from the fact that different investigators of children's thinking have reached diametrically opposing conclusions about just how “scientific” children's thinking really is. Some claim support for the “child as a scientist” position (Brewer & Samarapungavan, 1991 ; Gelman & Wellman, 1991 ; Gopnik, Meltzoff, & Kuhl, 1999 ; Karmiloff-Smith 1988 ; Sodian, Zaitchik, & Carey, 1991 ; Samarapungavan 1992 ), while others offer serious challenges to the view (Fay & Klahr, 1996 ; Kern, Mirels, & Hinshaw, 1983 ; Kuhn, Amsel, & O'Laughlin, 1988 ; Schauble & Glaser, 1990 ; Siegler & Liebert, 1975 .) Such fundamentally incommensurate conclusions suggest that this very field—children's scientific thinking—is ripe for a conceptual revolution!

A recent comprehensive review (Duschl, Schweingruber, & Shouse, 2007 ) of what children bring to their science classes offers the following concise summary of the extensive developmental and educational research literature on children's scientific thinking:

Children entering school already have substantial knowledge of the natural world, much of which is implicit.

What children are capable of at a particular age is the result of a complex interplay among maturation, experience, and instruction. What is developmentally appropriate is not a simple function of age or grade, but rather is largely contingent on children's prior opportunities to learn.

Students' knowledge and experience play a critical role in their science learning, influencing four aspects of science understanding, including (a) knowing, using, and interpreting scientific explanations of the natural world; (b) generating and evaluating scientific evidence and explanations, (c) understanding how scientific knowledge is developed in the scientific community, and (d) participating in scientific practices and discourse.

Students learn science by actively engaging in the practices of science.

In the previous section of this article we discussed conceptual change with respect to scientific fields and undergraduate science students. However, the idea that children undergo radical conceptual change in which old “theories” need to be overthrown and reorganized has been a central topic in understanding changes in scientific thinking in both children and across the life span. This radical conceptual change is thought to be necessary for acquiring many new concepts in physics and is regarded as the major source of difficulty for students. The factors that are at the root of this conceptual shift view have been difficult to determine, although there have been a number of studies in cognitive development (Carey, 1985 ; Chi 1992 ; Chi & Roscoe, 2002 ), in the history of science (Thagard, 1992 ), and in physics education (Clement, 1982 ; Mestre 1991 ) that give detailed accounts of the changes in knowledge representation that occur while people switch from one way of representing scientific knowledge to another.

One area where students show great difficulty in understanding scientific concepts is physics. Analyses of students' changing conceptions, using interviews, verbal protocols, and behavioral outcome measures, indicate that large-scale changes in students' concepts occur in physics education (see McDermott & Redish, 1999 , for a review of this literature). Following Kuhn ( 1962 ), many researchers, but not all, have noted that students' changing conceptions resemble the sequences of conceptual changes in physics that have occurred in the history of science. These notions of radical paradigm shifts and ensuing incompatibility with past knowledge-states have called attention to interesting parallels between the development of particular scientific concepts in children and in the history of physics. Investigations of nonphysicists' understanding of motion indicate that students have extensive misunderstandings of motion. Some researchers have interpreted these findings as an indication that many people hold erroneous beliefs about motion similar to a medieval “impetus” theory (McCloskey, Caramazza, & Green, 1980 ). Furthermore, students appear to maintain “impetus” notions even after one or two courses in physics. In fact, some authors have noted that students who have taken one or two courses in physics can perform worse on physics problems than naive students (Mestre, 1991 ). Thus, it is only after extensive learning that we see a conceptual shift from impetus theories of motion to Newtonian scientific theories.

How one's conceptual representation shifts from “naive” to Newtonian is a matter of contention, as some have argued that the shift involves a radical conceptual change, whereas others have argued that the conceptual change is not really complete. For example, Kozhevnikov and Hegarty ( 2001 ) argue that much of the naive impetus notions of motion are maintained at the expense of Newtonian principles even with extensive training in physics. However, they argue that such impetus principles are maintained at an implicit level. Thus, although students can give the correct Newtonian answer to problems, their reaction times to respond indicate that they are also using impetus theories when they respond. An alternative view of conceptual change focuses on whether there are real conceptual changes at all. Gupta, Hammer and Redish ( 2010 ) and Disessa ( 2004 ) have conducted detailed investigations of changes in physics students' accounts of phenomena covered in elementary physics courses. They have found that rather than students possessing a naive theory that is replaced by the standard theory, many introductory physics students have no stable physical theory but rather construct their explanations from elementary pieces of knowledge of the physical world.

Computational Approaches to Scientific Thinking

Computational approaches have provided a more complete account of the scientific mind. Computational models provide specific detailed accounts of the cognitive processes underlying scientific thinking. Early computational work consisted of taking a scientific discovery and building computational models of the reasoning processes involved in the discovery. Langley, Simon, Bradshaw, and Zytkow ( 1987 ) built a series of programs that simulated discoveries such as those of Copernicus, Bacon, and Stahl. These programs had various inductive reasoning algorithms built into them, and when given the data that the scientists used, they were able to propose the same rules. Computational models make it possible to propose detailed models of the cognitive subcomponents of scientific thinking that specify exactly how scientific theories are generated, tested, and amended (see Darden, 1997 , and Shrager & Langley, 1990 , for accounts of this branch of research). More recently, the incorporation of scientific knowledge into computer programs has resulted in a shift in emphasis from using programs to simulate discoveries to building programs that are used to help scientists make discoveries. A number of these computer programs have made novel discoveries. For example, Valdes-Perez ( 1994 ) has built systems for discoveries in chemistry, and Fajtlowicz has done this in mathematics (Erdos, Fajtlowicz, & Staton, 1991 ).

These advances in the fields of computer discovery have led to new fields, conferences, journals, and even departments that specialize in the development of programs devised to search large databases in the hope of making new scientific discoveries (Langley, 2000 , 2002 ). This process is commonly known as “data mining.” This approach has only proved viable relatively recently, due to advances in computer technology. Biswal et al. ( 2010 ), Mitchell ( 2009 ), and Yang ( 2009 ) provide recent reviews of data mining in different scientific fields. Data mining is at the core of drug discovery, our understanding of the human genome, and our understanding of the universe for a number of reasons. First, vast databases concerning drug actions, biological processes, the genome, the proteome, and the universe itself now exist. Second, the development of high throughput data-mining algorithms makes it possible to search for new drug targets, novel biological mechanisms, and new astronomical phenomena in relatively short periods of time. Research programs that took decades, such as the development of penicillin, can now be done in days (Yang, 2009 ).

Another recent shift in the use of computers in scientific discovery has been to have both computers and people make discoveries together, rather than expecting that computers make an entire scientific discovery. Now instead of using computers to mimic the entire scientific discovery process as used by humans, computers can use powerful algorithms that search for patterns on large databases and provide the patterns to humans who can then use the output of these computers to make discoveries, ranging from the human genome to the structure of the universe. However, there are some robots such as ADAM, developed by King ( 2011 ), that can actually perform the entire scientific process, from the generation of hypotheses, to the conduct of experiments and the interpretation of results, with little human intervention. The ongoing development of scientific robots by some scientists (King et al., 2009 ) thus continues the tradition started by Herbert Simon in the 1960s. However, many of the controversies as to whether the robot is a “real scientist” or not continue to the present (Evans & Rzhetsky, 2010 , Gianfelici, 2010 ; Haufe, Elliott, Burian, & O' Malley, 2010 ; O'Malley 2011 ).

Scientific Thinking and Science Education

Accounts of the nature of science and research on scientific thinking have had profound effects on science education along many levels, particularly in recent years. Science education from the 1900s until the 1970s was primarily concerned with teaching students both the content of science (such as Newton's laws of motion) or the methods that scientists need to use in their research (such as using experimental and control groups). Beginning in the 1980s, a number of reports (e.g., American Association for the Advancement of Science, 1993; National Commission on Excellence in Education, 1983; Rutherford & Ahlgren, 1991 ) stressed the need for teaching scientific thinking skills rather than just methods and content. The addition of scientific thinking skills to the science curriculum from kindergarten through adulthood was a major shift in focus. Many of the particular scientific thinking skills that have been emphasized are skills covered in previous sections of this chapter, such as teaching deductive and inductive thinking strategies. However, rather than focusing on one particular skill, such as induction, researchers in education have focused on how the different components of scientific thinking are put together in science. Furthermore, science educators have focused upon situations where science is conducted collaboratively, rather than being the product of one person thinking alone. These changes in science education parallel changes in methodologies used to investigate science, such as analyzing the ways that scientists think and reason in their laboratories.

By looking at science as a complex multilayered and group activity, many researchers in science education have adopted a constructivist approach. This approach sees learning as an active rather than a passive process, and it suggests that students learn through constructing their scientific knowledge. We will first describe a few examples of the constructivist approach to science education. Following that, we will address several lines of work that challenge some of the assumptions of the constructivist approach to science education.

Often the goal of constructivist science education is to produce conceptual change through guided instruction where the teacher or professor acts as a guide to discovery, rather than the keeper of all the facts. One recent and influential approach to science education is the inquiry-based learning approach. Inquiry-based learning focuses on posing a problem or a puzzling event to students and asking them to propose a hypothesis that could explain the event. Next, the student is asked to collect data that test the hypothesis, make conclusions, and then reflect upon both the original problem and the thought processes that they used to solve the problem. Often students use computers that aid in their construction of new knowledge. The computers allow students to learn many of the different components of scientific thinking. For example, Reiser and his colleagues have developed a learning environment for biology, where students are encouraged to develop hypotheses in groups, codify the hypotheses, and search databases to test these hypotheses (Reiser et al., 2001 ).

One of the myths of science is the lone scientist suddenly shouting “Eureka, I have made a discovery!” Instead, in vivo studies of scientists (e.g., Dunbar, 1995 , 2002 ), historical analyses of scientific discoveries (Nersessian, 1999 ), and studies of children learning science at museums have all pointed to collaborative scientific discovery mechanisms as being one of the driving forces of science (Atkins et al., 2009 ; Azmitia & Crowley, 2001 ). What happens during collaborative scientific thinking is that there is usually a triggering event, such as an unexpected result or situation that a student does not understand. This results in other members of the group adding new information to the person's representation of knowledge, often adding new inductions and deductions that both challenge and transform the reasoner's old representations of knowledge (Chi & Roscoe, 2002 ; Dunbar 1998 ). Social mechanisms play a key component in fostering changes in concepts that have been ignored in traditional cognitive research but are crucial for both science and science education. In science education there has been a shift to collaborative learning, particularly at the elementary level; however, in university education, the emphasis is still on the individual scientist. As many domains of science now involve collaborations across scientific disciplines, we expect the explicit teaching of heuristics for collaborative science to increase.

What is the best way to teach and learn science? Surprisingly, the answer to this question has been difficult to uncover. For example, toward the end of the last century, influenced by several thinkers who advocated a constructivist approach to learning, ranging from Piaget (Beilin, 1994 ) to Papert ( 1980 ), many schools answered this question by adopting a philosophy dubbed “discovery learning.” Although a clear operational definition of this approach has yet to be articulated, the general idea is that children are expected to learn science by reconstructing the processes of scientific discovery—in a range of areas from computer programming to chemistry to mathematics. The premise is that letting students discover principles on their own, set their own goals, and collaboratively explore the natural world produces deeper knowledge that transfers widely.

The research literature on science education is far from consistent in its use of terminology. However, our reading suggests that “discovery learning” differs from “inquiry-based learning” in that few, if any, guidelines are given to students in discovery learning contexts, whereas in inquiry learning, students are given hypotheses and specific goals to achieve (see the second paragraph of this section for a definition of inquiry-based learning). Even though thousands of schools have adopted discovery learning as an alternative to more didactic approaches to teaching and learning, the evidence showing that it is more effective than traditional, direct, teacher-controlled instructional approaches is mixed, at best (Lorch et al., 2010 ; Minner, Levy, & Century, 2010 ). In several cases where the distinctions between direct instruction and more open-ended constructivist instruction have been clearly articulated, implemented, and assessed, direct instruction has proven to be superior to the alternatives (Chen & Klahr, 1999 ; Toth, Klahr, & Chen, 2000 ). For example, in a study of third- and fourth-grade children learning about experimental design, Klahr and Nigam ( 2004 ) found that many more children learned from direct instruction than from discovery learning. Furthermore, they found that among the few children who did manage to learn from a discovery method, there was no better performance on a far transfer test of scientific reasoning than that observed for the many children who learned from direct instruction.

The idea of children learning most of their science through a process of self-directed discovery has some romantic appeal, and it may accurately describe the personal experience of a handful of world-class scientists. However, the claim has generated some contentious disagreements (Kirschner, Sweller, & Clark, 2006 ; Klahr, 2010 ; Taber 2009 ; Tobias & Duffy, 2009 ), and the jury remains out on the extent to which most children can learn science that way.

Conclusions and Future Directions

The field of scientific thinking is now a thriving area of research with strong underpinnings in cognitive psychology and cognitive science. In recent years, a new professional society has been formed that aims to facilitate this integrative and interdisciplinary approach to the psychology of science, with its own journal and regular professional meetings. 1 Clearly the relations between these different aspects of scientific thinking need to be combined in order to produce a truly comprehensive picture of the scientific mind.

While much is known about certain aspects of scientific thinking, much more remains to be discovered. In particular, there has been little contact between cognitive, neuroscience, social, personality, and motivational accounts of scientific thinking. Research in thinking and reasoning has been expanded to use the methods and theories of cognitive neuroscience (see Morrison & Knowlton, Chapter 6 ). A similar approach can be taken in exploring scientific thinking (see Dunbar et al., 2007 ). There are two main reasons for taking a neuroscience approach to scientific thinking. First, functional neuroimaging allows the researcher to look at the entire human brain, making it possible to see the many different sites that are involved in scientific thinking and gain a more complete understanding of the entire range of mechanisms involved in this type of thought. Second, these brain-imaging approaches allow researchers to address fundamental questions in research on scientific thinking, such as the extent to which ordinary thinking in nonscientific contexts and scientific thinking recruit similar versus disparate neural structures of the brain.

Dunbar ( 2009 ) has used some novel methods to explore Simon's assertion, cited at the beginning of this chapter, that scientific thinking uses the same cognitive mechanisms that all human beings possess (rather than being an entirely different type of thinking) but combines them in ways that are specific to a particular aspect of science or a specific discipline of science. For example, Fugelsang and Dunbar ( 2009 ) compared causal reasoning when two colliding circular objects were labeled balls or labeled subatomic particles. They obtained different brain activation patterns depending on whether the stimuli were labeled balls or subatomic particles. In another series of experiments, Dunbar and colleagues used functional magnetic resonance imaging (fMRI) to study patterns of activation in the brains of students who have and who have not undergone conceptual change in physics. For example, Fugelsang and Dunbar ( 2005 ) and Dunbar et al. ( 2007 ) have found differences in the activation of specific brain sites (such as the anterior cingulate) for students when they encounter evidence that is inconsistent with their current conceptual understandings. These initial cognitive neuroscience investigations have the potential to reveal the ways that knowledge is organized in the scientific brain and provide detailed accounts of the nature of the representation of scientific knowledge. Petitto and Dunbar ( 2004 ) proposed the term “educational neuroscience” for the integration of research on education, including science education, with research on neuroscience. However, see Fitzpatrick (in press) for a very different perspective on whether neuroscience approaches are relevant to education. Clearly, research on the scientific brain is just beginning. We as scientists are beginning to get a reasonable grasp of the inner workings of the subcomponents of the scientific mind (i.e., problem solving, analogy, induction). However, great advances remain to be made concerning how these processes interact so that scientific discoveries can be made. Future research will focus on both the collaborative aspects of scientific thinking and the neural underpinnings of the scientific mind.

The International Society for the Psychology of Science and Technology (ISPST). Available at http://www.ispstonline.org/

Ahn, W., Kalish, C. W., Medin, D. L., & Gelman, S. A. ( 1995 ). The role of covariation versus mechanism information in causal attribution.   Cognition , 54 , 299–352.

American Association for the Advancement of Science. ( 1993 ). Benchmarks for scientific literacy . New York: Oxford University Press.

Google Scholar

Google Preview

Atkins, L. J., Velez, L., Goudy, D., & Dunbar, K. N. ( 2009 ). The unintended effects of interactive objects and labels in the science museum.   Science Education , 54 , 161–184.

Azmitia, M. A., & Crowley, K. ( 2001 ). The rhythms of scientific thinking: A study of collaboration in an earthquake microworld. In K. Crowley, C. Schunn, & T. Okada (Eds.), Designing for science: Implications from everyday, classroom, and professional settings (pp. 45–72). Mahwah, NJ: Erlbaum.

Bacon, F. ( 1620 /1854). Novum organum (B. Monatgue, Trans.). Philadelphia, P A: Parry & McMillan.

Baillargeon, R. ( 2004 ). Infants' reasoning about hidden objects: Evidence for event-general and event-specific expectations (article with peer commentaries and response, listed below).   Developmental Science , 54 , 391–424.

Baker, L. M., & Dunbar, K. ( 2000 ). Experimental design heuristics for scientific discovery: The use of baseline and known controls.   International Journal of Human Computer Studies , 54 , 335–349.

Beilin, H. ( 1994 ). Jean Piaget's enduring contribution to developmental psychology. In R. D. Parke, P. A. Ornstein, J. J. Rieser, & C. Zahn-Waxler (Eds.), A century of developmental psychology (pp. 257–290). Washington, DC US: American Psychological Association.

Biswal, B. B., Mennes, M., Zuo, X.-N., Gohel, S., Kelly, C., Smith, S.M., et al. ( 2010 ). Toward discovery science of human brain function.   Proceedings of the National Academy of Sciences of the United States of America , 107, 4734–4739.

Brewer, W. F., & Samarapungavan, A. ( 1991 ). Children's theories vs. scientific theories: Differences in reasoning or differences in knowledge? In R. R. Hoffman & D. S. Palermo (Eds.), Cognition and the symbolic processes: Applied and ecological perspectives (pp. 209–232). Hillsdale, NJ: Erlbaum.

Bruner, J. S., Goodnow, J. J., & Austin, G. A. ( 1956 ). A study of thinking . New York: NY Science Editions.

Carey, S. ( 1985 ). Conceptual change in childhood . Cambridge, MA: MIT Press.

Carruthers, P., Stich, S., & Siegal, M. ( 2002 ). The cognitive basis of science . New York: Cambridge University Press.

Chi, M. ( 1992 ). Conceptual change within and across ontological categories: Examples from learning and discovery in science. In R. Giere (Ed.), Cognitive models of science (pp. 129–186). Minneapolis: University of Minnesota Press.

Chi, M. T. H., & Roscoe, R. D. ( 2002 ). The processes and challenges of conceptual change. In M. Limon & L. Mason (Eds.), Reconsidering conceptual change: Issues in theory and practice (pp 3–27). Amsterdam, Netherlands: Kluwer Academic Publishers.

Chen, Z., & Klahr, D. ( 1999 ). All other things being equal: Children's acquisition of the control of variables strategy.   Child Development , 54 (5), 1098–1120.

Clement, J. ( 1982 ). Students' preconceptions in introductory mechanics.   American Journal of Physics , 54 , 66–71.

Cohen, L. B., & Cashon, C. H. ( 2006 ). Infant cognition. In W. Damon & R. M. Lerner (Series Eds.) & D. Kuhn & R. S. Siegler (Vol. Eds.), Handbook of child psychology. Vol. 2: Cognition, perception, and language (6th ed., pp. 214–251). New York: Wiley.

National Commission on Excellence in Education. ( 1983 ). A nation at risk: The imperative for educational reform . Washington, DC: US Department of Education.

Crick, F. H. C. ( 1988 ). What mad pursuit: A personal view of science . New York: Basic Books.

Darden, L. ( 2002 ). Strategies for discovering mechanisms: Schema instantiation, modular subassembly, forward chaining/backtracking.   Philosophy of Science , 69, S354–S365.

Davenport, J. L., Yaron, D., Klahr, D., & Koedinger, K. ( 2008 ). Development of conceptual understanding and problem solving expertise in chemistry. In B. C. Love, K. McRae, & V. M. Sloutsky (Eds.), Proceedings of the 30th Annual Conference of the Cognitive Science Society (pp. 751–756). Austin, TX: Cognitive Science Society.

diSessa, A. A. ( 2004 ). Contextuality and coordination in conceptual change. In E. Redish & M. Vicentini (Eds.), Proceedings of the International School of Physics “Enrico Fermi:” Research on physics education (pp. 137–156). Amsterdam, Netherlands: ISO Press/Italian Physics Society

Dunbar, K. ( 1995 ). How scientists really reason: Scientific reasoning in real-world laboratories. In R. J. Sternberg, & J. Davidson (Eds.), Mechanisms of insight (pp. 365–395). Cambridge, MA: MIT press.

Dunbar, K. ( 1997 ). How scientists think: Online creativity and conceptual change in science. In T. B. Ward, S. M. Smith, & S. Vaid (Eds.), Conceptual structures and processes: Emergence, discovery and change (pp. 461–494). Washington, DC: American Psychological Association.

Dunbar, K. ( 1998 ). Problem solving. In W. Bechtel & G. Graham (Eds.), A companion to cognitive science (pp. 289–298). London: Blackwell

Dunbar, K. ( 1999 ). The scientist InVivo : How scientists think and reason in the laboratory. In L. Magnani, N. Nersessian, & P. Thagard (Eds.), Model-based reasoning in scientific discovery (pp. 85–100). New York: Plenum.

Dunbar, K. ( 2001 ). The analogical paradox: Why analogy is so easy in naturalistic settings, yet so difficult in the psychology laboratory. In D. Gentner, K. J. Holyoak, & B. Kokinov Analogy: Perspectives from cognitive science (pp. 313–334). Cambridge, MA: MIT press.

Dunbar, K. ( 2002 ). Science as category: Implications of InVivo science for theories of cognitive development, scientific discovery, and the nature of science. In P. Caruthers, S. Stich, & M. Siegel (Eds.) Cognitive models of science (pp. 154–170). New York: Cambridge University Press.

Dunbar, K. ( 2009 ). The biology of physics: What the brain reveals about our physical understanding of the world. In M. Sabella, C. Henderson, & C. Singh. (Eds.), Proceedings of the Physics Education Research Conference (pp. 15–18). Melville, NY: American Institute of Physics.

Dunbar, K., & Fugelsang, J. ( 2004 ). Causal thinking in science: How scientists and students interpret the unexpected. In M. E. Gorman, A. Kincannon, D. Gooding, & R. D. Tweney (Eds.), New directions in scientific and technical thinking (pp. 57–59). Mahway, NJ: Erlbaum.

Dunbar, K., Fugelsang, J., & Stein, C. ( 2007 ). Do naïve theories ever go away? In M. Lovett & P. Shah (Eds.), Thinking with Data: 33 rd Carnegie Symposium on Cognition (pp. 193–206). Mahwah, NJ: Erlbaum.

Dunbar, K., & Sussman, D. ( 1995 ). Toward a cognitive account of frontal lobe function: Simulating frontal lobe deficits in normal subjects.   Annals of the New York Academy of Sciences , 54 , 289–304.

Duschl, R. A., Schweingruber, H. A., & Shouse, A. W. (Eds.). ( 2007 ). Taking science to school: Learning and teaching science in grades K-8. Washington, DC: National Academies Press.

Einstein, A. ( 1950 ). Out of my later years . New York: Philosophical Library

Erdos, P., Fajtlowicz, S., & Staton, W. ( 1991 ). Degree sequences in the triangle-free graphs,   Discrete Mathematics , 54 (91), 85–88.

Evans, J., & Rzhetsky, A. ( 2010 ). Machine science.   Science , 54 , 399–400.

Fay, A., & Klahr, D. ( 1996 ). Knowing about guessing and guessing about knowing: Preschoolers' understanding of indeterminacy.   Child Development , 54 , 689–716.

Fischler, H., & Lichtfeldt, M. ( 1992 ). Modern physics and students conceptions.   International Journal of Science Education , 54 , 181–190.

Fitzpatrick, S. M. (in press). Functional brain imaging: Neuro-turn or wrong turn? In M. M., Littlefield & J.M., Johnson (Eds.), The neuroscientific turn: Transdisciplinarity in the age of the brain. Ann Arbor: University of Michigan Press.

Fox-Keller, E. ( 1985 ). Reflections on gender and science . New Haven, CT: Yale University Press.

Fugelsang, J., & Dunbar, K. ( 2005 ). Brain-based mechanisms underlying complex causal thinking.   Neuropsychologia , 54 , 1204–1213.

Fugelsang, J., & Dunbar, K. ( 2009 ). Brain-based mechanisms underlying causal reasoning. In E. Kraft (Ed.), Neural correlates of thinking (pp. 269–279). Berlin, Germany: Springer

Fugelsang, J., Stein, C., Green, A., & Dunbar, K. ( 2004 ). Theory and data interactions of the scientific mind: Evidence from the molecular and the cognitive laboratory.   Canadian Journal of Experimental Psychology , 54 , 132–141

Galilei, G. ( 1638 /1991). Dialogues concerning two new sciences (A. de Salvio & H. Crew, Trans.). Amherst, NY: Prometheus Books.

Galison, P. ( 2003 ). Einstein's clocks, Poincaré's maps: Empires of time . New York: W. W. Norton.

Gelman, R., & Baillargeon, R. ( 1983 ). A review of Piagetian concepts. In P. H. Mussen (Series Ed.) & J. H. Flavell & E. M. Markman (Vol. Eds.), Handbook of child psychology (4th ed., Vol. 3, pp. 167–230). New York: Wiley.

Gelman, S. A., & Kalish, C. W. ( 2006 ). Conceptual development. In D. Kuhn & R. Siegler (Eds.), Handbook of child psychology. Vol. 2: Cognition, perception and language (pp. 687–733). New York: Wiley.

Gelman, S., & Wellman, H. ( 1991 ). Insides and essences.   Cognition , 54 , 214–244.

Gentner, D. ( 2010 ). Bootstrapping the mind: Analogical processes and symbol systems.   Cognitive Science , 54 , 752–775.

Gentner, D., Brem, S., Ferguson, R. W., Markman, A. B., Levidow, B. B., Wolff, P., & Forbus, K. D. ( 1997 ). Analogical reasoning and conceptual change: A case study of Johannes Kepler.   The Journal of the Learning Sciences , 54 (1), 3–40.

Gentner, D., Holyoak, K. J., & Kokinov, B. ( 2001 ). The analogical mind: Perspectives from cognitive science . Cambridge, MA: MIT Press.

Gentner, D., & Jeziorski, M. ( 1993 ). The shift from metaphor to analogy in western science. In A. Ortony (Ed.), Metaphor and thought (2nd ed., pp. 447–480). Cambridge, England: Cambridge University Press.

Gianfelici, F. ( 2010 ). Machine science: Truly machine-aided science.   Science , 54 , 317–319.

Giere, R. ( 1993 ). Cognitive models of science . Minneapolis: University of Minnesota Press.

Gopnik, A. N., Meltzoff, A. N., & Kuhl, P. K. ( 1999 ). The scientist in the crib: Minds, brains and how children learn . New York: Harper Collins

Gorman, M. E. ( 1989 ). Error, falsification and scientific inference: An experimental investigation.   Quarterly Journal of Experimental Psychology: Human Experimental Psychology , 41A , 385–412

Gorman, M. E., Kincannon, A., Gooding, D., & Tweney, R. D. ( 2004 ). New directions in scientific and technical thinking . Mahwah, NJ: Erlbaum.

Gupta, A., Hammer, D., & Redish, E. F. ( 2010 ). The case for dynamic models of learners' ontologies in physics.   Journal of the Learning Sciences , 54 (3), 285–321.

Haufe, C., Elliott, K. C., Burian, R., & O'Malley, M. A. ( 2010 ). Machine science: What's missing.   Science , 54 , 318–320.

Hecht, E. ( 2011 ). On defining mass.   The Physics Teacher , 54 , 40–43.

Heit, E. ( 2000 ). Properties of inductive reasoning.   Psychonomic Bulletin and Review , 54 , 569–592.

Holyoak, K. J., & Thagard, P. ( 1995 ). Mental leaps . Cambridge, MA: MIT Press.

Karmiloff-Smith, A. ( 1988 ) The child is a theoretician, not an inductivist.   Mind and Language , 54 , 183–195.

Keil, F. C. ( 1999 ). Conceptual change. In R. Wilson & F. Keil (Eds.), The MIT encyclopedia of cognitive science . (pp. 179–182) Cambridge, MA: MIT press.

Kern, L. H., Mirels, H. L., & Hinshaw, V. G. ( 1983 ). Scientists' understanding of propositional logic: An experimental investigation.   Social Studies of Science , 54 , 131–146.

King, R. D. ( 2011 ). Rise of the robo scientists.   Scientific American , 54 (1), 73–77.

King, R. D., Rowland, J., Oliver, S. G., Young, M., Aubrey, W., Byrne, E., et al. ( 2009 ). The automation of science.   Science , 54 , 85–89.

Kirschner, P. A., Sweller, J., & Clark, R. ( 2006 ) Why minimal guidance during instruction does not work: An analysis of the failure of constructivist, discovery, problem-based, experiential, and inquiry-based teaching.   Educational Psychologist , 54 , 75–86

Klahr, D. ( 2000 ). Exploring science: The cognition and development of discovery processes . Cambridge, MA: MIT Press.

Klahr, D. ( 2010 ). Coming up for air: But is it oxygen or phlogiston? A response to Taber's review of constructivist instruction: Success or failure?   Education Review , 54 (13), 1–6.

Klahr, D., & Dunbar, K. ( 1988 ). Dual space search during scientific reasoning.   Cognitive Science , 54 , 1–48.

Klahr, D., & Nigam, M. ( 2004 ). The equivalence of learning paths in early science instruction: effects of direct instruction and discovery learning.   Psychological Science , 54 (10), 661–667.

Klahr, D. & Masnick, A. M. ( 2002 ). Explaining, but not discovering, abduction. Review of L. Magnani (2001) abduction, reason, and science: Processes of discovery and explanation.   Contemporary Psychology , 47, 740–741.

Klahr, D., & Simon, H. ( 1999 ). Studies of scientific discovery: Complementary approaches and convergent findings.   Psychological Bulletin , 54 , 524–543.

Klayman, J., & Ha, Y. ( 1987 ). Confirmation, disconfirmation, and information in hypothesis testing.   Psychological Review , 54 , 211–228.

Kozhevnikov, M., & Hegarty, M. ( 2001 ). Impetus beliefs as default heuristic: Dissociation between explicit and implicit knowledge about motion.   Psychonomic Bulletin and Review , 54 , 439–453.

Kuhn, T. ( 1962 ). The structure of scientific revolutions . Chicago, IL: University of Chicago Press.

Kuhn, D., Amsel, E., & O'Laughlin, M. ( 1988 ). The development of scientific thinking skills . Orlando, FL: Academic Press.

Kulkarni, D., & Simon, H. A. ( 1988 ). The processes of scientific discovery: The strategy of experimentation.   Cognitive Science , 54 , 139–176.

Langley, P. ( 2000 ). Computational support of scientific discovery.   International Journal of Human-Computer Studies , 54 , 393–410.

Langley, P. ( 2002 ). Lessons for the computational discovery of scientific knowledge. In Proceedings of the First International Workshop on Data Mining Lessons Learned (pp. 9–12).

Langley, P., Simon, H. A., Bradshaw, G. L., & Zytkow, J. M. ( 1987 ). Scientific discovery: Computational explorations of the creative processes . Cambridge, MA: MIT Press.

Lorch, R. F., Jr., Lorch, E. P., Calderhead, W. J., Dunlap, E. E., Hodell, E. C., & Freer, B. D. ( 2010 ). Learning the control of variables strategy in higher and lower achieving classrooms: Contributions of explicit instruction and experimentation.   Journal of Educational Psychology , 54 (1), 90–101.

Magnani, L., Carnielli, W., & Pizzi, C., (Eds.) ( 2010 ). Model-based reasoning in science and technology: Abduction, logic,and computational discovery. Series Studies in Computational Intelligence (Vol. 314). Heidelberg/Berlin: Springer.

Mandler, J.M. ( 2004 ). The foundations of mind: Origins of conceptual thought . Oxford, England: Oxford University Press.

Macpherson, R., & Stanovich, K. E. ( 2007 ). Cognitive ability, thinking dispositions, and instructional set as predictors of critical thinking.   Learning and Individual Differences , 54 , 115–127.

McCloskey, M., Caramazza, A., & Green, B. ( 1980 ). Curvilinear motion in the absence of external forces: Naive beliefs about the motion of objects.   Science , 54 , 1139–1141.

McDermott, L. C., & Redish, L. ( 1999 ). Research letter on physics education research.   American Journal of Psychics , 54 , 755.

Mestre, J. P. ( 1991 ). Learning and instruction in pre-college physical science.   Physics Today , 54 , 56–62.

Metz, K. E. ( 1995 ). Reassessment of developmental constraints on children's science instruction.   Review of Educational Research , 54 (2), 93–127.

Minner, D. D., Levy, A. J., & Century, J. ( 2010 ). Inquiry-based science instruction—what is it and does it matter? Results from a research synthesis years 1984 to 2002.   Journal of Research in Science Teaching , 54 (4), 474–496.

Mitchell, T. M. ( 2009 ). Mining our reality.   Science , 54 , 1644–1645.

Mitroff, I. ( 1974 ). The subjective side of science . Amsterdam, Netherlands: Elsevier.

Munakata, Y., Casey, B. J., & Diamond, A. ( 2004 ). Developmental cognitive neuroscience: Progress and potential.   Trends in Cognitive Sciences , 54 , 122–128.

Mynatt, C. R., Doherty, M. E., & Tweney, R. D. ( 1977 ) Confirmation bias in a simulated research environment: An experimental study of scientific inference.   Quarterly Journal of Experimental Psychology , 54 , 89–95.

Nersessian, N. ( 1998 ). Conceptual change. In W. Bechtel, & G. Graham (Eds.), A companion to cognitive science (pp. 157–166). London, England: Blackwell.

Nersessian, N. ( 1999 ). Models, mental models, and representations: Model-based reasoning in conceptual change. In L. Magnani, N. Nersessian, & P. Thagard (Eds.), Model-based reasoning in scientific discovery (pp. 5–22). New York: Plenum.

Nersessian, N. J. ( 2002 ). The cognitive basis of model-based reasoning in science In. P. Carruthers, S. Stich, & M. Siegal (Eds.), The cognitive basis of science (pp. 133–152). New York: Cambridge University Press.

Nersessian, N. J. ( 2008 ) Creating scientific concepts . Cambridge, MA: MIT Press.

O' Malley, M. A. ( 2011 ). Exploration, iterativity and kludging in synthetic biology.   Comptes Rendus Chimie , 54 (4), 406–412 .

Papert, S. ( 1980 ) Mindstorms: Children computers and powerful ideas. New York: Basic Books.

Penner, D. E., & Klahr, D. ( 1996 ). When to trust the data: Further investigations of system error in a scientific reasoning task.   Memory and Cognition , 54 (5), 655–668.

Petitto, L. A., & Dunbar, K. ( 2004 ). New findings from educational neuroscience on bilingual brains, scientific brains, and the educated mind. In K. Fischer & T. Katzir (Eds.), Building usable knowledge in mind, brain, and education Cambridge, England: Cambridge University Press.

Popper, K. R. ( 1959 ). The logic of scientific discovery . London, England: Hutchinson.

Qin, Y., & Simon, H.A. ( 1990 ). Laboratory replication of scientific discovery processes.   Cognitive Science , 54 , 281–312.

Reiser, B. J., Tabak, I., Sandoval, W. A., Smith, B., Steinmuller, F., & Leone, T. J., ( 2001 ). BGuILE: Stategic and conceptual scaffolds for scientific inquiry in biology classrooms. In S. M. Carver & D. Klahr (Eds.), Cognition and instruction: Twenty-five years of progress (pp. 263–306). Mahwah, NJ: Erlbaum

Riordan, M., Rowson, P. C., & Wu, S. L. ( 2001 ). The search for the higgs boson.   Science , 54 , 259–260.

Rutherford, F. J., & Ahlgren, A. ( 1991 ). Science for all Americans. New York: Oxford University Press.

Samarapungavan, A. ( 1992 ). Children's judgments in theory choice tasks: Scientifc rationality in childhood.   Cognition , 54 , 1–32.

Schauble, L., & Glaser, R. ( 1990 ). Scientific thinking in children and adults. In D. Kuhn (Ed.), Developmental perspectives on teaching and learning thinking skills. Contributions to Human Development , (Vol. 21, pp. 9–26). Basel, Switzerland: Karger.

Schunn, C. D., & Klahr, D. ( 1995 ). A 4-space model of scientific discovery. In Proceedings of the 17th Annual Conference of the Cognitive Science Society (pp. 106–111). Mahwah, NJ: Erlbaum.

Schunn, C. D., & Klahr, D. ( 1996 ). The problem of problem spaces: When and how to go beyond a 2-space model of scientific discovery. Part of symposium on Building a theory of problem solving and scientific discovery: How big is N in N-space search? In Proceedings of the 18th Annual Conference of the Cognitive Science Society (pp. 25–26). Mahwah, NJ: Erlbaum.

Shrager, J., & Langley, P. ( 1990 ). Computational models of scientific discovery and theory formation . San Mateo, CA: Morgan Kaufmann.

Siegler, R. S., & Liebert, R. M. ( 1975 ). Acquisition of formal scientific reasoning by 10- and 13-year-olds: Designing a factorial experiment.   Developmental Psychology , 54 , 401–412.

Simon, H. A. ( 1977 ). Models of discovery . Dordrecht, Netherlands: D. Reidel Publishing.

Simon, H. A., Langley, P., & Bradshaw, G. L. ( 1981 ). Scientific discovery as problem solving.   Synthese , 54 , 1–27.

Simon, H. A., & Lea, G. ( 1974 ). Problem solving and rule induction. In H. Simon (Ed.), Models of thought (pp. 329–346). New Haven, CT: Yale University Press.

Smith, E. E., Shafir, E., & Osherson, D. ( 1993 ). Similarity, plausibility, and judgments of probability.   Cognition. Special Issue: Reasoning and decision making , 54 , 67–96.

Sodian, B., Zaitchik, D., & Carey, S. ( 1991 ). Young children's differentiation of hypothetical beliefs from evidence.   Child Development , 54 , 753–766.

Taber, K. S. ( 2009 ). Constructivism and the crisis in U.S. science education: An essay review.   Education Review , 54 (12), 1–26.

Thagard, P. ( 1992 ). Conceptual revolutions . Cambridge, MA: MIT Press.

Thagard, P. ( 1999 ). How scientists explain disease . Princeton, NJ: Princeton University Press.

Thagard, P., & Croft, D. ( 1999 ). Scientific discovery and technological innovation: Ulcers, dinosaur extinction, and the programming language Java. In L. Magnani, N. Nersessian, & P. Thagard (Eds.), Model-based reasoning in scientific discovery (pp. 125–138). New York: Plenum.

Tobias, S., & Duffy, T. M. (Eds.). ( 2009 ). Constructivist instruction: Success or failure? New York: Routledge.

Toth, E. E., Klahr, D., & Chen, Z. ( 2000 ) Bridging research and practice: A cognitively-based classroom intervention for teaching experimentation skills to elementary school children.   Cognition and Instruction , 54 (4), 423–459.

Tweney, R. D. ( 1989 ). A framework for the cognitive psychology of science. In B. Gholson, A. Houts, R. A. Neimeyer, & W. Shadish (Eds.), Psychology of science: Contributions to metascience (pp. 342–366). Cambridge, England: Cambridge University Press.

Tweney, R. D., Doherty, M. E., & Mynatt, C. R. ( 1981 ). On scientific thinking . New York: Columbia University Press.

Valdes-Perez, R. E. ( 1994 ). Conjecturing hidden entities via simplicity and conservation laws: Machine discovery in chemistry.   Artificial Intelligence , 54 (2), 247–280.

Von Hofsten, C. ( 1980 ). Predictive reaching for moving objects by human infants.   Journal of Experimental Child Psychology , 54 , 369–382.

Von Hofsten, C., Feng, Q., & Spelke, E. S. ( 2000 ). Object representation and predictive action in infancy.   Developmental Science , 54 , 193–205.

Vosnaidou, S. (Ed.). ( 2008 ). International handbook of research on conceptual change . New York: Taylor & Francis.

Vosniadou, S., & Brewer, W. F. ( 1992 ). Mental models of the earth: A study of conceptual change in childhood.   Cognitive Psychology , 54 , 535–585.

Wason, P. C. ( 1968 ). Reasoning about a rule.   Quarterly Journal of Experimental Psychology , 54 , 273–281.

Wertheimer, M. ( 1945 ). Productive thinking . New York: Harper.

Yang, Y. ( 2009 ). Target discovery from data mining approaches.   Drug Discovery Today , 54 (3–4), 147–154.

  • About Oxford Academic
  • Publish journals with us
  • University press partners
  • What we publish
  • New features  
  • Open access
  • Institutional account management
  • Rights and permissions
  • Get help with access
  • Accessibility
  • Advertising
  • Media enquiries
  • Oxford University Press
  • Oxford Languages
  • University of Oxford

Oxford University Press is a department of the University of Oxford. It furthers the University's objective of excellence in research, scholarship, and education by publishing worldwide

  • Copyright © 2024 Oxford University Press
  • Cookie settings
  • Cookie policy
  • Privacy policy
  • Legal notice

This Feature Is Available To Subscribers Only

Sign In or Create an Account

This PDF is available to Subscribers Only

For full access to this pdf, sign in to an existing account, or purchase an annual subscription.

Critical Thinking Definition, Skills, and Examples

  • Homework Help
  • Private School
  • College Admissions
  • College Life
  • Graduate School
  • Business School
  • Distance Learning

scientific definition for critical thinking

  • Indiana University, Bloomington
  • State University of New York at Oneonta

Critical thinking refers to the ability to analyze information objectively and make a reasoned judgment. It involves the evaluation of sources, such as data, facts, observable phenomena, and research findings.

Good critical thinkers can draw reasonable conclusions from a set of information, and discriminate between useful and less useful details to solve problems or make decisions. Employers prioritize the ability to think critically—find out why, plus see how you can demonstrate that you have this ability throughout the job application process. 

Why Do Employers Value Critical Thinking Skills?

Employers want job candidates who can evaluate a situation using logical thought and offer the best solution.

 Someone with critical thinking skills can be trusted to make decisions independently, and will not need constant handholding.

Hiring a critical thinker means that micromanaging won't be required. Critical thinking abilities are among the most sought-after skills in almost every industry and workplace. You can demonstrate critical thinking by using related keywords in your resume and cover letter, and during your interview.

Examples of Critical Thinking

The circumstances that demand critical thinking vary from industry to industry. Some examples include:

  • A triage nurse analyzes the cases at hand and decides the order by which the patients should be treated.
  • A plumber evaluates the materials that would best suit a particular job.
  • An attorney reviews evidence and devises a strategy to win a case or to decide whether to settle out of court.
  • A manager analyzes customer feedback forms and uses this information to develop a customer service training session for employees.

Promote Your Skills in Your Job Search

If critical thinking is a key phrase in the job listings you are applying for, be sure to emphasize your critical thinking skills throughout your job search.

Add Keywords to Your Resume

You can use critical thinking keywords (analytical, problem solving, creativity, etc.) in your resume. When describing your  work history , include top critical thinking skills that accurately describe you. You can also include them in your  resume summary , if you have one.

For example, your summary might read, “Marketing Associate with five years of experience in project management. Skilled in conducting thorough market research and competitor analysis to assess market trends and client needs, and to develop appropriate acquisition tactics.”

Mention Skills in Your Cover Letter

Include these critical thinking skills in your cover letter. In the body of your letter, mention one or two of these skills, and give specific examples of times when you have demonstrated them at work. Think about times when you had to analyze or evaluate materials to solve a problem.

Show the Interviewer Your Skills

You can use these skill words in an interview. Discuss a time when you were faced with a particular problem or challenge at work and explain how you applied critical thinking to solve it.

Some interviewers will give you a hypothetical scenario or problem, and ask you to use critical thinking skills to solve it. In this case, explain your thought process thoroughly to the interviewer. He or she is typically more focused on how you arrive at your solution rather than the solution itself. The interviewer wants to see you analyze and evaluate (key parts of critical thinking) the given scenario or problem.

Of course, each job will require different skills and experiences, so make sure you read the job description carefully and focus on the skills listed by the employer.

Top Critical Thinking Skills

Keep these in-demand critical thinking skills in mind as you update your resume and write your cover letter. As you've seen, you can also emphasize them at other points throughout the application process, such as your interview. 

Part of critical thinking is the ability to carefully examine something, whether it is a problem, a set of data, or a text. People with  analytical skills  can examine information, understand what it means, and properly explain to others the implications of that information.

  • Asking Thoughtful Questions
  • Data Analysis
  • Interpretation
  • Questioning Evidence
  • Recognizing Patterns

Communication

Often, you will need to share your conclusions with your employers or with a group of colleagues. You need to be able to  communicate with others  to share your ideas effectively. You might also need to engage in critical thinking in a group. In this case, you will need to work with others and communicate effectively to figure out solutions to complex problems.

  • Active Listening
  • Collaboration
  • Explanation
  • Interpersonal
  • Presentation
  • Verbal Communication
  • Written Communication

Critical thinking often involves creativity and innovation. You might need to spot patterns in the information you are looking at or come up with a solution that no one else has thought of before. All of this involves a creative eye that can take a different approach from all other approaches.

  • Flexibility
  • Conceptualization
  • Imagination
  • Drawing Connections
  • Synthesizing

Open-Mindedness

To think critically, you need to be able to put aside any assumptions or judgments and merely analyze the information you receive. You need to be objective, evaluating ideas without bias.

  • Objectivity
  • Observation

Problem Solving

Problem-solving is another critical thinking skill that involves analyzing a problem, generating and implementing a solution, and assessing the success of the plan. Employers don’t simply want employees who can think about information critically. They also need to be able to come up with practical solutions.

  • Attention to Detail
  • Clarification
  • Decision Making
  • Groundedness
  • Identifying Patterns

More Critical Thinking Skills

  • Inductive Reasoning
  • Deductive Reasoning
  • Noticing Outliers
  • Adaptability
  • Emotional Intelligence
  • Brainstorming
  • Optimization
  • Restructuring
  • Integration
  • Strategic Planning
  • Project Management
  • Ongoing Improvement
  • Causal Relationships
  • Case Analysis
  • Diagnostics
  • SWOT Analysis
  • Business Intelligence
  • Quantitative Data Management
  • Qualitative Data Management
  • Risk Management
  • Scientific Method
  • Consumer Behavior

Key Takeaways

  • Demonstrate that you have critical thinking skills by adding relevant keywords to your resume.
  • Mention pertinent critical thinking skills in your cover letter, too, and include an example of a time when you demonstrated them at work.
  • Finally, highlight critical thinking skills during your interview. For instance, you might discuss a time when you were faced with a challenge at work and explain how you applied critical thinking skills to solve it.

University of Louisville. " What is Critical Thinking ."

American Management Association. " AMA Critical Skills Survey: Workers Need Higher Level Skills to Succeed in the 21st Century ."

  • How To Become an Effective Problem Solver
  • 2020-21 Common Application Essay Option 4—Solving a Problem
  • College Interview Tips: "Tell Me About a Challenge You Overcame"
  • Types of Medical School Interviews and What to Expect
  • The Horse Problem: A Math Challenge
  • What to Do When the Technology Fails in Class
  • A Guide to Business Letters Types
  • Landing Your First Teaching Job
  • How to Facilitate Learning and Critical Thinking
  • Best Majors for Pre-med Students
  • Problem Solving in Mathematics
  • Discover Ideas Through Brainstorming
  • What You Need to Know About the Executive Assessment
  • Finding a Job for ESL Learners: Interview Basics
  • Finding a Job for ESL Learners
  • Job Interview Questions and Answers
  • USC Libraries
  • Research Guides

Organizing Your Social Sciences Research Paper

  • Applying Critical Thinking
  • Purpose of Guide
  • Design Flaws to Avoid
  • Independent and Dependent Variables
  • Glossary of Research Terms
  • Reading Research Effectively
  • Narrowing a Topic Idea
  • Broadening a Topic Idea
  • Extending the Timeliness of a Topic Idea
  • Academic Writing Style
  • Choosing a Title
  • Making an Outline
  • Paragraph Development
  • Research Process Video Series
  • Executive Summary
  • The C.A.R.S. Model
  • Background Information
  • The Research Problem/Question
  • Theoretical Framework
  • Citation Tracking
  • Content Alert Services
  • Evaluating Sources
  • Primary Sources
  • Secondary Sources
  • Tiertiary Sources
  • Scholarly vs. Popular Publications
  • Qualitative Methods
  • Quantitative Methods
  • Insiderness
  • Using Non-Textual Elements
  • Limitations of the Study
  • Common Grammar Mistakes
  • Writing Concisely
  • Avoiding Plagiarism
  • Footnotes or Endnotes?
  • Further Readings
  • Generative AI and Writing
  • USC Libraries Tutorials and Other Guides
  • Bibliography

Critical thinking refers to deliberately scrutinizing and evaluating theories, concepts, or ideas using reasoned reflection and analysis. The act of thinking critically implies moving beyond simply understanding information, but rather, questioning its source, its production, and its presentation in order to expose potential bias or researcher subjectivity [i.e., being influenced by personal opinions and feelings rather than by external determinants ] . Applying critical thinking to investigating a research problem involves actively challenging assumptions and questioning the choices and potential motives underpinning how the author designed the study, conducted the research, and arrived at particular conclusions or recommended courses of action.

Mintz, Steven. "How the Word "Critical" Came to Signify the Leading Edge of Cultural Analysis." Higher Ed Gamma Blog , Inside Higher Ed, February 13, 2024; Van Merriënboer, Jeroen JG and Paul A. Kirschner. Ten Steps to Complex Learning: A Systematic Approach to Four-component Instructional Design . New York: Routledge, 2017.

Thinking Critically

Applying Critical Thinking to Research and Writing

Professors like to use the term critical thinking; in fact, the idea of being a critical thinker permeates much of higher education writ large. In the classroom, the idea of thinking critically is often mentioned by professors when students ask how they should approach a research and writing assignment [other approaches your professor might mention include interdisciplinarity, comparative, gendered, global, etc.]. However, critical thinking is more than just an approach to research and writing. It is an acquired skill associated with becoming a complex learner capable of discerning important relationships among the elements of, as well as integrating multiple ways of understanding applied to, the research problem. Critical thinking is a lens through which you holistically interrogate a topic.

Given this, thinking critically encompasses a variety of inter-related connotations applied to writing a college-level research paper:

  • Integrated and Multi-Dimensional . Critical thinking is not focused on any one element of research, but instead, is applied holistically throughout the process of identifying the research problem, reviewing the literature, applying methods of analysis, describing the results, discussing their implications, and, if appropriate, offering recommendations for further research. It permeates the entire research endeavor from contemplating what to write to proofreading the final product.
  • Humanizes the Research . Thinking critically can help humanize what is being studied by extending the scope of the analysis beyond the traditional boundaries of prior research. This could have involved, for example, sampling homogeneous populations, considering only certain factors related to the investigation of a phenomenon, or limiting the way authors framed or contextualized their study. Critical thinking creates opportunities to incorporate the experiences of others into the research process, leading to a more inclusive and representative examination of the topic.
  • Non-Linear . This refers to analyzing a research problem in ways that do not rely on sequential decision-making or rational forms of reasoning. Creative thinking relies on intuitive judgement, flexibility, and unconventional approaches to investigating complex phenomena in order to discover new insights, connections, and potential solutions . This involves going back and modifying your thinking as new evidence emerges , perhaps multiple times throughout the research process, and drawing conclusions from multiple perspectives.
  • Normative . This is the idea that critical thinking can be used to challenge prior assumptions in ways that advocate for social justice, equity, and inclusion and which can lead to research having a more transformative and expansive impact. In this respect, critical thinking can be a method for breaking away from dominant culture norms so as to produce research outcomes that illuminate previously hidden aspects of exploitation and injustice.
  • Power Dynamics . Research in the social and behavioral sciences often includes examining aspects of power and influence that shape social relations, organizations, institutions, and the production and maintenance of knowledge. This approach encompasses studying how power operates, how it can be acquired, and how power and influence can be maintained. Critical thinking can reveal how societal structures perpetuate power and influence in ways that marginalizes and oppresses certain groups or communities within the contexts of history , politics, economics, culture, and other factors.
  • Reflection . A key aspect of critical thinking is practicing reflexivity; the act of turning ideas and concepts back onto yourself in order to reveal and clarify your own beliefs, assumptions, and perspectives. Being critically reflexive is important because it can reveal hidden biases you may have that could unintentionally influence how you interpret and validate information. The more reflexive you are, the better able and more comfortable you are about opening yourself up to new modes of understanding.
  • Rigorous Questioning . Thinking critically is guided by asking questions that lead to addressing complex concepts, principles, theories, or problems more effectively and to help distinguish what is known from from what is not known [or that may be hidden]. In this way, critical thinking involves deliberately framing inquiries not just as research questions, but as a way to focus on systematic, disciplined,  in-depth questioning concerning the research problem and your positionality as a researcher.
  • Social Change . An overarching goal of critical thinking applied to research and writing is to seek to identify and challenge sources of inequality, exploitation, oppression, and marinalization that contributes to maintaining the status quo within institutions of society. This can include entities, such as, schools, courts, businesses, government agencies, or religious centers, that have been created and maintained through certain ways of thinking within the dominant culture.

Although critical thinking permeates the entire research and writing process, it applies most directly to the literature review and discussion sections of your paper . In reviewing the literature, it is important to reflect upon specific aspects of a study, such as, determining if the research design effectively establishes cause and effect relationships or provides insight into explaining why certain phenomena do or do not occur, assessing whether the method of gathering data or information supports the objectives of the study, and evaluating if the assumptions used t o arrive at a specific conclusion are evidence-based and relevant to addressing the research problem. An assessment of whether a source is helpful to investigating the research problem also involves critically analyzing how the research challenges conventional approaches to investigations that perpetuate inequalities or hides the voices of others.

Critical thinking also applies to the discussion section of your paper because this is where you internalize the findings of your study and explain its significance. This involves more than summarizing findings and describing outcomes. It includes reflecting on their importance and providing reasoned explanations why your paper is important in filling a gap in the literature or expanding knowledge and understanding in ways that inform practice. Critical reflection helps you think introspectively about your own beliefs concerning the significance of the findings, but in ways that avoid biased judgment and decision making.

Behar-Horenstein, Linda S., and Lian Niu. “Teaching Critical Thinking Skills in Higher Education: A Review of the Literature.” Journal of College Teaching and Learning 8 (February 2011): 25-41; Bayou, Yemeserach and Tamene Kitila. "Exploring Instructors’ Beliefs about and Practices in Promoting Students’ Critical Thinking Skills in Writing Classes." GIST–Education and Learning Research Journal 26 (2023): 123-154; Butcher, Charity. "Using In-class Writing to Promote Critical Thinking and Application of Course Concepts." Journal of Political Science Education 18 (2022): 3-21; Loseke, Donileen R. Methodological Thinking: Basic Principles of Social Research Design. Thousand Oaks, CA: Sage, 2012; Mintz, Steven. "How the Word "Critical" Came to Signify the Leading Edge of Cultural Analysis." Higher Ed Gamma Blog , Inside Higher Ed, February 13, 2024; Hart, Claire et al. “Exploring Higher Education Students’ Critical Thinking Skills through Content Analysis.” Thinking Skills and Creativity 41 (September 2021): 100877; Lewis, Arthur and David Smith. "Defining Higher Order Thinking." Theory into Practice 32 (Summer 1993): 131-137; Sabrina, R., Emilda Sulasmi, and Mandra Saragih. "Student Critical Thinking Skills and Student Writing Ability: The Role of Teachers' Intellectual Skills and Student Learning." Cypriot Journal of Educational Sciences 17 (2022): 2493-2510. Suter, W. Newton. Introduction to Educational Research: A Critical Thinking Approach. 2nd edition. Thousand Oaks, CA: SAGE Publications, 2012; Van Merriënboer, Jeroen JG and Paul A. Kirschner. Ten Steps to Complex Learning: A Systematic Approach to Four-component Instructional Design. New York: Routledge, 2017; Vance, Charles M., et al. "Understanding and Measuring Linear–Nonlinear Thinking Style for Enhanced Management Education and Professional Practice." Academy of Management Learning and Education 6 (2007): 167-185; Yeh, Hui-Chin, Shih-hsien Yang, Jo Shan Fu, and Yen-Chen Shih. "Developing College Students’ Critical Thinking through Reflective Writing." Higher Education Research & Development 42 (2023): 244-259.

  • << Previous: Academic Writing Style
  • Next: Choosing a Title >>
  • Last Updated: Apr 15, 2024 12:53 PM
  • URL: https://libguides.usc.edu/writingguide

University of Louisville

  • Programs & Services
  • Delphi Center

Ideas to Action (i2a)

  • What is Critical Thinking?

The ability to think critically calls for a higher-order thinking than simply the ability to recall information.

Definitions of critical thinking, its elements, and its associated activities fill the educational literature of the past forty years. Critical thinking has been described as an ability to question; to acknowledge and test previously held assumptions; to recognize ambiguity; to examine, interpret, evaluate, reason, and reflect; to make informed judgments and decisions; and to clarify, articulate, and justify positions (Hullfish & Smith, 1961; Ennis, 1962; Ruggiero, 1975; Scriven, 1976; Hallet, 1984; Kitchener, 1986; Pascarella & Terenzini, 1991; Mines et al., 1990; Halpern, 1996; Paul & Elder, 2001; Petress, 2004; Holyoak & Morrison, 2005; among others).

After a careful review of the mountainous body of literature defining critical thinking and its elements, UofL has chosen to adopt the language of Michael Scriven and Richard Paul (2003) as a comprehensive, concise operating definition:

Critical thinking is the intellectually disciplined process of actively and skillfully conceptualizing, applying, analyzing, synthesizing, and/or evaluating information gathered from, or generated by, observation, experience, reflection, reasoning, or communication, as a guide to belief and action.

Paul and Scriven go on to suggest that critical thinking is based on: "universal intellectual values that transcend subject matter divisions: clarity, accuracy, precision, consistency, relevance, sound evidence, good reasons, depth, breadth, and fairness. It entails the examination of those structures or elements of thought implicit in all reasoning: purpose, problem, or question-at-issue, assumptions, concepts, empirical grounding; reasoning leading to conclusions, implication and consequences, objections from alternative viewpoints, and frame of reference. Critical thinking - in being responsive to variable subject matter, issues, and purposes - is incorporated in a family of interwoven modes of thinking, among them: scientific thinking, mathematical thinking, historical thinking, anthropological thinking, economic thinking, moral thinking, and philosophical thinking."

This conceptualization of critical thinking has been refined and developed further by Richard Paul and Linder Elder into the Paul-Elder framework of critical thinking. Currently, this approach is one of the most widely published and cited frameworks in the critical thinking literature. According to the Paul-Elder framework, critical thinking is the:

  • Analysis of thinking by focusing on the parts or structures of thinking ("the Elements of Thought")
  • Evaluation of thinking by focusing on the quality ("the Universal Intellectual Standards")
  • Improvement of thinking by using what you have learned ("the Intellectual Traits")

Selection of a Critical Thinking Framework

The University of Louisville chose the Paul-Elder model of Critical Thinking as the approach to guide our efforts in developing and enhancing our critical thinking curriculum. The Paul-Elder framework was selected based on criteria adapted from the characteristics of a good model of critical thinking developed at Surry Community College. The Paul-Elder critical thinking framework is comprehensive, uses discipline-neutral terminology, is applicable to all disciplines, defines specific cognitive skills including metacognition, and offers high quality resources.

Why the selection of a single critical thinking framework?

The use of a single critical thinking framework is an important aspect of institution-wide critical thinking initiatives (Paul and Nosich, 1993; Paul, 2004). According to this view, critical thinking instruction should not be relegated to one or two disciplines or departments with discipline specific language and conceptualizations. Rather, critical thinking instruction should be explicitly infused in all courses so that critical thinking skills can be developed and reinforced in student learning across the curriculum. The use of a common approach with a common language allows for a central organizer and for the development of critical thinking skill sets in all courses.

  • SACS & QEP
  • Planning and Implementation
  • Why Focus on Critical Thinking?
  • Paul-Elder Critical Thinking Framework
  • Culminating Undergraduate Experience
  • Community Engagement
  • Frequently Asked Questions
  • What is i2a?

Copyright © 2012 - University of Louisville , Delphi Center

  • Subject List
  • Take a Tour
  • For Authors
  • Subscriber Services
  • Publications
  • African American Studies
  • African Studies
  • American Literature
  • Anthropology
  • Architecture Planning and Preservation
  • Art History
  • Atlantic History
  • Biblical Studies
  • British and Irish Literature
  • Childhood Studies
  • Chinese Studies
  • Cinema and Media Studies
  • Communication
  • Criminology
  • Environmental Science
  • Evolutionary Biology
  • International Law
  • International Relations
  • Islamic Studies
  • Jewish Studies
  • Latin American Studies
  • Latino Studies
  • Linguistics
  • Literary and Critical Theory
  • Medieval Studies
  • Military History
  • Political Science
  • Public Health
  • Renaissance and Reformation
  • Social Work
  • Urban Studies
  • Victorian Literature
  • Browse All Subjects

How to Subscribe

  • Free Trials

In This Article Expand or collapse the "in this article" section Critical Thinking

Introduction, general overviews.

  • Importance of Thinking Critically
  • Defining Critical Thinking
  • General Skills
  • Specific Skills
  • Metacognitive Monitoring Skills
  • Critical Thinking Dispositions
  • Teaching Specific Skills
  • Encouraging a Disposition toward Thinking Critically
  • Transfer to Other Domains
  • Metacognitive Monitoring
  • General or Comprehensive Assessments
  • Metacognition Assessments
  • Critical Thinking Disposition Assessments
  • Thinking Critically about Critical Thinking

Related Articles Expand or collapse the "related articles" section about

About related articles close popup.

Lorem Ipsum Sit Dolor Amet

Vestibulum ante ipsum primis in faucibus orci luctus et ultrices posuere cubilia Curae; Aliquam ligula odio, euismod ut aliquam et, vestibulum nec risus. Nulla viverra, arcu et iaculis consequat, justo diam ornare tellus, semper ultrices tellus nunc eu tellus.

  • Artificial Intelligence, Machine Learning, and Psychology
  • Assessment of Thinking in Educational Settings
  • Human Memory
  • Learning Theory
  • Mindfulness
  • Problem Solving and Decision Making
  • Procrastination
  • Student Success in College
  • Teaching of Psychology
  • Thinking Skills in Educational Settings
  • Women and Science, Technology, Engineering, and Math (STEM)

Other Subject Areas

Forthcoming articles expand or collapse the "forthcoming articles" section.

  • Data Visualization
  • Remote Work
  • Workforce Training Evaluation
  • Find more forthcoming articles...
  • Export Citations
  • Share This Facebook LinkedIn Twitter

Critical Thinking by Heather Butler , Diane Halpern LAST REVIEWED: 26 August 2022 LAST MODIFIED: 29 November 2011 DOI: 10.1093/obo/9780199828340-0019

Critical thinking has been described in many ways, but researchers generally agree that critical thinking involves rational, purposeful, and goal-directed thinking (see Defining Critical Thinking ). Diane F. Halpern defined critical thinking as an attempt to increase the probability of a desired outcome (e.g., making a sound decision, successfully solving a problem) by using certain cognitive skills and strategies. Critical thinking is more than just a collection of skills and strategies: it is a disposition toward engaging with problems. Critical thinkers are flexible, open-minded, persistent, and willing to exert mental energy working on tough problems. Unlike poor thinkers, critical thinkers are willing to admit they have made an error in judgment if confronted with contradictory evidence, and they operate on autopilot much less than poor thinkers (see Critical Thinking Dispositions ). There is good evidence that critical thinking skills and dispositions can be taught (see Teaching Critical Thinking ). This guide includes (a) sources that extol the importance of critical thinking, (b) research that identifies specific critical thinking skills and conceptualizations of critical thinking dispositions, (c) a list of the best practices for teaching critical thinking skills and dispositions, and (d) a review of research into ways of assessing critical thinking skills and dispositions (see Assessments ).

The sources highlighted here include textbooks, literature reviews, and meta-analyses related to critical thinking. These contributions come from both psychological ( Halpern 2003 ; Nisbett 1993 ; Sternberg, et al. 2007 ) and philosophical ( Ennis 1962 , Facione 1990 ) perspectives. Many of these general overviews are textbooks ( Facione 2011b ; Halpern 2003 ; Nisbett 1993 ; Sternberg, et al. 2007 ), while the other sources are review articles or commentaries. Most resources were intended for a general audience, but Sternberg, et al. 2007 was written specifically to address critical thinking in psychology. Those interested in a historical reference are referred to Ennis 1962 , which is credited by some as renewing contemporary interest in critical thinking. Those interested in a more recent conceptualization of critical thinking are referred to Facione 2011a , which is a short introduction to the field of critical thinking that would be appropriate for those new to the field, or Facione 1990 , which summarizes a collaborative definition of critical thinking among philosophers using the Delphi method. Facione 2011b would be a valuable resource for philosophers teaching critical thinking or logic courses to general audiences. For psychologists teaching critical thinking courses to a general audience, Halpern 2003 , an empirically based textbook, covers a wide range of topics; a new edition is expected soon. Fisher 2001 is also intended for general audiences and teaches a wide variety of critical thinking skills. Nisbett 1993 tackles the question of whether critical thinking skills can be taught and provides ample empirical evidence to that end. Sternberg, et al. 2007 is a good resource for psychology students interested in learning how to improve their scientific reasoning skills, a specific set of thinking skills needed by psychology and other science students.

Ennis, Robert H. 1962. A concept of critical thinking: A proposed basis of research in the teaching and evaluation of critical thinking. Harvard Educational Review 32:81–111.

A discussion of how critical thinking is conceptualized from a philosopher’s perspective. Critical of psychology’s definition of critical thinking at the time. Emphasizes twelve aspects of critical thinking.

Facione, Peter A. 1990. Critical thinking: A statement of expert consensus for purposes of educational assessment and instruction; Executive Summary of The Delphi Report . Millbrae, CA: California Academic Press.

Describes the critical thinking movement, definitions of critical thinking agreed upon by philosophers using the Delphi method, the assessment of critical thinking, and how critical thinking can be taught.

Facione, Peter A. 2011a. Critical thinking: What it is and why it counts . Millbrae, CA: Insight Assessment.

This accessible paper defines critical thinking, elaborates on specific critical thinking skills, and discusses what it means to have (or not have) a critical thinking disposition. A distinction is made between system 1 (shallow processing) and system 2 (deeper processing) thinking. Good resource for students new to the field.

Facione, Peter A. 2011b. THINK critically . Upper Saddle River, NJ: Prentice Hall.

Written from a philosophical perspective this critical thinking textbook emphasizes the application of critical thinking to the real world and offers positive examples of critical thinking. Chapters cover inductive, deductive, comparative, ideological, and empirical reasoning

Fisher, Alec. 2001. Critical thinking: An introduction . Cambridge, UK: Cambridge Univ. Press.

Textbook intended for college students discusses various types of reasoning, causality, argument analysis, and decision making. Includes exercises for students and teachers.

Halpern, Diane F. 2003. Thought & knowledge: An introduction to critical thinking . 4th ed. Mahwah, NJ: Lawrence Erlbaum.

This textbook, written by a cognitive psychologist, is grounded in theory and research from the learning sciences and offers practical examples. Chapters include an introduction to the topic and the correlates of critical thinking, memory, thought and language, reasoning, analyzing arguments, thinking as hypothesis testing, likelihood and uncertainty, decision making, development of problem-solving skills, and creative thinking.

Nisbett, Richard E. 1993. Rules for reasoning . Hillsdale, NJ: Lawrence Erlbaum.

This text is rich with empirical evidence that critical thinking skills can be taught to undergraduate and graduate students. Each chapter discusses research on an aspect of reasoning (e.g., statistical reasoning, heuristics, inductive reasoning) with special emphasis on teaching the application of these skills to everyday problems.

Sternberg, Robert J., Henry L. Roediger III, and Diane F. Halpern, eds. 2007. Critical thinking in psychology . New York: Cambridge Univ. Press.

This edited book explores several aspects of critical thinking that are needed to fully understand key topics in psychology such as experiment research, statistical inference, case studies, logical fallacies, and ethical judgments. Experts discuss the critical thinking strategies they engage in. Interesting discussion of historical breakthroughs due to critical thinking.

back to top

Users without a subscription are not able to see the full content on this page. Please subscribe or login .

Oxford Bibliographies Online is available by subscription and perpetual access to institutions. For more information or to contact an Oxford Sales Representative click here .

  • About Psychology »
  • Meet the Editorial Board »
  • Abnormal Psychology
  • Academic Assessment
  • Acculturation and Health
  • Action Regulation Theory
  • Action Research
  • Addictive Behavior
  • Adolescence
  • Adoption, Social, Psychological, and Evolutionary Perspect...
  • Advanced Theory of Mind
  • Affective Forecasting
  • Affirmative Action
  • Ageism at Work
  • Allport, Gordon
  • Alzheimer’s Disease
  • Ambulatory Assessment in Behavioral Science
  • Analysis of Covariance (ANCOVA)
  • Animal Behavior
  • Animal Learning
  • Anxiety Disorders
  • Art and Aesthetics, Psychology of
  • Assessment and Clinical Applications of Individual Differe...
  • Attachment in Social and Emotional Development across the ...
  • Attention-Deficit/Hyperactivity Disorder (ADHD) in Adults
  • Attention-Deficit/Hyperactivity Disorder (ADHD) in Childre...
  • Attitudinal Ambivalence
  • Attraction in Close Relationships
  • Attribution Theory
  • Authoritarian Personality
  • Bayesian Statistical Methods in Psychology
  • Behavior Therapy, Rational Emotive
  • Behavioral Economics
  • Behavioral Genetics
  • Belief Perseverance
  • Bereavement and Grief
  • Biological Psychology
  • Birth Order
  • Body Image in Men and Women
  • Bystander Effect
  • Categorical Data Analysis in Psychology
  • Childhood and Adolescence, Peer Victimization and Bullying...
  • Clark, Mamie Phipps
  • Clinical Neuropsychology
  • Clinical Psychology
  • Cognitive Consistency Theories
  • Cognitive Dissonance Theory
  • Cognitive Neuroscience
  • Communication, Nonverbal Cues and
  • Comparative Psychology
  • Competence to Stand Trial: Restoration Services
  • Competency to Stand Trial
  • Computational Psychology
  • Conflict Management in the Workplace
  • Conformity, Compliance, and Obedience
  • Consciousness
  • Coping Processes
  • Correspondence Analysis in Psychology
  • Counseling Psychology
  • Creativity at Work
  • Critical Thinking
  • Cross-Cultural Psychology
  • Cultural Psychology
  • Daily Life, Research Methods for Studying
  • Data Science Methods for Psychology
  • Data Sharing in Psychology
  • Death and Dying
  • Deceiving and Detecting Deceit
  • Defensive Processes
  • Depressive Disorders
  • Development, Prenatal
  • Developmental Psychology (Cognitive)
  • Developmental Psychology (Social)
  • Diagnostic and Statistical Manual of Mental Disorders (DSM...
  • Discrimination
  • Dissociative Disorders
  • Drugs and Behavior
  • Eating Disorders
  • Ecological Psychology
  • Educational Settings, Assessment of Thinking in
  • Effect Size
  • Embodiment and Embodied Cognition
  • Emerging Adulthood
  • Emotional Intelligence
  • Empathy and Altruism
  • Employee Stress and Well-Being
  • Environmental Neuroscience and Environmental Psychology
  • Ethics in Psychological Practice
  • Event Perception
  • Evolutionary Psychology
  • Expansive Posture
  • Experimental Existential Psychology
  • Exploratory Data Analysis
  • Eyewitness Testimony
  • Eysenck, Hans
  • Factor Analysis
  • Festinger, Leon
  • Five-Factor Model of Personality
  • Flynn Effect, The
  • Forensic Psychology
  • Forgiveness
  • Friendships, Children's
  • Fundamental Attribution Error/Correspondence Bias
  • Gambler's Fallacy
  • Game Theory and Psychology
  • Geropsychology, Clinical
  • Global Mental Health
  • Habit Formation and Behavior Change
  • Health Psychology
  • Health Psychology Research and Practice, Measurement in
  • Heider, Fritz
  • Heuristics and Biases
  • History of Psychology
  • Human Factors
  • Humanistic Psychology
  • Implicit Association Test (IAT)
  • Industrial and Organizational Psychology
  • Inferential Statistics in Psychology
  • Insanity Defense, The
  • Intelligence
  • Intelligence, Crystallized and Fluid
  • Intercultural Psychology
  • Intergroup Conflict
  • International Classification of Diseases and Related Healt...
  • International Psychology
  • Interviewing in Forensic Settings
  • Intimate Partner Violence, Psychological Perspectives on
  • Introversion–Extraversion
  • Item Response Theory
  • Law, Psychology and
  • Lazarus, Richard
  • Learned Helplessness
  • Learning versus Performance
  • LGBTQ+ Romantic Relationships
  • Lie Detection in a Forensic Context
  • Life-Span Development
  • Locus of Control
  • Loneliness and Health
  • Mathematical Psychology
  • Meaning in Life
  • Mechanisms and Processes of Peer Contagion
  • Media Violence, Psychological Perspectives on
  • Mediation Analysis
  • Memories, Autobiographical
  • Memories, Flashbulb
  • Memories, Repressed and Recovered
  • Memory, False
  • Memory, Human
  • Memory, Implicit versus Explicit
  • Memory in Educational Settings
  • Memory, Semantic
  • Meta-Analysis
  • Metacognition
  • Metaphor, Psychological Perspectives on
  • Microaggressions
  • Military Psychology
  • Mindfulness and Education
  • Minnesota Multiphasic Personality Inventory (MMPI)
  • Money, Psychology of
  • Moral Conviction
  • Moral Development
  • Moral Psychology
  • Moral Reasoning
  • Nature versus Nurture Debate in Psychology
  • Neuroscience of Associative Learning
  • Nonergodicity in Psychology and Neuroscience
  • Nonparametric Statistical Analysis in Psychology
  • Observational (Non-Randomized) Studies
  • Obsessive-Complusive Disorder (OCD)
  • Occupational Health Psychology
  • Olfaction, Human
  • Operant Conditioning
  • Optimism and Pessimism
  • Organizational Justice
  • Parenting Stress
  • Parenting Styles
  • Parents' Beliefs about Children
  • Path Models
  • Peace Psychology
  • Perception, Person
  • Performance Appraisal
  • Personality and Health
  • Personality Disorders
  • Personality Psychology
  • Phenomenological Psychology
  • Placebo Effects in Psychology
  • Play Behavior
  • Positive Psychological Capital (PsyCap)
  • Positive Psychology
  • Posttraumatic Stress Disorder (PTSD)
  • Prejudice and Stereotyping
  • Pretrial Publicity
  • Prisoner's Dilemma
  • Prosocial Behavior
  • Prosocial Spending and Well-Being
  • Protocol Analysis
  • Psycholinguistics
  • Psychological Literacy
  • Psychological Perspectives on Food and Eating
  • Psychology, Political
  • Psychoneuroimmunology
  • Psychophysics, Visual
  • Psychotherapy
  • Psychotic Disorders
  • Publication Bias in Psychology
  • Reasoning, Counterfactual
  • Rehabilitation Psychology
  • Relationships
  • Reliability–Contemporary Psychometric Conceptions
  • Religion, Psychology and
  • Replication Initiatives in Psychology
  • Research Methods
  • Risk Taking
  • Role of the Expert Witness in Forensic Psychology, The
  • Sample Size Planning for Statistical Power and Accurate Es...
  • Schizophrenic Disorders
  • School Psychology
  • School Psychology, Counseling Services in
  • Self, Gender and
  • Self, Psychology of the
  • Self-Construal
  • Self-Control
  • Self-Deception
  • Self-Determination Theory
  • Self-Efficacy
  • Self-Esteem
  • Self-Monitoring
  • Self-Regulation in Educational Settings
  • Self-Report Tests, Measures, and Inventories in Clinical P...
  • Sensation Seeking
  • Sex and Gender
  • Sexual Minority Parenting
  • Sexual Orientation
  • Signal Detection Theory and its Applications
  • Simpson's Paradox in Psychology
  • Single People
  • Single-Case Experimental Designs
  • Skinner, B.F.
  • Sleep and Dreaming
  • Small Groups
  • Social Class and Social Status
  • Social Cognition
  • Social Neuroscience
  • Social Support
  • Social Touch and Massage Therapy Research
  • Somatoform Disorders
  • Spatial Attention
  • Sports Psychology
  • Stanford Prison Experiment (SPE): Icon and Controversy
  • Stereotype Threat
  • Stereotypes
  • Stress and Coping, Psychology of
  • Subjective Wellbeing Homeostasis
  • Taste, Psychological Perspectives on
  • Terror Management Theory
  • Testing and Assessment
  • The Concept of Validity in Psychological Assessment
  • The Neuroscience of Emotion Regulation
  • The Reasoned Action Approach and the Theories of Reasoned ...
  • The Weapon Focus Effect in Eyewitness Memory
  • Theory of Mind
  • Therapies, Person-Centered
  • Therapy, Cognitive-Behavioral
  • Time Perception
  • Trait Perspective
  • Trauma Psychology
  • Twin Studies
  • Type A Behavior Pattern (Coronary Prone Personality)
  • Unconscious Processes
  • Video Games and Violent Content
  • Virtues and Character Strengths
  • Women and Science, Technology, Engineering, and Math (STEM...
  • Women, Psychology of
  • Work Well-Being
  • Wundt, Wilhelm
  • Privacy Policy
  • Cookie Policy
  • Legal Notice
  • Accessibility

Powered by:

  • [66.249.64.20|185.80.151.9]
  • 185.80.151.9

Warren Berger

A Crash Course in Critical Thinking

What you need to know—and read—about one of the essential skills needed today..

Posted April 8, 2024 | Reviewed by Michelle Quirk

  • In research for "A More Beautiful Question," I did a deep dive into the current crisis in critical thinking.
  • Many people may think of themselves as critical thinkers, but they actually are not.
  • Here is a series of questions you can ask yourself to try to ensure that you are thinking critically.

Conspiracy theories. Inability to distinguish facts from falsehoods. Widespread confusion about who and what to believe.

These are some of the hallmarks of the current crisis in critical thinking—which just might be the issue of our times. Because if people aren’t willing or able to think critically as they choose potential leaders, they’re apt to choose bad ones. And if they can’t judge whether the information they’re receiving is sound, they may follow faulty advice while ignoring recommendations that are science-based and solid (and perhaps life-saving).

Moreover, as a society, if we can’t think critically about the many serious challenges we face, it becomes more difficult to agree on what those challenges are—much less solve them.

On a personal level, critical thinking can enable you to make better everyday decisions. It can help you make sense of an increasingly complex and confusing world.

In the new expanded edition of my book A More Beautiful Question ( AMBQ ), I took a deep dive into critical thinking. Here are a few key things I learned.

First off, before you can get better at critical thinking, you should understand what it is. It’s not just about being a skeptic. When thinking critically, we are thoughtfully reasoning, evaluating, and making decisions based on evidence and logic. And—perhaps most important—while doing this, a critical thinker always strives to be open-minded and fair-minded . That’s not easy: It demands that you constantly question your assumptions and biases and that you always remain open to considering opposing views.

In today’s polarized environment, many people think of themselves as critical thinkers simply because they ask skeptical questions—often directed at, say, certain government policies or ideas espoused by those on the “other side” of the political divide. The problem is, they may not be asking these questions with an open mind or a willingness to fairly consider opposing views.

When people do this, they’re engaging in “weak-sense critical thinking”—a term popularized by the late Richard Paul, a co-founder of The Foundation for Critical Thinking . “Weak-sense critical thinking” means applying the tools and practices of critical thinking—questioning, investigating, evaluating—but with the sole purpose of confirming one’s own bias or serving an agenda.

In AMBQ , I lay out a series of questions you can ask yourself to try to ensure that you’re thinking critically. Here are some of the questions to consider:

  • Why do I believe what I believe?
  • Are my views based on evidence?
  • Have I fairly and thoughtfully considered differing viewpoints?
  • Am I truly open to changing my mind?

Of course, becoming a better critical thinker is not as simple as just asking yourself a few questions. Critical thinking is a habit of mind that must be developed and strengthened over time. In effect, you must train yourself to think in a manner that is more effortful, aware, grounded, and balanced.

For those interested in giving themselves a crash course in critical thinking—something I did myself, as I was working on my book—I thought it might be helpful to share a list of some of the books that have shaped my own thinking on this subject. As a self-interested author, I naturally would suggest that you start with the new 10th-anniversary edition of A More Beautiful Question , but beyond that, here are the top eight critical-thinking books I’d recommend.

The Demon-Haunted World: Science as a Candle in the Dark , by Carl Sagan

This book simply must top the list, because the late scientist and author Carl Sagan continues to be such a bright shining light in the critical thinking universe. Chapter 12 includes the details on Sagan’s famous “baloney detection kit,” a collection of lessons and tips on how to deal with bogus arguments and logical fallacies.

scientific definition for critical thinking

Clear Thinking: Turning Ordinary Moments Into Extraordinary Results , by Shane Parrish

The creator of the Farnham Street website and host of the “Knowledge Project” podcast explains how to contend with biases and unconscious reactions so you can make better everyday decisions. It contains insights from many of the brilliant thinkers Shane has studied.

Good Thinking: Why Flawed Logic Puts Us All at Risk and How Critical Thinking Can Save the World , by David Robert Grimes

A brilliant, comprehensive 2021 book on critical thinking that, to my mind, hasn’t received nearly enough attention . The scientist Grimes dissects bad thinking, shows why it persists, and offers the tools to defeat it.

Think Again: The Power of Knowing What You Don't Know , by Adam Grant

Intellectual humility—being willing to admit that you might be wrong—is what this book is primarily about. But Adam, the renowned Wharton psychology professor and bestselling author, takes the reader on a mind-opening journey with colorful stories and characters.

Think Like a Detective: A Kid's Guide to Critical Thinking , by David Pakman

The popular YouTuber and podcast host Pakman—normally known for talking politics —has written a terrific primer on critical thinking for children. The illustrated book presents critical thinking as a “superpower” that enables kids to unlock mysteries and dig for truth. (I also recommend Pakman’s second kids’ book called Think Like a Scientist .)

Rationality: What It Is, Why It Seems Scarce, Why It Matters , by Steven Pinker

The Harvard psychology professor Pinker tackles conspiracy theories head-on but also explores concepts involving risk/reward, probability and randomness, and correlation/causation. And if that strikes you as daunting, be assured that Pinker makes it lively and accessible.

How Minds Change: The Surprising Science of Belief, Opinion and Persuasion , by David McRaney

David is a science writer who hosts the popular podcast “You Are Not So Smart” (and his ideas are featured in A More Beautiful Question ). His well-written book looks at ways you can actually get through to people who see the world very differently than you (hint: bludgeoning them with facts definitely won’t work).

A Healthy Democracy's Best Hope: Building the Critical Thinking Habit , by M Neil Browne and Chelsea Kulhanek

Neil Browne, author of the seminal Asking the Right Questions: A Guide to Critical Thinking, has been a pioneer in presenting critical thinking as a question-based approach to making sense of the world around us. His newest book, co-authored with Chelsea Kulhanek, breaks down critical thinking into “11 explosive questions”—including the “priors question” (which challenges us to question assumptions), the “evidence question” (focusing on how to evaluate and weigh evidence), and the “humility question” (which reminds us that a critical thinker must be humble enough to consider the possibility of being wrong).

Warren Berger

Warren Berger is a longtime journalist and author of A More Beautiful Question .

  • Find a Therapist
  • Find a Treatment Center
  • Find a Support Group
  • International
  • New Zealand
  • South Africa
  • Switzerland
  • Asperger's
  • Bipolar Disorder
  • Chronic Pain
  • Eating Disorders
  • Passive Aggression
  • Personality
  • Goal Setting
  • Positive Psychology
  • Stopping Smoking
  • Low Sexual Desire
  • Relationships
  • Child Development
  • Therapy Center NEW
  • Diagnosis Dictionary
  • Types of Therapy

March 2024 magazine cover

Understanding what emotional intelligence looks like and the steps needed to improve it could light a path to a more emotionally adept world.

  • Coronavirus Disease 2019
  • Affective Forecasting
  • Neuroscience

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List

Logo of plosone

What influences students’ abilities to critically evaluate scientific investigations?

Ashley b. heim.

1 Department of Ecology and Evolutionary Biology, Cornell University, Ithaca, NY, United States of America

2 Laboratory of Atomic and Solid State Physics, Cornell University, Ithaca, NY, United States of America

David Esparza

Michelle k. smith, n. g. holmes, associated data.

All raw data files are available from the Cornell Institute for Social and Economic Research (CISER) data and reproduction archive ( https://archive.ciser.cornell.edu/studies/2881 ).

Critical thinking is the process by which people make decisions about what to trust and what to do. Many undergraduate courses, such as those in biology and physics, include critical thinking as an important learning goal. Assessing critical thinking, however, is non-trivial, with mixed recommendations for how to assess critical thinking as part of instruction. Here we evaluate the efficacy of assessment questions to probe students’ critical thinking skills in the context of biology and physics. We use two research-based standardized critical thinking instruments known as the Biology Lab Inventory of Critical Thinking in Ecology (Eco-BLIC) and Physics Lab Inventory of Critical Thinking (PLIC). These instruments provide experimental scenarios and pose questions asking students to evaluate what to trust and what to do regarding the quality of experimental designs and data. Using more than 3000 student responses from over 20 institutions, we sought to understand what features of the assessment questions elicit student critical thinking. Specifically, we investigated (a) how students critically evaluate aspects of research studies in biology and physics when they are individually evaluating one study at a time versus comparing and contrasting two and (b) whether individual evaluation questions are needed to encourage students to engage in critical thinking when comparing and contrasting. We found that students are more critical when making comparisons between two studies than when evaluating each study individually. Also, compare-and-contrast questions are sufficient for eliciting critical thinking, with students providing similar answers regardless of if the individual evaluation questions are included. This research offers new insight on the types of assessment questions that elicit critical thinking at the introductory undergraduate level; specifically, we recommend instructors incorporate more compare-and-contrast questions related to experimental design in their courses and assessments.

Introduction

Critical thinking and its importance.

Critical thinking, defined here as “the ways in which one uses data and evidence to make decisions about what to trust and what to do” [ 1 ], is a foundational learning goal for almost any undergraduate course and can be integrated in many points in the undergraduate curriculum. Beyond the classroom, critical thinking skills are important so that students are able to effectively evaluate data presented to them in a society where information is so readily accessible [ 2 , 3 ]. Furthermore, critical thinking is consistently ranked as one of the most necessary outcomes of post-secondary education for career advancement by employers [ 4 ]. In the workplace, those with critical thinking skills are more competitive because employers assume they can make evidence-based decisions based on multiple perspectives, keep an open mind, and acknowledge personal limitations [ 5 , 6 ]. Despite the importance of critical thinking skills, there are mixed recommendations on how to elicit and assess critical thinking during and as a result of instruction. In response, here we evaluate the degree to which different critical thinking questions elicit students’ critical thinking skills.

Assessing critical thinking in STEM

Across STEM (i.e., science, technology, engineering, and mathematics) disciplines, several standardized assessments probe critical thinking skills. These assessments focus on aspects of critical thinking and ask students to evaluate experimental methods [ 7 – 11 ], form hypotheses and make predictions [ 12 , 13 ], evaluate data [ 2 , 12 – 14 ], or draw conclusions based on a scenario or figure [ 2 , 12 – 14 ]. Many of these assessments are open-response, so they can be difficult to score, and several are not freely available.

In addition, there is an ongoing debate regarding whether critical thinking is a domain-general or context-specific skill. That is, can someone transfer their critical thinking skills from one domain or context to another (domain-general) or do their critical thinking skills only apply in their domain or context of expertise (context-specific)? Research on the effectiveness of teaching critical thinking has found mixed results, primarily due to a lack of consensus definition of and assessment tools for critical thinking [ 15 , 16 ]. Some argue that critical thinking is domain-general—or what Ennis refers to as the “general approach”—because it is an overlapping skill that people use in various aspects of their lives [ 17 ]. In contrast, others argue that critical thinking must be elicited in a context-specific domain, as prior knowledge is needed to make informed decisions in one’s discipline [ 18 , 19 ]. Current assessments include domain-general components [ 2 , 7 , 8 , 14 , 20 , 21 ], asking students to evaluate, for instance, experiments on the effectiveness of dietary supplements in athletes [ 20 ] and context-specific components, such as to measure students’ abilities to think critically in domains such as neuroscience [ 9 ] and biology [ 10 ].

Others maintain the view that critical thinking is a context-specific skill for the purpose of undergraduate education, but argue that it should be content accessible [ 22 – 24 ], as “thought processes are intertwined with what is being thought about” [ 23 ]. From this viewpoint, the context of the assessment would need to be embedded in a relatively accessible context to assess critical thinking independent of students’ content knowledge. Thus, to effectively elicit critical thinking among students, instructors should use assessments that present students with accessible domain-specific information needed to think deeply about the questions being asked [ 24 , 25 ].

Within the context of STEM, current critical thinking assessments primarily ask students to evaluate a single experimental scenario (e.g., [ 10 , 20 ]), though compare-and-contrast questions about more than one scenario can be a powerful way to elicit critical thinking [ 26 , 27 ]. Generally included in the “Analysis” level of Bloom’s taxonomy [ 28 – 30 ], compare-and-contrast questions encourage students to recognize, distinguish between, and relate features between scenarios and discern relevant patterns or trends, rather than compile lists of important features [ 26 ]. For example, a compare-and-contrast assessment may ask students to compare the hypotheses and research methods used in two different experimental scenarios, instead of having them evaluate the research methods of a single experiment. Alternatively, students may inherently recall and use experimental scenarios based on their prior experiences and knowledge as they evaluate an individual scenario. In addition, evaluating a single experimental scenario individually may act as metacognitive scaffolding [ 31 , 32 ]—a process which “guides students by asking questions about the task or suggesting relevant domain-independent strategies [ 32 ]—to support students in their compare-and-contrast thinking.

Purpose and research questions

Our primary objective of this study was to better understand what features of assessment questions elicit student critical thinking using two existing instruments in STEM: the Biology Lab Inventory of Critical Thinking in Ecology (Eco-BLIC) and Physics Lab Inventory of Critical Thinking (PLIC). We focused on biology and physics since critical thinking assessments were already available for these disciplines. Specifically, we investigated (a) how students critically evaluate aspects of research studies in biology and physics when they are individually evaluating one study at a time or comparing and contrasting two studies and (b) whether individual evaluation questions are needed to encourage students to engage in critical thinking when comparing and contrasting.

Providing undergraduates with ample opportunities to practice critical thinking skills in the classroom is necessary for evidence-based critical thinking in their future careers and everyday life. While most critical thinking instruments in biology and physics contexts have undergone some form of validation to ensure they are accurately measuring the intended construct, to our knowledge none have explored how different question types influence students’ critical thinking. This research offers new insight on the types of questions that elicit critical thinking, which can further be applied by educators and researchers across disciplines to measure cognitive student outcomes and incorporate more effective critical thinking opportunities in the classroom.

Ethics statement

The procedures for this study were approved by the Institutional Review Board of Cornell University (Eco-BLIC: #1904008779; PLIC: #1608006532). Informed consent was obtained by all participating students via online consent forms at the beginning of the study, and students did not receive compensation for participating in this study unless their instructor offered credit for completing the assessment.

Participants and assessment distribution

We administered the Eco-BLIC to undergraduate students across 26 courses at 11 institutions (six doctoral-granting, three Master’s-granting, and two Baccalaureate-granting) in Fall 2020 and Spring 2021 and received 1612 usable responses. Additionally, we administered the PLIC to undergraduate students across 21 courses at 11 institutions (six doctoral-granting, one Master’s-granting, three four-year colleges, and one 2-year college) in Fall 2020 and Spring 2021 and received 1839 usable responses. We recruited participants via convenience sampling by emailing instructors of primarily introductory ecology-focused courses or introductory physics courses who expressed potential interest in implementing our instrument in their course(s). Both instruments were administered online via Qualtrics and students were allowed to complete the assessments outside of class. The demographic distribution of the response data is presented in Table 1 , all of which were self-reported by students. The values presented in this table represent all responses we received.

Instrument description

Question types.

Though the content and concepts featured in the Eco-BLIC and PLIC are distinct, both instruments share a similar structure and set of question types. The Eco-BLIC—which was developed using a structure similar to that of the PLIC [ 1 ]—includes two predator-prey scenarios based on relationships between (a) smallmouth bass and mayflies and (b) great-horned owls and house mice. Within each scenario, students are presented with a field-based study and a laboratory-based study focused on a common research question about feeding behaviors of smallmouth bass or house mice, respectively. The prompts for these two Eco-BLIC scenarios are available in S1 and S2 Appendices. The PLIC focuses on two research groups conducting different experiments to test the relationship between oscillation periods of masses hanging on springs [ 1 ]; the prompts for this scenario can be found in S3 Appendix . The descriptive prompts in both the Eco-BLIC and PLIC also include a figure presenting data collected by each research group, from which students are expected to draw conclusions. The research scenarios (e.g., field-based group and lab-based group on the Eco-BLIC) are written so that each group has both strengths and weaknesses in their experimental designs.

After reading the prompt for the first experimental group (Group 1) in each instrument, students are asked to identify possible claims from Group 1’s data (data evaluation questions). Students next evaluate the strengths and weaknesses of various study features for Group 1 (individual evaluation questions). Examples of these individual evaluation questions are in Table 2 . They then suggest next steps the group should pursue (next steps items). Students are then asked to read about the prompt describing the second experimental group’s study (Group 2) and again answer questions about the possible claims, strengths and weaknesses, and next steps of Group 2’s study (data evaluation questions, individual evaluation questions, and next steps items). Once students have independently evaluated Groups 1 and 2, they answer a series of questions to compare the study approaches of Group 1 versus Group 2 (group comparison items). In this study, we focus our analysis on the individual evaluation questions and group comparison items.

The Eco-BLIC examples are derived from the owl/mouse scenario.

Instrument versions

To determine whether the individual evaluation questions impacted the assessment of students’ critical thinking, students were randomly assigned to take one of two versions of the assessment via Qualtrics branch logic: 1) a version that included the individual evaluation and group comparison items or 2) a version with only the group comparison items, with the individual evaluation questions removed. We calculated the median time it took students to answer each of these versions for both the Eco-BLIC and PLIC.

Think-aloud interviews

We also conducted one-on-one think-aloud interviews with students to elicit feedback on the assessment questions (Eco-BLIC n = 21; PLIC n = 4). Students were recruited via convenience sampling at our home institution and were primarily majoring in biology or physics. All interviews were audio-recorded and screen captured via Zoom and lasted approximately 30–60 minutes. We asked participants to discuss their reasoning for answering each question as they progressed through the instrument. We did not analyze these interviews in detail, but rather used them to extract relevant examples of critical thinking that helped to explain our quantitative findings. Multiple think-aloud interviews were conducted with students using previous versions of the PLIC [ 1 ], though these data are not discussed here.

Data analyses

Our analyses focused on (1) investigating the alignment between students’ responses to the individual evaluation questions and the group comparison items and (2) comparing student responses between the two instrument versions. If individual evaluation and group comparison items elicit critical thinking in the same way, we would expect to see the same frequency of responses for each question type, as per Fig 1 . For example, if students evaluated one study feature of Group 1 as a strength and the same study feature for Group 2 as a strength, we would expect that students would respond that both groups were highly effective for this study feature on the group comparison item (i.e., data represented by the purple circle in the top right quadrant of Fig 1 ). Alternatively, if students evaluated one study feature of Group 1 as a strength and the same study feature for Group 2 as a weakness, we would expect that students would indicate that Group 1 was more effective than Group 2 on the group comparison item (i.e., data represented by the green circle in the lower right quadrant of Fig 1 ).

An external file that holds a picture, illustration, etc.
Object name is pone.0273337.g001.jpg

The x- and y-axes represent rankings on the individual evaluation questions for Groups 1 and 2 (or field and lab groups), respectively. The colors in the legend at the top of the figure denote responses to the group comparison items. In this idealized example, all pie charts are the same size to indicate that the student answers are equally proportioned across all answer combinations.

We ran descriptive statistics to summarize student responses to questions and examine distributions and frequencies of the data on the Eco-BLIC and PLIC. We also conducted chi-square goodness-of-fit tests to analyze differences in student responses between versions within the relevant questions from the same instrument. In all of these tests, we used a Bonferroni correction to lower the chances of receiving a false positive and account for multiple comparisons. We generated figures—primarily multi-pie chart graphs and heat maps—to visualize differences between individual evaluation and group comparison items and between versions of each instrument with and without individual evaluation questions, respectively. All aforementioned data analyses and figures were conducted or generated in the R statistical computing environment (v. 4.1.1) and Microsoft Excel.

We asked students to evaluate different experimental set-ups on the Eco-BLIC and PLIC two ways. Students first evaluated the strengths and weaknesses of study features for each scenario individually (individual evaluation questions, Table 2 ) and, subsequently, answered a series of questions to compare and contrast the study approaches of both research groups side-by-side (group comparison items, Table 2 ). Through analyzing the individual evaluation questions, we found that students generally ranked experimental features (i.e., those related to study set-up, data collection and summary methods, and analysis and outcomes) of the independent research groups as strengths ( Fig 2 ), evidenced by the mean scores greater than 2 on a scale from 1 (weakness) to 4 (strength).

An external file that holds a picture, illustration, etc.
Object name is pone.0273337.g002.jpg

Each box represents the interquartile range (IQR). Lines within each box represent the median. Circles represent outliers of mean scores for each question.

Individual evaluation versus compare-and-contrast evaluation

Our results indicate that when students consider Group 1 or Group 2 individually, they mark most study features as strengths (consistent with the means in Fig 2 ), shown by the large circles in the upper right quadrant across the three experimental scenarios ( Fig 3 ). However, the proportion of colors on each pie chart shows that students select a range of responses when comparing the two groups [e.g., Group 1 being more effective (green), Group 2 being more effective (blue), both groups being effective (purple), and neither group being effective (orange)]. We infer that students were more discerning (i.e., more selective) when they were asked to compare the two groups across the various study features ( Fig 3 ). In short, students think about the groups differently if they are rating either Group 1 or Group 2 in the individual evaluation questions versus directly comparing Group 1 to Group 2.

An external file that holds a picture, illustration, etc.
Object name is pone.0273337.g003.jpg

The x- and y-axes represent students’ rankings on the individual evaluation questions for Groups 1 and 2 on each assessment, respectively, where 1 indicates weakness and 4 indicates strength. The overall size of each pie chart represents the proportion of students who responded with each pair of ratings. The colors in the pie charts denote the proportion of students’ responses who chose each option on the group comparison items. (A) Eco-BLIC bass-mayfly scenario (B) Eco-BLIC owl-mouse scenario (C) PLIC oscillation periods of masses hanging on springs scenario.

These results are further supported by student responses from the think-aloud interviews. For example, one interview participant responding to the bass-mayfly scenario of the Eco-BLIC explained that accounting for bias/error in both the field and lab groups in this scenario was a strength (i.e., 4). This participant mentioned that Group 1, who performed the experiment in the field, “[had] outliers, so they must have done pretty well,” and that Group 2, who collected organisms in the field but studied them in lab, “did a good job of accounting for bias.” However, when asked to compare between the groups, this student argued that Group 2 was more effective at accounting for bias/error, noting that “they controlled for more variables.”

Another individual who was evaluating “repeated trials for each mass” in the PLIC expressed a similar pattern. In response to ranking this feature of Group 1 as a strength, they explained: “Given their uncertainties and how small they are, [the group] seems like they’ve covered their bases pretty well.” Similarly, they evaluated this feature of Group 2 as a strength as well, simply noting: “Same as the last [group], I think it’s a strength.” However, when asked to compare between Groups 1 and 2, this individual argued that Group 1 was more effective because they conducted more trials.

Individual evaluation questions to support compare and contrast thinking

Given that students were more discerning when they directly compared two groups for both biology and physics experimental scenarios, we next sought to determine if the individual evaluation questions for Group 1 or Group 2 were necessary to elicit or helpful to support student critical thinking about the investigations. To test this, students were randomly assigned to one of two versions of the instrument. Students in one version saw individual evaluation questions about Group 1 and Group 2 and then saw group comparison items for Group 1 versus Group 2. Students in the second version only saw the group comparison items. We found that students assigned to both versions responded similarly to the group comparison questions, indicating that the individual evaluation questions did not promote additional critical thinking. We visually represent these similarities across versions with and without the individual evaluation questions in Fig 4 as heat maps.

An external file that holds a picture, illustration, etc.
Object name is pone.0273337.g004.jpg

The x-axis denotes students’ responses on the group comparison items (i.e., whether they ranked Group 1 as more effective, Group 2 as more effective, both groups as highly effective, or neither group as effective/both groups were minimally effective). The y-axis lists each of the study features that students compared between the field and lab groups. White and lighter shades of red indicate a lower percentage of student responses, while brighter red indicates a higher percentage of student responses. (A) Eco-BLIC bass-mayfly scenario. (B) Eco-BLIC owl-mouse scenario. (C) PLIC oscillation periods of masses hanging on springs scenario.

We ran chi-square goodness-of-fit tests on the answers between student responses on both instrument versions and there were no significant differences on the Eco-BLIC bass-mayfly scenario ( Fig 4A ; based on an adjusted p -value of 0.006) or owl-mouse questions ( Fig 4B ; based on an adjusted p-value of 0.004). There were only three significant differences (out of 53 items) in how students responded to questions on both versions of the PLIC ( Fig 4C ; based on an adjusted p -value of 0.0005). The items that students responded to differently ( p <0.0005) across both versions were items where the two groups were identical in their design; namely, the equipment used (i.e., stopwatches), the variables measured (i.e., time and mass), and the number of bounces of the spring per trial (i.e., five bounces). We calculated Cramer’s C (Vc; [ 33 ]), a measure commonly applied to Chi-square goodness of fit models to understand the magnitude of significant results. We found that the effect sizes for these three items were small (Vc = 0.11, Vc = 0.10, Vc = 0.06, respectively).

The trend that students answer the Group 1 versus Group 2 comparison questions similarly, regardless of whether they responded to the individual evaluation questions, is further supported by student responses from the think-aloud interviews. For example, one participant who did not see the individual evaluation questions for the owl-mouse scenario of the Eco-BLIC independently explained that sampling mice from other fields was a strength for both the lab and field groups. They explained that for the lab group, “I think that [the mice] coming from multiple nearby fields is good…I was curious if [mouse] behavior was universal.” For the field group, they reasoned, “I also noticed it was just from a single nearby field…I thought that was good for control.” However, this individual ultimately reasoned that the field group was “more effective for sampling methods…it’s better to have them from a single field because you know they were exposed to similar environments.” Thus, even without individual evaluation questions available, students can still make individual evaluations when comparing and contrasting between groups.

We also determined that removing the individual evaluation questions decreased the duration of time students needed to complete the Eco-BLIC and PLIC. On the Eco-BLIC, the median time to completion for the version with individual evaluation and group comparison questions was approximately 30 minutes, while the version with only the group comparisons had a median time to completion of 18 minutes. On the PLIC, the median time to completion for the version with individual evaluation questions and group comparison questions was approximately 17 minutes, while the version with only the group comparisons had a median time to completion of 15 minutes.

To determine how to elicit critical thinking in a streamlined manner using introductory biology and physics material, we investigated (a) how students critically evaluate aspects of experimental investigations in biology and physics when they are individually evaluating one study at a time versus comparing and contrasting two and (b) whether individual evaluation questions are needed to encourage students to engage in critical thinking when comparing and contrasting.

Students are more discerning when making comparisons

We found that students were more discerning when comparing between the two groups in the Eco-BLIC and PLIC rather than when evaluating each group individually. While students tended to independently evaluate study features of each group as strengths ( Fig 2 ), there was greater variation in their responses to which group was more effective when directly comparing between the two groups ( Fig 3 ). Literature evaluating the role of contrasting cases provides plausible explanations for our results. In that work, contrasting between two cases supports students in identifying deep features of the cases, compared with evaluating one case after the other [ 34 – 37 ]. When presented with a single example, students may deem certain study features as unimportant or irrelevant, but comparing study features side-by-side allows students to recognize the distinct features of each case [ 38 ]. We infer, therefore, that students were better able to recognize the strengths and weaknesses of the two groups in each of the assessment scenarios when evaluating the groups side by side, rather than in isolation [ 39 , 40 ]. This result is somewhat surprising, however, as students could have used their knowledge of experimental designs as a contrasting case when evaluating each group. Future work, therefore, should evaluate whether experts use their vast knowledge base of experimental studies as discerning contrasts when evaluating each group individually. This work would help determine whether our results here suggest that students do not have a sufficient experiment-base to use as contrasts or if the students just do not use their experiment-base when evaluating the individual groups. Regardless, our study suggests that critical thinking assessments should ask students to compare and contrast experimental scenarios, rather than just evaluate individual cases.

Individual evaluation questions do not influence answers to compare and contrast questions

We found that individual evaluation questions were unnecessary for eliciting or supporting students’ critical thinking on the two assessments. Students responded to the group comparison items similarly whether or not they had received the individual evaluation questions. The exception to this pattern was that students responded differently to three group comparison items on the PLIC when individual evaluation questions were provided. These three questions constituted a small portion of the PLIC and showed a small effect size. Furthermore, removing the individual evaluation questions decreased the median time for students to complete the Eco-BLIC and PLIC. It is plausible that spending more time thinking about the experimental methods while responding to the individual evaluation questions would then prepare students to be better discerners on the group comparison questions. However, the overall trend is that individual evaluation questions do not have a strong impact on how students evaluate experimental scenarios, nor do they set students up to be better critical thinkers later. This finding aligns with prior research suggesting that students tend to disregard details when they evaluate a single case, rather than comparing and contrasting multiple cases [ 38 ], further supporting our findings about the effectiveness of the group comparison questions.

Practical implications

Individual evaluation questions were not effective for students to engage in critical thinking nor to prepare them for subsequent questions that elicit their critical thinking. Thus, researchers and instructors could make critical thinking assessments more effective and less time-consuming by encouraging comparisons between cases. Additionally, the study raises a question about whether instruction should incorporate more experimental case studies throughout their courses and assessments so that students have a richer experiment-base to use as contrasts when evaluating individual experimental scenarios. To help students discern information about experimental design, we suggest that instructors consider providing them with multiple experimental studies (i.e., cases) and asking them to compare and contrast between these studies.

Future directions and limitations

When designing critical thinking assessments, questions should ask students to make meaningful comparisons that require them to consider the important features of the scenarios. One challenge of relying on compare-and-contrast questions in the Eco-BLIC and PLIC to elicit students’ critical thinking is ensuring that students are comparing similar yet distinct study features across experimental scenarios, and that these comparisons are meaningful [ 38 ]. For example, though sample size is different between experimental scenarios in our instruments, it is a significant feature that has implications for other aspects of the research like statistical analyses and behaviors of the animals. Therefore, one limitation of our study could be that we exclusively focused on experimental method evaluation questions (i.e., what to trust), and we are unsure if the same principles hold for other dimensions of critical thinking (i.e., what to do). Future research should explore whether questions that are not in a compare-and-contrast format also effectively elicit critical thinking, and if so, to what degree.

As our question schema in the Eco-BLIC and PLIC were designed for introductory biology and physics content, it is unknown how effective this question schema would be for upper-division biology and physics undergraduates who we would expect to have more content knowledge and prior experiences for making comparisons in their respective disciplines [ 18 , 41 ]. For example, are compare-and-contrast questions still needed to elicit critical thinking among upper-division students, or would critical thinking in this population be more effectively assessed by incorporating more sophisticated data analyses in the research scenarios? Also, if students with more expert-like thinking have a richer set of experimental scenarios to inherently use as contrasts when comparing, we might expect their responses on the individual evaluation questions and group comparisons to better align. To further examine how accessible and context-specific the Eco-BLIC and PLIC are, novel scenarios could be developed that incorporate topics and concepts more commonly addressed in upper-division courses. Additionally, if instructors offer students more experience comparing and contrasting experimental scenarios in the classroom, would students be more discerning on the individual evaluation questions?

While a single consensus definition of critical thinking does not currently exist [ 15 ], continuing to explore critical thinking in other STEM disciplines beyond biology and physics may offer more insight into the context-specific nature of critical thinking [ 22 , 23 ]. Future studies should investigate critical thinking patterns in other STEM disciplines (e.g., mathematics, engineering, chemistry) through designing assessments that encourage students to evaluate aspects of at least two experimental studies. As undergraduates are often enrolled in multiple courses simultaneously and thus have domain-specific knowledge in STEM, would we observe similar patterns in critical thinking across additional STEM disciplines?

Lastly, we want to emphasize that we cannot infer every aspect of critical thinking from students’ responses on the Eco-BLIC and PLIC. However, we suggest that student responses on the think-aloud interviews provide additional qualitative insight into how and why students were making comparisons in each scenario and their overall critical thinking processes.

Conclusions

Overall, we found that comparing and contrasting two different experiments is an effective and efficient way to elicit context-specific critical thinking in introductory biology and physics undergraduates using the Eco-BLIC and the PLIC. Students are more discerning (i.e., critical) and engage more deeply with the scenarios when making comparisons between two groups. Further, students do not evaluate features of experimental studies differently when individual evaluation questions are provided or removed. These novel findings hold true across both introductory biology and physics, based on student responses on the Eco-BLIC and PLIC, respectively—though there is much more to explore regarding critical thinking processes of students across other STEM disciplines and in more advanced stages of their education. Undergraduate students in STEM need to be able to critically think for career advancement, and the Eco-BLIC and PLIC are two means of measuring students’ critical thinking in biology and physics experimental contexts via comparing and contrasting. This research offers new insight on the types of questions that elicit critical thinking, which can further be applied by educators and researchers across disciplines to teach and measure cognitive student outcomes. Specifically, we recommend instructors incorporate more compare-and-contrast questions related to experimental design in their courses to efficiently elicit undergraduates’ critical thinking.

Supporting information

S1 appendix, s2 appendix, s3 appendix, acknowledgments.

We thank the members of the Cornell Discipline-based Education Research group for their feedback on this article, as well as our advisory board (Jenny Knight, Meghan Duffy, Luanna Prevost, and James Hewlett) and the AAALab for their ideas and suggestions. We also greatly appreciate the instructors who shared the Eco-BLIC and PLIC in their classes and the students who participated in this study.

Funding Statement

This work was supported by the National Science Foundation under grants DUE-1909602 (MS & NH) and DUE-1611482 (NH). NSF: nsf.gov The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.

Data Availability

Advertisement

Advertisement

An empirical analysis of the relationship between nature of science and critical thinking through science definitions and thinking skills

  • Original Paper
  • Open access
  • Published: 08 December 2022
  • Volume 2 , article number  270 , ( 2022 )

Cite this article

You have full access to this open access article

  • María Antonia Manassero-Mas   ORCID: orcid.org/0000-0002-7804-7779 1 &
  • Ángel Vázquez-Alonso   ORCID: orcid.org/0000-0001-5830-7062 2  

1996 Accesses

Explore all metrics

Critical thinking (CRT) skills transversally pervade education and nature of science (NOS) knowledge is a key component of science literacy. Some science education researchers advocate that CRT skills and NOS knowledge have a mutual impact and relationship. However, few research studies have undertaken the empirical confirmation of this relationship and most fail to match the two terms of the relationship adequately. This paper aims to test the relationship by applying correlation, regression and ANOVA procedures to the students’ answers to two tests that measure thinking skills and science definitions. The results partly confirm the hypothesised relationship, which displays some complex features: on the one hand, the relationship is positive and significant for the NOS variables that express adequate ideas about science. However, it is non-significant when the NOS variables depict misinformed ideas about science. Furthermore, the comparison of the two student cohorts reveals that two years of science instruction do not seem to contribute to advancing students’ NOS conceptions. Finally, some interpretations and consequences of these results for scientific literacy, teaching NOS (paying attention both to informed and misinformed ideas), for connecting NOS with general epistemic knowledge, and assessing CRT skills are discussed.

Similar content being viewed by others

scientific definition for critical thinking

Philosophical Inquiry and Critical Thinking in Primary and Secondary Science Education

scientific definition for critical thinking

Scientific Thinking and Critical Thinking in Science Education

Antonio García-Carmona

scientific definition for critical thinking

Using Socioscientific Issues to Promote the Critical Thinking Skills of Year 10 Science Students in Diverse School Contexts

Avoid common mistakes on your manuscript.

Introduction

Among other objectives, school science education perennially aims to improve scientific literacy for all, which involves being useful and functional for making adequate and sound personal and social daily life decisions. An essential component of scientific literacy is the knowledge “about” science, that is, knowledge about how science works and validates its knowledge and intervenes in the world (along with technology). This study focuses on the knowledge about science, which is often referred to in the literature as nature of science (NOS), scientific practice, ideas about science, etc., in turn, related to a continuous innovative teaching tradition (Vesterinen et al., 2014 ; Khishfe, 2012 ; Lederman, 2007 ; Matthews, 2012 ; McComas, 1996 ; Olson, 2018 ; among others).

On the other hand, some international reports and experts state that critical thinking (CRT) skills are key and transversal competencies for all educational levels, subjects and jobs in the 21st century. For instance, the European Union ( 2014 ) proposes seven key competencies that require developing a set of transversal skills, namely CRT, creativity, initiative, problem-solving, risk assessment, decision-making, communication and constructive management of emotions. In the same vein, the National Research Council ( 2012 ) proposes the transferable knowledge and skills for life and work, which explicitly details the following skills: argumentation, problem-solving, decision-making, analysis, interpretation, creativity, and others. In short, these and many other proposals converge in pointing out that teaching students to think and educating in CRT skills is an innovative and significant challenge for 21st century education and, of course, for science education. The CRT construct has been widely developed within psychological research. Yet, the field is complex, and terminologically bewildering (i.e., higher-order skills, cognitive skills, thinking skills, CRT, and other terms are used interchangeably), and some controversies are still unresolved. For instance, scholars do not agree on a common definition of CRT, and the most appropriate set of skills and dispositions to depict CRT is also disputed. As the differences among scholars still persist, the term CRT will be adopted hereafter to generally describe the variety of higher-order thinking skills that are usually associated in the CRT literature.

Further, some science education research currently suggests connections between NOS and CRT, arguing that CRT skills and NOS knowledge are related. Some claim that thinking skills are key to learning NOS (Erduran & Kaya, 2018 ; Ford & Yore, 2014 ; García-Mila & Andersen, 2008 ; Simonneaux, 2014 ), and specifically, that argumentation skills may enhance NOS understanding (Khishfe et al., 2017 ). In contrast, as argumentation skills are a key competence for the construction and validation of scientific knowledge, other studies claim that NOS knowledge (i.e., understanding the differences between data and claims) is also key to learning CRT skills such as argumentation (Allchin & Zemplén, 2020 ; Greene et al., 2016 ; Settlage & Southerland, 2020 ). Both directions of this intuitive relationship between CRT skills and NOS are fruitful ways to enhance scientific literacy and general learning. Hence, this study aims to empirically explore the NOS-CRT relationship, as the prior literature is somewhat mystifying and its contributions are limited, as will be shown below.

Theoretical contextualization

This study copes with two different, vast and rich realms of research, namely NOS and CRT, and their theoretical frameworks: the interdisciplinary context of philosophy, sociology, and history of science and science education for NOS; and psychology and general education for CRT skills. Both frameworks are summarized below to meet the journal space limitations.

Under the NOS label, science education has developed a fertile and vast realm of “knowledge about scientific knowledge and knowing”, which is obviously a particular case of human thinking, and probably the most developed to date. NOS represents the meta-cognitive, multifaceted and dynamic knowledge about what science is and how science works as a social way of knowing and explaining the natural world (knowledge construction and validation). This knowledge has been interdisciplinarily elaborated from history, philosophy, sociology of science and technology, and other disciplines. Scholars raised many and varied NOS issues (Matthews, 2012 ), which are relevant to scientific research and widely surpass the reduced consensus view (Lederman, 2007 ). Despite NOS complexity, it has been systematized across two broad dimensions: epistemological and social (Erduran & Dagher, 2014 ; Manassero-Mass & Vazquez-Alonso, 2019 ). The epistemological dimension refers to the principles and values underlying knowledge construction and validation, which are often described as the scientific method, empirical basis, observation, data and inference, tentativeness, theory and law, creativity, subjectivity, demarcation, and many others. The social dimension refers to the social construction of scientific knowledge and its social impact. It often deals with the scientific community and institutions, social influences, and general science-technology-society interactions (peer evaluation, communication, gender, innovation, development, funding, technology, psychology, etc.).

From its beginning, NOS research agrees that students (and teachers) hold inadequate and misinformed beliefs on NOS issues across different educational levels and contexts. Further, researchers agree that effective NOS teaching requires explicit and reflective methods to overcome the many learning barriers (Bennássarr et al., 2010 ; García et al., 2011 ; Cofré et al., 2019 ; Deng et al., 2011 ). These barriers relate to the basic processes of gathering (observation) and elaborating (analysis) data, decision-making in science, and specifically, the inability to differentiate facts and explanations and adequately coordinate evidence, justifications, arguments and conclusions; the lack of elementary meta-cognitive and self-regulation skills (i.e., the quick jump to conclusions as self-evident); the introduction of personal opinions, inferences, and reinterpretations and the dismissal of the counter-arguments or evidence that may contradict personal ideas (García-Mila & Andersen, 2008 ; McDonald & McRobbie, 2012 ).

As these barriers point directly to the general abilities involved in thinking (observation, analysis, answering questions, solving problems, decision-making and the like), researchers attribute those difficulties to the lack of the cognitive skills involved in the adequate management of the barriers, whose higher-order cognitive nature corresponds to many CRT skills (Kolstø, 2001 ; Zeidler et al., 2002 ). Thus, the solutions to overcome the barriers imply mastering the CRT skills, and, consequently, achieving successful NOS learning (Ford & Yore, 2014 ; McDonald & McRobbie, 2012 ; Simonneaux, 2014 ). Erduran and Kaya ( 2018 ) argue that the perennial aim of developing students’ and teachers’ NOS epistemic insights still remains a challenge for science education, despite decades of NOS research, due to the many aspects involved. They conclude that NOS knowledge critically demands higher-order cognitive skills. The paragraphs below elaborate on these higher-order cognitive skills or CRT skills.

Critical thinking

As previously stated, the CRT field shows many differences in scholarly knowledge on the conceptualization and composition of CRT. Ennis’ ( 1996 ) simple definition of CRT as reasonable reflective thinking focused on deciding what to believe or do is likely the most celebrated definition among many others. A Delphi panel of experts defined CRT as an intentional and self-regulated judgment, which results in interpretation, analysis, evaluation and inference, as well as the explanation of the evidentiary, conceptual, methodological, criterial or contextual considerations on which that judgment is based (American Psychological Association 1990 ).

However, the varied set of skills associated with CRT is controversial (Fisher, 2009 ). For instance, Ennis ( 2019 ) developed an extensive conception of CRT through a broad set of dispositions and abilities. Similarly, Madison ( 2004 ) proposed an extensive and comprehensive list of skills (Table 1 ).

The development of CRT tests has contributed to clarifying the relevance of the many CRT skills, as the test’s functionality requires concentrating on a few skills. For instance, Halpern’s ( 2010 ) questionnaire assesses, through everyday situations, problem-solving, verbal reasoning, probability and uncertainty, hypothesis-testing, argument analysis and decision-making. Watson and Glaser’s ( 2002 ) instrument assesses deduction, recognition of assumptions, interpretation, inference, and evaluation of arguments. The California Critical Thinking Skills Test assesses analysis, evaluation, inference, deduction and induction (Facione et al., 1998 ). It is also worth mentioning that most CRT tests target adults, although the Cornell Critical Thinking Tests (Ennis & Millman, 2005 ) were developed for a variety of young people and address several CRT skills (X test, induction, deduction, credibility, and identification of assumptions; Class Test, classical logical reasoning from premises to conclusion, etc.). The large number of CRT skills led scholars to perform efforts of synthesis and refinement that are summarized through some exemplary proposals (Table 1 ).

The CRT psychological framework presented above places the complex set of skills within the high-level cognitive constructs whose practice involves a self-directed, self-disciplined, self-supervised, and self-corrective way of thinking that presupposes conscious mastery of skills and conformity with rigorous quality standards. In addition to skills, CRT also involves effective communication and attitudinal commitment to intellectual standards to overcome the natural tendencies to fallacy and bias (self-centeredness and socio-centrism).

Science education and thinking skills

CRT skills mirror the scientific reasoning skills of scientific practice, and vice versa, based on their similar contents. This intuitive resemblance may launch expectations of their mutual relationship. Science education research has increased attention to CRT skills as promotors of meaningful learning, especially when involving NOS and understanding of socio-scientific issues (Vieira et al., 2011 ; Torres & Solbes, 2016 ; Vázquez-Alonso & Manassero-Mas, 2018 ; Yacoubian & Khishfe, 2018 , among others). Furthermore, Yacoubian ( 2015 ) elaborated several reasons to consider CRT a fundamental pillar for NOS learning.

Some authors stress the convergence between science and CRT based on the word critical , as thinking and science are both critical. Critical approaches have always been considered consubstantial to science (and likely a key factor of its success), as their range spreads from specific critical social issues (i.e., scientific controversies, social acceptance of scientific knowledge, social coping with a virus pandemic) to the socially organized scepticism of science (i.e., peer evaluation, scientific communication). The latter is considered a universal value of scientific practice to guarantee the validity of knowledge (Merton, 1968 ; Osborne, 2014 ). In the context of CRT research, the term critical involves normative ways to ensure the quality of good thinking, such as open-minded abilities and a disposition for relentless scrutiny of ideas, criteria for evaluating the goodness of thinking, adherence to the norms, standards of excellence, and avoidance of errors and fallacies (traits of poor thinking). These obviously also apply to scientific knowledge through peer evaluation practice, which represents a superlative form of good normative thinking (Bailin, 2002 ; Paul & Elder, 2008 ).

Another important feature of the convergence of CRT and science is the broad set of common skills sharing the same semantic content in both fields, despite that their names may seem different. Induction, deduction, abduction, and, in general, all kinds of argumentation skills, as well as problem-solving and decision-making, exemplify key tools of scientific practice to validate and defend ideas and develop controversies, discussions, and debates. Concurrently, they, too, are CRT skills (Sprod, 2014 ; Vieira et al., 2011 ; Yacoubian & Kishfe, 2018 ). In addition, Santos’ ( 2017 ) review suggests the following tentative list of skills: observation, exploration, research, problem-solving, decision-making, information-gathering, critical questions, reliable knowledge-building, evaluation, rigorous checks, acceptance and rejection of hypotheses, clarification of meanings, and true conclusions. Beyond skill names and focusing on their semantic content, (Manassero-Mas & Vázquez-Alonso, 2020a ) developed a deeper analysis of the skills usually attributed to scientific thinking and critical thinking, concluding that their constituent skills are deeply intertwined and much more coincident than different. This suggests that scientific and critical thinking may be considered equivalent concepts across the many shared skills they put into practice. However, equivalence does not mean identity, as important differences may still exist. For instance, the evaluation and judgment of ideas involved in organized scientific skepticism (i.e., peer evaluation) are much more demanding and deeper in scientific practice than in daily life thinking realms.

In sum, research on the CRT and NOS constructs is plural, as they draw from two different fields and traditions, general education and cognitive psychology, and science education, respectively. However, CRT and NOS share many skills, processes, and thinking strategies, as they both pursue the same general goal, namely, to establish the true value of knowledge claims. These shared features provide further reasons to investigate the possible relationships between NOS and CRT skills.

Research involving nature of science and thinking skills

The research involving both constructs is heterogeneous, as the operationalisations and methods are quite varied, given the pluralized nature of NOS and thinking. For example, Yang and Tsai ( 2012 ) reviewed 37 empirical studies on the relationship between personal epistemologies and science learning, concluding that research was heterogeneous along different NOS orientations: applications of Kuhn’s ( 2012 ) evolutionary epistemic categories, use of general epistemic knowledge categories, studies on epistemological beliefs about science (empiricism, tentativeness, etc.), and applications of other epistemic frameworks. The studies dealing with the epistemological beliefs about science were a minority. Another example of heterogeneity comes from Koray and Köksal’s ( 2009 ) study about the effect of laboratory instruction versus traditional teaching on creativity and logical thinking in prospective primary school teachers, where the laboratory group showed a significant effect in comparison to the traditional group. However, the NOS contents involved in laboratory instruction are still unclear. Dowd et al. ( 2018 ) examined the relationship between written scientific reasoning and eight specific CRT skills, finding that only three aspects of reasoning were significantly related to one skill (inference) and negatively to argument.

A series of studies suggest implicit relationships between NOS and thinking skills. Yang and Tsai ( 2010 ) interviewed sixth-graders to examine two uncertain science-related issues, finding that children who developed more complex (multiplistic) NOS knowledge displayed better reflective thinking and coordination of theory and evidence. Dogan et al. ( 2020 ) compared the impact of two epistemic-based methodologies (problem-based and history of science) on the creativity skills of prospective primary school teachers, finding that the problem-solving approach was more effective in increasing students’ creative thinking. Khishfe ( 2012 ) and Khishfe et al. ( 2017 ) found no differences in decision-making and argumentation in socio-scientific issues regarding NOS knowledge, but more participants in the treatment groups referred their post-decision-making factors to NOS than the other groups. Other studies found relationships between NOS understanding and variables that do not match CRT skills precisely. For instance, Bogdan ( 2020 ) found that inference and tentativeness relate to attitudes toward the role of science in social progress, but creativity does not, and the same applies to the acceptance of the evolution theory (Cofré et al., 2017 ; Sinatra et al., 2003 ).

Another set of studies comes from science education research on argumentation, which is based on the rationale that argumentation is a key scientific skill for validating knowledge in scientific practice. Thus, reasoning skills should be related to NOS understanding. Students who viewed science as dynamic and changeable were likely to develop more complex arguments (Stathopoulou & Vosnidou, 2007 ). In a floatation experience, Zeineddin and Abd-El-Khalick ( 2010 ) found that the stronger the epistemic commitments, the greater the quality of the scientific reasoning produced by the individuals. Accordingly, the term epistemic cognition of scientific argumentation has been coined, although specific research on argumentation and epistemic cognition is still relatively scarce (He et al., 2020 ).

Weinstock’s ( 2006 ) review suggested that people’s argumentation skills develop in proportion to their epistemic development, which Noroozi ( 2016 ) also confirmed. Further, Mason and Scirica ( 2006 ) studied the contribution of general epistemological comprehension to argumentation skills in two readings, finding that participants at the highest level of epistemic comprehension (evaluative) generated better quality arguments than participants at the previous multiplistic stage (Kuhn, 2012 ). In addition, the review of Rapanta et al. ( 2013 ) on argumentative competence proposed a three-dimensional hierarchical framework, where the highest level is epistemological (the ability to evaluate the relevance, sufficiency, and acceptability of arguments). Again, Henderson et al. ( 2018 ) discussed the key challenges of argumentation research and pointed to students’ shifting epistemologies about what might count as a claim or evidence or what might make an argument persuasive or convincing, as well as developing valid and reliable assessments of argumentation. On the contrary, Yang et al. ( 2019 ) found no significant associations between general epistemic knowledge and the performance of scientific reasoning in a controversial case with undergraduates.

From science education, González‐Howard and McNeill ( 2020 ) analysed middle-school classroom interactions in critique argumentation when an epistemic agency is incorporated, indicating that the development of students’ epistemic agency shows multiple and conflating approaches to address the tensions inherent to critiquing practices and to fostering equitable learning environments. This idea is further developed in the special section on epistemic tools of Science Education (2020), which highlights the continual need to accommodate and adapt the epistemic tools and agencies of scientific practices within classrooms while taking into account teaching, engineering, sustainability, equity and justice (González‐Howard & McNeill, 2020 ; Settlage & Southerland, 2020 ).

Finally, some of the above-mentioned research used a noteworthy concept of epistemic knowledge (EK) as “knowledge about knowledge and knowing” (Hofer & Pintrich, 1997 ), which has been developed in mainstream general education research and involves some meta-cognitions about human knowledge that research has largely connected to general learning and CRT skills (Greene et al., 2016 ). Obviously, EK and NOS knowledge share many common aspects (epistemic), suggesting a considerable overlap between them. However, it is noteworthy that NOS research is oriented toward CRT skills impacting NOS learning, while EK research orientates toward EK impacting CRT skills and general learning.

Regarding the Likert formats for research tools, test makers are concerned about the control of response biases that cause a lack of true reflection on the statement content and may damage the fidelity of data and correlations. Respondents’ tendency to agree with statements (acquiescence bias) is widespread. Further, neutrality bias and polarity bias reflect respondents’ propensity to choose fixed score points of the scale, either the midpoints (neutrality) or the extreme scores (polarity), either extreme high scores (positive bias) or extreme low scores (negative bias). To mitigate biases, experts recommend avoiding the exclusive use of positively worded statements within the instruments and combining positive and reversed items. This recommendation has been implemented here using three categories for NOS phrases that operationalize positive, intermediate and reversed statements (Vázquezr et al., 2006 ; Kreitchmann et al., 2019 ; Suárez-Alvarez et al., 2018 ; Vergara & Balluerka, 2000 ). However, the use of varied styles for phrases harms the instrument’s reliability and validity, and reliability is underestimated (Suárez-Alvarez et al., 2018 ).

All in all, the theoretical framework is twofold: CRT and NOS research. The above-mentioned research shares the hypothesis that the relationship between NOS and CRT skills matters. However, it displays a broad heterogeneity of research methods, variables, instruments and mixed results on the NOS-CRT relationship that do not allow a common methodological standpoint. Further, mainstream research focuses on college students and argumentation skills. In this regard, this study aims to empirically research the NOS-CRT relationship by applying standardized assessment tools for both constructs. This promotes comparability among researchers and provides quick diagnostic tools for teachers. Secondly, this study addresses younger students, which involves the creation of NOS and CRT tools adapted to young participants, for which some test validity and reliability data are provided. The research questions within this framework are: Do NOS knowledge and CRT skills correlate? What are the traits and limits conditions of this relationship, if any?

Materials and methods

The data gathering took place in Spain in the year 2018. At this time, the enacted school curriculum missed the international standards and specific curriculum proposals about CRT and NOS issues, so NOS issues could be implicitly related to some curricular contents about scientific research. Despite this lack of curricular emphasis, the principals of the participant Spanish schools expressed interest in diagnosing students’ thinking skills and NOS knowledge and agreed with the authors on the specific CRT and NOS-skills to be tested. As the Spanish school curriculum does not emphasize CRT and NOS issues, the students are expected to be equally trained, and this context conditioned the design of tentative tests through simple contents and an open-ended format, as they are cheap and easy to administer and interpret.

Participants

The participant schools (17) included some public (4) and state-funded private schools (13) that spread across mixed socio-cultural contexts and large, medium, and small Spanish townships. The participant students were tested in their natural school classes (29) of the two target grades. The valid convenience samples are two cohorts of students, each representing students of 6 th grade of Primary Education (PE6) ( n  = 434; 54.8% girls and 45.2% boys; mean age 11.3 years) and 8th grade of Secondary Compulsory Education (SCE8) ( n  = 347; 48.5% girls and 51.5% boys; mean age 13.3 years). In Spain, 6 th grade is the last year of the primary stage (11–12-year-old students), and the 8 th grade is the second year of the lower secondary compulsory stage (13–14-year-old students).

Instruments

Two assessment tools were tailored by researchers (a CRT skill test and a NOS scenario) to operationalise CRT and NOS to empirically check their relationships. As the Spanish school curriculum lacks CRT standards, the specific thinking skills that represent the CRT construct were agreed upon between principals and researchers. The design of the tool to assess NOS knowledge took into account that NOS was not explicitly taught in Spanish schools. Both tools were designed to match the schools’ interests and the students’ developmental level; the latter particularly led to choosing a simple NOS issue (definition of science) to match the primary students’ capabilities better.

Thinking challenge tests

Two CRT thinking skill test were developed for the two participant cohorts (PE6 and SCE8). The design aligns with the tradition of most CRT standardised tests that concentrate assessment on a few selected thinking skills (i.e., Ennis & Millman, 2005 ; Halpern, 2010 ). The test for the 6th-graders (PE6) assesses five skills: prediction, comparison and contrast, classification, problem-solving and logical reasoning. The test for the 8th-graders (SCE8) assesses causal explanation, decision-making, parts-all relationships, sequence and logical reasoning.

As most CRT tests are designed for adults, many tests and item pools were reviewed to select suitable items for younger students. The selection criteria were the fit of the items’ cognitive demand with students’ age, the addressed skill and the motivational challenge for students. Moreover, items must be readable, understandable, adequate, and interesting for the participant students. Then, two 45-item and 38-item tests were agreed on and piloted. Their results are described elsewhere (Manassero-Mas & Vázquez-Alonso, 2020b ). The items were examined by the authors according to their reliability, correlation and factor analysis to eliminate unfair items. Again, the former criteria were used to add new items to conform the two new 35-item Thinking Challenge Tests (TCT) to assess the CRT skills of this study.

The items of the first two skills were drawn from the Cornell (Nicoma) test, which evaluates four CRT skills through the information provided by a fictional story about some explorers of the Nicoma planet and asks questions about the story. Some items from prediction and comparison skills were drawn for the 6th-grade TCT (PE6), and some items from causal explanation and decision-making skills were drawn for the 8th-grade TCT (SCE8). The two TCT include three additional items on logical reasoning that were selected from the 78-item Class-Reasoning Cornell Test (Ennis & Millman, 2005 ). One item was also drawn from the 25-situation Halpern CRT test (Halpern, 2010 ) for the problem-solving skill of the PE6 test. The authors adapted the remaining figurative items (Table 2 ) to enhance students’ challenge, understanding, and motivation and make the TCT free of school knowledge (Appendix).

Overall, the TCT items pose authentic culture-free challenges, as their contents and cognitive demands are not related to or anchored in any prior school curricular knowledge, especially language and mathematics. Therefore, the TCT are intended to assess culture-free thinking skills.

The item formats involve multiple-choice and Likert scales with appropriate ranges and rubrics that facilitate quick and objective scoring and the elaboration of increasing adjustment between items’ cognitive demand and their corresponding skill, thereby leading to further revision based on validity and reliability improvement. This format also allows setting standardised baselines for hypothesis-testing through comparisons of research, educational programs, and teaching methodologies.

Nature of science assessment

A scenario on science definitions is used to assess the participants’ NOS understanding because this simple issue may better fit the lack of explicit NOS teaching and the developmental stage of the young students, especially the youngest 6th-graders. The scenario provides nine phrases that convey an epistemic, plural and varied range of science definitions, and respondents rate their agreement-disagreement with the phrases on a 9-point Likert scale (1 =  strongly disagree , 9 =  strongly agree ) to allow better nuancing of their NOS beliefs and avoid psychometric objections to the scale intervals. The scenario is drawn from the “Views on Science-Technology-Society” (VOSTS) pool that Aikenhead and Ryan ( 1992 ) developed empirically by synthesizing many students’ interviews and open answers into some scenarios, written in simple, understandable, and non-technical language. They consider that VOSTS items have intrinsic validity due to their empirical development, as the scenario phrases come from students, not from researchers or a particular philosophy, thus avoiding the immaculate perception bias and ensuring students’ understanding. Lederman et al. ( 1998 ) also consider VOSTS a valid and reliable tool for investigating NOS conceptions. Manassero et al. ( 2003 ) adapted the scenarios into the Spanish language and contexts, and developed a multiple-rating assessment rubric, based on the phrase scaling achieved through expert judges’ consensus. The rubric assigns indices whose empirical reliability has been presented elsewhere (Vázquezr et al., 2006 ; Bennássar et al., 2010 ).

The students completed the two tests through digital devices led by their teachers within their natural school classroom groups during 2018–19. To enhance students’ effort and motivation, the applications were infused into curricular learning activities, where students were encouraged to ask about problems and difficulties. During applications students did not ask questions to teacher that may reflect some difficulty to understand the tests. The database was processed with SPSS 25 and Factor program (Baglin, 2014 ) for exploratory and confirmatory factor analysis through polychoric correlations and Robust Unweighted Least Squares (RULS) method that lessen conditions on the score distribution of variables. Effect size statistics use a cut-off point ( d  = 0.30) to discriminate relevant differences.

There was no time limit for students to complete the tests, and the applications took between 25 and 50 min. Correct answers score one point, incorrect answers zero points, and no random corrections were applied. The skill scores were computed by adding the scores of the items that belong to each skill, which are independent. The addition of the five skill scores makes up a test score (thinking total) that estimates students’ global CRT competence and is dependent on the skill scores (Table 2 ).

The different types of validity maintain a reciprocal influence and represent the various parts of a whole, so they are not mutually independent. The Thinking Challenge tests’ validity relies on the quality of the CRT pools and tests examined by the authors, their agreement to choose the items that best matched the criteria, and the reviewed pilot results (Manassero-Mas & Vázquez-Alonso, 2020b ). The Factor program computes several reliability statistics (Cronbach alpha, EAP, Omega, etc.).

Nature of science scenario

The nine phrases describe different science definitions, and students rated each one on a 1–9 agreement scale. According to the experts’ current views on NOS, a panel of qualified judges reached a 2/3-consensus to categorize each phrase within a 3-level scheme (Adequate, Plausible, Naive), which has been widely used in NOS assessment (Khishfe, 2012 ; Liang et al., 2008 ; Rubba et al., 1996 ). The scheme means the phrases express informed (Adequate), partially informed (Plausible), or uninformed (Naive) NOS knowledge (see Appendix). According to this scheme, an evaluation rubric transforms the students’ direct ratings (1–9) into an index [− 1 to + 1], which is proportionally higher when the person agrees with an Adequate phrase, partially agrees with a Plausible phrase, or disagrees with a Naive phrase. All the rubric indices balance positive and negative scores, which are symmetrical for Adequate and Naïve phrases, but plausible indices are somewhat loaded toward agreement, as higher agreement would be expected. The index unifies the NOS measurements to make them homogeneous (positive indices mean informed conceptions), invariant (measurement independent of scenario/phrase/category), and standardised (all measures within the same interval [− 1, + 1]). The index proportionally values the adjustment of students’ NOS knowledge to the current views of science: the higher (or lower) the index, the better (or worse) informed is their NOS knowledge (Vázquezr et al., 2006 ).

Three category variables (Adequate, Plausible, and Naïve) are computed by averaging their phrase indices, which are mutually independent. The average of the three category variables computes a global NOS index representing the student’s overall NOS knowledge (Global). The use of three categories aligns with test makers’ recommendations to avoid using only positively worded phrases in order to elude the acquiescence bias, which harms reliability and validity (Suárez-Alvarez et al., 2018 ).

The links between thinking skills and NOS are empirically explored through correlational methods and one-way ANOVA procedures of the variables of the Thinking Challenge test and science definitions.

The results include the descriptive statistics of the target variables, twelve thinking variables (five skills plus thinking total for each group) and four variables of the science definitions (adequate, plausible, naive, and global), the analysis of the correlations, a linear regression analysis among these variables, and a comparison of thinking skills between NOS categorical groups through a one-way ANOVA.

Descriptive statistics

Most mean thinking variables scores fell near the midpoint of the scale range. Four skills (classification, problem-solving, causal explanation and sequence) scored above the midpoints of their ranges, whereas two variables (logical reasoning and decision) scored slightly below their midpoints. Overall, these results indicate the medium difficulty of the tests for the students, neither easy nor difficult, which means the CRT tests can be acceptable to assess young students’ thinking skills (Table 3 ).

The EAP reliability indices of classification, problem-solving, sequence, parts (mainly figurative items) and thinking scales were excellent, good for the remaining scales, but poor for logical reasoning. Low reliability indicates a need for item revision and limited applicability (i.e., inappropriate for individual diagnosis), but is insufficient to reject the test in research purposes (U.S. Department of Labor, 1999 ). As test reliability critically depends on the number of items, increasing the length of logical reasoning over its three current items will improve its reliability.

The descriptive results for the direct scores of the NOS variables (Table 4 ) showed a biased pattern toward agreement (average phrases between 4.9 and 7.4), which suggests some acquiescence bias in spite of presenting varied phrases. The average indices obtained positive scores for the adequate category, slightly negative ones for the naïve category, and close-to-zero for the plausible phrases (the effect size of the differences concerning a zero score was low). The overall weighted average index for the whole sample (global variable) was close-to-zero and slightly positive, meaning that the students’ overall epistemic conception of science definition was not significantly informed. The overall average index of Adequate phrases obtained the highest positive score for both samples of students, which means that most students agreed with the Adequate phrases (expressing informed beliefs about science). In contrast, the Naïve overall average index obtained the lowest negative mean score, indicating that the students agreed instead of disagreeing with phrases expressing uninformed views about science. The Plausible variable (phrases expressing partially informed beliefs, neither adequate nor naive) obtained a close-to-zero average score, meaning that the students’ beliefs about these variables were far from informed. Overall, the students presented slightly informed views on Adequate phrases, close-to-zero average indices scores (not informed views) for Plausible phrases and slightly uninformed views on Naive statements.

Polychoric correlations among NOS direct scores computed through Factor attained good scores on all NOS items, indicating a unidimensional structure (but Phrase I). The exploratory factor analysis (EFA) applied to phrase scores displayed a dominant eigenvalue, whose general factor had acceptable loadings for all phrases (only phrase I had low loading). The unidimensional model obtained fair statistics in the confirmatory factor analysis. These results suggest one general factor underlying students’ scores and justify a global score representing the variance of all the NOS phrases. The expected a posteriori (EAP) reliability scores for the entire NOS scale were good (Table 4 ).

The comparison of NOS scores between primary and secondary grades highlights that the four NOS variable scores on science definitions were significantly equal for both cohorts of students, despite the two years separation. So, the educational impact of the two-year period on NOS seems almost null, given the close-to-zero differences in science definitions. This result could be expected, as NOS is not explicitly planned in Spanish science curricula and is not usually taught in the classroom.

Both cohorts answered the same anchoring CRT item (see Appendix), whose correct answer rate (27% primary; 33% secondary) suggests a slight improvement in CRT skills that sharply contrasts with the former NOS comparison. Summing up, despite that CRT and NOS have not been taught to Spanish students, developmental learning may increase CRT skills but not improve NOS knowledge. This reinforces the claim for explicit and reflective teaching of NOS, as implicit developmental maturation alone seems ineffective.

Correlations between nature of science and thinking skills

The empirical analysis of the hypothesised relationships between thinking skills and NOS epistemic variables (Adequate, Plausible, Naive) was performed through correlational methods (Pearson’s bivariate correlation coefficients and linear regression analysis) and one-way analysis of variance.

The Pearson correlation coefficients revealed a pattern of the relationships between NOS and thinking skills (Table 5 ): all thinking skills positively correlated with the Adequate variable, and most were significant, except for prediction and logical reasoning in EP6, which were non-significant. However, the correlations with the Naive and Plausible variables were overall non-significant. However, there were some exceptions: first, the Plausible/problem-solving correlation in EP6 was significant (and negative); second, the correlations between Naïve and logical reasoning (positive in EP6) and also between decision-making, logical reasoning and the thinking total score (negative in SCE8) were significant.

Thus, the noteworthy pattern for the NOS-CRT relationship showed that the Adequate variable positively correlated with all the thinking variables and was mostly statistically significant (83%); the highest positive correlations corresponded to problem-solving (EP6), sequence and parts-all (ES8), and the thinking total skills for both groups ( p  < 0.01). This pattern means that students with higher (lower) thinking skill scores expressed higher (lower) agreement with Adequate phrases.

The correlation pattern between thinking skills and the Plausible and Naive variables was mainly non-significant (75%). Only two correlations were significant in the EP6 group; the Plausible-problem-solving correlation was negative (higher scorers on problem-solving did not recognize the intermediate value of Plausible science definitions), whereas the Naïve-logical reasoning correlation was positive (higher scorers on logical reasoning tended to disagree with Naive science definitions). Three Naïve correlations were significant and negative in the secondary group (SCE8): parts-all, logical reasoning skills and thinking total.

Overall, the positive and significant correlation pattern of the Adequate variable was stronger than the mainly non-significant and somewhat negative Naive and Plausible correlation pattern.

Linear regression analysis between nature of science and thinking skills

Regression analysis (RA) compares the power of a set of variables to predict a dependent variable and the common variance. Two linear regression analyses were carried out to test the mutual contribution of the CRT and NOS variables. The first RA uses the NOS variables (Adequate, Plausible, Naive and Global) as the dependent variables, and the five independent thinking skills as predictors (Table 6 ). The second RA (Table 7 ) reversed the roles of the variables, thus establishing the thinking skills as the dependent variables and the three independent NOS variables (Adequate, Plausible and Naive) as the predictors. Collinearity tests were negative for all RAs through tolerance, variance inflation factor and condition index statistics.

The first RA (Table 6 ) showed that the NOS Adequate variable achieved the highest proportion of common variance with thinking skill predictors at both educational levels (4.2% in PE6 and 9.2% in SCE8), whereas the other two NOS variables achieved much lower levels of explained variance. In PE6, the most significant predictor skill of NOS was problem-solving, whereas the other predictor skills did not reach statistical significance in any case. In SCE8, the most significant predictors were three skills (sequencing, reasoning, and parts-all), whereas the remaining skills did not reach statistical significance (the predictors of the Plausible variable were negative).

The second RA (Table 7 ) showed that the Adequate variable achieved the greatest predictive power, as most thinking skills displayed statistically significant standardised beta coefficients at the two educational levels, while Plausible and Naïve variables had a much lower predictive power, and Plausible standardised coefficients were non-significant for any skill predictor. The common variance displayed a similar amount to the first analysis; the thinking total variable displayed the largest variance at both educational levels (4.8% PE6; 9.6% SCE8), and the problem-solving skills at PE6 (5.3%) and parts-all at SCE8 (7.1%).

In summary, the Adequate variable and the classification and problem-solving skills (PE6) and sequencing and parts-all skills (SCE8) were the variables that presented the largest standardised coefficients and statistical significance regarding the research question raised in this study about the positive relationship between NOS and thinking skills.

Analysis of variance between nature of science and thinking skills

Further exploration of the NOS-skills relationship was conducted through one-way between-groups analysis of variance. According to performance on the Adequate, Plausible and Naive variables, the participants were allocated to four percentile groups (low group: 0–25%; medium–low: 25–50%; medium–high: 50–75%; high: 75–100%), which made up the independent variable of the ANOVA for testing the differences in thinking skills (dependent variable) among these four groups.

The Adequate groups yielded a statistically significant main effect for the thinking total in primary [ F (3, 429) = 7.745, p  = 0.000] and secondary education [ F (3, 343) = 2.607, p  = 0.052]. The effect size of the differences in the thinking total scores between the high and low groups was large for the primary ( d  = 0.69) and secondary ( d  = 0.86) cohorts. Furthermore, comparison, classification, and problem-solving skills also replicated this pattern of large differences between high-low groups that supports the NOS/CRT positive relationship. However, prediction ( p  = 0.069) and logical reasoning ( p  = 0.504) did not display differences among the Adequate groups.

Post-hoc comparisons (Scheffé test) showed that the low group achieved significantly lower scores than the other three Adequate groups. The Adequate low group scores on thinking total, comparison, classification, and problem-solving skills were significantly lower than the scores of the other three groups, whereas the differences among the Adequate groups on prediction and logical reasoning scores were non-significant.

The main effect of the Plausible groups on the thinking total variable did not reach statistical significance for the primary F (3, 430) = 1.805, p = 0.145] and secondary groups [ F (3, 343) = 2.607, p  = 0.052]. The effect size was small ( d  = − 0.31 primary; d  = − 0.32 secondary) and negative (the thinking total mean score of the low group was higher than that of the high group). Post-hoc comparisons (Scheffé test) confirmed the trend, as they did not yield significant differences among the Plausible groups, although the mean score of the Plausible high group was lower than the other three groups. Exceptionally, problem-solving skill (primary) displayed a statistically significant difference between the Plausible high group (the lowest mean score) and the remaining three groups.

The main effect of Naive groups on the thinking total variable did not reach statistical significance [ F (3, 430) = 1.075, p  = 0.367 primary; F (3, 343) = 1.642, p  = 0.179 secondary] and the effect size of the differences was small ( d  = 0.32 primary; d  = − 0.31 secondary). The opposite direction of the differences in primary (positive) and secondary education (negative) is noteworthy, as it means that the highest mean score corresponded to the Naive high group in primary (positive) or the Naive low group in secondary (negative). Post-hoc comparisons (Scheffé test) showed that there were no significant differences among the Naive groups. However, the league table of groups across the Naive groups revealed differences between primary and secondary cohorts. Overall, the primary Naive groups followed the pattern of the Adequate variable (the low group displayed the lowest score), whereas the secondary Naive groups followed the pattern of the Plausible variable (the high group tended to display the lowest score).

The empirical findings of this study quantify through correlations some significant and positive relationships between thinking skills and NOS beliefs about science definitions, as the main answer to the research question. However, the analysis shows a complex pattern of the relationship, which depends on the kind of the NOS variable under consideration: the NOS Adequate variable, which represents phrases expressing informed views on science, is positively and significantly related to most thinking skills, whereas the uninformed Naive and intermediate Plausible variables show a lower predictive power of thinking skills. Summing up, the positive significant CRT-NOS relationship is not displayed by all NOS variables, as it is limited to those NOS variables that express an Adequate view of science, while the other NOS variables do not significantly correlate with CRT skills.

The implications of this study for research are twofold. On the one hand, the variables of this study specifically operationalise the two constructs under investigation, namely, CRT skills and NOS knowledge, which has been a challenge throughout their mixed operationalisation in the reviewed research. On the other hand, via Pearson correlations and regression analysis, this study quantifies the amount of the common variance between specific CRT skills and specific NOS knowledge, which is significant in many cases. Both contributions improve the features of previous studies, as most of them investigated the relationship from varied methodological frameworks: some reported group comparison, fewer analysed correlations, and most of the latter used a diversity of variables, which often did not match either CRT skills or NOS variables. For instance, Vieira et al. ( 2011 ) correlated thinking skills with science literacy (not NOS) and reported Pearson correlations that were lower than the correlations obtained herein, even though they used a smaller sample, which favours higher correlations.

The findings reveal the complexity of the NOS-CRT relationship, which limits the positive and relevant relationship to the NOS Adequate variables about science definitions, but not to the Plausible or Naive conceptualizations, which mainly display non-significant and somewhat negative correlations. The positive relationship between thinking and Adequate science definitions is a remarkable finding, which empirically supports the hypothesis that better thinking skills involve better NOS knowledge and confirms the concomitant intuitions and claims of some studies about the importance of thinking skills for learning NOS epistemic topics (Erduran & Kaya, 2018 ; Ford & Yore, 2014 ; Simonneaux, 2014 ; Torres & Solbes, 2016 ; Yacoubian, 2015 ). The findings also contribute to establishing the limit of the significant relationship, which applies when the NOS is conveyed by informed statements (Adequate phrases) and does not apply for non-adequate NOS statements, which are a minority in the face of most NOS literature, which conveys informed statements on NOS (Cofré et al., 2019 ).

The implications of the collateral finding on the lack of differences in science definitions between primary and secondary cohorts deserve further comments. Obviously, the finding confirms that two educational years have a scarce impact on improving Spanish students’ understanding of science definitions; that is, NOS teaching seems ineffective and stagnated, probably due to poor curriculum development and the lack of teacher training and educational resources. Besides, the students’ higher performance on adequate phrases than on plausible and naïve phrases also suggests that Spanish students may achieve some mild knowledge about the informed traits of science because they are implicitly displayed in teaching, textbooks and media. However, plausible and naïve knowledge is not usually available from those sources, as it requires explicit and reflective teaching, which Spanish students usually lack. Both findings suggest the need for further attention to misinformed NOS knowledge to invigorate explicit and reflective NOS teaching (Cofré et al., 2019 ; McDonald & McRobbie, 2012 ).

The unexpected non-significant/negative relationships between thinking and Plausible and Naive variables may need some elaboration due to the complexity of students’ NOS conceptions. For instance, Bennássar et al. ( 2010 ) described the students’ inconsistent agreements when rating opposite statements. Bogdan ( 2020 ) found that epistemic conceptions of science creativity did not relate to attitudes to science, and Khishfe ( 2012 ) reported complex relationships between epistemic aspects of science and decision-making about genetically modified organisms or the acceptance of the evolution theory (Cofré et al., 2017 ; Sinatra, et al., 2003 ). Thus, a tentative interpretation of those paradoxical relationships is elaborated.

Higher-thinking-skill students might develop better quality reflections that elicit more confident and higher scores on NOS phrases than lower-thinking-skill students. The latter tend toward less confident and low-quality reflection, which may elicit intermediate, less polarized scores. On average, this differential pattern explains the complex pattern of relationships between CRT and NOS variables. For the Adequate phrases (where the rubric assigns the best indices to the highest scores), higher-thinking students will achieve higher NOS indices than lower-thinking students, explaining the observed positive CRT-NOS correlations in the Adequate variables and the ANOVA results. On the other hand, when Naive and, especially, Plausible phrases are involved (which obtain their highest indices at low and intermediate scores, respectively), the differential response pattern would lead the lower-thinking students to achieve higher NOS indices than the higher-thinking students, thus shifting to the observed non-significant or negative correlations for Naive and Plausible phrases. In short, unconfident/confident and lower/higher quality reflection on NOS knowledge of the lower-/higher-thinking students would explain the shift from the positive and significant relationship of CRT-Adequate phrases to the non-significant correlations of Plausible and Naive phrases. This interpretation agrees with the striking finding of O’Brien et al. ( 2021 ) about a similar unexpected higher adherence to pseudoscientific claims in students with higher trust in science, which the authors attributed to the acritical acceptation of any scientific contents. Similarly, mastery of CRT skills is a desirable learning outcome, but it may make master students vulnerable to positive polarization in science definitions. However, further research is needed to confirm the non-significant correlations and the interpretation of the differential response pattern.

As the previous reference suggests, the findings about the complex CRT-NOS relationship connect with some pending controversies about NOS teaching, namely, the marginalized attention paid to misinformed ideas or myths about science, in favour of the informed ideas, which reveal implicit and non-reflective NOS teaching, as obviously misinformed ideas contribute to triggering more reflection than informed ideas (Acevedo et al., 2007 ; McComas, 1996 ). The effect of this under-exposure is students’ under-training about misinformed NOS ideas, which may act as obstacles to authentic NOS epistemic learning, explaining the differences presented herein. The remedy to this situation and the unconfident bias may lie in devoting more time and explicit attention to uninformed or incomplete NOS claims through reflective teaching.

This study is determined and limited by the contextual conditions of its correlational methodology. First, the research question implied measurements of thinking skills and NOS knowledge; second, the young participants (12–14-year-olds) required measurement tools appropriate to this age; third, the thinking skill tests had to match the thinking skills demanded by the participant school; fourth, the selected NOS tool was conditioned by the students’ age and the lack of appropriate NOS assessment tools. Thus, further suggestions to overcome these limitations are focused on expanding empirical support for the NOS-CRT relationship. On the one hand, some new NOS issues, such as additional epistemological and social aspects of science, should be explored to extend the representativeness of NOS knowledge. Similar reflections apply to including new skills to expand the scope of the CRT tool. Furthermore, the number of items of the logical reasoning scale should be increased to improve its reliability. Overall, the perennial debate between open-ended and closed formats is also noteworthy for future research, as quantitative methods could be complemented with qualitative methods (such as students’ interviews and the like).

Finally, the main educational implication of this study is that students may need to master some competence in CRT skills to learn NOS knowledge or general epistemic knowledge. Conversely, mastery of CRT skills may foster learning NOS knowledge. Although this study focuses on epistemic NOS knowledge drawn from science education, educational research has parallelly elaborated the epistemic knowledge (EK) construct for general education (Hofer & Pintrich, 1997 ), which opens further prospective research developments for NOS comprehension and CRT skills. On the one hand, the study of the NOS-EK relationship may shed light on convergent epistemic teaching and learning, both in science and in general education. On the other hand, the importance of CRT skills for NOS, and vice versa, may help coordinate teaching NOS-EK issues (Erduran & Kaya, 2018 ; Ford & Yore, 2014 ; McDonald & McRobbie, 2012 ; Simonneaux, 2014 ). This joint prospective of NOS-EK elaboration may also provide new answers to two aspects: the mutual connections between CRT skills and NOS-EK issues and the EK assessment tools that may also contribute to advancing the evaluation of CRT skills and NOS.

Data availability

The Spanish State Research Agency and the University of the Balearic Islands hold the property of all data and materials of this study, which may be available under reasonable request to them.

Code availability

Not applicable.

Acevedo JA, Vázquez A, Manassero MA, Acevedo P (2007) Consensus on the nature of science: epistemological aspects. Revista Eureka sobre Enseñanza y Divulgación de las Ciencias 4:202–225. http://www.apac-eureka.org/revista/Larevista.htm

Aikenhead GS, Ryan AG (1992) The development of a new instrument: “Views on Science-Technology-Society” (VOSTS). Sci Educ 76:477–491

Article   Google Scholar  

Allchin D, Zemplén GÁ (2020) Finding the place of argumentation in science education: Epistemics and whole science. Sci Educ 104(5):907–933. https://doi.org/10.1002/sce.21589

American Psychological Association (1990) Critical thinking: A statement of expert consensus for purposes of educational assessment and instruction. Executive Summary “The Delphi Report”. www.insightassessment.com/dex.html

Baglin J (2014) Improving your exploratory factor analysis for ordinal data: a demonstration using factor. Pract Assess Res Eval 19(5):2

Google Scholar  

Bailin S (2002) Critical thinking and science education. Sci Educ 11:361–375

Bennássar A, Vázquez A, Manassero MA, García-Carmona A (Coor.). (2010) Ciencia, tecnología y sociedad en Iberoamérica [Science, technology society in Latin America]. Organización de Estados Iberoamericanos. http://www.oei.es/salactsi/DOCUMENTO5vf.pdf

Bogdan R (2020) Understanding of epistemic aspects of NOS and appreciation of its social dimension. Revista Eureka sobre Enseñanza y Divulgación de las Ciencias, 17, Article 2303. https://doi.org/10.25267/Rev_Eureka_ensen_divulg_cienc.2020.v17.i2.2303

Cofré H, Cuevas E, Becerra B (2017) The relationship between biology teachers’ understanding of the NOS and the understanding and acceptance of the theory of evolution. Int J Sci Educ 39:2243–2260. https://doi.org/10.1080/09500693.2017.1373410

Cofré H, Nuñez P, Santibáñez D, Pavez JM, Valencia M, Vergara C (2019) A critical review of students’ and teachers’ understandings of NOS. Sci Educ 28:205–248. https://doi.org/10.1007/s11191-019-00051-3

Deng F, Chen D-T, Tsai C-C, Chai C-S (2011) Students’ views of the NOS: a critical review of research. Sci Educ 95:961–999

Dogan N, Manassero MA, Vázquez A (2020) Creative thinking in prospective science teachers: effects of problem and history of science based learning, 48. https://doi.org/10.17227/ted.num48-10926

Dowd JE, Thompson RJ Jr, Schiff LA, Reynolds JA (2018) Understanding the complex relationship between critical thinking and science reasoning among undergraduate thesis writers. CBE Life Sci Educ. https://doi.org/10.1187/cbe.17-03-0052

Ennis RH (1996) Critical thinking. Prentice, Hoboken

Ennis RH, Millman J (2005) Cornell Critical Thinking Test Level X. The Critical Thinking Company.

Ennis, R. H. (2019). Long definition of critical thinking . http://criticalthinking.net/definition/long-definition-of-critical-thinking/

Erduran S, y Dagher, Z. R. (eds) (2014) Reconceptualizing the Nature of Science for Science Education. Scientific Knowledge, Practices and Other Family Categories. Springer, Dordrecht

Erduran S, Kaya E (2018) Drawing nature of science in pre-service science teacher education: epistemic insight through visual representations. Res Sci Educ 48(6):1133–1149. https://doi.org/10.1007/s11165-018-9773-0

European Union (2014). Key competence development in school education in Europe. KeyCoNet’s review of the literature: A summary . http://keyconet.eun.org

Facione PA, Facione RN, Blohm SW, Howard K, Giancarlo CAF (1998) California Critical Thinking Skills Test: Manual (Revised). California Academic Press, California

Fisher A (2009) Critical thinking An introduction. Cambridge University Press, Cambridge

Fisher A (2021) What critical thinking is. In: Blair JA (ed) Studies in critical thinking, 2nd edn. University of Windsor, Canada, pp 7–26

Ford CL, Yore LD (2014) Toward convergence of critical thinking, metacognition, and reflection: Illustrations from natural and social sciences, teacher education, and classroom practice. In: Zohar A, Dori YJ (eds) Metacognition in science education. Springer, Berlin, pp 251–271

García-Mila M, Andersen C (2008) Cognitive foundations of learning argumentation. In: Erduran S, Jiménez-Aleixandre MP (eds) Argumentation in science education: perspectives from classroom-based research. Springer, Berlin, pp 29–45

García-Carmona A, Vázquez A, Manassero MA (2011) Current status and perspectives on teaching the nature of science: a review of teachers’ beliefs obstacles. Enseñanza de las Ciencias 28:403–412

González-Howard M, McNeill KL (2020) Acting with epistemic agency: characterizing student critique during argumentation discussions. Sci Educ 104:953–982

Greene JA, Sandoval WA, Bråten I (2016) Handbook of epistemic cognition. Routledge, London

Book   Google Scholar  

Halpern DF (2010) Halpern Critical Thinking Assessment. Schuhfried, Modling

He X, Deng Y, Saisai Y, Wang H (2020) The influence of context on the large-scale assessment of high school students’ epistemic cognition of scientific argumentation. Sci Educ 29:7–41. https://doi.org/10.1007/s11191-019-00088-4

Henderson JB, McNeill KL, Gonzalez-Howard M, Close K, Evans M (2018) Key challenges and future directions for educational research on scientific argumentation. J Res Sci Teach 55(1):5–18. https://doi.org/10.1002/tea.21412

Hofer BK, Pintrich PR (1997) The development of epistemological theories: beliefs about knowledge and knowing and their relation to learning. Rev Educ Res 67:88–140. https://doi.org/10.3102/00346543067001088

Khishfe R (2012) Nature of science and decision-making. Int J Sci Educ 34:67–100. https://doi.org/10.1080/09500693.2011.559490

Khishfe R, Alshaya FS, BouJaoude S, Mansour N, Alrudiyan KI (2017) Students’ understandings of nature of science and their arguments in the context of four socio-scientific issues. Int J Sci Educ 39:299–334

Kolstø SD (2001) Scientific literacy for citizenship: Tools for dealing with the science dimension of controversial socio-scientific issues. Sci Educ 85:291–310

Koray Ö, Köksal MS (2009) The effect of creative and critical thinking based laboratory applications on creative logical thinking abilities of prospective teachers. Asia-Pacific Forum Sci Learn Teach 10, Article 2. https://www.eduhk.hk/apfslt/download/v10_issue1_files/koksal.pdf

Kreitchmann RS, Abad FJ, Ponsoda V, Nieto MD, Morillo D (2019) Controlling for response biases in self-report scales: Forced-choice vs psychometric modeling of Likert items. Front Psychol. https://doi.org/10.3389/fpsyg.2019.02309

Kuhn D (2012) Enseñar a pensar [Education for thinking]. Amorrortu, Argentina

Lederman NG (2007) Nature of science: past, present, and future. In: Abell SK, Lederman NG (eds) Handbook of research on science education. Lawrence Erlbaum Associates, USA, pp 831–879

Lederman NG, Wade PD, Bell RL (1998) Assessing understanding of the NOS: A historical perspective. In: McComas WF (ed) The NOS in science education: Rationales and strategies. Kluwer, Netherland, pp 331–350

Liang LL, Chen S, Chen X, Kaya ON, Adams AD, Macklin M, Ebenezer J (2008) Assessing preservice elementary teachers’ views on the nature of scientific knowledge: a dual-response instrument. Asia- Pacific Forum Sci Learn Teach 9(1). http://www.ied.edu.hk/apfslt/v9_issue1/liang/index.htm

Madison, J. (2004). James Madison Critical Thinking Course . The Critical Thinking Co. https://www.criticalthinking.com/james-madison-critical-thinking-course.html

Manassero MA, Vázquez A, Acevedo JA (2003) Cuestionario de opiniones sobre ciencia, tecnologia y sociedad (COCTS) [Questionnaire of opinions on science, technology and society]. Educational Testing Service. https://store.ets.org/store/ets/en_US/pd/ThemeID.12805600/productID.39407800

Manassero-Mas MA, Vázquez-Alonso A (2019) Conceptualization and taxonomy to structure knowledge about science. Revista Eureka sobre Enseñanza y Divulgación de las Ciencias 16(3):3104. http://www.10.25267/Rev_Eureka_ensen_divulg_cienc.2019.v16.i3.3104

Manassero-Mas M, Vázquez-Alonso Á (2020a) Scientific thinking and critical thinking: transversal competences for learning. Indag Didact 12(4):401–420. https://doi.org/10.34624/id.v12i4.21808

Manassero-Mas MA, Vásquez-Alonso Á (2020b) Assessment of critical thinking skills: validation of free-culture tools. Tecné, Epistemé y Didaxis, 47:15–32. https://doi.org/10.17227/ted.num47-9801

Mason L, Scirica F (2006) Prediction of students’ argumentation skills about controversial topics by epistemological understanding. Learn Instr 16:492–509. https://doi.org/10.1016/j.learninstruc.2006.09.007

Matthews MR (2012) Changing the focus: From nature of science (NOS) to features of science (FOS). In: Khine MS (ed) Advances in nature of science research Concepts and methodologies. Springer, Berlin, pp 3–26

Chapter   Google Scholar  

McComas WF (1996) Ten myths of science: reexamining what we think we know about the NOS. Sch Sci Math 96:10–16

McDonald CV, McRobbie CJ (2012) Utilising argumentation to teach NOS. In: Fraser BJ, Tobin KG, McRobbie CJ (eds) Second international handbook of science education. Springer, Berlin, pp 969–986

Merton RK (1968) Social theory and social structure. Simon and Schuster, Newyork

National Research Council (2012) Education for life and work: Developing transferable knowledge and skills in the 21st century. The National Academies Press, USA

Noroozi O (2016) Considering students’ epistemic beliefs to facilitate their argumentative discourse and attitudinal change with a digital dialogue game. Innov Educ Teach Int 55(3):357–365. https://doi.org/10.1080/14703297.2016.1208112

O’Brien TC, Palmer R, Albarracin D (2021) Misplaced trust: When trust in science fosters belief in pseudoscience and the benefits of critical evaluation. J Exp Soc Psychol 96:104184. https://doi.org/10.1016/J.JESP.2021.104184

Olson JK (2018) The inclusion of the NOS in nine recent international science education standards documents. Sci Educ 27:637–660. https://doi.org/10.1007/s11191-018-9993-8

Osborne J (2014) Teaching critical thinking? new directions in science education. Sch Sci Rev 95:53–62

Paul R, Elder L (2008) The miniature guide to critical thinking: concepts and tools (5th ed.). Foundation for Critical Thinking Press

Rapanta C, Garcia-Mila M, Gilabert S (2013) What is meant by argumentative competence? an integrative review of methods of analysis and assessment in education. Rev Educ Res 83:483–520

Rubba PA, Schoneweg CS, Harkness WL (1996) A new scoring procedure for the views on science-technology-society instrument. Int J Sci Educ 18(4):387–400. https://doi.org/10.1080/0950069960180401

Santos LF (2017) The role of critical thinking in science education. J Educ Pract 8:159–173

Settlage J, Southerland SA (2020) Epistemic tools for science classrooms: the continual need to accommodate and adapt. Sci Educ 103(4):1112–1119. https://doi.org/10.1002/sce.21510

Simonneaux L (2014) From promoting the techno-sciences to activism – A variety of objectives involved in the teaching of SSIS. In: Bencze L, Alsop S (eds) Activist science and technology education. Springer, Berlin, pp 99–112

Sinatra GM, Southerland SA, McConaughy F, Demastes JW (2003) Intentions and beliefs in students’ understanding and acceptance of biological evolution. J Res Sci Teach 40:510–528. https://doi.org/10.1002/tea.10087

Sprod T (2014) Philosophical Inquiry and Critical Thinking in Science Education. In: Matthews MR (ed) International Handbook of Research in History, Philosophy and Science Teaching. Springer, Berlin, pp 1531–1564

Stathopoulou C, Vosnidou S (2007) Conceptual change in physics and physics-related epistemological beliefs: A relationship under scrutiny. In: Vosnidou S, Baltas A, Vamvakoussi X (eds) Re-framing the problem of conceptual change in learning and instruction. Elsevier, Amsterdam, pp 145–163

Suárez-Alvarez J, Pedrosa I, Lozano LM, García-Cueto E, Cuesta M, Muñiz J (2018) Using reversed items in likert scales: a questionable practice. Psicothema 30:149–158. https://doi.org/10.7334/psicothema2018.33

Torres N, Solbes J (2016) Contributions of a teaching intervention using socio-scientific issues to develop critical thinking. Enseñanza De Las Ciencias 34:43–65. https://doi.org/10.5565/rev/ensciencias.1638

U.S. Department of Labor Employment and Training Administration (1999). Understanding test quality-concepts of reliability and validity . https://hr-guide.com/Testing_and_Assessment/Reliability_and_Validity.htm

Vázquez-Alonso Á, Manassero-Mas MA (2018) Beyond science understanding: science education to develop thinking. Revista Electrónica de Enseñanza de las Ciencias 17:309–336. http://www.saum.uvigo.es/reec

Vázquez A, Manassero MA, Acevedo JA (2006) An analysis of complex multiple-choice science-technology-society items: Methodological development and preliminary results. Sci Educ 90: 681–706

Vergara AI, Balluerka N (2000) Methodology in cross-cultural research: current perspectives. Psicothema 12:557–562

Vesterinen VM, Manassero-Mas MA, Vázquez-Alonso Á (2014) History, philosophy, and sociology of science and science-technology-society traditions in science education: continuities and discontinuities. In Matthews MR (ed) International Handbook of Research in History, Philosophy and Science Teaching (pp 1895–1925). Springer

Vieira RM, Tenreiro-Vieira C, Martins IP (2011) Critical thinking: Conceptual clarification and its importance in science education. Sci Educ Int 22:43–54

Watson G, Glaser EM (2002) Watson-Glaser Critical Thinking Appraisal-II Form E. Pearson, London

Weinstock MP (2006) Psychological research and the epistemological approach to argumentation. Informal Logic 26:103–120

Yacoubian HA (2015) A framework for guiding future citizens to think critically about NOS and socioscientific issues. Can J Sci Math Technol Educ 15:248–260

Yacoubian HA, Khishfe R (2018) Argumentation, critical thinking, NOS and socioscientific issues: a dialogue between two researchers. Int J Sci Educ 40:796–807

Yang FY, Tsai CC (2010) Reasoning on the science-related uncertain issues and epistemological perspectives among children. Instr Sci 38:325–354

Yang FY, Tsai CC (2012) Personal epistemology and science learning: A review of studies. In: Fraser BJ, Tobin KG, McRobbie CJ (eds) Second international handbook of science education. Springer, Berlin, pp 259–280

Yang F-Y, Bhagat KK, Cheng C-H (2019) Associations of epistemic beliefs in science and scientific reasoning in university students from Taiwan and India. Int J Sci Educ 41:1347–1365. https://doi.org/10.1080/09500693.2019.1606960

Zeidler DL, Walker KA, Ackett WA, Simmons ML (2002) Tangled up in views: beliefs in the NOS and responses to socioscientific dilemmas. Sci Educ 86:343–367

Zeineddin A, Abd-El-Khalick F (2010) Scientific reasoning and epistemological commitments: coordination of theory and evidence among college science students. J Res Sci Teach 47:1064–1093. https://doi.org/10.1002/tea.20368

Download references

Acknowledgments

Grant EDU2015-64642-R of the Spanish State Research Agency and the European Regional Development Fund, European Union.

Open Access funding provided thanks to the CRUE-CSIC agreement with Springer Nature. This study is part of a research project funded by Grant No EDU2015-64642-R of the Spanish State Research Agency and the European Regional Development Fund, European Union.

Author information

Authors and affiliations.

Department of Psychology, University of the Balearic Islands, Palma, Spain

María Antonia Manassero-Mas

Centre for Postgraduate Studies, University of the Balearic Islands, Edificio Guillem Cifre de Colonya, Carretera de Valldemossa, Km. 7.5, 07122, Palma, Spain

Ángel Vázquez-Alonso

You can also search for this author in PubMed   Google Scholar

Contributions

Both authors declare their contribution to this study, their agreement with the content, their explicit consent to submit and that they obtained consent from the responsible authorities at the organization where the work has been carried out before the work was submitted. All authors contributed to the study conception and design, material preparation, data collection and analysis of the first draft of the manuscript, and all authors commented on previous versions of the manuscript. All authors read and approved the final manuscript.

Corresponding author

Correspondence to Ángel Vázquez-Alonso .

Ethics declarations

Conflict of interest.

The authors have no conflicts of interest or competing interests to declare regarding this article.

Ethical approval

This study was performed in accordance with the Declaration of Helsinki and the Ethics Committee of the University of the Balearic Islands approved the whole research project. Participants’informed consent was deemed not necessary because only the participants’ teachers developed the tasks involved in the study as ordinary learning classroom tasks, without any intervention of researchers. This manuscript is original, has not been published elsewhere and has not been submitted simultaneously to any other journal for consideration.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ .

Reprints and permissions

About this article

Manassero-Mas, M.A., Vázquez-Alonso, Á. An empirical analysis of the relationship between nature of science and critical thinking through science definitions and thinking skills. SN Soc Sci 2 , 270 (2022). https://doi.org/10.1007/s43545-022-00546-x

Download citation

Received : 11 December 2021

Accepted : 10 October 2022

Published : 08 December 2022

DOI : https://doi.org/10.1007/s43545-022-00546-x

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Nature of science
  • Critical thinking skills
  • Scientific literacy
  • Assessment of thinking skills
  • Epistemic assessment
  • Find a journal
  • Publish with us
  • Track your research

IMAGES

  1. why is Importance of Critical Thinking Skills in Education

    scientific definition for critical thinking

  2. How to promote Critical Thinking Skills

    scientific definition for critical thinking

  3. Critical Thinking Definition, Skills, and Examples

    scientific definition for critical thinking

  4. 6 Main Types of Critical Thinking Skills (With Examples)

    scientific definition for critical thinking

  5. Science-Based Strategies For Critical Thinking

    scientific definition for critical thinking

  6. Critical Thinking Skills

    scientific definition for critical thinking

VIDEO

  1. Breaking the Bias: Understanding Confirmation Bias in Decision-Making

  2. Critical thinking and deferring to experts

  3. Scientific Reasoning/Critical Thinking in Labs PER Interest Group Feb 23, 2024

  4. Critical Thinking Hacks! #facts #shorts

  5. Shortest Definition of Critical and Logical Thinking

  6. Shortest Definition of Critical and Logical Thinking

COMMENTS

  1. Critical Thinking

    Critical Thinking. Critical thinking is a widely accepted educational goal. Its definition is contested, but the competing definitions can be understood as differing conceptions of the same basic concept: careful thinking directed to a goal. Conceptions differ with respect to the scope of such thinking, the type of goal, the criteria and norms ...

  2. Critical thinking

    Critical thinking is the analysis of available facts, evidence, observations, and arguments in order to form a judgement by the application of rational, skeptical, and unbiased analyses and evaluation. The application of critical thinking includes self-directed, self-disciplined, self-monitored, and self-corrective habits of the mind, thus a critical thinker is a person who practices the ...

  3. What Is Critical Thinking?

    Critical thinking is the ability to effectively analyze information and form a judgment. To think critically, you must be aware of your own biases and assumptions when encountering information, and apply consistent standards when evaluating sources. Critical thinking skills help you to: Identify credible sources. Evaluate and respond to arguments.

  4. Critical thinking

    From the turn of the 20th century, he and others working in the overlapping fields of psychology, philosophy, and educational theory sought to rigorously apply the scientific method to understand and define the process of thinking. They conceived critical thinking to be related to the scientific method but more open, flexible, and self ...

  5. Defining Critical Thinking

    Critical thinking is, in short, self-directed, self-disciplined, self-monitored, and self-corrective thinking. It presupposes assent to rigorous standards of excellence and mindful command of their use. It entails effective communication and problem solving abilities and a commitment to overcome our native egocentrism and sociocentrism.

  6. PDF The Nature of Scientific Thinking

    science shifts and changes over time and some patterns specific to current day cutting edge science. You might consider using these lessons at the beginning of the school year before the first science unit is taught (when the scientific method is usually presented). The lessons also might be infused during the school year, making connections

  7. Scientific Thinking and Reasoning

    Abstract. Scientific thinking refers to both thinking about the content of science and the set of reasoning processes that permeate the field of science: induction, deduction, experimental design, causal reasoning, concept formation, hypothesis testing, and so on. Here we cover both the history of research on scientific thinking and the different approaches that have been used, highlighting ...

  8. Critical Thinking: A Model of Intelligence for Solving Real-World

    4. Critical Thinking as an Applied Model for Intelligence. One definition of intelligence that directly addresses the question about intelligence and real-world problem solving comes from Nickerson (2020, p. 205): "the ability to learn, to reason well, to solve novel problems, and to deal effectively with novel problems—often unpredictable—that confront one in daily life."

  9. Critical Thinking Definition, Skills, and Examples

    Critical thinking refers to the ability to analyze information objectively and make a reasoned judgment. It involves the evaluation of sources, such as data, facts, observable phenomena, and research findings. Good critical thinkers can draw reasonable conclusions from a set of information, and discriminate between useful and less useful ...

  10. Applying Critical Thinking

    Critical thinking refers to deliberately scrutinizing and evaluating theories, concepts, or ideas using reasoned reflection and analysis. The act of thinking critically implies moving beyond simply understanding information, but questioning its source, its production, and its presentation in order to expose potential bias or researcher subjectivity [i.e., being influenced by personal opinions ...

  11. What is Critical Thinking?

    Critical thinking is the intellectually disciplined process of actively and skillfully conceptualizing, applying, analyzing, synthesizing, and/or evaluating information gathered from, or generated by, observation, experience, reflection, reasoning, or communication, as a guide to belief and action. Paul and Scriven go on to suggest that ...

  12. Critical Thinking

    Diane F. Halpern defined critical thinking as an attempt to increase the probability of a desired outcome (e.g., making a sound decision, successfully solving a problem) by using certain cognitive skills and strategies. Critical thinking is more than just a collection of skills and strategies: it is a disposition toward engaging with problems.

  13. Science and the Spectrum of Critical Thinking

    Both the scientific method and critical thinking are applications of logic and related forms of rationality that date to the Ancient Greeks. The full spectrum of critical/rational thinking includes logic, informal logic, and systemic or analytic thinking. This common core is shared by the natural sciences and other domains of inquiry share, and ...

  14. A Crash Course in Critical Thinking

    Here is a series of questions you can ask yourself to try to ensure that you are thinking critically. Conspiracy theories. Inability to distinguish facts from falsehoods. Widespread confusion ...

  15. Redefining Critical Thinking: Teaching Students to Think like

    Scientific thinking is the ability to generate, test, and evaluate claims, data, and theories (e.g., Bullock et al., 2009; Koerber et al., 2015 ). Simply stated, the basic tenets of scientific thinking provide students with the tools to distinguish good information from bad. Students have access to nearly limitless information, and the skills ...

  16. Revisiting the origin of critical thinking

    Interestingly, Robert Ennis, whose definition of critical thinking is widely discussed and accepted, suggested that 'critical thinking' originated with the progressive educators who decided to give a different name to Dewey's reflective thinking (Ennis, Citation 2015, p. 31). It is not clear why Ennis did not acknowledge the fact that ...

  17. Understanding the Complex Relationship between Critical Thinking and

    Critical thinking and scientific reasoning are similar but different constructs that include various types of higher-order cognitive processes, metacognitive strategies, and dispositions involved in making meaning of information. ... Although various other definitions of critical thinking have been proposed, researchers have generally coalesced ...

  18. Evidenced-Based Thinking for Scientific Thinking

    As Hyytinen, Toom, and Shavelson discussed in Chapter 3 of this book, critical thinking can be defined in many ways (Lai, 2011) and involves complex skills to follow reasons and evidence, question information, tolerate new ideas and clarity of thought, and interpret information and perspectives (Pascarella & Terenzini, 2005).It is one important dimension of scientific thinking because with ...

  19. PDF A Miniature Guide To Scientific Thinking

    This miniature guide is designed for administrators, faculty, and students. It consists of the essence of scientific thinking concepts and tools. For faculty it provides a shared concept of scientific thinking. For students it is a scientific thinking supplement to any textbook for any science course. Faculty can use it to design science ...

  20. Scientific Thinking and Critical Thinking in Science Education

    Scientific thinking and critical thinking are two intellectual processes that are considered keys in the basic and comprehensive education of citizens. For this reason, their development is also contemplated as among the main objectives of science education. However, in the literature about the two types of thinking in the context of science education, there are quite frequent allusions to one ...

  21. (PDF) Critical thinking: Definition and Structure

    Critical thinking is a vital skill for the 21st century, involving using rational standards to analyze and evaluate information, thoughts and situations. It aims to create new knowledge ...

  22. Science, method and critical thinking

    Science is founded on a method based on critical thinking. A prerequisite for this is not only a sufficient command of language but also the comprehension of the basic concepts underlying our understanding of reality. This constraint implies an awareness of the fact that the truth of the World is not directly accessible to us, but can only be ...

  23. Scientific thinking and critical thinking in science education · Two

    Scientific thinking and critical thinking are two intellectual processes that are considered keys in the basic and comprehensive education of citizens. For this reason, their development is also ...

  24. What influences students' abilities to critically evaluate scientific

    Research on the effectiveness of teaching critical thinking has found mixed results, primarily due to a lack of consensus definition of and assessment tools for critical thinking [15, 16]. Some argue that critical thinking is domain-general—or what Ennis refers to as the "general approach"—because it is an overlapping skill that people ...

  25. An empirical analysis of the relationship between nature of science and

    Critical thinking (CRT) skills transversally pervade education and nature of science (NOS) knowledge is a key component of science literacy. Some science education researchers advocate that CRT skills and NOS knowledge have a mutual impact and relationship. However, few research studies have undertaken the empirical confirmation of this relationship and most fail to match the two terms of the ...