Teaching critical thinking: The first step is to pause and reflect

© 2021 Gwen Dewar, all rights reserved


Why do we fall for fallacies? Why do we get duped by lies? It isn’t because we lack brain power, and it isn’t because we are helpless to overcome our own biases. Instead, what’s really crucial is whether we take the time to pause and reflect to consciously question our assumptions, and analyze the evidence.


Here’s a test question for you to answer. Want to take a look?

“On an abandoned field, there is a patch of dandelions. Every day, the patch doubles in size. If it takes 48 days for the patch to cover the entire field, how long would it take for the patch to cover half of the field?”

Go ahead. Give it a try.

Scroll down when you’re ready to continue.


patch of dandelions in the grass

The question is a variant of a test item invented by Shane Frederick, a psychologist who studies decision-making. Back in 2005 he created a tidy little packet of such questions, and, over the years, researchers have administered his test to hundreds of thousands of people. 

How did you respond?

Since you know this article is about critical thinking, you may have  approached the question with wariness. You stopped a moment to reflect and analyze. You wanted to be sure you were thinking clearly about the solution.

But if you didn’t pause and reflect — if you responded quickly, if you were distracted, if you felt anxious about being tested  — you might have listened to the part of your mind that provides you with intuitive, knee-jerk answers.  

This intuitive mind may have told you, “This is easy. Half the field, 48 days. The answer must be one-half of 48, or 24!”

And that’s wrong. If the patch of dandelions doubles each day, then the moment the field reaches 50% coverage, it’s merely 24 hours away from reaching 100% coverage. We should expect the patch to cover half the field on Day 47. Not Day 24.

What exactly is being measured by this question?

Mathematical skills, to be sure. But the skills required are pretty basic. And most people — even most college students — don’t do very well on Frederick’s test questions. They have the mental tools to solve the problems, but they still get them wrong.

Moreover, when people get things wrong, they don’t come up with just any old wrong answer. Most come up with the same wrong answer.

For example, when asked about the patch that doubles every day, most people who get it wrong say the answer is 24.

And in fact many of the people who get the right answer do so only after considering — and rejecting — that same, incorrect-but-appealing answer. (Frederick 2005).

So there’s something intuitive about the answer 24. And getting the correct answer seems to depend on overriding this intuition.

When people screw up on this question, it’s usually because they fail to stop and think. They fail to engage in cognitive reflection.

xthinking-girl-by-STUDIO_GRAND_WEB-shutterstock-600x.jpg.pagespeed.ic.uLkbLPIbot.jpg

Human beings everywhere — throughout the world — are facing a terrible crisis of misinformation and gullibility. And it might seem like we’re doomed because we lack intelligence, or because we’re too committed to our political biases to approach things with an open mind.

After all, intelligence is linked with cognitive reflection. People who score higher on tests of cognitive ability (IQ) tend to perform better on Shane Frederick’s Cognitive Reflection Test (CRT).

Moreover, it’s a well known fact: We’re all prey to “confirmation bias” — the tendency to seek out and favor information that confirms our own, pre-existing beliefs. When we encounter information that challenges those beliefs, we may try to ignore it. Or scramble to explain it away (Kahneman 2011).

But while intelligence is linked with cognitive reflection, the correlation isn’t very strong.

Yes, being good with numbers helps you perform well on the CRT, because the test questions often require mathematical thinking (Szaszi et al 2017). But general cognitive ability?

High IQ is no guarantee that you’ll ace the Cognitive Reflection Test. In one study, fewer than 50% of MIT students scored high on the CRT test (Frederick 2005).

Nor is high cognitive ability a prerequisite for doing well. When it comes to the CRT, the most important thing isn’t the computational power of your brain. It’s whether you choose to use your brain resources to reason analytically — to switch away from an easy, intuitive, automatic mode of thinking, and into a mode that is slower, more deliberate, and effortful (Toplak et al 2014).

We also know that our motivations and biases — by themselves — don’t prevent people from discovering the truth.

When people score high on cognitive reflection, they are resistant to accepting and sharing false news headlines — even if those false news headlines appeal to their political biases (Pennycook et al 2019; Sindermann et al 2020).

And open-mindedness?

We need open-mindedness to learn new things, to discover better solutions, to recognize mistakes. But open-mindedness doesn’t ensure that we’ll think rationally. Not on its own.

What if you are open-minded to the point of accepting many claims uncritically?

Without critical thinking, open-mindedness can lead to the same errors and absurdities that we associate with stubborn, close-mindedness (Pennycook and Rand 2020). 

So no. The most important ingredient for critical thinking isn’t high cognitive ability. Our prior beliefs and biases aren’t the biggest roadblocks to battling misinformation. Open-mindedness doesn’t protect us from getting duped.

What’s really crucial — the most important first step for critical thinking — is to pause and reflect. Be ready to consider new claims, but don’t accept them on their face. Take the time to question your assumptions, and weigh the evidence.

What does this look like? What specific things should we do when we encounter a new puzzle, claim, or story in the news?

xthinking-boy-by-Nikhil_Patil-istock-400x.jpg.pagespeed.ic.LKq_-MRaM_.jpg

I don’t have any special studies to cite here. But researchers and fact-checking experts tend to agree on these basic tips.

1. Realize that you need to shift into an active, skeptical mode.

When a new claim or story comes your way, make the conscious decision to turn on your deliberative, analytical mode of thinking. And be especially wary if the item in question evokes strong emotions. That’s the stuff that makes us especially gullible, and bad actors know it. 

2. Investigate the source.

Who is saying this? If it’s a claim or story, is there an author’s byline? Who is this person? Conduct an independent search on the web. Can your verify that this person exists? Are his or her credentials real? Who published this? Did it come from a reputable website or media organization? 

3. Analyze.

Never accept claims on the basis of a slogan, headline, or tweet. These brief statements are meant to get your attention — to manipulate you, to get a click — and they often leave out crucial details, details that can altogether change your impression of the underlying story.

In addition, as factcheck.org points out, some stories are actually intended as parodies. The headlines don’t give them away. You have to read the story itself to realize it’s a joke.

4. Check for common manipulative ploys.

Is the item sensational, shocking, or emotional? Is the item in question merely an ad hominem attack, or a bit of trolling? Does its effect depend on activating an “us versus them” response? None of these things make the claim true. But they encourage us to be gullible. Don’t fall for it. Look for the hard evidence.

5. Watch your assumptions and check the facts.

You might be thinking this is breaking news. But is it really? Check the date of the material to be sure.

You might also assume that the claims are supported by evidence. But did the author cite any reputable sources? If so, do those sources actually say what the author claims they do? 

And use your common sense. If this is a big story, and it’s true, then you should be able to find it mentioned in the major media.

Not sure where to look for answers? Try searching the keywords along with the name of one of the reputable fact-checking organizations — like Snopes.com and factcheck.org.

So what learning resources are available for teaching these habits to children, teenagers, and others? 

xCritical-thinking-kids-online-by-Rido-shutterstock-400x.jpg.pagespeed.ic.5a0tGWE3DW.jpg

Cornell University offers this collection of articles, videos, and infographics about learning to evaluate media sources.

Mike Caulfield of the University of Washington has written a book called Web Literacy for Student Fact Checkers. It is free and available online here.

And here’s something innovative and pretty fun: Researchers have created a game that teaches players how to spot fake news and misinformation.

The game is called Bad News, and it’s available online, in a variety of languages.

It was created by researchers at the Cambridge Social Decision-Making Lab at Cambridge University, and it’s appropriate for people ages 12 and up. You can access the Bad News game here.

The researchers have also created a similar game, Bad News Junior, for kids ages 8-11. It’s available online in American English and British English language versions.

In all versions of the games, you play the role of an unscrupulous web publisher interested in getting as much attention as possible. Along the way, you are coached in the techniques that real-world publishers use to spread misinformation.

Is the game effective? Does it improve your ability to spot fake news?

There’s reason to think so.

In a recent study, Melisa Basol and her colleagues randomly assigned approximately 200 young adults to play either (1) Bad News (the experimental condition), or (2) Tetris (the control condition), for 15 minutes.

Each participant was tested — before and after gaming — on his or her ability to detect unreliable social media posts.

And the results? Just 15 minutes of playing Bad News made a difference.

Compared with people in the Tetris control group, the folks who played Bad News improved their ability to spot misinformation techniques (Basol et al 2020). 

What else should we tell our kids about the need to pause and reflect?

The key point is that there isn’t any one lesson that is going to turn you into a good critical thinker.

Critical thinking doesn’t work like that. It isn’t something that you master one day, like learning to ride a bike, and then forever after excel at.

Instead, it’s an ongoing, effortful practice. You have to keep reminding yourself to turn on your conscious, reflective, deliberative brain. 

It’s also important for adults to become aware of the ways in which we — sometimes unintentionally — teach our kids to avoid critical thinking.

First, there’s the obvious. When we take an authoritarian approach to parenting — or teaching — we are sending a chilling message. Your job is to follow orders. No questions allowed.

It’s the defining feature of authoritarianism — an insistence on unquestioning obedience. And that simply isn’t consistent with critical thinking.

But you don’t have to practice authoritarian discipline to deter critical thinking. 

As I’ve written elsewhere, I’ve seen preschool television shows teach the sort of sloppy thinking that can hold children back from achieving in STEM.

It happens at home and in the classroom, too — when adults unwittingly perpetuate fallacies or misconceptions. Read more about it in my article, “Critical thinking: Are we teaching our kids to be dumb?”

And I see other problems. Here in the United States, I’ve noticed a worrying trend in some schools. In the name of teaching argumentation — or persuasive writing — kids are being given some very dubious assignments.

Take, for instance, this question:

“Do violent video games make people more violent in real life?”

It’s the kind of question that kids are sometimes asked to debate about, or write about. And there’s nothing wrong with that. Not if kids are informed about the best-available scientific evidence. 

But in some cases, kids aren’t informed. And then what happens? What happens when we ask students to pick a side, and argue it without ever learning the facts?

Instead of teaching critical thinking, we’re teaching something else, something antithetical to critical thinking. 

Don’t gather data. Don’t analyze. Don’t test. Your knee-jerk intuitions are enough.

So what should we do about these problems?

  • We need to question — politely but assertively — cultural practices that reward kids for parroting misconceptions and reaching hasty conclusions.
  • We need to encourage kids to bravely speak up when something doesn’t make sense.
  • We need to welcome their questions, value their skepticism, and show them how to find — and weigh — the evidence.

Where does this start? It starts in our everyday interactions. 

If a child stumps you with a question (like, “Why do people cry when they are sad?” or “How do airplanes fly?” or “Is earth the only planet with life?”) be candid.

Tell your child, “That’s a good question! I don’t know.”

Then show your child how to track down answers — how to find out what scientists or scholars have learned about the subject.

And if it turns out nobody knows? 

Now you have even more to talk about. What could scientists do to solve this mystery? What kind of data would they need to collect? Or is this a puzzle that scientists might never be able to solve? 


More Parenting Science articles about critical thinking


References: Pause and reflect

Basol M, Roozenbeek J, van der Linden S. 2020. Good News about Bad News: Gamified Inoculation Boosts Confidence and Cognitive Immunity Against Fake News.  J Cogn. 3(1):2.

Frederick S. 2005. Cognitive Reflection and Decision Making. Journal of Economic Perspectives. 19 (4): 25–42.

Pennycook G and  Rand  DG. 2020. Who falls for fake news? The roles of bullshit receptivity, overclaiming, familiarity, and analytic thinking. J Pers 88(2):185-200.

Pennycook G and Rand DG. 2019. Lazy, not biased: Susceptibility to partisan fake news is better explained by lack of reasoning than by motivated reasoning. Cognition. 188:39-50.

Sindermann C, Cooper A, Montag C. 2020. A short review on susceptibility to falling for fake political news.  Curr Opin Psychol. 36:44-48. 2

Szaszi B, Szollosi A, Palfi B, and Aczél B. 2017. The cognitive reflection test revisited: exploring the ways individuals solve the test. Thinking and Reasoning 23(3): 207-234. 

Title image of silhouette of thinking child by harshvardhanroy / istock

Banner image of girl wearing glasses by STUDIO GRAND WEB / shutterstock

Image of dandelions by frentusha / istock

Image of boy with pencil by Nikhil Patil / istock

Image of kids on their phones by Rido / shutterstock

Image of girl looking at frog by Melanie DeFazio / shutterstock

The owner of this website has made a committment to accessibility and inclusion, please report any problems that you encounter using the contact form on this website. This site uses the WP ADA Compliance Check plugin to enhance accessibility.