Corps de l’article

Introduction

An educational strategy for promoting critical thinking needs to grow out of a diagnosis of why critical thinking is lacking. It is one thing if you believe that people don’t know how to be good critical thinkers – in which case the educational goal is to provide them with the skills and dispositions to be better critical thinkers (preferably starting at as early an age as possible). It is a different thing if you believe that even if people have the capability to be critical thinkers, certain societal influences (like advertising, etc.) interfere with, or even disincentivize, actually enacting those capabilities – in which case the educational goal, apart from fostering the capabilities of critical thinking, is to help learners recognize and resist these countervailing influences or situations. But it is yet a third thing to confront a social climate that is actively hostile to the idea of critical thinking, in which there is a concerted effort to promote false information and an anti–critical thinking ethos. (In 2012 the Texas Republican Party in the United States adopted a platform calling for a ban on teaching critical thinking in schools, though they reversed this two years later.) In such a climate, it is not enough simply to teach the skills and dispositions of critical thinking if one wants students (and the adults they grow into) to actually be and act as critical thinkers. Defending critical thinking today requires a diagnosis of the very real agencies and social processes working against it.

The Assault on Critical Thinking

The hostile climate against critical thinking needs to be understood in relation to three interdependent factors. We see these not only in the US, but in countries around the world.

Post-Truth Politics

It is hardly a revelation to recognize that politicians have a slippery relationship with the truth. Spinning, deflecting, and outright lying are as old as the practice of politics itself. What is new is an outright rejection of any factual basis for resolving political disagreements – everything is partisan and to be used in advancing one’s political interests – and, even more strikingly, openly admitting that one is willing to lie when necessary. The keynote for this trend was articulated in an infamous interview with Republican strategist Karl Rove 20 years ago:

A cluster of particularly vivid qualities was shaping George W. Bush’s White House through the summer of 2001: a disdain for contemplation or deliberation, an embrace of decisiveness, a retreat from empiricism … In the summer of 2002 … I had a meeting with a senior adviser to Bush … The aide said that guys like me were “in what we call the reality-based community,” which he defined as people who “believe that solutions emerge from your judicious study of discernible reality.” I nodded and murmured something about enlightenment principles and empiricism. He cut me off. “That’s not the way the world really works anymore,” he continued. “We’re an empire now, and when we act, we create our own reality. And while you’re studying that reality – judiciously, as you will – we’ll act again, creating other new realities, which you can study too.”

Suskind, 2004

The open admission by a major establishment political party that facts don’t matter, and that the fabrication of what would later be called “alternative facts” is a legitimate tool of political strategy, marked a fundamental shift in political discourse for any democratic society. Of course dissimulation happens in politics, by politicians of every political stripe, and in every nation; but it is a step further to claim that the very expectation that politicians ought to tell the truth, and can be held accountable when they do not, is no longer operative.[1]

The nadir of this trend (so far) is the recent political career of former American President Donald Trump. We have never seen an American political figure lie so much, and so blatantly. In an interview, he admitted as much:

We asked him why, as president, he thought it was OK for him to continually tell the American people things that were not true, to lie again and again and again … And he said to us, “You know, there’s a beautiful word, and it’s called disinformation.”

Moran, 2021

About COVID, Trump has said, “Coronavirus numbers are looking MUCH better, going down almost everywhere,” “99%” of COVID-19 cases are “totally harmless,” and “We now have the lowest fatality (mortality) rate in the world” – all statements that we know his own advisors were telling him were false, as he said them (Paz, 2020).

The effect of all this is corrosive to critical thinking because the overt claim is that critical thinking is itself partisan. In a context like this, it is: when a political party or movement aligns itself against “the reality-based community,” then any opposition to that stance will be represented as being motivated by political bias against that party or movement – which will then be put forward as more proof that, as some say, “the facts have a liberal bias.” Reality-based institutions that seek to promote critical thinking, like fact-based journalism, higher education, scientific research, and nonpartisan public agencies and branches of government, need to be attacked and discredited, not only to blunt their public legitimacy and impact, but to drag them into the post-truth framework: they are viewed as agencies of elitism whose claims threaten to make the gullible feel uncomfortable and inadequate, which in itself entitles them to ignore these claims. The rejection of COVID vaccinations and simple safety advice like mask wearing, in defiance of all scientific evidence and advice from medical experts, exemplifies the consequences of a systematic attempt to foster and encourage an anti-intellectual attitude for the sake of advancing a political agenda.

The Media, and Social Media

The attack on reality-based news institutions is not just a political ploy; it is also a marketing strategy. In the US, Fox News (and smaller, more fringe television and print media) has actively modelled itself as the counterbalance to the ostensibly “liberal” news media. These outlets contain not only blatant falsehoods, but also constant attacks on other news networks, newspapers, and reporters that show their “bias,” which reinforces the idea that there is no reason to watch them or take them seriously.

Today, it is impossible to talk about the ecosystem of disinformation and propaganda without looking at the pervasive influence of social media, which has become the primary source of political ideas and information for more and more people. Churchill’s famous quip, “A lie gets halfway around the world before the truth has a chance to get its pants on,” fits the Internet perfectly: the speed and scope of connectivity, a culture of instantly reposting and promoting items you like, a pervasive susceptibility to scandalous and sensationalistic “clickbait,” and a coordinated effort by domestic and international provocateurs means that disinformation and misinformation can “go viral” in a matter of moments; meanwhile, efforts to recall or correct such misinformation are at a huge psychological and technological disadvantage once it is “out there.”

There is a great deal of talk about “the algorithm,” but not always an appreciation of its consequences. The core idea of the Internet’s algorithms, developed first around commercial advertising (which generates enormous revenue for social media companies), is that once someone shows an interest in a product, through their clicks, their web searches, their purchases, and so on, they are bombarded with further information about that or similar products, with an eye toward getting them to buy more. The problem is what happens when this same technical capacity gets applied to news, information, and political commentary: apart from the personal choices that people might make to follow, view, or read certain sites or sources of news and information that align with their political views and preferences, the algorithm pushes additional material into their pages or inbox, accelerating the process of creating a “bubble” in which more and more of what they see reinforces what they already think and believe. The phrase “a consumer of information” captures this problem: the idea that people should be able to choose the information they want to receive, and have more and more of it pushed to them, as if information were just another product they pick to fit their preferences and desires.

Cognitive Biases

The third factor, interacting with the previous two, is our growing understanding of the psychology of belief formation, and the recognition that for a variety of reasons, including our cognitive makeup, this process is often much less rational than we might imagine it to be. One explanation for cognitive bias comes from recent research on “fast and slow thinking”:

Daniel Kahneman described S1 as “fast, automatic, frequent, emotional, stereotypic, and unconscious,” describing what are more colloquially known as “gut reactions.” S2 is “slow, effortful, infrequent, logical, calculating, and conscious,” which is closer to what we tend to conceive of as thinking – taking a step back, slowing down, consciously assessing and reasoning … S1 is where we find what these days are called “implicit biases,” the kinds of preconscious predilections that shape how you react to a situation before “you,” the conscious, thinking you, is even fully aware of what’s going on.

Roberts, 2020

It is important to realize that from an evolutionary and pragmatic perspective, “fast thinking” is often beneficial: not every decision or choice can receive, or needs to receive, careful detailed analysis; sometimes there is simply not enough time. The problem, in this context, is how fast thinking intersects with the problem of information overload:

We are bombarded by information. It comes from other people, the media, our experience, and various other sources. Our minds must find means of encoding, storing, and retrieving the data we are exposed to. One way we do this is by developing cognitive shortcuts and models. These can be either useful or unhelpful. Confirmation bias is one of the less-helpful heuristics which exists as a result … Confirmatory data is taken seriously, while disconfirming data is treated with skepticism.

Farnam Street, 2017

A related dimension of cognitive bias is what is called “motivated reasoning”: we often have a pre-existing desire to believe a specific outcome, and so are highly selective in favouring evidence that supports that conclusion, and dismissive of counterevidence that does not. This motivation is especially strong when a system of belief is strongly connected with one’s sense of identity and place in a community. Willard V. Quine and J. S. Ullian describe this kind of bias:

The desire to be right and the desire to have been right are two desires, and the sooner we separate them the better off we are. The desire to be right is the thirst for truth. On all counts, both practical and theoretical, there is nothing but good to be said for it. The desire to have been right, on the other hand, is the pride that goeth before a fall. It stands in the way of our seeing we were wrong, and thus blocks the progress of our knowledge.

Quinn and Ullian, 1978

But it gets even worse. Another type of potential cognitive bias is called the “backfire effect” – that in certain situations, being presented with counterevidence and counterarguments actually makes people get defensive and hold onto their current (mistaken) beliefs even more strongly:

Tell him you disagree and he turns away. Show him facts or figures and he questions your sources. Appeal to logic and he fails to see your point … Suppose that he is presented with evidence, unequivocal and undeniable evidence, that his belief is wrong: what will happen? The individual will frequently emerge, not only unshaken, but even more convinced of the truth of his beliefs than ever before. Indeed, he may even show a new fervor about convincing and converting people to his view.

Festinger, Riecken & Schachter, 2017

For advocates of critical thinking, these dimensions of cognitive bias, and their pervasiveness, are particularly disturbing because they suggest that the anti–critical thinking social and political trends described above interact strongly with tendencies in our very psychological makeup. Each exacerbates the other: strategies of disinformation exploit our tendencies to believe things that we should not, while conversely the strong desire to preserve our beliefs drives us in the direction of “consuming” information that reassures and gratifies us that we were right all along.

Two Examples

I want to describe here two kinds of distorted belief systems that are rampant today and show how these three factors (a post-truth environment, the media and information systems we have created, and our predispositions toward cognitive biases) interact and reinforce each other, especially during a stressful experience like COVID.

Conspiracy Theories

The basic appeal of a conspiracy theory is that a person is convinced that they, as individuals and as members of a group, know something that others do not: “We see the pattern, we see the proof.” Feeling part of a special group of initiates to the secret is key to this appeal. Usually the conspiracy is based on a sensationalistic premise (aliens have visited earth, COVID vaccines contain microchips to allow people to be tracked by the government – or they rewrite your DNA, or they make you magnetic) (Cassata, 2021). In a certain sense, the very fact that these conspiracies are outlandish reinforces the desire to know more about them: they are a kind of “clickbait” for the imagination. Conspiracy theories also grow out of a particular mindset: suspicious, sometimes even paranoid, and thoroughly mistrusting of official sources of information. As a result of this, conspiracy theory true believers are highly susceptible to lies and disinformation, and impervious to counterevidence, for all three of the reasons noted above: they trust their sources of information over anybody else’s; they are tied into media and information systems, including social media feeds, that continually reinforce their beliefs and build up their sense that they, and only they, know what is really going on; and they are certain that they are right, and then they find arguments and evidence to reinforce that certainty, perfectly illustrating the concept of “motivated reasoning.” Moreover, there are places where they can go to find “proof” of these conspiracies, often full of pseudo-facts and evidence, convincing them they are right. As a result, traditional critical thinking interventions will not work: conspiracy advocates believe that they are the true critical thinkers and that those who do not see the conspiracy are the dupes. Anyone who questions or challenges them, with whatever evidence they might provide, is simply a perpetrator of the conspiracy. The paradox is clear.

We Understand and Respect You; They Do Not

Especially for groups consumed by resentment, who feel mistreated, misunderstood, and threatened, it can be very appealing when political figures, characters in the media, or others tell them: “Those people have contempt for you. They don’t understand you or your grievances. But I (or we) do.” This appeal takes many forms, and is targeted toward many different kinds of groups, but in the context of faux-populist politics the dynamic is invariably anti-elites, anti-establishment, anti-science, and anti–fact-based journalism. This strategy both draws disaffected groups toward the sympathetic figure who claims to appreciate them and reinforces their sense of resentment and insecurity toward the others.

In the context of COVID, for example, Trump said: “People are tired of COVID. I have the biggest rallies I’ve ever had, and we have COVID.” He continued: “People are saying whatever. Just leave us alone. They’re tired of it. People are tired of hearing Fauci and all these idiots” (Collins & Liptak, 2020). This sort of appeal overlaps with the mindset of conspiracy theorizing, and poses an especially difficult challenge for critical thinking interventions: for when you present evidence or arguments against their point of view, it reinforces and feeds into the underlying resentment that you are another of those elites who do not respect or understand them. Indeed, the more compelling the evidence or arguments might be, the more threatening they feel, and so all the more reason to reject them – not because of their content, but because of their source. It is another paradox.

Conclusion

There is much more to be said about the anti–critical thinking tendencies of today’s society, the sources of such influences, and the social and political motivations of these sources. They are a threat to democracy, to public deliberation and debate, and to the ethos of a fact-based polity. Under the conditions of COVID, schools have been in an especially challenging dilemma: we expect them to be sites that promote critical thinking, but instead they have become ground zero for the militant refusal to have children vaccinated or wear masks. Sixteen US states refuse to allow schools to require vaccinations or masks for students (Perez, 2021). In one Florida school, students who get vaccinated are required to stay home for 30 days and miss school (Qamar, 2021).

While some manifestations of these trends are all too visible in the rise of some contemporary public and political figures, it is essential for understanding what is going on to recognize elements of these threats across the political spectrum, to recognize them in those we might agree with as well as in those we do not (otherwise we risk making the very same mistakes ourselves). COVID denialism, for example, is not only found on the Trumpian fringe; there are left-wing versions as well (Christou, 2020).

My main purpose here is to suggest that the traditional model of teaching and promoting critical thinking – fostering the skills and dispositions of being a critical thinker, advocating for and trying to model critical thinking in our own speech and behaviour – is no longer enough. Responding to the kinds of threats to critical thinking recounted here requires a different kind of intervention – one that is sensitive to the psychological, emotional, and tribal dynamics that have combined to create a culture actively hostile to critical thinking, and one that recognizes that traditional assumptions and actions regarding critical thinking often exacerbate the resistance and hostility toward it. In some cases this may require a more circumnavigatory approach, involving strategies of intervention that do not look very much like traditional critical thinking approaches. In other cases it may look more like a kind of therapy, trying to identify and address the underlying sources of and motivations for resistance to critical thinking. And in some other cases it just may not be possible at all.