More than six years have passed since Richard Arum and Josipa Roksa rocked the academic world with their landmark book, Academically Adrift: Limited Learning on College Campuses. Their study of more than 2,300 undergraduates at colleges and universities across the country found that many of those students improved little, if at all, in key areas—especially critical thinking.
Since then, some scholars have disputed the book’s findings—notably, Roger Benjamin, president of the Council for Aid to Education, in a 2013 article entitled “Three Principle Questions about Critical Thinking Tests.” But the fact remains that the end users, the organizations that eventually hire college graduates, continue to be unimpressed with their thinking ability.
In 2010, the Noel-Levitz Employer Satisfaction Survey of over 900 employers identified “critical thinking [as] the academic skill with the second largest negative gap between performance satisfaction and expectation.” Four years later, a follow-up study conducted by the Association of American Colleges and Universities found little progress, concluding that “employers…give students very low grades on nearly all of the 17 learning outcomes explored in the study”—including critical thinking—and that students “judge themselves to be far better prepared for post-college success than do employers.”
As recently as May of 2016, professional services firms PayScale and Future Workplace reported that 60 percent of employers believe new college graduates lack critical thinking skills, based on their survey of over 76,000 managers and executives.
Clearly, colleges and universities across the country aren’t adequately teaching thinking skills, despite loudly insisting, to anyone who will listen, that they are.
How do we explain that disconnect? Is it simply that colleges are lazily falling down on the job? Or is it, rather, that they’re teaching something they call “critical thinking” but which really isn’t?
I would argue the latter.
Traditionally, the “critical” part of the term “critical thinking” has referred not to the act of criticizing, or finding fault, but rather to the ability to be objective. “Critical,” in this context, means “open-minded,” seeking out, evaluating and weighing all the available evidence. It means being “analytical,” breaking an issue down into its component parts and examining each in relation to the whole.
Above all, it means “dispassionate,” recognizing when and how emotions influence judgment and having the mental discipline to distinguish between subjective feelings and objective reason—then prioritizing the latter over the former.
I wrote about all this in a recent post on The Chronicle of Higher Education’s Vitae website, mostly as background for a larger point I was trying to make. I assumed that virtually all the readers would agree with this definition of critical thinking—the definition I was taught as a student in the 1980s and which I continue to use with my own students.
To my surprise, that turned out not to be the case. Several readers took me to task for being “cold” and “emotionless,” suggesting that my understanding of critical thinking, which I had always taken to be almost universal, was mistaken.
I found that puzzling, until one helpful reader clued me in: “I share your view of what critical thinking should mean,” he wrote. “But a quite different operative definition has a strong hold in academia. In this view, the key characteristic of critical thinking is opposition to the existing ‘system,’ encompassing political, economic, and social orders, deemed to privilege some and penalize others. In essence, critical thinking is equated with political, economic, and social critique.”
Suddenly, it occurred to me that the disconnect between the way most people (including employers) define critical thinking and the way many of today’s academics define it can be traced back to the post-structuralist critical theories that invaded our English departments about the time I was leaving grad school, in the late 1980s. I’m referring to deconstruction and its poorer cousin, reader response criticism.
Both theories hold that texts have no inherent meaning; rather, meaning, to the extent it exists at all, is entirely subjective, based on the experiences and mindset of the reader.
Thomas Harrison of UCLA, in his essay “Deconstruction and Reader Response,” refers to this as “the rather simple idea that the significance of the text is governed by reading.”
That idea has been profoundly influential, not only on English faculty but also on their colleagues in the other humanities and even the social sciences. (Consider, for example, the current popularity of ethnography, a form of social science “research” that combines fieldwork with subjective story-telling.)
Unfortunately, those disciplines are also where most critical thinking instruction supposedly occurs in our universities. (Actually, other fields, such as the hard sciences and engineering, probably do a better job of teaching true thinking skills—compiling and evaluating evidence, formulating hypotheses based on that evidence, testing those hypotheses for accuracy before arriving at firm conclusions. They just don’t brag about it as much.)
The result is that, although faculty in the humanities and social sciences claim to be teaching critical thinking, often they’re not. Instead, they’re teaching students to “deconstruct”—to privilege their own subjective emotions or experiences over empirical evidence in the false belief that objective truth is relative, or at least unknowable.
That view runs contrary to the purposes of a “liberal arts” education, which undertakes the search for truth as the academy’s highest aim. Indeed, the urge to deconstruct everything is fundamentally illiberal. Heritage Foundation’s Bruce Edwards calls it “liberal education’s suicide note” in that it suggests the only valid response to any idea or situation is the individual’s own—how he or she “feels” about it.
Unfortunately, such internalization of meaning does not culminate in open-mindedness and willingness to examine the facts and logic of differing views. Rather, it leads to the narrow-minded, self-centered assumption that there is a “right” way to feel, which automatically delegitimizes the responses of any and all who may feel differently.
All of this has a profound impact on students and explains a great deal of what is happening on colleges campuses today, from the dis-invitation (and sometimes violent disruption) of certain speakers to the creation of “safe spaces” complete with Play-Doh and “adult coloring books” (whatever those are—I shudder to think). Today’s students are increasingly incapable of processing conflicting viewpoints intellectually; they can only respond to them emotionally.
More to the point, that explains why employers keep complaining that college graduates can’t think. They’re not being taught to think. They’re being taught, in too many of their courses, to “oppose existing systems”—without regard for any objective appraisal of those systems’ efficacy—and to demonstrate their opposition by emoting.
That may go over just fine on the quad, but it does not translate well to the workplace.
—
This article is republished with permission from the James G. Martin Center for Academic Renewal.
Dear Readers,
Big Tech is suppressing our reach, refusing to let us advertise and squelching our ability to serve up a steady diet of truth and ideas. Help us fight back by becoming a member for just $5 a month and then join the discussion on Parler @CharlemagneInstitute and Gab @CharlemagneInstitute!
Leave a Comment
Your email address will not be published. Required fields are marked with *