With lots of kids heading to school this week, an old question comes back to the fore: Can thinking be separated from knowing?
Many people, and not a few educators, believe that the answer is yes. Schools, they suggest, should focus on developing students’ “critical thinking skills” rather than on helping them beef up their memories with facts and other knowledge about the world. With the Internet, they point out, facts are always within easy reach. Why bother to make the effort to cram stuff into your own long-term memory when there’s such a capacious store of external, or “transactive,” memory to draw on? A kid can google the facts she needs, plug them into those well-honed “critical thinking skills,” and – voila! – brilliance ensues.
That sounds good, but it’s wrong. The idea that thinking and knowing can be separated is a fallacy, as the University of Virginia psychologist Daniel Willingham explains in his book Why Don’t Students Like School. This excerpt from Willingham’s book seems timely:
I defined thinking as combining information in new ways. The information can come from long-term memory — facts you’ve memorized — or from the environment. In today’s world, is there a reason to memorize anything? You can find any factual information you need in seconds via the Internet. Then too, things change so quickly that half of the information you commit to memory will be out of date in five years — or so the argument goes. Perhaps instead of learning facts, it’s better to practice critical thinking, to have students work at evaluating all that information available on the Internet, rather than trying to commit some small part of it to memory.
This argument is false. Data from the last thirty years lead to a conclusion that is not scientifically challengeable: thinking well requires knowing facts, and that’s true not simply because you need something to think about. The very processes that teachers care about most — critical thinking processes such as reasoning and problem solving — are intimately intertwined with factual knowledge that is in long-term memory (not just found in the environment).
It’s hard for many people to conceive of thinking processes as intertwined with knowledge. Most people believe that thinking processes are akin to those of a calculator. A calculator has available a set of procedures (addition, multiplication, and so on) that can manipulate numbers, and those procedures can be applied to any set of numbers. The data (the numbers) and the operations that manipulate the data are separate. Thus, if you learn a new thinking operation (for example, how to critically analyze historical documents), it seems like that operation should be applicable to all historical documents, just as a fancier calculator that computes sines can do so for all numbers.
But the human mind does not work that way. When we learn to think critically about, say, the start of the Second World War, it does not mean that we can think critically about a chess game or about the current situation in the Middle East or even about the start of the American Revolutionary War. Critical thinking processes are tied to the background knowledge. The conclusion from this work in cognitive science is straightforward: we must ensure that students acquire background knowledge with practicing critical thinking skills.
Willingham goes on the explain that once a student has mastered a subject — once she’s become an expert — her mind will become fine-tuned to her field of expertise and she’ll be able to fluently combine transactive memory with biological memory. But that takes years of study and practice. During the K – 12 years, developing a solid store of knowledge is essential to learning how to think. There’s still no substitute for a well-furnished mind.
“Before you become too entranced with gorgeous gadgets and mesmerizing video displays, let me remind you that information is not knowledge, knowledge is not wisdom, and wisdom is not foresight. Each grows out of the other, and we need them all” — Arthur C. Clarke
Seeing the term “well-furnished mind” reminded me of the late, great Jacques Barzun, who I believe used those words to express what he took to be the proper aim of education. His book of essays on teaching and learning Begin Here is worth looking at in this age of standardized testing, school “reform,” and educational fads such as “21st-century skills.”
Also [WARNING: Shameless commercial plug ahead], for more of Willingham on this subject, a couple of posts on the Encyclopaedia Britannica blog:
“Education for the 21st Century: Balancing Content Knowledge with Skills”
“Flawed Assumptions Undergird . . . 21st-Century Skills”
Disclosure: I work at Britannica. (The opinions here are my own.)
Nick, having read The Shallows, it is a little bit comical to see the word cram in the same sentence as long-term memory. Besides Willingham’s explanations and research, from what I remember in your book, long-term memory can’t be used until it moves from working memory to long-term memory, which doesn’t happen in the amount of time it takes for a student to read the information that needs to be “plugged-into” the critical thinking portion of his/her brain. (I am just stating what you already know, but it feels good to say it, anyway.)
I think the best distinction would be between facts and truths. Truths predict facts; it’s the difference between being smart and being wise.
In some ways, this reminds me of Orwell’s Newspeak. By removing specific words from langauge, the Oceanian government was able to severely restrict dissident thought. While individual citizens undoubtedly had latent ideas of disagreement or discontentment, they remained inchoate at best.
Even if “critical thinking skills” are separable from “knowledge”, they must be rooted in (if not tested through) something literal for digestion and especially future application. These skills need a “language” – and that language is knowledge. Case study curricula, such as those offered by Harvard Business School, are rooted in this understanding.
That being said, there was a great Atlantic piece several years ago extolling the benefits of a philosophy degree versus an MBA.
http://www.theatlantic.com/magazine/archive/2006/06/the-management-myth/304883/
If there is any need for more proof that advocates of the idea that Google brings automatic brilliance are themselves highly prone to neglect taking care of their long-term memory, look no further than Aleks Krotoski (author of Untangling the Web: What the Web is Doing to You) being interviewed for the Guardian, in which she blatantly says that The Shallows presents “no empirical evidence” for the hypothesis that our brains are working less hard because of new digital networked technologies such as the Internet: http://www.theguardian.com/science/audio/2013/sep/10/tech-science-podcast-untangling-web (12:07 and further).
BTW, Nicholas, you are “not a stupid chap” according to the interviewer. So that’s good.
She may have just read a summary of The Shallows on Twitter.