Stanovich shows what science can tell us about myside bias: how common it is, how to avoid it, and what purposes it serves. Stanovich explains that although myside bias is ubiquitous, it is an outlier among cognitive biases. It is unpredictable. Intelligence does not inoculate against it, and myside bias in one domain is not a good indicator of bias shown in any other domain. Stanovich argues that because of its outlier status, myside bias creates a true blind spot among the cognitive elite—those who are high in intelligence, executive functioning, or other valued psychological dispositions.
They may consider themselves unbiased and purely rational in their thinking, but in fact they are just as biased as everyone else. Stanovich investigates how this bias blind spot contributes to our current ideologically polarized politics, connecting it to another recent trend: the decline of trust in university research as a disinterested arbiter.
How to assess critical aspects of cognitive functioning that are not measured by IQ tests: rational thinking skills. Why are we surprised when smart people act foolishly? Smart people do foolish things all the time. Misjudgments and bad decisions by highly educated bankers and money managers, for example, brought us the financial crisis of The return to this state of affairs has been fuelled by a cognitive trait that divides. In it, we described our attempt to create the first comprehensive test of rational thinking.
The book is very much an academic volume, full of statistics and technical details. Stanovich Keith E. The Social Science Monoculture Doubles Down Over the past 18 months, a number of significant events have occurred that were interpreted through two entirely different worldviews: COVID—19 lockdowns; rise of the BLM movement; the riots and violence in major cities; the US election process and its aftermath; and vaccine safety.
The Bias that Divides Us As we sit here over six months after the initial lockdown provoked by COVID, the United States has moved out of a brief period of national unity into distressingly predictable and bitter partisan division. David Boulton: So you wanted to do something that would have a practical benefit to the world from the point of view of cognitive psychology?
Keith Stanovich: Of a cognitive psychologist, right. When I entered the field, it was very exciting. I mean, we were going through the cognitive revolution, right? We had discovered information processing and the information processing framework was replacing a behaviorist framework. That was all very exciting, but it was still a lot of reaction time and micro-milliseconds of memory.
I think we kind of naturally gravitated to something where we thought we could make a difference. So, that book really was exciting for us. So, we took on the problem of studying context effects at the word recognition stage and with, in some sense, a bias.
The bias was that we were going to put some empirical meat on these conjectures of Smith. So, the idea was that contextual reliance was supposed to be characteristic of the most fluent readers.
When we started running readers of different abilities in various context paradigms from information processing psychology, what we found was just the opposite. It was the poorer struggling reader who was relying on semantic and syntactic context. So that sent us back for a big rethink. It was kind of that early work with West that led to what I later termed the i nteractive compensatory model. Now this is an old and well-known story, but we have about twenty-five years of hindsight.
It was very…. He developed, basically, a parallel notion. I called mine the i nteractive compensatory notion and he called his v erbal efficiency theory. Not having to use capacity for word recognition is efficacious, because it can be used for higher level processing.
Keith Stanovich: Yes, for comprehension. One of your questions summarized the story quite well. It was, in a sense, very Popperian because it was that shock that kind of sent us back to the drawing board that turned out to be so productive. David Boulton: Some of it. Keith Stanovich: Oh, yeah. A number of false starts. Jean Chall published one of them, and Bertram Bruce in Britain in I mean, they would be recognizable today, very contemporary. Then in , you have Isabelle Liberman and her famous paper in Journal of Experimental Child Psychology with the tapping, with the phoneme tapping task, and phoneme tapping being so much harder than syllable tapping.
Keith Stanovich: Yes, and that was the next one. So, you can just see the volume getting turned up. Now the interest in phonological processes is a little louder but still a bit attenuated. So, we then published in that new flurry of papers, where again, these kind of trends were coming together.
Then the field was more ripe than it was back in for the phonological awareness work. Maybe you could describe some of the sub processors, some of the modules that are interacting here. Of course, Anne has given you part of that foundational skills in phonological awareness.
But no one denies that a substrate of cognitive abilities in the phonological awareness domain is critical. Then that will get you nowhere without the basic alphabetic insight that print maps speech, and it maps, in English at least, at a fairly abstract and analytic level.
Keith Stanovich: Exactly. And then, what is the structure of this code, as you rightly point out, and the complexity of this code? Then very early in that process, I saw you talking with Anne about Matthew Effects , and very early in that process those effects start to kick in. I see that part of your interest in this series is the consequences of the acquisition of literacy. Those consequences feed back on the act of acquisition itself and they ripple out into other cognitive structures, processes and tools, like vocabulary.
That is what we started to try to capture in a program of research that call The Print Exposure Program. After I published the Matthew Effects article, I mean, that was essentially a model that synthesized a lot of literature, but we tried to study empirically some of the effects of differential exposure to print. For about a decade we were quite involved with studies looking at the effects of print exposure. One of the aspects of that program was in part methodological: How do you measure differences in exposure to print?
And we thought that we had done that in a variety of studies that illustrate your statement. Of course, we focused immediately on some of the obvious candidates, and certainly vocabulary was something we focused on a lot. Keith Stanovich: We showed, with various correlational techniques and some longitudinal designs, that vocabulary growth was independently predicted by the amount of print exposure. I think Anne referred you to some of the work by Hayes on lexical density, looking at the tremendous differences between print and oral.
Hayes, D. Speaking and writing: Distinct patterns of word choice. Journal of Memory and Language, 27, He basically has a program that rates the relative rarity of words. So, you put a corpus in and he has some oral transcripts from television shows and things, and hospital conversation, and then compares it to print. Then he has various statistical measures of lexical density. So, you take some arbitrary definition of a rare word, say a word ranked greater than 10, in the Kucera and Francis Count or ranked greater than 5, in the Carroll, Davies and Richman Count.
What he did was to take text that had been made into movies, things like To Kill a Mockingbird , or something like that. He has the transcript of the movie and then he has letter by letter — like here are all the words beginning with U in the movie and here are all the words beginning with U in the book.
Of course, one is a huge long list and the other is this emaciated list of fairly frequent words. Keith Stanovich: There was a tenfold difference in print. In this work we did a lot of analyses, a lot of regression type analyses where we tried to partial out a lot of alternative candidates.
In the longitudinal studies, we partialed out the auto correlation. So, we partial out the vocabulary at an earlier point in time to see if differential growth can be predicted. In the cross-sectional studies, we partial out the obvious cognitive candidates, again, like intelligence. So, if you have another variable that can predict vocabulary over and above intelligence, that is saying something.
What we mean by that is that of course part of earlier print exposure is probably in the intelligence measure. All the earlier developmental effects — some of these studies, by the way, we did were with adults. Keith Stanovich: So, all the earlier developmental effects of print exposure, okay, tons of those effects were already in the intelligence measure.
So, what do we go and do physically? We take intelligence out. So we were quite excited to be able to find that. We studied an elderly group of people, Richard West and I.
I think they had a mean age of about seventy-eight. We replicated, in that study, kind of the classic thing you show with fluid and crystallized intelligence. Fluid intelligence drops with age. So we have our university sample and our forty-year-olds and our seventy-nine year-old. Fluid intelligence, of course, drops way off. The seventy-nine year-old is quite low.
Crystallized intelligence is the vocabulary real world knowledge, declarative knowledge of the world that keeps rising across the age span. Then we add our measures of print exposure. We do a series of statistical analyses showing that the growth that you see in elderly people, the growth in crystallized intelligence is almost all due to reading. David Boulton: As an adjunct to the work that you were doing here, did you engage in or know anybody else that engaged in looking at the vocabulary count level of oral-language-only people whose native language had never developed a writing system?
Keith Stanovich: No. Then the other thing is that in addition to vocabulary, did you ever get to the granularity level of being able to assess for abstraction, the capacity for abstraction, the dimensional extent of abstraction, or generalization, or some of the other things that come with learning to be an abstract processor in the ways that reading requires?
We did several studies that started to address that and one that was published. Keith Stanovich: We focused on, and again, this comes a bit from my interest in critical thinking, decontextualization ability. The ability to decontextualize, to stand aside from a media context and process abstractly. So again, the classic paradigm would be one of syllogistic reasoning with unbelievable conclusions.
You create situations where logical validity is in conflict with the real world content of the conclusion. So, you give people a syllogism that is valid but a conclusion that is unbelievable in the real world. Or conversely, you give people a syllogism that is invalid but has a conclusion that is very congruent with world knowledge. We found a small effect of print exposure on that type of ability. Keith Stanovich: Yes, yes.
But that correlation, again, diminishes when you start partialling out other alternatives.
0コメント