When it comes to avoiding nasty surprises, most of us rely on our experience to get by. We understand less of the physical world than we should, but we have our hunches, our clinical judgment, our gut feelings, and sometimes we’re successful. In linking vaccines to autism or fluoridated water to common cancers, we don’t do so well.
Getting along with people, we operate a little differently. The late Jerome Bruner suggested that we use narrative thinking to predict human behavior more often than scientific, paradigmatic thinking. That is, storytelling is more convincing to parents and juries and newspaper readers than models in molecular biology in accounting for the way someone behaves. Tales of bad blood are more likely to circulate than predictions about mutated genes.
These two ways of understanding are not mutually exclusive. We use them both, depending on the circumstances. A few people have excelled at using both, such as Sir Peter Medawar, a Nobel prize-winning biologist who could write about science like a detective story. For example, neuroscience and psychology researchers communicate among themselves through paradigmatic journals that are scientific but hard to read. Because the researchers share the same conceptual background, though, the reading is enlightening even if it is chewy.
This has been a mistake in communicating to the public, which responds better to narrative thinking. Even from your own psychology classes, what do you remember best? If the time since you enrolled in psychology classes is measured in decades rather than months, you may remember the Kitty Genovese story about 38 bystanders who listened to her cries and supposedly failed to call the police; or the Milgram study about lethal shocks; and two others (Zimbardo’s prison experiment and Sherif’s Robber’s Cave study) that had to be stopped early because they triggered violent behavior in kids or college students. People of refined sensitivity would add others. They were all of questionable value, though memorable.
The Kitty Genovese murder did occur, but it did not feature 38 silent bystanders. Conclusions from the Milgram study have been reported to have been distorted. Even Phineas Gage of tamping iron fame needs reassessing. In the intro course in psychology that I took, long ago, it was the Jukes and the Kallikaks that misled faculty and students. You don’t see them anymore, fortunately, but they made great stories.
It’s interesting to ask why we absorb stories and are satisfied with them to predict the behavior of people around us. One suggestion has been that we need the Who of behavior as much as the What, Where, and How of it to make sense of what others will do next. A narrative account may not exclude chance as an explanation as well as a paradigmatic account, but it will maybe feel truer or more realistic.
However the assurances that our brains are prediction machines has to include making sense of our friends and enemies, too. How we do this can be described in either probabilistic or in neural terms or, of course, both. For a practical example, consider our reactions to the COVID-19 pandemic. It is unprecedented in our experience, and its outcome depends on how our friends and their viral shedding circulate.
Whereas for tips on the spread of a virus we rely on epidemiologists, for other people we mentalize. From our friends we build a sense of who we are and then use that sense to predict how others will behave. Some have argued that mirror neurons are responsible, while others have implicated the default mode network.
PSYCHO: Will it hurt? If you are starting a new drug, where do you turn for understanding? Do you read package inserts full of facts, or news accounts featuring others, maybe a biased sample, who have volunteered their experiences?
It’s amazing that we can learn so much from the experiences of others. Starting in childhood, at around three or four years, our earlier observations of people coalesce into a theory of mind. The awareness of other minds is not present at birth, nor is it a logical inference. It develops poorly in some people who are logical but autistic. Nonetheless, by the label of theory of mind or mentalizing, it is the basis for much of our guesswork about how others will behave.
We vary in our use of this mode of understanding. Simon Baron-Cohen has Simon Baron-Cohen has argued for years that men are rather paradigmatic in their thinking, and in the extreme tend to be autistic, while women are more narratively oriented empathizers. As with most dual-process theories in psychology, it’s a good idea to keep in mind that most of us are probably lumped in the middle. Genes are not destiny when it comes to behavior, which is agreeably plastic. Baron-Cohen has freely admitted that men and women overlap, although this does not assure us that empathizing and systemizing evolved together.
SOCIAL: It has been suggested that mentalizing may lead us into the fundamental attribution error, attributing the behavior of others to the traits in their personalities while we ascribe our own behavior to the situations we confront.
I would not consider this to be one of evolution’s many flawed outcomes because of concerns that the fundamental attribution error is largely a false dichotomy. The distinction between one’s disposition and the situations that affect us refers to a single interaction, analogous to the false dichotomy of nature and nurture that overlooks the fact that nature and nurture always interact in influencing behavior. Recognizing this fact removes the paradoxical sting from efforts to localize such processes in the brain.