Dreams Help You Mind-Read, Science Says

Dear CF,

Thought you might be interested in the following finding on dreaming, which I read today in Yahoo! News as I blearily sipped my tea:

A recent study by Walker and his colleagues examined how rest – specifically, rapid eye movement (REM) sleep – influences our ability to read emotions in other people’s faces. In the small analysis of 36 adults, volunteers were asked to interpret the facial expressions of people in photographs, following either a 60- or 90-minute nap during the day or with no nap. Participants who had reached REM sleep (when dreaming most frequently occurs) during their nap were better able to identify expressions of positive emotions like happiness in other people, compared with participants who did not achieve REM sleep or did not nap at all. Those volunteers were more sensitive to negative expressions, including anger and fear.

They noticed that sleep-deprived volunteers had reduced prefrontal activity. (Haven’t seen the actual paper, but this is obviously a different prefrontal area than the part which, when damaged, removes your ability to inhibit imitative actions set up by mirror neurons.) The article suggests that, without sleep, there’s an evolutionary advantage to remaining hyper-sensitive to negative emotion … all your available RAM goes into processing situations that might lead to harm.

It bolsters, in an interesting way, Vaillant’s claim in the Grant Study (which I wrote about here) that negative emotion has a more immediate payoff than positive emotion (remember the doctor who wouldn’t open the box full of his own testimonials?).

I’m interested, too, in how sleep, then, must affect what Gallese would call our “mind-reading” ability, “the activity of representing specific mental states of others, for example, their perceptions, goals, beliefs, expectations, and the like.”

I thought you’d be especially interested in the dreaming aspect of all this. The authors of the study suggest that dreams strip memories of their negative emotion. Which makes a sort of sense… particularly since (speaking totally subjectively, and I think you kind of agree with me) long periods of aimless or boring dreaming seem to me to correlate to periods of negative emotion. As soon as a little drama enters the picture, even if it’s an invasion of Indian ladies with casseroles, things seem to take a turn for the better. This makes me wonder, totally irresponsibly, about possible links between storytelling—narrative—and empathy.



Brain Mirrors, Brain Motors

Brains, brains, glorious brains. I wrote about Sapolsky’s lecture on how an interest in religion might be neurologically motivated because I feel R.S. galumphs rather irresponsibly into dangerous ideological territory. As neuroscience creeps closer to the murkily catholic discipline known as cognitive science, critics often point to what they see as the sinister goal of the field—to someday “see” thought, demystify experience and offer a biologically plausible but totally inadequate account of what it means to think and be alive.

Sandra Aamodt, former editor of Nature Neuroscience, writes the following in a book review about the tendency in “pop” neuroscience to speculate about the disorders of various historical figures:

wherever the historical Mt. Sinai actually was and whatever occurred either at its base or at its summit, Mosaic hypoxia is not likely to explain much about his subsequent teachings. Although seizures may have produced the visions of St. Theresa of Avila, they do not explain her capacity for vivid communication or, for that matter, her organizational abilities as a religious leader. Trying to explain too much with reductive biomedical arguments is an occupational hazard of popular science; despite the examples adduced here, the authors largely avoid this pitfall (for example, there is no mention of El Greco’s putative astigmatism).

To wit: there are things that neuroscience should not do. Trying to retroactively construct Shakespeare’s brain based on his writings is one of those things (it has been tried). But there are things it can help with: understanding the neural bases of empathy, for instance. Or tracing the neuroanatomical links between action, intention and spectatorship. It can help us, in other words, think a little more critically (and yes, biologically) about what it means to be an audience and how we participate in the stories we hear or read.

The skeleton key to this sort of enterprise is the now-famous phenomenon of mirror neurons, a cortical system which, when you watch someone grasp a baseball, increases the excitability of your own motor cortex and baseball-grasping muscles. Basically, this is a class of neurons that discharges both when you grasp a baseball and when you watch someone else grasp one. In their 1998 paper describing this discovery, Gallese and Goldman write that “every time we are looking at someone performing an action, the same motor circuits that are recruited when we ourselves perform that action are concurrently activated.”

Although the phenomenon was first observed in monkeys, it’s also been found to operate in humans, and on a much broader scale. Gallese and Goldman suggest that the mirror neuron system enables a kind of “mind-reading,” which they define as “the activity of representing specific mental states of others, for example, their perceptions, goals, beliefs, expectations, and the like.”

G and G take it as read that humans are pretty good at developing some internal representation of a conspecific’s mental state, and offer two mechanisms through which we might do so. The first, amusingly named “theory theory” (TT), speculates that we operate according to strict logical principles and create a “commonsense theory of mind” which acknowledges that only one’s mind is knowable through introspection, and that we have no direct access to the minds of others. All we can do is posit causal relationships and explanations. It’s an “if A, then B” sort of model that will give us the subject’s emotional state as output. For example, if we see John get hit in the face by Sam, we’ll assume that since people don’t usually like to get punched, especially not in the face, it is probably safe to conclude that John is angry at Sam.

The second, “simulation theory” (ST), suggests that we actually guess at someone’s emotional state by “putting ourselves in their shoes,” creating an entire system of pretend desires and beliefs through which we filter the available information and come up with an output. Gallese and Goldman offer the following example: Mr. A and Mr. B are sharing a cab to the airport and get caught in a traffic jam. They get the airport 30 minutes after their scheduled departure times. Mr. A is told his flight left on time, Mr. B is told his left just five minutes ago. Who was more upset? 96% of people say Mr. B was more upset. According to TT, we would need to come up with some sort of psychological law for why this should be so. According to ST, the “law,” even if we could find it, would be a product of our own internal decision-making mechanism with all its preferences and beliefs.

Now here’s where it gets tricky. Yesterday Slate published an article on whether or not Judge Sotomayor really would be a better judge because of her Latinaness and womanity. Dahlia Lithwick mentions a phenomenon called “imaginative identification” which she explains as follows:

The gist of it is that in order to get ahead in the world, you learn to see life through the eyes of those who have already succeeded. According to at least some anthropologists, women have had to get awfully good at understanding what it would be like to be a man. Men, on the other hand, are rarely forced to think about life in a woman’s Manolos.

A commenter on Lithwick’s article raises the important question: granted that perhaps we do pretend to step into each other’s shoes, how good are we at doing it? Or, as Gallese and Goldman put it, “if simulation is going to make accurate predictions of targets’ decisions, pretend desires or beliefs must be sufficiently similar to genuine desires and beliefs that the decision-making system operates on them the same way as it operates on genuine desires and beliefs. Are pretend states really similar enough to the genuine articles that this will happen?”

This is their answer, which stays in the realm of motor activity and doesn’t pretend to go into the more nuanced problems of reading emotion:

Homologies between pretend and natural (i.e. non-pretend) mental states are well documented in the domains of visual and motor imagery. (We assume here that visual and motor imaging consist, respectively, in pretending to see and pretending to do; see Currie and Ravenscroft.) These visual and motor homologies do not show, of course, that other pretend mental states, for example, desires and beliefs, also functionally resemble their natural counterparts, but informal evidence suggests this.

Curiously, the mirror system can go too far when impaired. One of the curious things about the mirror system is that although it increases the excitability of the muscles involved in a particular movement, it very rarely results in the movements themselves. An important exception: imitation behavior—people with prefrontal lesions who compulsively mimic behaviors performed in front of them.

G and G interpret this as a failure of the inhibitory system. They theorize that when one sees an action done, one forms a “plan” to perform the action oneself, which gets inhibited by the prefrontal lobe. When the relevant part of that lobe is damaged, the inhibition is lifted and the plan goes into effect.

Chilling, isn’t it?



Why Sapolsky’s Take on Schizotypal Personality Disorder and Religion is Problematic

Dear CF,

BoingBoing posted one of Robert Sapolsky’s (Stanford neurobiologist and author of Monkeyluv, The Trouble with Testosterone and Why Zebras Don’t Get Ulcers) lectures on schizophrenia and schizotypal personality disorder today. It’s an hour long, but makes for pretty interesting listening if you have the time to give it. In this installment he starts off speculating about the possible selective evolutionary advantages of schizophrenia, which—unlike cystic fibrosis and sickle-cell anemia, which protect heterozygotes (carriers, usually with one good copy of the gene) from cholera and malaria, respectively—hasn’t been thought to confer any kind of selective advantage.

He suggests an advantage exists, and that it lies in schizotypal personality disorder—sufferers who display milder schizophrenic symptoms and are labeled “half-crazy.” A group of scientists studying adoptive and biological schizophrenics in Denmark discovered, after interviewing all the parties concerned over a period of (I think ten years) that many relatives of schizophrenics display this attenuated version of the disease, which he characterizes as “movie-projector syndrome.” These people tend toward the antisocial; they prefer isolated occupations and are guilty of “metamagical thinking,” a near-schizophrenic kind of mental process that protects the sufferer from ostracism by successfully channeling odd or schizophrenic qualities into their proper contexts.

I haven’t tracked down his lecture on schizophrenia itself yet and I’d like to, because that definition of schizotypal personality disorder is rhetorically a bit too pat and makes it easy for him to (for example) retroactively ascribe it to shamans, witch doctors, medicine-men and religious founders generally. Anyone who thought he heard a burning bush talk or believed he was talking to a man who’d risen from the dead (or indeed claimed to have risen from the dead himself) would, today, be diagnosed with schizotypal personality disorder.

This is clever, of course,  but it’s the argumentation I’m objecting to. I realize this is just a lecture, but it’s disappointingly poor logic from a defender of rationalism. To suggest that a newly developed (and rather hazy) diagnosis, rooted in a spectrum of sane vs. insane behaviors and defined only by a list of symptoms that have a priori been categorized as “schizotypal” or “insane,” can be applied to someone thousands of years ago who has precisely those milder “insane” symptoms is a textbook example of petitio principii, begging the question. I have developed this definition, it says, and look! someone a thousand years ago fits it!

(The difficulty lies, I think, in locating the definitional limits of schizotypal metamagical thinking. Is there any irrational or metamagical belief that wouldn’t be automatically classified as schizoid/schizotypal? Is it a matter of cumulative weight? Sapolsky mentions that 50% of Americans believe in UFOs, but wouldn’t (I assume) classify half the population as half-crazy. Is it then a matter of authorship—it’s one thing to hold an irrational belief that’s been culturally transmitted, another to create an entirely new one of your own? I think he’s getting at the latter, and suggesting that your evolutionary “fitness” depends on your ability to persuade other, more rational creatures of the truth of your idiosyncratic vision.)

Having established (which he hasn’t, at least not in this lecture) that important religious figures in different societies were schizotypal, he uses this to prove that in fact people who suffer from schizotypal personality disorder actually wield a hefty amount of power and had no trouble reproducing and passing on their genes. No data is cited to support this, and he dismisses the fact that many religious figures (both in shamanistic cultures and mainstream religions) were proscribed from marrying and asserts that indeed schizotypal personalities (unlike their schizophrenic counterparts) were and are reproductively quite successful.

I’m skeptical about both retrospective claims for a couple of reasons. One, I’d be interested to see hard statistics on the reproductive success of major religious founders. It seems to me that anecdotally, at least, they fall into two extremes: celibacy or some version of cult-leader polygamy. Two, the line he draws between schizotypal and schizophrenic is the second case where he uses the conclusion to prove the premise. His argument goes thusly:

  1. Schizophrenic people are not reproductively successful and can’t behave appropriately according to context.
  2. Schizophrenics are therefore ostracized from society.
  3. People with schizotypal personality disorder are milder cases that can channel their putative schizophrenic experience properly (for example, they’ll have an epiphany in church, not on a street corner).
  4. Schizotypals are not ostracized from society.
  5. Therefore, because religious founders who claimed to converse with bushes, etc., were not totally ostracized from society, they must be schizotypal personalities.

This is logical and historical nonsense. Read more of this post

Schizophrenia, Hyper-Mentalism, and the Happy Puppet

Couldn’t stop thinking about it.

What to make of the Firecracker’s attraction to schizophrenia as a word and lifestyle, and why did it become the writer-singer-songwriter’s passport into a different kind of world? Schizophrenia, after all, goes beyond the mere desire for altered states of mind. Yeah, Coleridge loved opium, but this exceeds drugs, hallucinations, trumps the scope and governance of the will. Is this why it’s appealing? Is it a release from an oppressive hyper-consciousness? Is it a kind of Fate?

As evidence that what I’m saying actually happens, and that the word crops up in oddly reverential ways, some examples:

  • Talking about Lynch’s union of the banal and the grotesque, DFW says, admiringly, that “there’s a certain schizophrenia about it.”
  • From “In the Company of Creeps”, an article in Publisher’s Weekly:

    Wallace characterizes the public reception of both Infinite Jest and a followup essay collection, A Supposedly Fun Thing I’ll Never Do Again (Little, Brown, 1997) as a ‘schizophrenia of attention.’

  • The Firecracker I married described his turmoil over whether or not his desires were compatible with being married to me as being sliced in half while in the shower. He called this his schizophrenia, and declared finally that his interest in madness isn’t intellectual, but religious. In my lower moments I think he yearns for it.

My sister is schizophrenic. She’s plagued daily by origami devils and monster faces in her food. She spends hours tracking down hackers breaking into her computer, scratches strips of skin off to get at the bugs beneath, turns sly and calculating whenever a collection agency calls to collect on one of the forty cell phone accounts she’s opened and closed and left unpaid. She resents that no one will believe that the doctor removed her temporal lobe during one the many unnecessary surgeries she’s convinced them to perform. She’s tried to kill herself three times.

I mention this to justify—or at least disclose—what might be an unreasonably rigid sense of what schizophrenia means. For me, it’s always meant a clinical condition.

So I thought I should check and see what it actually means. The word was coined in 1910 (or 1896, depending on whether you ask the OED or The Guardian). The OED defines it thusly:

A mental disorder occurring in various forms, all characterized by a breakdown in the relation between thoughts, feelings, and actions, usu. with a withdrawal from social activity and the occurrence of delusions and hallucinations.
Used in the U.S. with a broader meaning than in Britain (cf. quots. 1979, 1980).

The earlier term was “dementia praecox,” the premature unraveling of the mind. Schizophrenia means “split mind,” a term coined by Eugen Beuler to describe the splitting of mental functions. (It’s kind of ironic that these days “split-brain” patients are epileptic survivors whose corpus callosa—the bundle of fibers connecting the two cerebral hemispheres—have been surgically cut.)

In fact, the word seems to be losing status in the scientific community. The romance is unfelt in this quarter, and some people are trying to get rid of it as a category altogether: From Kate Hilpern’s article “Muddy Thinking” in The Guardian

“As a single word, schizophrenia can ruin a life as surely as any bullet,” says Hammersley. “I know of one woman whose psychiatrist told her it would have been better for her to have cancer. Our desire to dump schizophrenia in the diagnostic dustbin is therefore not just about the poor science that surrounds it, but the immense damage that this label brings about. Lives are being ruined on the basis of a highly suspect diagnostic system.”

Other scientists defend the label. Vague and bland as it is, to dispose of it would eliminate research funding. They’ve pressed on, and two in particular have come to a pretty awesome conclusion about a possible genetic basis for autism and schizophrenia.

Turns out the quest for a baby’s mental health is the ultimate Boy vs. Girl genetic free-for-all, the egg-and-sperm version of the bedroom scene in A Pocketful of Miracles. Nature recently published an opinion piece by Christopher Badcock (heh) and Bernard Crespi called “Battle of the Sexes”.

Here is what they found. Read more of this post