All the trouble starts when people forget they’re human.
– Oliver Sacks, A Leg to Stand On

 
Growing up, one of my favorite shows was My Favorite Martian. If you’re not sufficiently ancient or addicted to terrible — I mean, retro-cool — TV to remember, Ray Walston’s title character looked like a human but had knitting-needle antennae he could raise from the back of his head, plus an aluminum foil spacesuit and other unspecial effects. Bill Bixby, his Hulk days still ahead, spent three seasons in the 60s trying to conceal from the neighbors that Uncle Martin was an alien. Hijinks ensued.

I was in it for the hijinks, of course, but Martin could do some terrific things beyond raising and lowering those antennae. He could make himself invisible, like Jeannie a few years later or Sue Storm of the Fantastic Four — making him a rare exception to the invisible-woman trope — and he could levitate objects by pointing at them, and freeze people or speed them up like a bored film projectionist at a Tuesday matinee.

The best part for me, though, was the mind reading. Martin could project his thoughts and read other people’s, and that’s what I wanted to do. It still is.

But as with the X-Ray Spex advertised in every comic book of the time, I had no interest in how it worked. Why would Spex show you the bones of your hand but only peek under a girl’s clothes? I didn’t care. At that age I wanted to penetrate minds more than underwear, although to be honest it’s not a choice I was ever asked to make.

If anything, mind reading is even bigger now than during Martin’s brief stint on our backward planet, but it goes by a different name. Telepathy still survives as a subcategory of ESP, roughly on a level with alien abduction, but empathy is everywhere: Psychology Today (“Are We Entering the Age of Empathy?”), Medical News Today, Harvard Business Review (“Empathy is Key to a Great Meeting”), articles scientific and otherwise, and of course the web in all its glory: 6 Habits of Highly Empathic People, How to Show Empathy in 13 Steps, Testosterone Lowers Empathy, Music Cultivates Empathy, Yawning Reveals Your Level of Empathy, and my personal favorite, Chickens Are Capable of Empathy.

A podcast I occasionally listen to asks guests what they regard as overrated and underrated that week. It’s a fun segment, and the answers are often surprising. How can that be overrated, I find myself thinking; it’s the best ever. That’s underrated? It’s all I ever hear about. It isn’t easy to get inside the head of someone who sees the world differently.

I haven’t heard the hosts ask about empathy, but I’d pull the car over to listen to that one. Because while we all sense that empathy’s important, no one seems to agree on what it means. Being kind and thoughtful? Feeling bad when someone else does? But sympathy covers that, doesn’t it, as do pity, compassion, commiseration, fellow feeling. No, when we’re empathetic we’re supposed to do more than just recognize someone else’s feelings, we’re supposed to share them.

Ah, but there’s the rub: does sharing mean imagining what it’s like to be you, as a trained counselor might, or an actor, or a novelist? Or does it mean truly taking part in someone’s happiness or pain, overlapping with that person, becoming a human Venn diagram? That option, as sci-fi as it sounds — literally sensing another’s pain or pleasure — might just be possible. What’s less clear is whether such a deep dive would render us more human, or less, or something else altogether.

This state has been given multiple names, no doubt because “empathy” was becoming too clear. One is simulpathity, psychiatrist Bernard Beitman’s term for a journey beyond sympathy (an awareness of suffering) to the actual experience of it, even from afar. His research grew from an incident in which he began choking over the sink one night in San Francisco, and the next day was told that his father had choked to death at roughly the same time in Connecticut.

As arresting as this anecdote is — and you may have a similar one, or know someone who does — Beitman did not markedly advance his claims by founding the field of Coincidence Studies, or writing the book Connecting With Coincidence, which invokes the “psycho-sphere” (a “mental atmosphere that surrounds Earth”).

Whatever else you might think empathy is, it isn’t new. The word einfuhlung (in-feeling) was coined in 1858 by German philosopher Rudolf Lotze and entered English 50 years later courtesy of British psychologist Edward Titchener. We’ve been trying to read each other for millennia, of course, but that tumultuous half-century brought fresh attention to the project of fathoming our fellow humans. Newly named is newly noticed. Titchener first used it as a term of art appreciation. Regarding a painting or statue, he writes, “Not only do I see gravity and modesty and pride and courtesy and stateliness, but I feel or act them in the mind’s muscles.” Freud referred to it in his essay “Jokes and their Relation to the Unconscious,” and both Darwin and Adam Smith talked about it employing the word sympathy — which Smith called a faculty of “changing places in fancy with the sufferer,” distinct (but not very) from compassion.

If the idea of empathy isn’t new, though, it’s newly ubiquitous. Treatments of it cover the waterfront. “Empathy is not enough in reporting,” Ira Glass claimed in a Columbia School of Journalism commencement address in 2018. “Some stories are not about empathy.” This might be true from a reporter’s standpoint, but from a reader’s it is harder to defend. Type “empathy” into Amazon Books and watch as 1,500 titles unfold. Oxytocin is the “empathy gene,” a commentator informs us, although genes are on a very long list of things that neither oxytocin nor empathy is. A Wharton professor defines it as “emotional contagion… We are literally infected with other people’s emotions.” (Well, no.) Historian Doris Kearns Goodwin says that even as a child Abraham Lincoln “possessed an uncanny ability to empathize with other people’s point of view… to feel other people’s feelings.” A review of William Trevor’s final book begins, “If the engine of accomplished fiction truly is empathy” — which assumes it is — and Obama speechwriter Sarada Peri tells us that FDR, before fireside chats, would envision a mason working on a building or a farmer in a field, because he understood that “everything starts with empathy.”

It does seem that way, once you’ve tuned your mental antennae to the E-word. Listen to people discuss neurobiology, novels, business ethics or almost anything else and you’ll hear empathy described as universal, non-existent, irreplaceable, invasive, whatever the speaker needs it to be. But as much as it’s talked about, empathy still isn’t well understood. Women may have more of it than men, workers more than the wealthy, identical twins more than fraternal. Progesterone boosts it, testosterone doesn’t, but there aren’t clear gender differences early in life. Babies do without empathy until age 4 — or, if it’s genetic, they only express it then — and psychopaths have none, period.

The numbers associated with this, such as heritability estimates ranging from 10% to 70%, can’t set our nature/nurture questions to rest, and even that Silence of the Lambs truism doesn’t quite hold up. Psychologists Paul Ekman and Daniel Goleman make distinctions between two kinds of empathy, cognitive and emotional (others would add compassionate), with psychopaths said to have plenty of the first but none of the second. Severe autism is characterized by the reverse. By the time I came across articles devoted to “self-empathy” I had begun to wonder, not for the first time, if we’re ever all talking about the same thing.

Speaking of psychopaths (there’s an opener for your next dinner party), consider Thalia Goldstein, a Pace University researcher who studies Theory of Mind, the ability to understand what others are thinking or feeling. Though this sounds like still another synonym, Goldstein says empathy isn’t a kind of understanding but rather “a feeling we get that is appropriate… to someone else’s emotion.” It’s the difference between gauging a person’s mental state and inhabiting it; between analyzing a pair of shoes and strapping them on.

Goldstein says her research subjects demonstrate a great deal of the first skill but little of the second. They’re “extraordinarily socially attuned” but don’t feel the emotions their antennae pick up; they simply use them for their own purposes. Before you dismiss this as irrelevant since you aren’t, presumably, Hannibal Lecter, I should note that Goldstein studies actors, not psychopaths. If either happens to be sitting next to you at that party, someone’s about to be uncomfortable.

Let’s say you are sitting near Lecter or one of his ilk. Would you know it, just sense something? Would your antennae quiver? Probably not. Actor or not, he’ll be acting normal. “He seemed so nice” is the obligatory evening-news refrain when body parts turn up in someone’s basement, and it’s borne out by research. A study of 250 serial killers revealed that just 20% had been diagnosed with a psychiatric disorder before being found out. That’s two hundred strolling the grocery store aisles, looking up from laptops in coffee shops, their eyes, smiles, coloring, speech, demeanor giving away nothing.
 

*****

 
Empathy is declining, it seems, even as we talk about it obsessively. Maybe that’s why we do. Recent news articles inform us, for instance, that today’s college students have less empathy than past generations — 40% less, according to one study, than during the 1980s and 1990s. One problem with this, beyond arguing about what the word even means, is that these results are self-reported. The same holds for EQ (Empathy Quotient), which aspires to measure that quality with “brief self-reports.” Are we really the best judges of our own states of mind and emotions, much less those of others? I believe I’m empathetic, and I think others might agree. (If I were truly empathetic, I guess I’d know.) But which matters more, intent or effect? My perception or theirs? It seems worth noting that 93% of American college students also believe they’re better-than-average drivers, a position it doesn’t take college statistics to label impossible.

Rating ourselves above average, incidentally — way above — is not restricted to the young or data-ignorant. In a 1977 study, 94% of college professors thought themselves above-average teachers, and a recent survey found that one-third of software company employees place themselves in the top 5% of their peers.

It doesn’t help that empathy is one of those attributes, like intelligence or a sense of humor, that we all think we have more of than most people. In the case of humor this is slightly comical, which I suppose qualifies as irony. With empathy, though, the irony becomes textbook since the whole point, of course, is to understand the world from someone else’s perspective.

Think of all the images we employ for this: standing in the other guy’s shoes, I’m in his head, she’s an open book, we’re on the same wavelength, see through someone’s eyes, feel your pain. This demonstrates our uniquely human facility with metaphor, but to me it also signals the presence of a phenomenon more obsessed about than understood. Think about sex (as we supposedly do every few seconds), and consider the hundreds, the thousands of euphemisms for it. It’s difficult to utter a phrase that couldn’t, with a sufficiently focused puerile imagination, be construed as sexual.

The same seems to be true of empathy: the more you look, the more fractal it becomes. Drop another firecracker into this confusion. Supposing for a moment we could agree on a definition, is empathy even good? It sounds like a ridiculous question, but some very thoughtful people have asked. Paul Bloom, a psychologist at Yale, caused a stir recently with a New Yorker article called “The Baby in the Well,” later expanded into a book with the blunt title Against Empathy. To Bloom, empathy, however benevolent in intent, risks subverting true altruism because it’s both biased (directed mainly toward people who look and sound like me) and innumerate (one from Group A is worth 100 from Group B).

Bloom’s thesis brought a storm of objections, from people like Simon Baron-Cohen of Cambridge and the charity Empathy for Peace, for whom “evil is the erosion of empathy,” and from people born without the irony gene who left death threats on his voicemail. Bloom is backed up, though, by recent evidence of a “racial empathy gap.” Researchers showed subjects a video clip of a needle entering skin, meanwhile measuring activity in the brain’s pain matrix. When the needle pierced black skin, white subjects had a markedly lower response. Another study revealed that white people, even medical personnel, assume that blacks feel less pain.

Bloom prefers “rational compassion” as a way forward, a method of caring for others that values “conscious, deliberate reasoning” over easily manipulated emotion. That head-or-heart debate won’t be ending soon either, but some would say even asking whether empathy is good is a waste of time. The asking assumes it’s real, rather than a metaphor or synonym for wishing, judging, guessing, goodwill, best intentions. But what if it was real? What if we could cross that line from unclear to Uncle, from metaphor to Martin?

Things always get more interesting with a shift from embodied-in-language to language of the body. But interesting, like new or different, doesn’t necessarily mean good. The idea of melding with other people in more than a figurative way, taking the meta out of metaphysical, has inspired more complex reactions than Kumbaya. Star Trek, for example, imagined it bleakly. In the 1968 episode “The Empath” a beautiful mute alien absorbs the crew’s injuries, taking them on herself, then is forced to suffer as Kirk is tortured by other aliens. In later Trek incarnations, Deanna Troi, another beautiful woman and half-alien — a step closer to human — has empathic powers (“the psionic ability to sense emotion”). Things don’t work out so well for Troi either. Plot lines repeatedly pit her against other aliens who assault her, mentally and physically, and “talk through her body.” This sounds much like the “secondary trauma” or “empathy burnout” suffered by health care and disaster relief workers who invest too much of their bodies and minds in their patients’ torment.

Even getting away from disasters and psychopaths, as I do whenever possible, it turns out we’re awful at this. In normal life, among normal people (whatever that might mean) we can’t read minds or faces either. “The eye is traitor of the heart,” wrote poet Thomas Wyatt in the 1500s, but Shakespeare wasn’t buying it. He has Macbeth’s King Duncan say, after executing a traitor, “There’s no art to find the mind’s construction in the face.” The king is dead right; he says this to the character about to murder him. Bringing the dark side up to date, Neal Acherson writes in London Review of Books that anyone who lived in Soviet-era communist countries had a secret police file filled with denunciations by people they knew, trusted, even loved: “The crowning mercy of human relations is that we don’t know what other people are really thinking about us.”

Dan McNeill, author of The Face, sides with the Bard here. Study after study has shown that we’re terrible at detecting whether someone is lying, and the rare subjects who can outperform chance don’t know how they do it. He cites Paul Ekman in Telling Lies, who says the few with higher success rates rely on no conscious algorithm, pattern or cue. “Rules of thumb don’t work,” says Ekman. “Anyone who can say, ‘I know what to look for’ is going to be wrong most of the time.”

There are some facial and vocal tipoffs — a rise in pitch, slower speech, “micro expressions,” brief answers, fewer gestures — but there’s simply too much variation in the list, and in people, for any of that to be definitive. Liars often smile early, for example, or late, or asymmetrically, but so do the rest of us. Even judges, detectives and Secret Service officers can’t dependably tell a liar. Nevertheless, McNeill says, “In all groups, the subjects’ self-assessments of skill at lie detection bore no relationship to actual score.”

Maybe this is not too surprising, given that it isn’t only our neighbors, with or without basements, who can fool us. We’re perfectly capable of fooling ourselves. Dan Gilbert, author of Stumbling on Happiness, says most of us believe we understand what makes us happy, often make predictions about it, and are usually wrong. Or there’s the familiar placebo effect: give a patient a sugar pill with a reputation attached, and even if you tell him what you’re doing, it’ll help. Tell another you’ve done arthroscopic repair when all you’ve done is knock her out and sew a few stitches, and her knee will improve. In a condition called spatial hemi-neglect, patients with a parietal lobe injury behave as if half the world doesn’t exist, including their own body: they might eat off the left side of the plate, shave only the left side of their faces, even deny that both arms and legs are theirs. The flip side of this would be phantom limb syndrome in amputees, or deaf-mutes and paraplegics who dream themselves intact, even if the handicap was congenital. “It is as if the dream has access to a different person,” says a neurologist who studies the phenomenon.

This disparity between what we think we see and feel and what’s actually there could arise partly because of the “reigning common-sense belief,” as researcher Lisa Feldman Barrett says, that “emotions are reflexes,” that we’re born with “pre-wired emotion circuits, that anger, or sadness, or fear, looks and feels the same in all people, and even in some animals.” Barrett, author of How Emotions Are Made, flatly debunks this. “Emotions are not universal. They’re not even historically static in time. If we assume people wrinkle their nose only in disgust, scowl only when they’re angry, that we can read emotions like words on a page, it can wreak havoc in your life.”

Women, for instance, are more likely to die from heart attacks over 65 because their doctors (and they themselves) believe the symptoms express anxiety and send them home. Judges and juries similarly try to read guilt or remorse in a defendant. “As a scientist,” Barrett says, “I have to tell you that jurors do not and cannot detect remorse or any other emotion in anybody else. Neither can I, neither can you. Emotions are guesses.”

Can machines guess any better? And if so, would that make them more like us, or merely superior? The “emotional machines” that Rosalyn Pickard predicted in the 1990s are apparently here as “Emotional AI.” A company called Softbank offers Pepper, “the first emotional robot,” with this marketing tag: “Pepper is here to make people happy, help them grow and enhance their lives. Think of it as high tech you can high-five.” Pepper is meant to elicit emotions, perceive them and (maybe) experience them. Some find Pepper’s eye contact comforting, a felt connection; one researcher commented, “I sense that he cares about me, tries to understand me.” Others find it disturbing, less connection than absorption.

If an artificial you is creepy, why should it be that the more alike, the creepier? A vacuuming Roomba isn’t especially unsettling, but Alice from the Jetsons, and Pepper and whatever’s next, risk entering the so-called uncanny valley, an aesthetic space in which the closer that human-ish objects come to us the more they elicit feelings of eeriness or revulsion. The first robots we imagined and named, Karel Capek’s Rossum’s Universal Robots, came fitted with emotions and skipped quickly from servant-partner to overlord. Even the movie The Polar Express has this effect on some people. Despite the warm Christmas story, and Tom Hanks’s voice and a soundtrack full of Crosby and Sinatra, one reviewer thought its motion-capture animation, half cartoon half actor, made “the human characters come across as downright… well, creepy.” He claimed the film ought to be subtitled “The Night of the Living Dead.”

That review came back to me when I saw an interview with Stephen Asma, a philosophy professor at Columbia College Chicago and author of On Monsters, who argued that loaded word still fits some people, among them Stephen Paddock, who killed 58 people in the 2017 Las Vegas shooting. Monster is a term, Asma said, that we reserve for those whose “behavior, their motives, their mind are almost impossible to understand. Our theory of mind doesn’t work on these people.” But a moment after placing monsters across this chasm of incomprehension, Asma’s train of thought detours into Uncanny Valley. “There’s something violating our sense of bodily barriers,” he muses. A monster is “the animal that you should be afraid of, or that you can become.”

In a world with more IoT devices than people — something over 20 billion, including smart phones, smart locks, smart gloves and even smart mirrors — maybe we do want to meld with Pepper, or with each other. Perhaps it’s more surprising that we haven’t, given the mirror neurons we carry around. These were discovered in the 1980s when researchers at the University of Parma noticed their macaque monkeys’ brains were responding in the same manner — down to specific neurons — whether they were performing an action or watching it. If an experimenter put a peanut in her mouth or threw a ball, the appropriate neurons in an observing monkey’s brain would fire.

Initially called “monkey see, monkey do” neurons, predictably enough, before being dubbed mirror neurons in a 1996 paper, they’ve since been documented in humans. Watch someone cut her finger, or hit a backhand or eat a banana, and if you happen to be hooked up to a brain scanner it will record your neurons firing as if you were doing the cutting, hitting or biting. V.S. Ramachandra, a pioneer in this research, describes these cells as “dissolving the barrier between self and other,” so it will come as no surprise that he prefers to call them “empathy neurons.”

Neuroscientists Robert Burton and Marco Iacoboni have recently debunked some of the monkey-see claims, Iacoboni noting that “a diversity of mental states can generate the same motor action.” In a poker game, for instance, “understanding that your opponent will soon push his chips forward tells you nothing about the purpose behind the motion.” Burton imagines a photograph of a mother and her young child:

I look at the mother and see a combination of love and awe. But with a moment’s reflection, I realize that I have gathered together some general assumptions about what humans have in common and dropped them into her mind. I have no way of knowing if she is also worrying that her husband might feel neglected by her single-minded devotion to her child, wondering when to enroll her child in preschool…. I can imagine her mind at the most universal and generic, but not at the particular.

Still, in a few people these mirror cells pull us further into Trek territory. Perhaps 1 in 60 of us experience a condition called mirror-touch synesthesia in which such neuronal aping rises toward the real — a “mental simulacrum so strong that it crosses a threshold into near-tactile sensation,” as one investigator puts it. When Dr. Joel Salinas, a neurologist, thumps a patient’s knee with one of those tiny rubber tomahawks, he feels a tingle nudge his own knee. Performing a spinal tap, he feels the needle slip into his lower back. If he sees someone get pinched or slapped, he feels a hint of the same on his arm or face. The overlap is more than skin deep. “When a psychotic patient goes into a rage,” a profile of the doctor notes, he “feels himself getting worked up.” More unsettling still: “Even when patients die, Salinas feels an involuntary glimmer of the event firsthand. His body starts to feel vacant — empty, like a limp balloon.”

This “heightened empathic ability,” as some neurologists refer to it, comes in handy in Dr. Salinas’s work (one article about him is titled “This Doctor Knows Exactly How You Feel”), but there are gradations of the experience, with some finding it more curse than gift. An English woman describes seeing a man get punched and passing out in her car. She takes daily medication to calm the constant sensory tsunami.

The first documented case of mirror-touch involved a 41-year-old British woman who was surprised to discover that her ability was abnormal. Dr. Salinas, too, realized only during medical school that he was unusual. “Isn’t that everybody?” Salinas asked a colleague who asked him what it was like. This seems worth noting. Neither could sense other people’s inability and so had no way of feeling different, even though their difference resides specifically in that ability to feel. “I just figured it was like being human,” Salinas says.

For the rest of us, or those who want to augment our neuro-powers, there are sensors like FitBit and Apple Watch to monitor heartbeats or how much we sweat. But what about sensing other people’s bodies? With or without mirror neurons, we have only so much natural bandwidth. At a given moment there are something like 11 million pieces of information available to us, of which we can only consciously process about 40. The notion of “bounded rationality,” which won a Nobel Prize for economist Herbert Simon, plays on this astounding disparity. If we’re going to join with Martin or Pepper, we’re going to need some help.

Researchers at MIT are doing their part. In an effort to update X-Ray Spex they’ve trained a “neural network” to see people behind walls using radio waves from broadcast antennae: “It’s X-ray vision without the need for harmful X-rays.” We now have Google Glass, and brain-scan decoders, and AlterEgo, a wearable device that loops around your ear and, according to marketers, can “recognize non-verbal prompts, essentially reading your mind.” Or you might choose O2Amps, glasses that enhance color vision so as to “perceive oxygenation and hemoglobin variation in another person’s face… a date’s emotional state would be effortless to identify; you could easily tell if they’re blushing or aroused (or not).”

Or the breakthrough might come with “deep learning,” autonomous neural systems that will ultimately, predicts one AI scientist, become as complex as our own brains, “joining the myriad living creatures on our planet.” He scoffs at “fearmongers” who worry this might make humans obsolete. Similarly, neurologist Poppy Crum in a TED talk enthuses about new technology that will tell real smiles from fake — there’ll be “no such thing as a poker face” — or detect linguistic changes that signal honesty, or analyze the chemical composition of our breath, all of which will “let us feel more, connect more, become closer and more authentic. I believe it is the era of the empath.”

Is it permissible, without sounding like a Luddite yelling at kids on his lawn, to wonder if this is really what we want? Or what we’ll get? O2Amps is described as “a mind-reading game-changer,” and fair enough. But are we sure that afterward we’ll still be us? Don DeLillo has a character reflect in Point Omega, “It’s what no one knows about you that allows you to know yourself.” It’s hard not to think about slippery slopes here; “open mind” takes on different connotations.

Instead of real or artificial people, what if we looked to imaginary ones? Could fiction facilitate a Martinesque mind-meld while avoiding the uncanny valley? After all, we’ve been making up stories since not long after our own banana-eating era, and author Francine Prose pictures on her personal cave wall “the sentence or paragraph that the reader cannot only comprehend instantly but see straight through to the writer’s intention, so that reader and writer are communicating directly, brain to brain, like aliens in science fiction.” This seems more aspirational than descriptive, however, as it arrives in an essay titled “It’s Harder Than It Looks to Write Clearly,” accompanied by a rueful admission from the voice of experience:

Let’s say that you have written something, and it turns out that no one has the faintest idea what in the world you could possibly mean — no one but you, the writer. And in the absence of clarity even the writer may forget the formerly obvious purpose that has somehow managed to burrow and hide beneath a fuzzy blanket of language.

Artists, however accomplished, will nod their heads at that. Doris Lessing devoted her novel The Golden Notebook to the theme of madness and healing but later complained, “Nobody so much as noticed…. Handing the manuscript to publisher and friends, I learned that I had written a tract about the sex war.” Lewis Carroll wrote Alice’s Adventures in Wonderland as a protest against abstract math; Georgia O’Keefe insisted her irises weren’t meant to evoke female genitalia. Ray Bradbury was once shouted out of a lecture hall for claiming that Fahrenheit 451 was about television, not censorship.

Clearly all readers aren’t obtuse, or all writers oblique. Maybe it’s simpler than that and every narrator, fictional and real, is unreliable. Novelist Adam O’Fallon Price calls this “less a structural feature of storytelling and more a structural feature of the human condition. We lie to ourselves, we lie to others, and even if we mean to tell our story with complete honesty, we can never fully understand it. The proof that we’re unreliable narrators is the fact that everyone is the star of their own story.” Colum McCann in “Thirteen Ways of Looking” puts it another way: “So many things are unexplainable, and how is it that we know a life, except that we know our own.”

Maybe empathy is one more story, a cave painting in which we intend to depict others but inevitably draw ourselves. Milan Kundera calls the characters in his novels “my own unrealized possibilities. Each one has crossed a border that I myself have circumvented.” George Saunders thinks stories make us “empathize with people we don’t know, and if the storytelling is good enough we imagine them as being, essentially, like us.” He calls the best stories “complex and baffling and ambiguous,” and I submit it would be hard to come up with a better description of humans. Nate DiMeo of The Memory Palace captures this perfectly in “Other Bodies”:

When a running back jukes left, we don’t know the numbness in his knee from the cortisone shot that masks the pain in his patellar tendon. We just see the 4-yard gain. When the actor accepts her Emmy, we don’t know the blister forming in her lower heels; we just see the grateful smile. We don’t know our neighbor’s lupus, or seem to be able to remember our sister’s or our parent’s or our partner’s pain, in their back or their wrist or whatever, even though it’s with them all the time and it’s all they’d really want to talk about if anyone really wanted to hear. We can’t hold it in our heads, because they are our heads. The best we can do is pause to imagine, and try to remember.

As solitary as that sounds, I have a feeling he might be right. Maybe the trying is what matters, and the failing, and that’s all there is. As Kelly Corrigan, author of Tell Me More, puts it, “My whole thesis in life is, all people care about is they want to feel they’ve been felt.”

Felt, yes, but not scanned, not recorded. Not sensed in a way that reduces the mystery of identity to circuits and bytes. John Edgar Wideman has a lovely phrase somewhere, “the ambiguous edges of other lives,” and I think that’s why we find the near-human both fascinating and frightening, why we can’t look away. It isn’t that Rossum and Pepper and Terminator and Atlas by Robot Dynamics and other novel humanoids — for god’s sake don’t Google “dentistry training head” — border on the unreal. It’s more that they’re strangely familiar, with each quality amplifying the other. Their edges are ours.
 

Latest posts by David Raney (see all)