Free Novel Read

In Pursuit of Silence Page 5


  In the chapel, my eyes were drawn to the candle in the glass. Though I couldn’t feel a breath of air where I was, the chain was being tugged gently this way and that by an otherwise imperceptible draft; the reflection of fire on the glass doubled the image of its burning orange glow and made it look like two wings fluttering tremulously open and closed, as though the proverbial moth had actually become the flame.

  So far from being taken outside my comfort zone, I found myself wanting to remain and sink deeper into it.

  The monks at New Melleray each, in one way or another, described themselves listening to silence for self-knowledge. Yet the self-knowledge that the silence of the monastery promotes is, in the end, less about discovering whom one really is, in our conventional use of the term, than about acknowledging the limitations of our grasp on what lies within and without us. Indeed, the self-knowledge the monks advocate—and which they believe the quiet of monastic life reveals to them—is the knowledge that there’s something beyond the self.

  All the time I’d been in the monastery, I’d been searching for some kind of clear, encapsulated lesson in the silence—something that I could take home with me. But what I’d received instead was a powerful reminder of the good that can come from not knowing, from lingering where the mind keeps reaching outward. I remembered speaking earlier to Vinod Menon, a neuroscientist who has done extensive fMRI studies of people listening to music. Menon discovered that the peak of positive brain activity actually occurs in the silent pauses between sounds, when the brain is striving to anticipate what the next note will be. The burst of neural firing that takes place in the absence of sound stimulus enables the mind to perform some of its most vital work of maintaining attention and encoding memories. I asked Menon what he took from this finding. His small, dark eyes twinkled. “Silence is golden,” he said. “Silence in the right contexts.”

  Even brief silence, it seems, can inject us with a fertile unknown: a space in which to focus and absorb experience—a reminder that the person we are with may yet surprise us; a reflection that some things we cannot put into words are yet resoundingly real; a reawakening to our dependency on something greater than ourselves.

  I wanted to stay in the chapel. But Alberic was already rising to his feet and beckoning me forward. I resisted his summons a few moments longer, then stood. I don’t know how long we remained in the end. It wasn’t long enough, and I felt overcome with sadness as we stepped away—rising back up from the depths of darkness and stillness, into the light and echoing footsteps of the abbey’s upper stories.

  CHAPTER TWO

  Why We Hear

  “You hear a snap off in the distance, what do you do?”

  The question was put to me—in a gruff voice that made me feel I should be paying better attention—by Rickye Heffner, a professor of psychology at the University of Toledo.

  “I turn my head and try to see where the snap is coming from?” I ventured.

  “You look for it! You want to know what made the sound and where it is. You want to orient your eyes toward it. Among mammals, the ears are animal detectors and they tell your eyes where to go to hunt for the animals they detect. So the ears have to have sufficient range and be able to hear the right frequencies to locate a sound source. That seems to be what’s driving the evolution of hearing.”

  Heffner and her husband, Henry Heffner, have dedicated the last forty-plus years of their lives to producing a vast profusion of studies with titles like “Hearing in Glires: Domestic Rabbit, Cotton Rat, Feral House Mouse, and Kangaroo Rat.” They seem to have explored the auditory mechanisms and evolution of hearing in every warm-blooded, young-suckling creature in existence. I telephoned Heffner because, after my initial ruminations on silence and noise, I realized that if there was something meaningful in the idea of listening to the unknown, I needed a better sense of what is known. It’s impossible to understand how noise or silence affects us without getting a handle on why we began to hear in the first place.

  After all, on the face of it, the very notion of a pursuit of silence seems a bit of sensory nonsense. Who, by way of comparison, would want to pursue an existence in which there was nothing to touch? We don’t go in search of a place without taste or scent (however many fragrances we might prefer not to sniff). Why would so many people have come to believe there was something not just enjoyable but beneficial in pursuing a state where one sense was exercised as little as possible? What is it about the sense of hearing that makes the idea of having next to nothing to hear so appealing?

  Heffner broke off our Socratic dialogue to announce that a family of deer had just stepped in front of her window despite the fact that a train was blasting past on the nearby rails (I could faintly hear its roar). She told me that she also regularly saw groundhogs and rabbits, along with the odd fox, the “dreaded muskrat,” and “platoons of evil raccoons.” They’re habituated to trains, she said, “so the animals don’t get frightened away—even though the noise is terrible.” Almost all of the Heffners’ research has been directed toward exploring an overarching, two-part theory: animals hear what they need to hear in order to survive; animals with the same kinds of lifestyles hear the same kinds of things. As it turns out, this amounts to a theory wherein skull size is destiny and sound localization is the raison d’être of the auditory apparatus.

  In the 1960s, the Heffners were both students of the evolutionary neuroscientist R. Bruce Masterton. Henry co-authored and Rickye helped with statistical analysis on Masterton’s first landmark paper, “The Evolution of Human Hearing,” a comparative study of the hearing of eighteen mammals. They discovered that if you could measure the space between an animal’s ears, you could predict its high-frequency hearing with tremendous accuracy. This is because we graph the location of a sound source in space on the basis of the differences in the way sound waves strike each of our ears. The amount of inter-ear distance available to an animal will be the main factor determining what cues it can leverage to track which way a beating wing or falling paw is moving.

  Since that study, the Heffners have looked at close to seventy additional species, and while the theory has been expanded and tweaked, the correlation remains unchanged: if you’ve got a big head, your high-frequency hearing is going to be less sensitive than if you have a small head. Most mammals take advantage of both time cues and “spectral differences”—changes in the intensity of the sound striking each ear successively—to identify the position of a sound source.

  Temporal signals are straightforward. Given sufficient distance between the ears, a sound on one side is going to hit one ear before the other. Time delay is the strongest, most reliable cue. “But,” Heffner noted, “not everybody has that luxury.” I found myself patting my hands to the sides of my head as she spoke to try and gauge my own head span. “If you’re a mouse, think about how far apart the ears are,” she said. “Sound travels from one ear to the next in perhaps thirty microseconds and the nervous system just can’t calculate a sound source being off at three o’clock or two o’clock in that interval.” So what does the mouse do? “When the chips are down you use what you’ve got—intensity. There are plenty of good reasons for being a small animal and you want to occupy that niche as effectively as you can. So you hear really high frequencies to make as much use as you can of the intensity signals.”

  The usefulness of the “sound shadow” cast by the head in localizing sound has been recognized since the mid-1870s. It was then that Lord Rayleigh, the indefatigable English physicist who discovered argon and explained the irregular flight of tennis balls, stood with his eyes shut at the center of a lawn in Cambridge surrounded by a ring of assistants brandishing tuning forks. When one assistant set his tuning fork vibrating at a sufficient intensity, Rayleigh could accurately identify that man’s position in the circle. If sound-wave cycles are sufficiently close together, he found, sound is louder in the first ear it hits, since the head blocks out the upper frequencies of the waves en route to the second ear. Ray
leigh dubbed this variation in intensity the “binaural ratio,” with “binaural” signifying the employment of both ears. In the last century, the extent of this shadowing was calculated electronically. At one thousand cycles, the ear closest to the sound source receives the wave at a level eight decibels more intense than that of the farther ear. At ten thousand cycles this ratio jumps to a thirty-decibel difference.

  Spectral difference is vital for creatures with tiny skulls, but time delay doesn’t work at low frequencies. When a sound wave is long enough, the whole wave can just “hang ten” around the skull and strike the second ear without being blocked at all. Masterton predicted that the smaller the skull, the higher the upper range of an animal’s hearing would be.

  Exceptions to the correlation between skull size and high-frequency hearing appear most notably in the case of pea-headed subterranean creatures like pocket gophers, which have poor high-frequency hearing. In these instances, the Heffners argue, having adapted themselves to “the one-dimensional world of an underground habitat,” sound localization becomes an empty exercise. Throughout the animal kingdom, the selective pressures on hearing concern the need to detect the nature and location of what’s out there snapping the twig. Rickye Heffner argues that much of the time when we try to use noise either to protect or frighten animals, we end up doing so on the basis of human psychology rather than evolutionary realities. Particular bugbears for her in this regard are ultrasonic deer whistles and flea collars. “Like the sound of a deer whistle on a truck is going to scare an animal more than the sound of the truck!” she scoffed. Our aware ness that a sound is pitched outside our hearing range triggers in us an association with danger, she contends. Flea collars blast eighty decibels directly beneath the cat’s ear at a frequency the felines hear perfectly well and quite possibly find agonizing—despite the fact that there’s no evidence the fleas themselves even perceive, let alone are affected by the sound. “There’s one born every minute,” Heffner sighed, “and most of them seem to own cats.”

  THE EVOLUTIONARY PURSUIT OF SILENCE

  A cross-section of the ear suggests an improbable patent application: a bagpipe, several models for the St. Louis Arch, and a couple of snapped rubber bands grafted onto a sea snail. Until recently, the ear was understood to operate on a model that might be abridged to the CBCs of hearing—channel, boost, convert—with those three steps corresponding to the external, middle, and inner ear respectively. The outer-ear channels and condenses sound waves from outside so that those waves strike against the eardrum. The eardrum then sends the mechanical energy of the sound into the middle ear, with its three tiny bones: the hammer, anvil, and stirrup. The wave amplifies as it passes along these vibration-friendly ossicles, the last of which is pressed flush against the oval window of the liquid-filled coil of the cochlea. This entrance to the cochlea marks the threshold of the inner ear. At the point where the stirrup goads the oval window, the pressure of the original force will have multiplied dramatically. The energized vibration now ripples into the fluid in the cochlea, triggering thousands of hair cells into motion. The movement of those cells transduces the vibration into an electrical signal that enters the auditory nerve, which, in turns, sends the sound into the brain. But this is not the whole story.

  Complications with the model begin at the innocent-seeming flaps on the sides of our heads. For all the confidence produced by three decades of work demonstrating correlation between skull size and high-frequency hearing, the visible ear throws a spanner into Heffner’s work. When she spoke about it, her voice took on a bitterness otherwise reserved for deer whistles and flea collars. “The pinnas,” she said, using the technical name for the external ear, “act as independent sound shadowers. They alter the degree of the sound shadow cast by the skull. This is part of their work as directional amplifiers. Animals point their ears at something and then they can hear it better.” But the extent of the pinna’s impact as a frontline amplification system continues to defy researchers. “The head is basically a lumpy sphere with two big funnels on it,” Heffner said. “Those funnels intensify a sound as it drops down toward the eardrums. But we’ve never tried to measure pinna dimensions because—what do you measure? Ideally, you’d get some of these animals and take several of their pinnas and measure the physical properties of what the pinnas do to sound coming into the sound canal, but it’s not remotely practical. Pinnas are very complicated shapes. Some are kind of flat. Some have big openings. There are all kinds of folds. And while animals with big heads generally have big pinnas, that’s not always the case. It’s known that the external folds help to augment sound and create a difference between what’s heard in each ear. So if you have a sound off to the right quadrant somewhere, and you’re a little bat with big ears full of fancy convoluted folds, sound coming in is going to have very different features than it does for a cow.”

  The mysteries of the inner ear are still more pronounced. Jim Hudspeth, who works at Rockefeller University studying the molecular and biological basis of hearing, has shown that the motion of the hair cells not only converts the mechanical wave into an electrical signal that can be read by the auditory nerve in the brain; the various reactions set in motion by the oscillation of the hair cells also serve to magnify the sound. A huge “power gain” takes place, he says, within the inner ear itself. How exactly this happens is still not understood.

  Regardless, we now know that all three parts of the ear play a dynamic role in boosting sound. If our auditory mechanism is working normally, Hudspeth told me, by the time we realize we’ve heard a sound it’s a hundred times louder than it was before it began bouncing around inside our ears. When you consider how little energy is released by a pin falling onto the floor, the amplification power of our ears is clear. Indeed, since so much of what the ear accomplishes involves making noise louder, it’s unsurprising that a majority of hearing problems represent an inability not to perceive sound but to properly amplify it.

  People like to distinguish between the ears and the eyes by saying that the latter have lids. But, in fact, the amplification function of the middle ear is complemented by a series of equivalent mechanisms that mitigate the effects of a loud noise. Our middle-ear bones have small muscles attached to them that are part of a reflex to reduce the vibration of the bones under the impact of a loud sound. One of them jerks on the eardrum itself so that it tightens and vibrates less violently. Another yanks the stirrup back from the oval window. The eustachian tube performs a complicated maneuver to equalize air pressure. But why amplify to begin with if you’re only going to end up deadening the noise?

  Because in nature, there aren’t very many loud sounds.

  “Most animals don’t announce their presence if they can help it,” Heffner told me. “Even the famous roar of the lion is an exceptional event to threaten an intruder.” For the most part, animals move through space as quietly as possible. Today people make noise to reassert their importance, but for our predecessors silence was almost always the secret to survival. “That’s why kids today are at such risk,” Heffner added. “I can guarantee you they’re going to have hearing loss, because when you’re listening to headphones you don’t realize the volume. Continuing loud sounds put a stress on the auditory system because the middle-ear reflexes are constantly trying to protect you from them … If you’ve got a generally noisy environment, you don’t hear the twig snap. But really loud sounds are just going to knock you off your perch no matter how preoccupied you are.”

  In 1961, Dr. Samuel Rosen, a New York ear specialist, wanted to measure the hearing of a people who had not “become adjusted to the constant bombardment of modern mechanization.” Rosen went off to visit the Mabaan tribe, some 650 miles southeast of Khartoum, in what was then one of the most noise-free regions of Africa. The Mabaans were notable among their neighbors for having neither drums nor guns. He went armed with 1,000 bottle caps, which he planned to distribute to tribe members as rewards for their participation in audio tests: Mabaan wome
n, he had heard, liked to fix the caps to their ears and make necklaces from them.

  Rosen discovered that the hearing of Mabaan tribe members at the age of seventy was often superior to that of Americans in their twenties. Some 53 percent of Mabaan villagers could discern sounds that only 2 percent of New Yorkers could hear. “Two Mabaans standing 300 feet apart, or the length of a football field, can carry on a conversation in a soft voice with their backs turned,” he reported. Rosen attributed the extraordinary preservation of hearing among the Mabaans to both their low-fat diet—which along with eliminating heart disease kept the cochlea well nourished—and the fact that they heard so little noise. The imbalance between noise and silence to which most of us who don’t live in remote tribal areas are subject dramatically accelerates the aging process of our hearing.

  Without having recourse to the hearing power of the Mabaans, there are still a few groups of people who use their ears in a manner consistent with the evolutionary pursuit of silence.

  When Jason Everman spearheaded the Third Infantry in the invasion of Iraq, he and his team were fitted with state-of-the-art noise-canceling headphones that had two radios, one internal and one external. He was never comfortable with them because of the auditory isolation they created. He always wanted to be, he said, “totally tuned in to the ambient sound.” Every noise a soldier hears on a special operation can be a clue to the situation he’s approaching. Everman didn’t want to miss a single auditory clue, and so he often just cupped one side of the headphones to his ear as he walked. He was also never comfortable with the modified Toyota trucks he and his team drove around in because, he said, “they were like an enclosed bubble, and if someone wanted to shoot at me, I wanted to hear it.”