Think of some of your favorite sounds: the voice of a loved one, the jingle of an ice cream truck, the song you danced to on your wedding day. How exactly do all those different sounds—music, laughter, birdsong, voices—travel through the air and wind through our ears to reach our brains?
All the sounds we hear begin as waves or vibrations. These sound waves come in all kinds of shapes and sizes. If you ring a tiny bell, for instance, the rapid vibrations caused by the metal being struck will create a high-frequency sound wave that produces a high-pitched ring. If you strike a bass drum, the relatively slower vibrations of the drum’s surface will send out a low-frequency sound wave that produces a low-pitched boom.
The human brain is able to hear as many as 7,000 distinct pitches, each created by a sound wave with its own unique frequency. When we hear sounds, it’s a result of these sound waves entering our ear canals and beginning an intricate process of transforming into signals that our brains can interpret and understand. To understand how that process works, we first need to get familiar with the basic anatomy of the ear.
The ear is made up of three main parts: the outer ear, middle ear and inner ear. These three parts all work together to capture sound waves and transform them into signals our brains can recognize. Each of these main parts of the ear have distinct functions in the overall process of how we hear. To get more in depth information on the different parts of the ears and how they work, visit our page dedicated to the anatomy of the ear.
Let’s follow a sound wave through the ear to get a better understanding of how ears work and what role they play in the hearing process.
Step one: The outer part of the ear captures a sound wave and funnels it through the ear canal, where it strikes the tympanic membrane (or outer layer of the eardrum).
Step two: The sound wave causes the eardrum and the three small ossicles bones within the middle ear to vibrate. The movement of the ossicles amplifies the sound waves as they are transmitted to the entrance of the inner ear, known as the oval window.
Step three: The vibrations travel through the fluid in the cochlea and move the tiny hair cells there. The movement of these hair cells creates electrical impulses that travel along the auditory nerve to the brain’s hearing center, where they are interpreted as sounds.
As the electrical impulses from the inner ear travel to the brain, their first stop is in an area of the brain stem known as the cochlear nucleus. The cochlear nucleus organizes the electrical signals according to duration, intensity and pitch, and then passes the organized signals on to other areas of the brain for further processing.
The thalamus, at the base of the brain, is responsible for collecting and sorting sensory information from all of the senses and directing it to appropriate areas of the cortex to prepare a response, such as a verbal response or the fight or flight reflex, depending on the content of the signal.
In the temporal lobes of the brain (right above the ears) is the auditory cortex—the brain’s language processing center. This is where the decoded and organized auditory signals are fully unpacked into individual words, recognizable voices and other identifying information such as location, volume and tone.
Finally, the prefrontal cortex puts all the information from the auditory cortex and other brain sites together. This is where the brain forms a cohesive understanding of sensory data, integrating the content and meaning of a sound with other information such as facial expressions, body language, memory and emotion.
Not all sound waves follow the above process. If we heard and processed every sound wave traveling through the air, we could easily be overwhelmed by sensory overload. Instead, humans evolved to perceive sound waves within a limited range of frequency.
The frequency range of human hearing is generally between 20 Hz and 20,000 Hz. Low-frequency sounds below 20 Hz are called infrasound, while high-frequency sounds above 20,000 Hz are considered ultrasound.
(Examples of infrasound include noises made by whales, elephants and giraffes to communicate over long distances. And dogs are notorious for their ability to pick up on ultrasound frequencies that are too high for humans to hear—up to 46 kHz!)
High frequencies are usually the first to be affected by hearing loss as we get older. Those with high-frequency hearing loss may have trouble hearing sounds in the 2,000 to 8,000 Hz range, making it difficult to understand female or children’s voices or to perceive consonants such as S, F or H.
An audiogram can help determine whether there’s been any loss of hearing perception and be the first step toward counteracting any loss of function in the inner ear.