Hearing Explained: How Hearing Works

How do we hear?

Think of some of your favorite sounds: the voice of a loved one, the jingle of an ice cream truck, the song you danced to on your wedding day. How exactly do all those different sounds—music, laughter, birdsong, voices—travel through the air and wind through our ears to reach our brains?

Brian Hill

Brian Hill, MS, MBA, CCC/A, FAAA

Audiologist, Director of Professional Services and Training

View profile

All the sounds we hear begin as waves or vibrations. These sound waves come in all kinds of shapes and sizes. If you ring a tiny bell, for instance, the rapid vibrations caused by the metal being struck will create a high-frequency sound wave that produces a high-pitched ring. If you strike a bass drum, the relatively slower vibrations of the drum’s surface will send out a low-frequency sound wave that produces a low-pitched boom.

The human brain is able to hear as many as 7,000 distinct pitches, each created by a sound wave with its own unique frequency. When we hear sounds, it’s a result of these sound waves entering our ear canals and beginning an intricate process of transforming into signals that our brains can interpret and understand. To understand how that process works, we first need to get familiar with the basic anatomy of the ear.

Parts of the ear

The ear is made up of three main parts: the outer ear, middle ear and inner ear. These three parts all work together to capture sound waves and transform them into signals our brains can recognize.

The outer ear includes:

  • The auricle and earlobe (the protruding parts of the ear)
  • The ear canal
  • The outer layer of the eardrum (or tympanic membrane)

The outer ear collects sound waves from the air and directs them through the ear canal to the middle ear.

The middle ear includes:

  • The tympanic cavity
  • The ossicular chain (three tiny bones that transmit sound to the inner ear)
  • The Eustachian tube, which helps maintain air pressure in the middle ear

The inner ear includes:

  • Fluid-filled semicircular ducts that help maintain balance
  • The cochlea, which transforms sound into signals that get sent to the brain
  • The auditory tube, which drains fluid from the middle ear into the throat

Hair cells in the cochlea transform the mechanical pressure waves sent from the middle ear into electrical impulses that travel via the auditory nerve to the hearing center of the brain.

Noise becomes signal

Let’s follow a sound wave through the ear and see how it transforms on its journey to the brain.

Step one: The outer part of the ear captures a sound wave and funnels it through the ear canal, where it strikes the tympanic membrane (or outer layer of the eardrum).

Step two: The sound wave causes the eardrum and the three small ossicles bones within the middle ear to vibrate. The movement of the ossicles amplifies the sound waves as they are transmitted to the entrance of the inner ear, known as the oval window.

Step three: The vibrations travel through the fluid in the cochlea and move the tiny hair cells there. The movement of these hair cells creates electrical impulses that travel along the auditory nerve to the brain’s hearing center, where they are interpreted as sounds.

How we hear: The brain’s role

As the electrical impulses from the inner ear travel to the brain, their first stop is in an area of the brain stem known as the cochlear nucleus. The cochlear nucleus organizes the electrical signals according to duration, intensity and pitch, and then passes the organized signals on to other areas of the brain for further processing.

The thalamus, at the base of the brain, is responsible for collecting and sorting sensory information from all of the senses and directing it to appropriate areas of the cortex to prepare a response, such as a verbal response or the fight or flight reflex, depending on the content of the signal.

In the temporal lobes of the brain (right above the ears) is the auditory cortex—the brain’s language processing center. This is where the decoded and organized auditory signals are fully unpacked into individual words, recognizable voices and other identifying information such as location, volume and tone.

Finally, the prefrontal cortex puts all the information from the auditory cortex and other brain sites together. This is where the brain forms a cohesive understanding of sensory data, integrating the content and meaning of a sound with other information such as facial expressions, body language, memory and emotion.

Hearing sound: Our window of perception

Not all sound waves follow the above process. If we heard and processed every sound wave traveling through the air, we could easily be overwhelmed by sensory overload. Instead, humans evolved to perceive sound waves within a limited range of frequency.

The frequency range of human hearing is generally between 20 Hz and 20,000 Hz. Low-frequency sounds below 20 Hz are called infrasound, while high-frequency sounds above 20,000 Hz are considered ultrasound.

(Examples of infrasound include noises made by whales, elephants and giraffes to communicate over long distances. And dogs are notorious for their ability to pick up on ultrasound frequencies that are too high for humans to hear—up to 46 kHz!)

High frequencies are usually the first to be affected by hearing loss as we get older. Those with high-frequency hearing loss may have trouble hearing sounds in the 2,000 to 8,000 Hz range, making it difficult to understand female or children’s voices or to perceive consonants such as S, F or H.

An audiogram can help determine whether there’s been any loss of hearing perception and be the first step toward counteracting any loss of function in the inner ear.

Get support and advice

Book an appointment online

Book now

Take a free online hearing test

Start test

Find a hearing aid center near you

Search now