(Original Link - http://www.chicagotribune.com/health/sc-health-1201-music-20101201,0,3647950.story)
On her last night at the hospital after undergoing a series of spine surgeries, Susan Mandel lay in bed listening to Pachelbel's Canon in D.
For days, Mandel's positive attitude had kept any anxiety at bay, so she was surprised when she noticed her face was wet, and then her pillow, which slowly soaked through. She sobbed silently, listening to the familiar violins, until the tears stopped coming. Then she felt peace.
"It wasn't a cry of anguish, it was a cry of relief," Mandel said, recalling the night more than 20 years ago. "It's very tender, evocative music, and I think it gave me permission to release the pent-up emotions."
Philosophers for millenniums have marveled at the power of music to speak to our souls, to inspire joy, melancholy, aggression or calm with visceral insight beyond the grasp of our rational minds. Thanks to advances in neuroscience, researchers are beginning to understand what it is about music that touches us so deeply, and how to harness that power to soothe, uplift, comfort and heal — to use music as medicine for emotional and physical health.
Mandel, a music therapist and research consultant at Lake Health Wellness Institute in Cleveland, this month released "Manage Your Stress and Pain Through Music," (Berklee Press Publications, $29.99), with co-author Suzanne Hanser, chairwoman of the music therapy department at Berklee College of Music in Boston. The book explains how to choose and use music to cope with challenges in your life.
Not what you'd guess
It can seem obvious which songs would bring you up and which might bring you down. And indeed, there are structural components to songs that are meant to communicate joy, such as a fast tempo in major mode, or sadness, such as a slower tempo in minor mode. But there's a difference between the emotion communicated through music and the emotion actually induced in the listener. Our memories, personal preferences and mood at the time can have a heavier influence than the intent of the musical structure in how music makes us feel.
"You could have a really positive emotional experience with a song that structurally communicates sadness," said Meagan Curtis, assistant professor of psychology at State University of New York at Purchase, who does research in music psychology.
What matters most in reaping the health benefits of music, from pain reduction to stress relief, is that you listen to music you enjoy, research shows. In a study on cardiac rehabilitation patients, Mandel found that the patients who liked a therapeutic music CD she put together experienced a reduction in blood pressure and reported feeling calmer, while patients who didn't like the music actually felt worse.
While there are structural components that convey soothing, such as consonant harmonies and a narrow pitch range, whatever music has the most positive associations to the individual will have the most positive emotional and physiological response. It activates the parasympathetic nervous system, which calms heart rate, lowers blood pressure and relaxes muscles.
"I have found people who love punk rock and find that it helps them to sleep," Hanser said. "It's likely that they have learned it truly speaks to them and expresses a part of who they are."
Music and pain
Music also has been found to help people tolerate pain longer and make the pain less painful.
Studies using a cold pressor task, which simulates chronic pain by submerging subjects' hands in a bucket of freezing cold water, found that people were able to leave their hands in the water longer when they were listening to music they enjoyed, Curtis said.
That could be because people take comfort in the familiar, or because it distracts them. Between recalling memories, tapping our fingers, conjuring up images and other tasks, our brain releases so many chemicals to process music that they interfere with our perception of pain.
How the brain processes
There's some evidence that we feel music viscerally because it goes straight to the amygdala, the part of the limbic system that manages our emotions, and the hippocampus, where long-term memories are stored, Hanser said.
Music that gives people chills or shivers up the spine has been found to activate the same reward areas of the brain stimulated by food, sex and certain types of recreational drugs, Curtis said. While different people get chills from different songs, often those shiver-producing songs have an unexpected tonal structure, like a chord that isn't part of the harmonic progression, she said.
Impact of lyrics
While structure is less important than personal experience in a song's ability to induce emotion, lyrics may be even less important than structure, Curtis said. We don't need to consciously attend to structure to process its emotion, but we do have to pay attention to lyrics, making the impact of structure stronger and less difficult to process.
People are usually very intuitive about what songs are useful to them and often choose music appropriate for the state they're in, Curtis said. That explains one of the great ironies of human behavior: that many people like to listen to sad music when they're sad.
We might like the affirmation, as we create a bond with the singer or composer because they, too, have felt what we feel, Curtis said. Another theory is that wallowing is a kind of emotional catharsis, helping us fully experience the sadness so that we go through the stages of grief more quickly.
And it can be a healthy thing. A central tenet of music therapy is to meet people where they are, called the ISO principal. So if people are very depressed and lonely, you would start them with music that matches their mood before introducing something more uplifting.
"You first affirm and allow the person to reflect, and then move on to more positive things and hopeful outlooks," Hanser said.
Some researchers hope to nail down the precise combination of pitch, tone, tempo, rhythm, timbre, melody and lyrics that makes a piece of music ideal for regulating people's moods or helping to reduce pain. A study under way at Glasgow Caledonian University aims to develop a "comprehensive mathematic model" that identifies how music communicates emotions, which eventually could help doctors prescribe music.
Hanser is skeptical that a sweeping formula exists, and if it does, "I hope we don't find it," she said. "I don't know anyone who is the mean, the normal. If we can recognize our own unique characteristics and what makes us each respond so differently, that I think is really fascinating and what humanity is all about."
aelejalderuiz@tribune.com
Emotional impact
While a person's emotional reaction to a song is based largely on his or her history with the song, the song's structure also can communicate emotions, mostly through mode (major or minor chords) and tempo, said Meagan Curtis, assistant professor of psychology at State University of New York at Purchase.
A fast tempo (up to 120 beats per minute) tends to heighten physiological arousal, while slower tempos (down to 60 beats per minute) tend to reduce arousal. Major chords tend to evoke positive emotions, such as joy and contentment, and minor chords negative emotions, like fear, anger or sadness.
Curtis offered some examples:
•Major mode, fast tempo Example: "Shiny Happy People," by R.E.M. Emotion conveyed: happy.
•Major mode, slow tempo Example: "Sitting on the Dock of the Bay," by Otis Redding. Emotion conveyed: soothing, tenderness.
•Minor mode, fast tempo Example: "Smells Like Teen Spirit," by Nirvana. Emotion conveyed: angst, anger.
•Minor mode, slow tempo Example: "Eleanor Rigby," by the Beatles. Emotion conveyed: sadness.
Showing posts with label brain on music. Show all posts
Showing posts with label brain on music. Show all posts
Thursday, December 2, 2010
Thursday, October 21, 2010
Scientists Closer to Grasping How the Brain's 'Hearing Center' Spurs Responses to Sound
Just as we visually map a room by spatially identifying the objects in it, we map our aural world based on the frequencies of sounds. The neurons within the brain's "hearing center" -- the auditory cortex -- are organized into modules that each respond to sounds within a specific frequency band. But how responses actually emanate from this complex network of neurons is still a mystery.
A team of scientists led by Anthony Zador, M.D., Ph.D., Professor and Chair of the Neuroscience program at Cold Spring Harbor Laboratory (CSHL) has come a step closer to unraveling this puzzle. The scientists probed how the functional connectivity among neurons within the auditory cortex gives rise to a "map" of acoustic space.
"What we learned from this approach has put us in a position to investigate and understand how sound responsiveness arises from the underlying circuitry of the auditory cortex," says Zador. His team's findings appear online, ahead of print, on October 17th in Nature Neuroscience.
Neuronal organization within the auditory cortex fundamentally differs from the organization within brain regions that process sensory inputs such as sight and sensation. For instance, the relative spatial arrangement of sight receptors in the retina (the eyes' light-sensitive inner surface) is directly represented as a two-dimensional "retinotopic" map in the brain's visual cortex.
In the auditory system, however, the organization of sound receptors in the cochlea -- the snail-like structure in the ear -- is one-dimensional. Cochlear receptors near the outer edge recognize low-frequency sounds whereas those whereas those near the inside of the cochlea are tuned to higher frequencies. This low-to-high distribution, called 'tonotopy,' is preserved along one dimension in the auditory cortex, with neurons tuned to high and low frequencies arranged in a head-to-tail gradient.
"Because sound is intrinsically a one-dimensional signal, unlike signals for other senses such as sight and sensation which are intrinsically two-dimensional, the map of sound in the auditory cortex is also intrinsically one-dimensional," explains Zador. "This means that there is a functional difference in the cortical map between the low-to-high direction and the direction perpendicular to it. However, no one has been able understand how that difference arises from the underlying neuronal circuitry."
To address this question, Zador and postdoctoral fellow Hysell Oviedo compared neuronal activity in mouse brain slices that were cut to preserve the connectivity along the tonotopic axis vs. activity in slices that were cut perpendicular to it.
To precisely stimulate a single neuron within a slice and record from it, Oviedo and Zador, working in collaboration with former CSHL scientists Karel Svoboda and Ingrid Bureau, used a powerful tool called laser-scanning photostimulation. This method allows the construction of a detailed, high-resolution picture that reveals the position, strength and the number of inputs converging on a single neuron within a slice.
"If you did this experiment in the visual cortex, you would see that the connectivity is the same regardless of which way you cut the slice," explains Oviedo. "But in our experiments in the auditory cortex slices, we found that there was a qualitative difference in the connectivity between slices cut along the tonotopic axis vs. those cut perpendicular to it."
There was an even more striking divergence from the visual cortex -- and presumably the other cortical regions. As demonstrated by a Nobel Prize-winning discovery in 1962, in the visual cortex, the neurons that share the same input source (or respond to the same signal) are organized into columns. As Oviedo puts it, "all neurons within a column in the vertical cortex are tuned to the same position in space and are more likely to communicate with other neurons from within the same column."
Analogously, in the auditory cortex, neurons within a column are expected to be tuned to the same frequency. So the scientists were especially surprised to find that for a given neuron in this region, the dominant input signal didn't come from within its column but from outside it.
"It comes from neurons that we think are tuned to higher frequencies," elaborates Zador. "This is the first example of the neuronal organizing principle not following the columnar pattern, but rather an out-of-column pattern." Discovering this unexpected, out-of-column source of information for a neuron in the auditory complex adds a new twist to their research, which is focused on understanding auditory function in terms of the underlying circuitry and how this is altered in disorders such as autism.
"With this study, we've moved beyond having only a conceptual notion of the functional difference between the two axes by actually finding correlates for this difference at the level of the neuronal microcircuits in this region," he explains.
This work was supported by grants from the US National Institutes of Health, the Patterson Foundation, the Swartz Foundation and Autism Speaks.
Labels:
brain,
brain on music,
dj frobot,
frobot,
http://www.frobot.jp,
neural science,
neurology
Monday, October 11, 2010
Monday, September 20, 2010
What part of the brain interprets music?
Music lights up almost every area of the brain, which shouldn’t be a surprise since it makes people tap their feet, encourages the recollection of vivid memories and has the potential to lighten the mood.
Around the outside
1. Prefrontal cortex: This brain region plays a role in the creation, satisfaction and violation of expectations. It may react, for instance, when a beat goes missing. Recent work has shown that during improvisation a part of the prefrontal cortex involved in monitoring performance shuts down, while parts involved in self-initiated thoughts ramp up.
2. Motor cortex: Music is not independent of motion. Foot-tapping and dancing often accompany a good beat, meaning the motor cortex gets involved. And playing an instrument requires carefully timed physical movements. In some cases, this area of the brain is engaged when a person simply hears notes, suggesting a strong link to the auditory cortex.
3. Sensory cortex: Playing an instrument sends tactile messages to the sensory cortex, as keys are hit, for example.
4. Auditory cortex: Hearing any sound, including music, involves this region, which contains a map of pitches for the perception and analysis of tones.
5. Visual cortex: Reading music or watching a performer’s movements activates the visual cortex.
The inside track
6. Cerebellum: Movements such as foot-tapping and dancing activate this part of the brain. This could be because of the cerebellum’s role in timing and synchrony; it helps people track the beat. The cerebellum is also involved in the emotional side of music, lighting up with likable or familiar music, and appears to sense the difference between major and minor chords.
7. Hippocampus: Known to play a role in long-term memory, the hippocampus (part of which is shown) may help the brain retrieve memories that give a sound meaning or context. It also helps people link music they have heard before to an experience and to a given context, possibly explaining why it is activated during pleasant or emotionally charged music.
8. Amygdala: The amygdala seems to be involved in musical memories. It reacts differently to major and minor chords, and music that leads to chills tends to affect it. Studies suggest the skillful repetition heard in music is emotionally satisfying.
9. Nucleus accumbens: This brain structure is thought to be the center of the reward system. It reacts to emotional music, perhaps through the release of dopamine.
(Original Link - http://www.sciencenews.org/view/feature/id/61593/title/Your_brain_on_music )
Subscribe to:
Posts (Atom)