Posts tagged "learning"

I heard about this study on nature-neuroscience. These are scientists with some serious intestinal fortitude. It takes a lot of guts to apply for research grants, submit a manuscript and publish in the face of outside claims that your research is fanatic and based on bunk science. Now that the behavioral data exists, they should run an fMRI follow up to implicate the associational regions that underlie this phenomenon. Is it simple pavlovian conditioning, or something else? 
scinerds

LEARNING IN YOUR SLEEP
Sleeping and learning go hand in hand, studies have shown for years. Even a brief nap can boost your memory and sharpen your thinking. But the relationship goes deeper than that. In a new study, scientists report that the brain can actually learn something new during sleep.
Scientists used to believe that a sleeping brain was taking a break. But it turns out it can be taught a thing or two, scientists reported in a scientific journal published in August.
“The brain is not passive while you sleep,” neuroscientist Anat Arzi told Science News. “It’s quite active. You can do quite a lot of things while you are asleep.” Arzi researches olfaction, or the sense of smell, at the Weizmann Institute of Science in Rehovot, Israel. She worked on the new study.
Arzi and her coworkers didn’t try to teach the sleeping volunteers any complex information, like new words or facts. (So sleeping on top of your study notes won’t boost your grades.) Instead, the scientists taught snoozing volunteers to make new connections between smells and sounds.
When we smell something nice, like a flower, we automatically take deep breaths. When we smell something bad, like the stench of a dumpster, we automatically take short breaths. These natural reactions maximize our exposure to good smells and minimize our exposure to bad ones. Arzi and her coworkers based their experiment on these reactions and the knowledge that our senses don’t turn off while we slumber.
Once the volunteers fell asleep in the lab, the scientists went to work. They gave the volunteers a whiff of something pleasant, like shampoo, and at the same time played a particular musical note. The volunteers didn’t wake up, but they did hear — and sniff deeply. Then the scientists gave the volunteers a whiff of something repulsive, like rotten fish, and played a different musical note. Again, the volunteers heard and smelled — a short snort this time — but didn’t wake up. The researchers repeated the experiment while the volunteers slept.
After just four repetitions, volunteers made a connection between the musical notes and their paired smells. When the scientists played the musical tone that went with good smells, the sleepers inhaled deeply — even though there was no good smell to sniff. And when the scientists played the musical tone that went with foul odors, the sleepers inhaled briefly — despite there being no bad smell.
“They learned what the tone signified,” Arzi concluded.
The next day, the volunteers woke up with the sound-smell connection intact. They inhaled deeply when hearing one tone and cut their breaths short when hearing the other. Which must have been odd for them: Imagine walking down the street and taking a deep breath upon hearing a particular sound!

I heard about this study on nature-neuroscience. These are scientists with some serious intestinal fortitude. It takes a lot of guts to apply for research grants, submit a manuscript and publish in the face of outside claims that your research is fanatic and based on bunk science. Now that the behavioral data exists, they should run an fMRI follow up to implicate the associational regions that underlie this phenomenon. Is it simple pavlovian conditioning, or something else?

scinerds

LEARNING IN YOUR SLEEP

Sleeping and learning go hand in hand, studies have shown for years. Even a brief nap can boost your memory and sharpen your thinking. But the relationship goes deeper than that. In a new study, scientists report that the brain can actually learn something new during sleep.

Scientists used to believe that a sleeping brain was taking a break. But it turns out it can be taught a thing or two, scientists reported in a scientific journal published in August.

“The brain is not passive while you sleep,” neuroscientist Anat Arzi told Science News. “It’s quite active. You can do quite a lot of things while you are asleep.” Arzi researches olfaction, or the sense of smell, at the Weizmann Institute of Science in Rehovot, Israel. She worked on the new study.

Arzi and her coworkers didn’t try to teach the sleeping volunteers any complex information, like new words or facts. (So sleeping on top of your study notes won’t boost your grades.) Instead, the scientists taught snoozing volunteers to make new connections between smells and sounds.

When we smell something nice, like a flower, we automatically take deep breaths. When we smell something bad, like the stench of a dumpster, we automatically take short breaths. These natural reactions maximize our exposure to good smells and minimize our exposure to bad ones. Arzi and her coworkers based their experiment on these reactions and the knowledge that our senses don’t turn off while we slumber.

Once the volunteers fell asleep in the lab, the scientists went to work. They gave the volunteers a whiff of something pleasant, like shampoo, and at the same time played a particular musical note. The volunteers didn’t wake up, but they did hear — and sniff deeply. Then the scientists gave the volunteers a whiff of something repulsive, like rotten fish, and played a different musical note. Again, the volunteers heard and smelled — a short snort this time — but didn’t wake up. The researchers repeated the experiment while the volunteers slept.

After just four repetitions, volunteers made a connection between the musical notes and their paired smells. When the scientists played the musical tone that went with good smells, the sleepers inhaled deeply — even though there was no good smell to sniff. And when the scientists played the musical tone that went with foul odors, the sleepers inhaled briefly — despite there being no bad smell.

“They learned what the tone signified,” Arzi concluded.

The next day, the volunteers woke up with the sound-smell connection intact. They inhaled deeply when hearing one tone and cut their breaths short when hearing the other. Which must have been odd for them: Imagine walking down the street and taking a deep breath upon hearing a particular sound!

Learning Requires Rhythmical Activity of Neurons
The hippocampus represents an important brain structure for learning. Scientists at the Max Planck Institute of Psychiatry in Munich discovered how it filters electrical neuronal signals through an input and output control, thus regulating learning and memory processes.

Accordingly, effective signal transmission needs so-called theta-frequency impulses of the cerebral cortex. With a frequency of three to eight hertz, these impulses generate waves of electrical activity that propagate through the hippocampus. Impulses of a different frequency evoke no transmission, or only a much weaker one. Moreover, signal transmission in other areas of the brain through long-term potentiation (LTP), which is essential for learning, occurs only when the activity waves take place for a certain while. The scientists even have an explanation for why we are mentally more productive after drinking a cup of coffee or in an acute stress situation: in their experiments, caffeine and the stress hormone corticosterone boosted the activity flow.
When we learn and recall something, we have to concentrate on the relevant information and experience it again and again. Electrophysiological experiments in mice now show why this is the case. Scientists belonging to Matthias Eder´s Research Group measured the transmission of electrical impulses between neurons in the mouse hippocampus. Under the fluorescence microscope, they were able to observe in real time how the neurons forward signals.
Jens Stepan, a junior scientist at the Max Planck Institute of Psychiatry in Munich, stimulated the input region of the hippocampus the first time that specifically theta-frequency stimulations produce an effective impulse transmission across the hippocampal CA3/CA1 region. This finding is very important, as it is known from previous studies that theta-rhythmical neuronal activity in the entorhinal cortex always occurs when new information is taken up in a focused manner. With this finding, the researchers demonstrate that the hippocampus highly selectively reacts to the entorhinal signals. Obviously, it can distinguish important and, thus, potentially recollection-worth information from unimportant one and process it in a physiologically specific manner.
One possible reaction is the formation of the so-called long-term potentiation (LTP) of signal transmission at CA3-CA1 synapses, which is often essential for learning and memory. The present study documents that this CA1-LTP occurs only when the activity waves through the hippocampus take place for a certain time. Translating this to our learning behavior, to commit for instance an image to memory, we should intently view it for a while, as only then we produce the activity waves described long enough to store the image in our brain.
With this study, Matthias Eder and colleagues succeeded in closing a knowledge gap. “Our investigation on neuronal communication via the hippocampal trisynaptic circuit provides us with a new understanding of learning in the living organism. We are the first to show that long-term potentiation depends on the frequency and persistency of incoming sensory signals in the hippocampus,” says Matthias Eder.

Learning Requires Rhythmical Activity of Neurons


The hippocampus represents an important brain structure for learning. Scientists at the Max Planck Institute of Psychiatry in Munich discovered how it filters electrical neuronal signals through an input and output control, thus regulating learning and memory processes.

Accordingly, effective signal transmission needs so-called theta-frequency impulses of the cerebral cortex. With a frequency of three to eight hertz, these impulses generate waves of electrical activity that propagate through the hippocampus. Impulses of a different frequency evoke no transmission, or only a much weaker one. Moreover, signal transmission in other areas of the brain through long-term potentiation (LTP), which is essential for learning, occurs only when the activity waves take place for a certain while. The scientists even have an explanation for why we are mentally more productive after drinking a cup of coffee or in an acute stress situation: in their experiments, caffeine and the stress hormone corticosterone boosted the activity flow.

When we learn and recall something, we have to concentrate on the relevant information and experience it again and again. Electrophysiological experiments in mice now show why this is the case. Scientists belonging to Matthias Eder´s Research Group measured the transmission of electrical impulses between neurons in the mouse hippocampus. Under the fluorescence microscope, they were able to observe in real time how the neurons forward signals.

Jens Stepan, a junior scientist at the Max Planck Institute of Psychiatry in Munich, stimulated the input region of the hippocampus the first time that specifically theta-frequency stimulations produce an effective impulse transmission across the hippocampal CA3/CA1 region. This finding is very important, as it is known from previous studies that theta-rhythmical neuronal activity in the entorhinal cortex always occurs when new information is taken up in a focused manner. With this finding, the researchers demonstrate that the hippocampus highly selectively reacts to the entorhinal signals. Obviously, it can distinguish important and, thus, potentially recollection-worth information from unimportant one and process it in a physiologically specific manner.

One possible reaction is the formation of the so-called long-term potentiation (LTP) of signal transmission at CA3-CA1 synapses, which is often essential for learning and memory. The present study documents that this CA1-LTP occurs only when the activity waves through the hippocampus take place for a certain time. Translating this to our learning behavior, to commit for instance an image to memory, we should intently view it for a while, as only then we produce the activity waves described long enough to store the image in our brain.

With this study, Matthias Eder and colleagues succeeded in closing a knowledge gap. “Our investigation on neuronal communication via the hippocampal trisynaptic circuit provides us with a new understanding of learning in the living organism. We are the first to show that long-term potentiation depends on the frequency and persistency of incoming sensory signals in the hippocampus,” says Matthias Eder.


Want to solve a problem? Don’t just use your brain, but your body, too

When we’ve got a problem to solve, we don’t just use our brains but the rest of our bodies, too. The connection, as neurologists know, is not uni-directional. Now there’s evidence from cognitive psychology of the same fact. “Being able to use your body in problem solving alters the way you solve the problems,” says University of Wisconsin psychology professor Martha Alibali. “Body movements are one of the resources we bring to cognitive processes.”
These conclusions, of a new study by Alibali and colleagues—Robert C. Spencer, also at the University of Wisconsin, and Lucy Knox and Sotaro Kita of the University of Birmingham—are augmented by another, counter-intuitive one – even when we are solving problems that have to do with motion and space, the inability to use the body may force us to come up with other strategies, and these may be more efficient.
The findings will be published in an upcoming issue of Psychological Science, a journal of the Association for Psychological Science.
The study involved two experiments. The first recruited 86 American undergraduates, half of whom were prevented from moving their hands using Velcro gloves that attached to a board. The others were prevented from moving their feet, using Velcro straps attached to another board. The latter thus experienced the strangeness of being restricted, but also had their hands free. From the other side of an opaque screen, the experimenter asked questions about gears in relation to each other—e.g., “If five gears are arranged in a line, and you move the first gear clockwise, what will the final gear do?” The participants solved the problems aloud and were videotaped.
The videotapes were then analyzed for the number of hand gestures the participants used (hand rotations or “ticking” movements, indicating counting); verbal explanations indicating the subject was visualizing those physical movements; or the use of more abstract mathematical rules, without reference to perceptual-motor processes.
The results: The people who were allowed to gesture usually did so—and they also commonly used perceptual-motor strategies in solving the puzzles. The people whose hands were restrained, as well as those who chose not to gesture (even when allowed), used abstract, mathematical strategies much more often.
In a second experiment, 111 British adults did the same thing silently and were videotaped, and described their strategies afterwards. The results were the same.
The findings evince deeper questions about the relationship of mind and body and their relationship to space, says Alibali. “As human thinkers, we use visual-spatial metaphors all the time to solve problems and conceptualize things—even in domains that don’t seem physical on their face. Adding is ‘up,’ subtracting is ‘down.’ A good mood is ‘high,’ a bad one is ‘low.’ This is the metaphoric structuring of our conceptual landscape.”
Alibali, who is also an educational psychologist, asks: “How we can harness the power of action and perception in learning?” Or, conversely: What about the cognitive strategies of people who cannot use their bodies? “They may focus on different aspects of problems,” she says. And, it turns out, they may be onto something the rest of us could learn from.

http://www.eurekalert.org/pub_releases/2011-06/afps-wts060211.php

Want to solve a problem? Don’t just use your brain, but your body, too

When we’ve got a problem to solve, we don’t just use our brains but the rest of our bodies, too. The connection, as neurologists know, is not uni-directional. Now there’s evidence from cognitive psychology of the same fact. “Being able to use your body in problem solving alters the way you solve the problems,” says University of Wisconsin psychology professor Martha Alibali. “Body movements are one of the resources we bring to cognitive processes.”

These conclusions, of a new study by Alibali and colleagues—Robert C. Spencer, also at the University of Wisconsin, and Lucy Knox and Sotaro Kita of the University of Birmingham—are augmented by another, counter-intuitive one – even when we are solving problems that have to do with motion and space, the inability to use the body may force us to come up with other strategies, and these may be more efficient.

The findings will be published in an upcoming issue of Psychological Science, a journal of the Association for Psychological Science.

The study involved two experiments. The first recruited 86 American undergraduates, half of whom were prevented from moving their hands using Velcro gloves that attached to a board. The others were prevented from moving their feet, using Velcro straps attached to another board. The latter thus experienced the strangeness of being restricted, but also had their hands free. From the other side of an opaque screen, the experimenter asked questions about gears in relation to each other—e.g., “If five gears are arranged in a line, and you move the first gear clockwise, what will the final gear do?” The participants solved the problems aloud and were videotaped.

The videotapes were then analyzed for the number of hand gestures the participants used (hand rotations or “ticking” movements, indicating counting); verbal explanations indicating the subject was visualizing those physical movements; or the use of more abstract mathematical rules, without reference to perceptual-motor processes.

The results: The people who were allowed to gesture usually did so—and they also commonly used perceptual-motor strategies in solving the puzzles. The people whose hands were restrained, as well as those who chose not to gesture (even when allowed), used abstract, mathematical strategies much more often.

In a second experiment, 111 British adults did the same thing silently and were videotaped, and described their strategies afterwards. The results were the same.

The findings evince deeper questions about the relationship of mind and body and their relationship to space, says Alibali. “As human thinkers, we use visual-spatial metaphors all the time to solve problems and conceptualize things—even in domains that don’t seem physical on their face. Adding is ‘up,’ subtracting is ‘down.’ A good mood is ‘high,’ a bad one is ‘low.’ This is the metaphoric structuring of our conceptual landscape.”

Alibali, who is also an educational psychologist, asks: “How we can harness the power of action and perception in learning?” Or, conversely: What about the cognitive strategies of people who cannot use their bodies? “They may focus on different aspects of problems,” she says. And, it turns out, they may be onto something the rest of us could learn from.

http://www.eurekalert.org/pub_releases/2011-06/afps-wts060211.php


Brain-Like Computing a Step Closer to Reality

The development of ‘brain-like’ computers has taken a major step forward with the publication of research led by the University of Exeter.
Published in the journal Advanced Materials, the study involved the first ever demonstration of simultaneous information processing and storage using phase-change materials. This new technique could revolutionize computing by making computers faster and more energy-efficient, as well as making them more closely resemble biological systems.
Computers currently deal with processing and memory separately, resulting in a speed and power ‘bottleneck’ caused by the need to continually move data around. This is totally unlike anything in biology, for example in human brains, where no real distinction is made between memory and computation. To perform these two functions simultaneously the University of Exeter research team used phase-change materials, a kind of semi-conductor that exhibits remarkable properties.
Their study demonstrates conclusively that phase-change materials can store and process information simultaneously. It also shows experimentally for the first time that they can perform general-purpose computing operations, such as addition, subtraction, multiplication and division. More strikingly perhaps it shows that phase-change materials can be used to make artificial neurons and synapses. This means that an artificial system made entirely from phase-change devices could potentially learn and process information in a similar way to our own brains.
Lead author Professor David Wright of the University of Exeter said: “Our findings have major implications for the development of entirely new forms of computing, including ‘brain-like’ computers. We have uncovered a technique for potentially developing new forms of ‘brain-like’ computer systems that could learn, adapt and change over time. This is something that researchers have been striving for over many years.”
This study focused on the performance of a single phase-change cell. The next stage in Exeter’s research will be to build systems of interconnected cells that can learn to perform simple tasks, such as identification of certain objects and patterns.

Editor’s Note: Skynet Lives

(Science Daily) 

Brain-Like Computing a Step Closer to Reality

The development of ‘brain-like’ computers has taken a major step forward with the publication of research led by the University of Exeter.

Published in the journal Advanced Materials, the study involved the first ever demonstration of simultaneous information processing and storage using phase-change materials. This new technique could revolutionize computing by making computers faster and more energy-efficient, as well as making them more closely resemble biological systems.

Computers currently deal with processing and memory separately, resulting in a speed and power ‘bottleneck’ caused by the need to continually move data around. This is totally unlike anything in biology, for example in human brains, where no real distinction is made between memory and computation. To perform these two functions simultaneously the University of Exeter research team used phase-change materials, a kind of semi-conductor that exhibits remarkable properties.

Their study demonstrates conclusively that phase-change materials can store and process information simultaneously. It also shows experimentally for the first time that they can perform general-purpose computing operations, such as addition, subtraction, multiplication and division. More strikingly perhaps it shows that phase-change materials can be used to make artificial neurons and synapses. This means that an artificial system made entirely from phase-change devices could potentially learn and process information in a similar way to our own brains.

Lead author Professor David Wright of the University of Exeter said: “Our findings have major implications for the development of entirely new forms of computing, including ‘brain-like’ computers. We have uncovered a technique for potentially developing new forms of ‘brain-like’ computer systems that could learn, adapt and change over time. This is something that researchers have been striving for over many years.”

This study focused on the performance of a single phase-change cell. The next stage in Exeter’s research will be to build systems of interconnected cells that can learn to perform simple tasks, such as identification of certain objects and patterns.

Editor’s Note: Skynet Lives

(Science Daily