The hippocampus represents an important brain structure for learning. Scientists at the Max Planck Institute of Psychiatry in Munich discovered how it filters electrical neuronal signals through an input and output control, thus regulating learning and memory processes.
Accordingly, effective signal transmission needs so-called theta-frequency impulses of the cerebral cortex. With a frequency of three to eight hertz, these impulses generate waves of electrical activity that propagate through the hippocampus. Impulses of a different frequency evoke no transmission, or only a much weaker one. Moreover, signal transmission in other areas of the brain through long-term potentiation (LTP), which is essential for learning, occurs only when the activity waves take place for a certain while. The scientists even have an explanation for why we are mentally more productive after drinking a cup of coffee or in an acute stress situation: in their experiments, caffeine and the stress hormone corticosterone boosted the activity flow.
When we learn and recall something, we have to concentrate on the relevant information and experience it again and again. Electrophysiological experiments in mice now show why this is the case. Scientists belonging to Matthias Eder´s Research Group measured the transmission of electrical impulses between neurons in the mouse hippocampus. Under the fluorescence microscope, they were able to observe in real time how the neurons forward signals.
Jens Stepan, a junior scientist at the Max Planck Institute of Psychiatry in Munich, stimulated the input region of the hippocampus the first time that specifically theta-frequency stimulations produce an effective impulse transmission across the hippocampal CA3/CA1 region. This finding is very important, as it is known from previous studies that theta-rhythmical neuronal activity in the entorhinal cortex always occurs when new information is taken up in a focused manner. With this finding, the researchers demonstrate that the hippocampus highly selectively reacts to the entorhinal signals. Obviously, it can distinguish important and, thus, potentially recollection-worth information from unimportant one and process it in a physiologically specific manner.
One possible reaction is the formation of the so-called long-term potentiation (LTP) of signal transmission at CA3-CA1 synapses, which is often essential for learning and memory. The present study documents that this CA1-LTP occurs only when the activity waves through the hippocampus take place for a certain time. Translating this to our learning behavior, to commit for instance an image to memory, we should intently view it for a while, as only then we produce the activity waves described long enough to store the image in our brain.
With this study, Matthias Eder and colleagues succeeded in closing a knowledge gap. “Our investigation on neuronal communication via the hippocampal trisynaptic circuit provides us with a new understanding of learning in the living organism. We are the first to show that long-term potentiation depends on the frequency and persistency of incoming sensory signals in the hippocampus,” says Matthias Eder.
Want to solve a problem? Don’t just use your brain, but your body, too
When we’ve got a problem to solve, we don’t just use our brains but the rest of our bodies, too. The connection, as neurologists know, is not uni-directional. Now there’s evidence from cognitive psychology of the same fact. “Being able to use your body in problem solving alters the way you solve the problems,” says University of Wisconsin psychology professor Martha Alibali. “Body movements are one of the resources we bring to cognitive processes.”
These conclusions, of a new study by Alibali and colleagues—Robert C. Spencer, also at the University of Wisconsin, and Lucy Knox and Sotaro Kita of the University of Birmingham—are augmented by another, counter-intuitive one – even when we are solving problems that have to do with motion and space, the inability to use the body may force us to come up with other strategies, and these may be more efficient.
The findings will be published in an upcoming issue of Psychological Science, a journal of the Association for Psychological Science.
The study involved two experiments. The first recruited 86 American undergraduates, half of whom were prevented from moving their hands using Velcro gloves that attached to a board. The others were prevented from moving their feet, using Velcro straps attached to another board. The latter thus experienced the strangeness of being restricted, but also had their hands free. From the other side of an opaque screen, the experimenter asked questions about gears in relation to each other—e.g., “If five gears are arranged in a line, and you move the first gear clockwise, what will the final gear do?” The participants solved the problems aloud and were videotaped.
The videotapes were then analyzed for the number of hand gestures the participants used (hand rotations or “ticking” movements, indicating counting); verbal explanations indicating the subject was visualizing those physical movements; or the use of more abstract mathematical rules, without reference to perceptual-motor processes.
The results: The people who were allowed to gesture usually did so—and they also commonly used perceptual-motor strategies in solving the puzzles. The people whose hands were restrained, as well as those who chose not to gesture (even when allowed), used abstract, mathematical strategies much more often.
In a second experiment, 111 British adults did the same thing silently and were videotaped, and described their strategies afterwards. The results were the same.
The findings evince deeper questions about the relationship of mind and body and their relationship to space, says Alibali. “As human thinkers, we use visual-spatial metaphors all the time to solve problems and conceptualize things—even in domains that don’t seem physical on their face. Adding is ‘up,’ subtracting is ‘down.’ A good mood is ‘high,’ a bad one is ‘low.’ This is the metaphoric structuring of our conceptual landscape.”
Alibali, who is also an educational psychologist, asks: “How we can harness the power of action and perception in learning?” Or, conversely: What about the cognitive strategies of people who cannot use their bodies? “They may focus on different aspects of problems,” she says. And, it turns out, they may be onto something the rest of us could learn from.
Brain-Like Computing a Step Closer to Reality
The development of ‘brain-like’ computers has taken a major step forward with the publication of research led by the University of Exeter.
Published in the journal Advanced Materials, the study involved the first ever demonstration of simultaneous information processing and storage using phase-change materials. This new technique could revolutionize computing by making computers faster and more energy-efficient, as well as making them more closely resemble biological systems.
Computers currently deal with processing and memory separately, resulting in a speed and power ‘bottleneck’ caused by the need to continually move data around. This is totally unlike anything in biology, for example in human brains, where no real distinction is made between memory and computation. To perform these two functions simultaneously the University of Exeter research team used phase-change materials, a kind of semi-conductor that exhibits remarkable properties.
Their study demonstrates conclusively that phase-change materials can store and process information simultaneously. It also shows experimentally for the first time that they can perform general-purpose computing operations, such as addition, subtraction, multiplication and division. More strikingly perhaps it shows that phase-change materials can be used to make artificial neurons and synapses. This means that an artificial system made entirely from phase-change devices could potentially learn and process information in a similar way to our own brains.
Lead author Professor David Wright of the University of Exeter said: “Our findings have major implications for the development of entirely new forms of computing, including ‘brain-like’ computers. We have uncovered a technique for potentially developing new forms of ‘brain-like’ computer systems that could learn, adapt and change over time. This is something that researchers have been striving for over many years.”
This study focused on the performance of a single phase-change cell. The next stage in Exeter’s research will be to build systems of interconnected cells that can learn to perform simple tasks, such as identification of certain objects and patterns.
Editor’s Note: Skynet Lives