ELearning/Foundations/Processes supporting learning

From Encyclopedia of Science and Technology
Jump to: navigation, search

Introduction

Research into the physical processes involved in learning and memory has brought much to light during the past several years, and provides important foundational lessons for designing and managing learning experiences. Here we describe the basic brain processes, or building blocks, that are essential to learning.

The good news is that most theories and practices around learning are supported by this new research. Many theories have been extended and refined as a result. The other news is mixed in that many practices, including lecture and lock-step teaching, are not adequate. We also now know that our brains are in large measure shaped by the technologies we use – thus the internet is having an enormous impact on the modern brain.

Neuroplasticity

Conventional wisdom has said that neurons within the brain connect into circuits during childhood and, as we reach maturity, the circuitry becomes fixed – hard-wired and changeless, except through deterioration. We now know that this conception is false.

In fact, the brain can undergo rapid and extensive restructuring at the cellular level – neuroplasticity (Carr, 2010). Virtually all of our neural circuits, whether they involve feeling, seeing, moving, thinking, learning, perceiving, or remembering are subject to change. The brain is not just plastic; it is very plastic. Plasticity does diminish with age, but it never goes away.

Infancy: Chaos to order

The newborn brain is a chaotic place. A baby's early months are marked by the growth and development of brain components, beginning at the back of the brain and progressing forward (LeWinn et al., 2017), with individual neurons firing haphazardly, much like a crowd in which everyone is talking at once (Hensch, 2016). Different components are disconnected from each other and some, especially the temporal association cortex where recognition memories are stored, remain in an extended state of dormancy, disconnected until the brain brings them "online" as needed (Chomiak et al., 2016). In essence, the brain does not store memories until it has sufficiently matured.

Order begins during decisive critical periods in the child's life, lasting months to years, during which the brain's molecular structures are laid down. Most occur during infancy but some come as late as the teenage and young adult years. For example, the axon bundle running between the posterior temporal lobe and the ventromedial and dorsomedial cortices matures between the ages of three and four, during which time children begin to understand what other people think and how they may see the world differently (Weisman et al., 2017). This ability to attribute mental states—beliefs, intents, desires, pretending, knowledge, etc.—to oneself and others and to understand that others have beliefs, desires, intentions, and perspectives that are different from our own - is referred to as theory of mind.

If these critical periods come too early or too late, there are serious consequences. These periods initiate when sensory pathways mature sufficiently to interact with the outside world, and chemical switches cause GABA producing neurons, sending inhibitory signals, to quiet the storm of activity by extending their axons throughout the relevant pathways and increase their GABA emission. Cells that fire in unison become wired together while those that fire out of sync get pruned. "Neurons that fire together wire together." This continues until an excitatory-inhibitory balance is reached, allowing for clear neuronal communication. The failure to achieve this excitatory-inhibitory balance has recently been implicated in autism (Selimbeyoglu et al., 2017). Critical periods have thus far been identified for vision, hearing, memory, language, self-identity, susceptibility to depression, and various forms of social interaction.

Adult neuroplasticity

Brain regions can assume new functions when adjacent areas are damaged Neuron migration Typing changed Nietzsche's thought processes
1. Brain regions can take on new functions when adjacent areas are damaged 2. Neuron migration 3. Typing changed Nietzsche's writing

Old neuronal connections are always fading and new ones are forming, and brand new cells are always being created. While different regions of the brain are associated with different mental and motor functions, the cells never form a permanent structure. Thus, the brain is capable of reprogramming itself by changing synaptic communication, altering its functions (Saez, et al., 2016). An excellent example of neuroplasticity was demonstrated in London taxi drivers who were shown to have increased grey matter in the back part of their hippocampus – related to spatial representation of the environment - when compared to those who weren't professional drivers. This capacity is also evident in trauma that damages the nervous system. When a seeing person loses his sight, the occipital lobe takes on some the function of hearing and touch, making them more sensitive and accurate (Figure 1 above). Stroke victims learn to walk again after the primary motor cortex is damaged and other areas assume its functions (Frost et. al., 2003; Ward, 2005).

Remembering is based on neuronal plasticity between and within neurons. As memories form, the connections between groups of neurons strengthen due to chemical and physical changes in member neurons. Repeated experience further strengthens the connection. Chemical changes also make possible a second form of plasticity within the neurons, making memory formation quicker than would otherwise occur. See the #physical basis of memory below.

Forgetting is important to learning new things that differ from existing memory. Forgetting occurs for a number of reasons. First, there is an adaptive forgetting mechanism in which the acting of recalling some memories suppresses other, competing, memories and reduces their likelihood of future recall (Wimber et al., 2015). "The idea that the very act of remembering can cause forgetting is surprising, and could tell us more about selective memory and even self deception."

Next, there is life-long neurogenesis within the hippocampus (see below), serving to remodel hippocampal circuits and degrade short-term memories (Akers, et al., 2014). We call this interference, where one memory displaces another. Interference can occur in two directions: new memories can displace old ones, as we might expect. However, newly recalled (old) memories from long-term memory can also displace newly acquired ones. It becomes a matter of which memory was recalled most recently.

Forgetting also involves weakening old memory traces through disuse (fading) so that others can form (Mills et al., 2014). "In certain situations, you have to be able to 'forget' to learn."Synapses that are too strong or "sticky" hinder cognitive flexibility, as when we add to or correct our understanding of concepts, tasks, etc. Excessive levels of β-catenin (beta-catenin), a protein that is part of the molecular glue that holds synapses together, is associated with Alzheimer's and Huntington's diseases, and has been demonstrated to block new learning in mice.

Strengthening. In addition to connections forming and fading, they can also grow stronger. As experience is repeated, synaptic signal strength grows, additional dendrites grow, and new synaptic terminals are formed on dendrites and axons. This is the essence of expertise, where the brain fine tunes itself. Even the most difficult routines are carried out ever more quickly, smoothly, and effectively. This efficiency manifests itself as skilled habits of mind and action. All the while, unused skills fade away as these circuits are pruned. The survival of the busiest (Carr, 2010).

Neurogenesis. Another form of brain plasticity is seen in the production and storage of new brain cells. Under normal conditions, progenitor cells are produced in the hippocampus and migrate to their destination using glial cell pathways (Figure 2), then undergo a process of differentiation into neurons and glial cells. Only about one-third of these cells make it to their destination while the others die off enroute. During times of stress, the hippocampus produces fewer progenitor cells and more stem cells, which appear to be “stockpiled” and then released during better times (Dranovsky, 2011). Chronic stress also reduces the "adhesion" between neurons (significantly reduced number of nectin-3 cell adhesion protein molecules) resulting in reduced sociability and cognitive function (Sandi et al., 2014). When progenitor cells differentiate into neurons, they quickly grow a large number of dendrites reaching out to other neurons, most of which are then pruned into lasting connections (Tiago et al., 2016). "The more dendrites a neuron starts with, the more flexibility it has to prune back exactly the right branches." More moderate neurogenesis also takes place within the adult amygdala (Jhaver et al., 2017) and olfactory bulb (Ming & Song, 2005).

To complicate things a bit more, the brain’s plasticity allows it to be reconstructed according to the tools we use, which can also be for good or bad. Friends of philosopher Friedrich Nietzsche detected important changes in his writing when he moved from pen to typewriter in 1882 (Figure 3). “There was a new forcefulness to it – tighter and more telegraphic.” Nietzsche responded in kind: “You’re right. Our writing equipment takes part in the forming of our thought.” One-hundred and thirty years later, editor and teacher Robert Bliwise (2013) chimes in, “Computer-powered writing pushes me to be a better writer – to add, subtract, embellish, tone down, move around, and otherwise polish my prose.” Text itself brought about more disciplined and linear thinking as it developed over the millennia. The effects of technology – and text is a technology - do not occur at the level of opinions or concepts. Rather, they alter patterns of perception and thought steadily and without awareness (McLuhan, 2003). In the long run, the content is less important than the medium in influencing how we think and act. Consider how maps help form our perceptions of the physical world, or how clocks drive our time perception.

For a modern take on the Nietzsche phenomenon, watch 5 Crazy Ways Social Media Is Changing Your Brain Right Now

We now find ourselves in the age of the network. Internet, intranets, data networks, social networks. There is strong and growing evidence that our brains are reorganizing around these tools with real consequences including quicker decision-making, better eye-hand coordination, and better multi-tasking, but also the propensity to scan rather than read, shortened attention spans, and the desire for constant input. Meanwhile, deep reading and quiet contemplation are on the wane. And so it has gone throughout history. Every medium develops some skills at the expense of others.

Refer to #Sleep, below, for more on neuroplasticity.

Memory

Memory is the basis of learning; without it we would not be able to change our thoughts or behaviors. We could not adapt to change, nor create change. We would be stuck in the forever present without past or future. Luckily, we do have memory and lots of it. Here we see the whole picture. (Note that there are multiple models of memory, especially working memory. We like this one because it makes the fewest unproven assumptions.)

MemoryTable.png
4. Memory in all its forms

Temporary memory

Sensory memory is the very brief impression left by our sense organs on the brain: sight, hearing, smell, touch, and taste. Unless we turn our attention to them, as when looking at those around us or tasting that first bite of chocolate, they disappear almost instantaneously – about two-tenths of a second. The impression is so short that many consider sensory memory as part of perception.

Working memory can be thought of as containing three layers, or circuits, including short-term memory. The core central executive holds verbal and/or nonverbal (visio-spatial) information in mind and mentally works with it. The executive is critical for making sense of anything that unfolds over time, such as language, conversations, explanations, directions, and considering alternatives (Diamond, 2013). Consider holding the numerous clauses in mind as you read the opening statement to the U.S. Declaration of Independence:

"When in the course of human events it becomes necessary for one people [to dissolve the political bands [which have connected them with another]] and [to assume among the powers of the earth, the separate and equal station [to which the laws of Nature and of Nature’s God entitle them]]], a decent respect to the opinions of mankind requires [that they should declare the causes [which impel them to the separation]]].”

An important component, or "helper application", is inhibitory control: resisting temptations and distractions, both internal and environmental, preventing errors of impulsivity. Types of inhibitory control include cognitive inhibition (e.g., resisting jumping to conclusions), response inhibition (resisting the urge to interrupt), and behavioral inhibition (about to cross the street and the light suddenly changes). This core also retrieves items from the other layers and updates itself as new input is received and aggregated. Working memory is centered in the lateral prefrontal cortex, just anterior to the motor cortex, with participation from the anterior cingulate cortex and the posterior parietal cortex (Minamoto, et al., 2017). Inhibitory control also includes the subthalamic nucleus, part of the basal nuclei (Diamond, 2013).

ExecutiveFunction.png
5. Central executive processes and individual variation

Figure 5 illustrates the results of tests of executive performance, conducted by Nomi et al. (2016) with 189 subjects. Multiple cognitive tests were performed, and the three above were found significant to overall performance. We see a greater range of scores among subjects for processing speed and working memory capacity than cognitive flexibility. The color bands represent standard deviations of scores. This distribution of scores tells us that humans vary considerably in their executive functioning, with approximately 16% above and below the normal range.

As emphasized byDiamond (2013), executive functions often serve as an early-warning system for personal health. They are the first to suffer, and suffer disproportionally, when something is wrong: stress, anxiety, lack of sleep, physically unfit, sadness, and loneliness.

6. Cognitive flexibility: How easy is it to switch from stating the words to stating the colors, ignoring the words? (credit: John Stroop, 1935)

Processing speed - the number of cognitive operations performed within a timeframe (e.g., one minute).

Cognitive flexibility - a complex of abilities that allow for near- and long-term adaptation:

  • Task switching - the ease or speed at which the brain switches operational modes (e.g., math functions to reading comprehension; telephone to computer)
  • Changing perceptions spatially - how an object or person would look from a different angle
  • Changing perspectives interpersonally - understanding others' points of view and, perhaps, adapting them as our own, or admitting we were wrong and someone else was right
  • Thinking outside the box - tackling a problem from a fresh perspective; going beyond ready answers
  • Adjusting - Accepting and acting on changed demands or priorities
  • Capitalizing - taking advantage of sudden, unexpected opportunities

Working memory capacity - the number of separate pieces of information that can be held simultaneously in working memory 1 and 2. Capacity is directly related to attentional control and general intelligence (Minamoto et al., 2017).

Learning is reinforced (made more permanent) through a complex interplay of “prediction success and error” occurring when outcomes confirm or deviate from expectations, involving an interplay of dopamine and GABA neurons centered in the cingulate cortex, but involving up to forty different regions. This process is known as "reward prediction error". Dopamine neurons, carrying the feel-good neurotransmitter, fire in response to correct and especially “corrected” expectations, reinforcing what has been learned.

The second working memory function holds three or more active elements available for immediate retrieval. The neural basis of this storage lies in ensembles of neurons in the medial frontal and medial temporal lobes (Kamiński et al., 2017) repeatedly firing in brief bursts within the gamma range, with the most rapid rates occurring at the beginning of a task, when the memory is encoded, and again at the end when the memory is recalled (Lundqvist, et al., 2016). Different items involve different groups of neurons, each with its own rhythm.

The third, more diffuse circuit referred to as short-term memory holds passive items, "indexed" by the hippocampus for later retrieval or storage. As memories move from the concentrated areas of working memory into the more diffuse short-term memory, learning is evolving toward greater permanence. Thus, the "first pass" in working memory quickly fades, whereas additional processing like rehearsal, self-testing (active recall), and rumination (e.g., critical thinking) involving the more diffuse cells in short-term memory fades more slowly. Although apparently rooted in the lateral prefrontal cortex , working and short-term memory are not located in a specific structure, but are rather functional networks of brain structures working in concert.

Working memory is the primary driver of cognition, and many scientists believe it to be the seat of consciousness. See below.

Long-term memory

Explicit ("knowledge") long-term memory is dependent on the hippocampus, which stores memories for as long as two weeks before moving them to long-term storage in various parts of the cerebral cortex. Implicit memory ("skill"), on the other hand, is attained through a different, largely unconscious, process. Formal skills training almost always involves both.

Explicit, or declarative memory is memory of facts and events that we can consciously experience and recall. Think of explicit memory as “knowing what” that comes in two forms, episodic and semantic. Episodic memory is based in experience and all that is involved in it – the sights, sounds, tastes, touch, and emotions. They are rich in detail and follow a timeline. “We went to the movies and then . . .” We tend to see ourselves as actors in an event. Semantic memory, on the other hand, is a more structured record of facts, meaning, concepts, and knowledge about the outside world – factual knowledge rather than experienced knowing within its context.

Contrary to what was believed to be true, that episodic memory eventually transitions to semantic memory through generalization. Neurons in the medial PFC develop codes to help store relevant, general information from multiple experiences while, over time, losing the more irrelevant, minor details unique to each experience. The findings provide new insight into how the brain collects and stores useful knowledge about the world that can be adapted and applied to new experiences. "Memories of recent experiences are rich in incidental detail but, with time, the brain is thought to extract important information that is common across various past experiences" (Morrissey et. al., 2017). "The medial prefrontal cortex activity thus transitions from representing incidental details to representing abstract relationships. This may relate to how the brain extracts commonality across experiences."

Individuals tend toward one or the other based on differences of connectivity within the brain (Sheldon et al., 2016). Individuals who easily remember facts and recall episodes from life as matters of fact show higher connectivity between the medial temporal lobe and the left inferior and right superior cortices. They tend to integrate higher-order facts and information when recalling memories or thinking about events. Those with strong episodic memory, having vivid sensory recall involving stories and emotion – virtually re-experiencing events, show higher connectivity between the medial temporal lobe and the parietal and occipital lobes at the back of the cortex. Remember that these structural differences result in tendencies and not strict behavioral dichotomies.

Implicit, or procedural memory is the unconscious memory of skills and how to do things, particularly the use of objects and movements of the body, such as tying a shoelace, playing a guitar or riding a bike. These memories are typically acquired through repetition and practice, and are composed of automatic sensorimotor behaviors that are so deeply embedded that we become unaware of them. Once learned, these "body memories" allow us to carry out ordinary motor actions more or less automatically. This type of memory is referred to as implicit because previous experiences aid in the performance of a task without conscious awareness of these previous experiences.

Implicit memories do not appear to involve the hippocampus at all, and are encoded and stored by the cerebellum, putamen, caudate nucleus and the motor cortex, all of which are involved in motor control. Learned skills such as riding a bike are stored in the putamen; instinctive actions such as grooming are stored in the caudate nucleus; and the cerebellum is involved with timing and coordination of body skills. Thus, without the medial temporal lobe, a person is still able to form new procedural memories, such as playing the piano, but cannot remember the events during which they happened or were learned.

Tidbits

  • Females generally possess better memories of all types than similarly aged males (Rentz et al., 2016).

  • Those with larger working memories tend to tire of experiences more quickly. "People with larger working memory capacities actually encode information more deeply. They remember more details about the things they've experienced, and that leads them to feel like they've had it more. That feeling then leads to the large capacity people getting tired of experiences faster" (Nelson & Redden, 2017).

  • When asked about memorable things and events in their lives, most people best remember those from the ages of 15 to 25 (Moulin et al., 2016). "It doesn't matter if it's current affairs, sporting or public events. It can be Oscar winners, hit records, books or personal memories." Researchers call this the reminiscence bump – in reference to the shape it gives when we plot a curve of memories over a person's lifespan. There are multiple theories about why this happens, but the evidence points to the fact that our long-term identities are shaped during this important developmental period. See critical periods above.

  • Memory loss, unfortunately, is a well-documented consequence of the late-stage aging process.

Physical basis

The physical basis of learning and memory centers on the synapse - that miniscule gap between signal carrying axons and signal receiving dendrites. Short-term memory within the hippocampus occurs where proteins make the cell more receptive (TARP γ-8) or less receptive (CKAMP44) to input from other neurons, and also impact the number of receptors available for communication. These changes can last hours to days or longer (Khodesevich et al., 2014). A long-term memory mechanism within the cortex occurs with the small bumps, called spines, on signal-receiving dendrites. When neurotransmitters are released from axons and absorbed by dendritic spines, the spines themselves release a protein (BDNF) into the synapse which is then captured by BDF receptors on the same spine - a process called autocrine signaling - causing the spine to grow (Harward et al., 2016). This autocrine process leads to release of three additional proteins from the spine, two of which spread to other spines on the dendrite, further reinforcing the memory (Hedrick et al., 2016). Thus long-term memory traces are physically manifested in the cortex.

Individual memories are stored as engrams, groups of neurons with an increased strength of connection, as described above. New thinking (Titley, 2017) refers to this as "synaptic plasticity", based on repeated conditioning - that is we have to experience something several times to learn and form a memory. The chemical changes within the engram also make individual neurons more sensitive to each other, making possible a second form of plasticity and memory formation, "intrinsic plasticity". This type of memory is a nearly instantaneous mechanism with a lower threshold, taking fewer, or even a single, experience to initiate memory formation. A useful analogy is the electrical wiring and a dimmer switch for a specific light in your home. Synaptic plasticity forms the wiring for a memory while intrinsic plasticity acts as the dimmer switch varying the expression of the memory. Intrinsic plasticity may serve a range of functions, some of which complement synaptic plasticity, whereas others point toward roles independent from synaptic plasticity.

The CREB gene, expressed as the CREB protein, orchestrates both short-term and long-term memory formation by temporally linking individual memories (Silva, 2017). Memories formed closer in time (within the same day) are more likely to be linked than when they are separated by longer periods. The effect being that a second memory triggered by a previous memory is likely to be from the same time period. Generally, memories much more than a day apart remain unlinked. Memories couple when they are initially stored in overlapping populations of neurons. Remembering one is more likely to cause memories of the others. There are higher concentrations of CREB in the young, which decrease with age, helping to explain why so many of our early memories remain intact over the years.

Temporal memories (e.g., chronologically ordered) are characterized by bursts of gamma waves ordered along a slower theta oscillation so that the gamma activity for object 1 precedes that for object 2 and so on (Heussen et al., 2016).

Interesting new research suggests a physical mechanism for long-term memory storage. For every memory trace, there is an equal but opposite "anti-memory" trace, an exact opposite pattern of electrical activity (Barron et al., 2016). This precise mirroring of the excitation of the new memory with its inhibitory anti-memory prevents a runaway storm of brain activity, ensuring that the system stays in balance. While the memory is still present, the activity it caused has been subdued. In this way, anti-memories work to silence the original memory without erasing it. Anti-memories are critical to prevent a potentially dangerous build-up of electrical excitation in the brain, something that could lead to epileptic-like seizures. It's thought anti-memories may also play an important role in stopping memories from spontaneously activating each other, which would lead to confusion and severely disordered thought processes.

Emotions: Valence and arousal

Emotions and feelings, more accurately termed affect (ahh-fect, not uh-fect), centered in the limbic system, arises out of basic survival instincts – approach and avoidance. As such, affect is more basic than emotion, with only two dimensions, positive and negative. Science refers to this as valence. Specific emotions, it turns out, are more a cognitive phenomenon in which the cortex evaluates and labels signals coming from from the amygdala in the context of physical and social circumstances, one's internal state, and relevant memories that may be triggered (Duncan & Barrett, 2008; Barrett, 2006). So, for example, if the weather is cold and cloudy and I am feeling down, news that my spouse was fired from her job may be sad or depressing. On the other hand, if the weather is hot and muggy and I am feeling tense, I may become angry and outraged at the same news. In both cases, the amygdala has sent the same negatively valenced message but the brain mixes the message with other concurrent phenomena and memories and may come to widely varying conclusions – emotions.

A bi-directional valence makes good sense when we think of the life-or-death consequences for ancient humans in hostile environments – determining what is safe and what is dangerous. It also makes sense when we look at the physical manifestations of affect, which can be measured in three ways (Voosen, 2013):

  • Bodily responses such as heartbeat, sweating, blinking, and the brain's electrical firing. These measures can't tell us the emotion being experienced, not even the difference between positive and negative valence - only arousal. Bodily responses are very difficult to consciously manipulate.
  • Behavior, what we do in a situation. This measure can certainly tell us the direction of valence, basically approach and reward-seeking versus fight or flight. Behavioral responses can be subject to conscious control with effort and training.
7. Valence and arousal create affect
  • Language, expressing your emotion; putting a label on it. This measure is filtered through consciousness, evaluation, motivation, and language. As such, it is readily subject to bias and deception.

Valence, then, is a phenomena describing the direction and extent of positive and negative affect. We realize, however, that valence by itself doesn't quite capture the entire experience. Human differences tell us that the intensity of experience matters as much as direction. Strong emotions exert much more impact on the body and are much more motivating than weak ones. The strength of valence is referred to as arousal, from placid to highly agitated. We see the interaction between the two dimensions in Figure 7. The words in quotes are examples of words that solicit the different combinations of valence and arousal. Try saying the words to yourself and see how and to what extent they impact your feelings. Because you're an individual, results will vary.

In experiments, stimuli with negative valence or high arousal are associated with the fight or flight response. Stimuli with positive valence or low arousal are associated with approach tendencies. These two responses are separately initiated at the preconscious level and then integrated to evaluate the stimulus for further action (Citron, 2014).

Finally, it is well established that "bad" has more impact than "good" - bad emotions, bad parents, and bad feedback. It has more impact and is processed more thoroughly in the cortex and limbic system than good. Bad impressions and bad stereotypes are quicker to form and more resistant to change. Shermer (2016) suggests that these tendencies arise from loss aversion, in which, on average, losses hurt twice as much as gains feel good. We have a tendency to value what we have more than what we do not have. "A bird in the hand is worth two in the bush." Shermer ascribes this tendency to our evolutionary past, in which it paid to be highly risk-averse and highly sensitive to threats because the consequences were life and death. While the threat of physical harm or death is much less today than in the past, our physical brain remains highly attuned to physical as well as psychological threats.

Impact on attention and memory

Arousal and attention are intimately related. Focusing attention, our bodies become more erect and rigid - sitting up, thrusting the head forward, eyebrows raised. Pupils dilate. Cognitively, we perceive more details, process them faster, and remember them better.

It's probably no surprise that emotional stimuli, whether live, in movies and pictures, spoken or written language, are much more likely to gain our attention, even when they are not initially the focus of attention. They are also processed faster, through prioritized processing, and more accurately, with enhanced perceptual processing (Citron et al., 2014), and more likely to be remembered.

Negative stimuli elicit stronger arousal than positive ones. Subjects viewing a pastoral scene more quickly and accurately identify emotional elements such as snakes and spiders than neutral ones like flowers and mushrooms (Kessinger, 2014). Further, this focus on emotional elements crowds out perception of other elements – focus is narrowed to the emotional stimuli at the expense of others. We call this selective attention. Interestingly, not only are concurrent peripheral phenomena screened out, but also subsequent events following closely behind the arousing one – the attentional blink. We may see a man break a bottle over another man's head, but miss the fact that the offender has quickly exited the front door! Beyond the attentional blink, however, is the phenomenon of emotional hangover, in which stimuli presented after an emotional event are better remembered than stimuli presented before the emotional event. This carry-over effect is said to last 20 - 30 minutes (Tambini et al., 2016).

A related emotional phenomena is the carry-over effect from one situation to another (Barrett, 2006). Whether positive or negative, the phone call just before physics class will color our anticipation and attention at the learning event. It's only human. This impact occurs not only during attention and memory encoding, but during memory consolidation and recall as well. Affect permeates everything and there is no escaping it.

Stimuli containing personal relevance are more likely to be remembered than neutral stimuli (Kessinger, 2014). Consolidation of these memories is much less prone to interference than neutral ones, thus more likely to be remembered for a long time. We are much more likely to remember information when we apply it to ourselves than when we remember for meaning only, the so-called self-reference effect (Fossati, 2003). During learning, we are more likely to remember what we have learned when it is associated with reward (Grubet et al., 2016), the stronger the reward, the stronger the memory.

Stress is known to have an inverted U-shaped relationship with learning and memory. Very low or high levels of stress impair, whereas moderate elevations facilitate, the acquisition and retention of learning and memories (Mateo, 2008). Stress also interferes with the brain's reward system, reducing the response of serotonin and dopamine neurons (Weixin et al., 2017). Consider stress as a continuum, from indifferent to panic. High levels also have a major effect on our senses (Dinse et al., 2016). In times of stress, the stress hormone cortisol is produced by the adrenal glands and is spread throughout the body via the bloodstream. Within the hippocampus, high levels of cortisol suppresses the strengthening of synaptic connections, synaptic plasticity, and therefore the plasticity of the brain – its ability to learn.

Brain rhythms (cortical state fluctuations) naturally occur throughout our waking hours, with different locales cycling independently between active and inactive states (Engel et al., 2016). These changes can counter our conscious efforts to pay attention. "Sometimes you think you are paying attention, but you will still miss things." If the visual cortex, for example, is cycling into a passive state, we will miss visual details even though we may be concentrating intently. Energy conservation and waste removal are thought to be the underlying cause. "There is a metabolic cost associated with neurons firing all the time. The brain uses a lot of energy and may be giving the cells a chance to do the energetic equivalent of sitting down, allowing the brain to save energy ... and clear out metabolic waste."

Individual differences

Just as people differ in their focus on and sensitivity to pain, we also differ in our focus on and sensitivity to valence, and this difference seems to be a stable characteristic throughout life. “One person's potent stimulus is another person's background feature (Barrett, 2006).” Think of this sensitivity as emotional granularity where some make fine categorical distinctions between emotions, where others do not. Individuals high in emotional granularity tend to make fine distinctions in an effort to communicate the precise valence and arousal level: “I'm a little peeved with you, but not angry.” Those low in emotional granularity tend to use broad labels to characterize their experience: “I'm upset with you.”

These differences are more than the labels we use; they manifest themselves physically in the form of sympathetic and parasympathetic activation – sweaty palms, rapid breathing, quickening heart. The usual result is that those with high emotional granularity find themselves on emotional roller coasters, with each new stimulus taking them to new highs and lows. An even temperament is more characteristic of those with lower emotional granularity.

Another aspect of emotional sensitivity lies in judging a stimulus as self-relevant or not. As you probably suspect, higher sensitivity is strongly correlated with the tendency to take everything personal – or personally relevant. They place more importance on everything from the daily tasks of living to what other people say and how they look, and so are more prone to higher levels of arousal both when everything is going their way and when something goes awry. We can also note that those with high emotional granularity tend to have more self-esteem issues than the less sensitive.

Tidbit: People who chronically suppress their emotions are more likely to suffer from chronic fatigue syndrome (Rimes, et al., 2016).

Physical basis

Incoming stimuli are routed through the thalamus to relevant perceptual cortices and subcortical bodies. The insullar and anterior cingulate cortices show activation when the task requires a minimum degree of processing depth, that is neutral stimuli.

Emotionally-laden stimuli evoke broader responses in the amygdala, anterior cingulate cortex, insular cortex, orbitofrontal cortex, hippocampus, and the parietal cortex. The orbitofrontal and anterior cingulate cortices respond more to valence, whereas the amygdala and anterior insular cortex respond more to arousal. There is a significant cluster of activation in the right insular cortex to the superior temporal lobe in response to interaction between valence and arousal.

The anterior insular cortex integrates physiological information from the body with cognitive and evaluative processes, which signals the posterior insular cortex, giving rise to emotional awareness.

In learning experiments, the level of activity in the prefrontal cortex and hippocampus during encoding predicts the likelihood that information will be later remembered.

Summary

Emotions – affect – arise out of basic survival instincts, and can best be understood as having two dimensions: valence and arousal. Generally, negative valence is more acutely experienced than positive. Personal relevance is perhaps the largest determinant of impact. The higher the valence and arousal, the more powerful its impact on the rest of the brain. Strongly felt, affect can commandeer attention and overpower cognition and logic. As such, it is a powerful influence in the learning process, memory encoding, consolidation, and retrieval.

People differ in their attention and sensitivity to affect. Those paying more attention to valence tend to express their feelings more accurately and fine-grained, whereas the less attuned speak in more general terms and less emphatically. Highly sensitive (more easily aroused) people assign more personal importance to their feelings, events, the world and other people. They are, then, more susceptible to stimuli than those less sensitive. This generally means more volatility, more physical symptoms, and more self-esteem issues - all of which can interfere with learning if not under control.

The unconscious and conscious mind

CreativeMindB.jpg

We now move to mental activity that depends on the interconnectivity between brain regions - the networked brain. It would be convenient to say that consciousness is more basic and therefore required for cognition, but this is not the case. Unconscious processes have considerably larger resources and more computing ability than consciousness (Allakhverdov, 2009). They are two different, but intertwined, functions of the fascinating brain.

Daniel Kahneman, in his famous book Thinking Fast and Slow (2011), described the unconscious and conscious mind as System 1 and System 2. System 1 is automatic, fast, and efficient and typically outside the realm of conscious awareness, making it devoid of deliberation and planning. System 1 processes require only a simple stimulus, such as the words on your screen, to connect effortlessly in your mind with their meaning. System 2 processes are just the opposite. They require purposeful and relatively slow engagement of the conscious mind. Consider the effort required to complete a tax return. Similar to Freud's id and superego, the automatic and controlled systems complement each other, but they can also conflict. We need to act without thinking to avoid that oncoming car, but we also need to check ourselves when we feel like screaming out at the other driver (Bargh, 2014).

The unconsciousness mind

8. Freud's conception of the unconscious mind.

The unconscious mind comprises mental processes that are inaccessible to conscious awareness but that influence thoughts, judgements, feelings, and behavior (Wilson, 2004). According to Freud, the unconscious mind is the primary source of human behavior. Like an iceberg, the most important part of the mind is the part you cannot see. As mentioned above, unconscious processes are amazingly quick and can direct behavior without our notice. These “shortcut processes” save us (individually) from having to figure out from scratch what is helpful and what is dangerous to our well being. The unconscious evolved as a behavioral guidance system and as a source of adaptive and appropriate, and sometimess maladaptive as well, actional impulses (Bargh & Morsella, 2008).

The stronger the unconscious influence, the harder we have to work consciously to overcome it. This is especially so with habitual behaviors (eating in response to stress, compulsive masturbation) and thought patterns (cynicism, suspiciousness) since they have established themselves in our physical brain by way of established neural circuits.

Unconscious mental processes occur in relation to the environment and also to our inner thoughts and intensions.

Priming

Unconscious interaction with the environment most often occurs through priming – unnoticed factors exerting influence on perception and action (Bargh, 2014). Our perceived well being is influenced by sunny and cloudy days, the presence of food ads on TV increases consumption of available snacks, placing fruits and vegetables first in a cafeteria line increases their consumption, holding a cup of hot coffee increases the expression of “warm” thoughts.

Unconscious perception of others’ behavior increases imitation of their behaviors – smiling, use of gestures, even cooperation. Remembering past events, such as hurting a friend’s feelings, increases caring behavior with another person in the present. In contextual priming, the mere presence of certain events and people automatically activates our representations of them, and concomitantly, all of the internal information (goals, knowledge, affect) stored in those representations that is relevant to responding back. Past experience guides current thoughts and behavior to the extent that the present arouses similarities with the past, good and bad. Our present preferences are derived from those that served adaptive ends in the past.

Feelings, emotions, can exert an irresitable force on our thoughts and behaviors. Fear, anger, depression, hopelessness all have a confounding effect on our thoughts, actions, and reactions. The stronger they are, the more impact they have. We must exert willful, conscious effort to put aside unexplained and sometimes unwarranted feelings that arise.

Unconscious information processing

Whereas priming operates in a bottom-up fashion, with unconscious processes influencing conscious ones, unconscious information processing works in a top-down manner. In this instance, the unconscious is tasked by the executive function with information processing based on conscious intentions (attentional sensitization). Unconscious information is only processed and influences responding to a target stimulus to the extent that it matches current intentions (Kiefer, 2012). Attentional influences originating from the task at hand enhance task-relevant unconscious processes while reducing task-irrelevant unconscious processes. Consider how an unsolvable question the night before is suddenly answered after a night's sleep. Or you're trying to remember something, give up, and turn to other things only to have it come to mind later on.

"Executive function influence over unconscious processing demonstrates the adaptability of the cognitive system in optimizing ongoing processing toward the pursuit of an intended goal. Research suggests that preemptive executive control of unconscious processes as proposed by the attentional sensitization model coordinates even the unconscious processing streams in congruency with higher-level task representations. This considerably reduces the effort of the cognitive control system to organize behavior because task-incongruent processes are dampened at relatively early stages. Hence, attentional sensitization of unconscious information processing contributes to an effective goal-related adaptation of our cognitive system" (Kiefer, 2012).

Consciousness

Consciousness is a surprisingly unsettled phenomenon in science. Is it attention? Awareness? Wakefulness? It's actually none of these, but something more basic. We haven't found a satisfactory definition, but there does seem to be some agreement about its components (Godwin et al., 2015; Tononi & Koch, 2008; Bargh & Morsella, 2008). First, there is general agreement on the qualities of conscious thought processes: they are intentional, controllable, serial in nature (one thing at a time), and accessible to awareness (i.e., verbally reportable). Second, consciousness requires the integration of activities among brain regions. "Consciousness requires the 'binding' together of a multitude of attributes within a single experience, as when we see a visual scene containing multiple objects and attributes that is nevertheless perceived as a unified whole" (Singer, 1999). Additionally, fMRI imaging reveals a breakdown of activity between brain regions as we fall asleep or undergo anesthesia. Third, there are levels of consciousness, ranging from hyper-attentiveness to the REM sleep of dreams, approximately corresponding with brainwave patterns. Fourth, it is a subjective experience not dependent on bodily or external reality; a sort of "summary" of the current subjective state or, as Allakhverdov (2009) asserts, "the things we understand; a human necessity to live in a rational world that is not rational." Consciousness is the realm of suppression, distortion, and rationalization that provide solace in a confusing world. Finally, there s a minimum of time required for consciousness to unfold - ranging from a 400 milliseconds up to two seconds, depending on the complexity of required cognition. Reflex reactions such as pulling away from a hot stove occur in under 10 milliseconds, and enter consciousness only after the fact. Go to Human Benchmark to measure your conscious reaction time.

Theories of the neural basis of consciousness fall generally into two camps: focal and global. Focal theories contend there are specific areas of the brain (e.g., the prefrontal and sensory cortices) that are critical for generating consciousness, while global theories argue consciousness arises from the myriad processes occurring during waking hours. Recent studies (Lahav et al., 2016; Tagliazucchi et al., 2016; Godwin et al., 2015; Bressler & Menom, 2010) using fMRI imaging give support to the global view:

"We know there are numerous brain networks that control distinct cognitive functions such as attention, language and control, with each node of a network densely interconnected with other nodes of the same network, but not with other networks. Consciousness appears to break down the modularity of these networks, as we observed a broad increase in functional connectivity between these networks with awareness." Consciousness appears to be an emergent property of how information that needs to be acted upon gets propagated throughout the brain. "We take for granted how unified our experience of the world is. We don't experience separate visual and auditory worlds, it's all integrated into a single conscious experience. This widespread cross-network communication makes sense as a mechanism by which consciousness gets integrated into that singular world." Godwin et al., 2015

Herzog et al. (2016) propose a two-stage model of information processing. First comes the unconscious stage: The brain processes specific features of objects (e.g. color or shape) and quickly analyzes them quasi-continuously. However, the model suggests that there is no perception of time during this unconscious processing. Even time features, such as duration or color change, are not consciously perceived during this period.

Then comes the conscious stage: Unconscious processing is completed, and the brain simultaneously renders all the features conscious. This produces the final "picture", which the brain finally presents to our consciousness, making us aware of the stimulus. The whole process, from stimulus to conscious perception, taking at least 400 milliseconds, which is a considerable delay from a physiological point of view. "The reason is that the brain wants to give you the best, clearest information it can, and this demands a substantial amount of time," explains Herzog.

Loosely speaking, consciousness is generated by a "central" network structure with a high capacity for information integration, with the contribution of sub-networks that contain specific and segregated information without being part of the central structure. In other words, certain parts of the brain are more involved than others in the conscious complex of the brain, yet other connected parts still contribute, working quietly outside the conscious complex (Lahav et al., 2016).

Read more at: http://phys.org/news/2016-08-cortical-conscious-network.html#jCp

We know that that most brain processes occur without our awareness. It's obvious we are not aware of breathing, sweating, or the beating of our hearts. Moreover, the act of paying attention to, say, this computer screen, necessarily blocks out consciousness awareness of most of our environment (selective attention). Our physical reaction to an acrid smell or a loud noise actually occurs prior to our awareness. Perhaps most interesting is the fact that we can bring most of our brain functions into conscious awareness if we decide to pay attention to them. We can break that reaction to the loud noise into its constituent parts by learning the physiology of the startle response and then applying it to ourselves. On the other hand, the conscious mind can also fool us by creating plausible, though false, explanations. Consider retrospective sense-making: acting first and then trying to make it sensible. For example, we might buy a new phone on impulse, lulled by its many features, even though the phone we have is only six months old and fully functional. Later we explain to ourselves and others that the old one doesn't do enough to meet our needs. We can also reject external facts about the world in favor of mystical explanations.

Cognition and metacognition

Cognition and metacognition constitute the mind brought to life - they are what the unconscious and conscious mind do: the processes they undertake, the brain in action.

Cognition

"Cogito ergo sum. I think, therefore I am.” Descartes

A truly universal experience, and a driver of all human creation, cognition - is defined as a series of mental events, operations, or manipulations that alter the content of thought in some way. It is a stream of consciousness, imagination, mindfulness, believing, a chain of thought (Moseley et al., 2005). Cognition can be as simple as recognizing an object and as complex as Einstein creating his equations. Physically, it consists of the coordinated activity of nuclei organized into small networks connected by nodes into larger networks. Consider the range of cognitive tasks:

  • Perception
  • Attention
  • Comprehension
  • Reasoning
  • Memory encoding and retrieval
  • Boredom
  • Use of symbols, languages, rules
  • Production of sentences, ideas, etc.
  • Inference
  • Introspection
  • Conversation
  • Body language
  • Judgement
  • Decisions
  • Remembering
  • Pattern recognition
  • Empathy
  • Drawing conclusions

Internal dialogue

Inner speech is the silent dialogue we have with ourselves. It is the major conduit of cognition, the tool with which we evaluate and motivate our behavior, regulate our emotions, and think creatively (Fernyhough, 2017). As such it is key to motivation and self-regulation. Vygotsky (1978) suggested that this silent self-talk is an internalized version of conversations with others as we were developing as children. Indeed, fMRI imaging shows that both our language system and social cognition system are active during internal dialogue. Thus it appears that when we talk to ourselves, we are having an actual conversation - often between different points of view. This speech is not fully formed, however, flowing in condensed form and without full articulation (i.e., "shadows of words").

We can say with certainty that cognition is fundamental to learning. One of the most replicable findings in psychology is the positive manifold: that individual differences in cognitive abilities are universally positively correlated (Kievit, 2017). That is, on average, people who are better at a skill like reasoning are also better at a skill like vocabulary. Ability in one aspect of cognition reinforces ability in other aspects. This is referred to as the "g" factor in intelligence.

Salient features

Here we briefly discuss some of the most salient aspects of cognition.

Perception. Assigning meaning to sensory information by matching it with known information, referred to as pattern recognition. It is the active process of selecting, organizing, and interpreting incoming signals. Note that perception not only establishes cognitive awareness but also basic personal meaning (approach/avoidance).

Perception is not generally thought of as a skill, but it is of deepest importance to many aspects of human endeavor. As a skill, it is the ability to discriminate finer and finer distinctions between similar phenomena. As such, it is an essential element of expertise. Playing the violin requires that musicians perceive the difference between musical notes. Interpreting x-rays requires the radiologist to tease out subtle visual differences on a screen. Geller (2011) describes three characteristics of perceptual learning:

  • Optimization of attention
  • Increased specificity of discrimination
  • Increasing efficiency in identifying relevant features
  • Increasing efficiency in screening out irrelevant features

Attention. Concentrating mental resources and effort on a specific stimulus, internal or external. It can be voluntary or involuntary, conscious or implicit. Think of attention as a continuum, beginning with the involuntary orienting response, emanating from the hindbrain, in which a strong stimulus, like a boom or flash, captures our attention and raises our alertness and arousal. Next comes spotlight attention, selecting the focus of attention, filtering out other, distracting, stimuli, and preparing to encode information. This may or may not be intentional. Then we have sustained attention under conscious control. Here we are encoding incoming information and perhaps preparing a response. Sustained attention is under control of the executive function in working memory, centered in the lateral prefrontal cortex (Tanji & Hoshi, 2008).

Comprehension. Grasping the meaning, nature, or importance of a thing, person, or idea: understanding. Rather than a horizontal continuum, comprehension is more akin to a deep pool, having a surface and depth. Thus, we speak of superficial and deep understanding, where we grasp only the most narrow and superficial characteristics, a profound realization of not only the facts, but the history, subtleties, implications, relationships, and variations, or anywhere in between. Comprehension can be an active process in the moment, as during a conversation or while reading, or it can be accumulated over time as we build schema, or mental models, we use as a resource and information store. Inherent in both aspects of comprehension is depth of processing. Shallow processing focuses only on surface features like color, shape, sound, etc. Deeper processing can be seen in the following (Benchmark Education, 2016):

  • Interpreting and evaluating events, dialogue, ideas, and information
  • Connecting new information to what we already know
  • Adjusting current understanding to include new ideas or look at those ideas in a different way
  • Determining and remembering the most important points
  • Reading “between the lines” to understand underlying meanings and implications

Keep in mind that affect and consciousness can and do distort comprehension.

Reasoning. Thinking that is coherent and logical, maintaining an internal consistency and following the rules of logic. This type of cognition is guided by tools like deductive and inductive reasoning, critical thinking, logic, evidence, argumentation and the like. Consider four types of reasoning

  • Spatial reasoning, used by artists and engineers for example, is the capacity to think about objects in three dimensions, and perhaps four or more, and to draw conclusions about those objects from limited information.
  • Logical reasoning uses rules of logic such as deductive reasoning (drawing general principles from specific instances; theory building), inductive reasoning (applying general principles to specific instances; hypothesis testing), and argumentation (giving reasons to support or negate a proposition). Bayes's theorem is often held up as the epitome of logical reasoning.
  • Numerical reasoning is the ability to interpret, analyze, compute, and draw logical conclusions based on numerical data: accounting, mathematics, and statistics.
  • Verbal reasoning is the ability to analyze and evaluate spoken and written material, synthesize information obtained from it, recognize and analyze relationships among component words and concepts.

Decision making is the process of identifying and choosing alternatives based on information and the values and preferences of the decision-maker. We make hundreds of decisions each day, from the mundane ("Grapes or apples?") to the profound ("Should I quit my job?"). Inherent in each of those decisions is the backdrop of interests, biases, mood, personal history, action orientation; essentially our entire history. Contrary to the wish of economists, humans are not inherently rational - in terms of following the rules of reason. Humans can make rational decisions based on reasoning, but it takes effort. Refer to the Physical brain for a description of decision making on the physical level.

Production. Generating original thoughts and combining existing ideas to produce new ones, observed in creativity, imagination, brainstorming, generating alternatives, producing proposals, problem-solving, conceptualizing, provocating, managing change, making connections (Moseley et al., 2005).

Productive thinking involves both creative and critical thinking. On the creative side, we see things anew, plan what to do, imagine situations, make predictions, generate new perspectives, conceive new combinations, synthesize seemingly unrelated concepts. However, without the accompanying critical side, we fail to test our creations against the real world. Analysis, evaluation, problem-solving, considering options, considering opinions, making judgements, making discriminations all serve to test creative ideas for their application, truthfulness, completeness, consistency, and implications for the larger world.

It is widely accepted that the two aspects of productive thinking should be separated. Creative, or divergent, thinking is governed by a lack of restrictions, lack of evaluation, piggybacking on ideas, stream of consciousness wild-eyed production of ideas. Critical, or convergent, thinking considers, compares, evaluates, rank-orders, eliminates, combines, or otherwise reduces the number of choices to one.

“Remembering is not the re-excitation of innumerable fixed, lifeless and fragmentary traces. It is an imaginative reconstruction.” Frederick Bartlett, 1917

9. Memory reconstruction is a creative process

Remembering. As Dr. Bartlett foretold many years ago, we now know that memories are rebuilt every time they are accessed (Figure 9). There is no permanent memory per se, but rather memory reconsolidation that comes with a natural updating mechanism to integrate the current state of the individual - less like a movie; more like a play. For example, several hundred 9/11 witnesses were interviewed shortly after the tragedy, and annually thereafter for four years. Within one year, 37 percent of the details had changed, and 50 percent changed within three years (Hirst, Phelps, Budson, et al., 2009). Clearly, memories are not permanent, although there may be permanent aspects (towers collapsing). Recent research (Nakamura & Sauvage, 2015) tells us that the same hippocampal cells that encode memories also retrieve them. Additionally, these cells are essential for distinguishing similar or overlapping memories for all senses (Suthana, et al., 2015). We also know that recalled memories are vulnerable to interference in the same way that recent memories are until they have been reconsolidated in long-term memory (Gluck et al., 2014). Controlling the conditions under which memories are recalled can also change their content. Horrifying experiences can be largely neutralized if recalled under optimal conditions, and can even be erased by an injection (CaMKII-alpha inhibitor).

Metacognition

Metacognition is the introspective ability to monitor, regulate, and sequence one's own mental processes (McCurdy et al., 2013). Thinking about our thinking. It is an essential component of mental and behavioral strategies; deciding our current strategy is not working and switching to another, for example. As such, it is a key component of effective self-regulation. We can break metacognition into two separate processes (Brick, et al., 2015):

  • Metacognitive strategies such as planning for future performance and evaluating past performance.
  • Metacognitive experiences, allowing for concurrent, or "online" monitoring during task performance, especially subjective judgments of feelings, "knowingness", and comparing ourselves to others during competition.

Metacognition has been measured in terms of learning and memory, decision making, perceptual performance, and athletic performance. The following measures can be used to compare subjective judgments with actual performance and using metacognitive strategies (modified from Brick et al., 2015; Fleming, 2012).

Timing Measures Metacognitive Strategies
Prospective judgement of capabilities, feeling of knowing, estimate of required effort and level of difficulty plan, study, practice, self-test, mentally prepare
Concurrent feelings of knowing, familiarity, confidence, difficulty, fatigue monitor internal sensory signals, monitor external factors (including others' performance in competition), manage distractions
Retrospective confidence, wagering (willingness to bet you are correct), judging the level of difficulty review and evaluate, judge performance level, capabilities, and effort

Accurate self-knowledge is not a sure thing, however. Many experiments tell us that confabulation is quite common. For example, two studies asked subjects to make judgments about facial attractiveness and supermarket goods (need source). Unknown to the subjects, the experimenters reversed the original decisions and then asked subjects to explain their decisions. Most remained unaware of the switch and went on to convincingly explain their (reversed) decisions. Other studies have shown that assessments of one's personal attributes are poorer than assessment of others' attributes. Despite these tendencies, we know that accurate self-assessment is possible, especially when assessing specific performances. Less so when assessing general capabilities. The ability to accurately predict and judge performance increases as task performance itself increases, a result of accumulating evidence and more accurate and fine-tuned judgement.

See Metacognitive activities in the Learning activities article for further discussion.

Physical basis

Within the brain, the lateral prefrontal cortex (PFC) is involved in sustained cognitive control while medial PFC works with it to adjust control dynamically according to changing task demands that are reflected in the level of conflict elicited during performance. "Control conflict loop" theory hypothesizes that during dynamic control, the anterior cingulate cortex detects conflict and signals lateral prefrontal areas to more strongly represent context to support higher levels of control. The conflict control loop is rooted in the dorsal anterior cingulate cortex and reaching to the medial and lateral prefrontal cortices (PFC), the insular and parietal cortices, and the basal nuclei.

The anterior cingulate cortex is also the seat of “prediction error learning,” arising out of conflict. It occurs when outcomes (of choices, decisions, performance) confirm or deviate from our expectations, based on previous learning. This type of learning involves an interplay of dopamine and GABA neurons centered in the anterior cingulate cortex, but involving up to forty different locations. It performs both "error detection" and “error-correction” in the form of identifying and selecting more appropriate responses (Holroyd & Coles, 2002). Dopamine neurons, carrying the feel-good neurotransmitter, fire in response to correct and especially “corrected” responses, reinforcing what has been learned.

Anteriodorsal aspects of the PFC are central to making retrospective judgments of performance. Prospective judgments depend more on the medial PFC. Dorsolateral and anterior PFC subregions interact with the cingulate and insular cortices to promote judgments of active performance (Baird, et al., 2013; Fleming, 2012). Metacognitive ability for perceptual decisions is associated with greater activity between the lateral anterior PFC and the right dorsal anterior cingulate cortex, putamen, caudate nucleus, and thalamus. Ability for memory recall is associated with greater activity between the medial PFC and medial and inferior parietal lobe. Palmer et al. (2014) found a marked decrease in perceptual metacognitive accuracy with age, and a non-significant decrease in memory metacognitive accuracy.

Sleep

Sleep has recently emerged as a critical component of learning and memory. First, memory consolidation is most efficient during periods of rest following learning (Whitehurst et al., 2016; Studte, et al., 2015; Schlichting & Preston, 2014). Rest includes taking a break between periods of study, taking naps, and especially six to eight hours of nighttime sleep. During waking hours, when a synapse is frequently activated within the cerebral cortex, the surface area of the synapse between axons and dendritic spines increases, increasing its signal strength - important for learning and memory.

SleepSharpensSkills.png
10. How sleep sharpens learning. Credit: University of California San Francisco

REM (rapid eye movement) sleep Nighttime sleep then shrinks about 80% of synapses by approximately 18% - preparing the brain for a new day of activity and learning. The physical process is triggered during REM (rapied eye movement associated with dreaming) sleep by the release of the protein Homer1A from the nerve cell into the synapse, which prunes dendrite spines and removes glutamate receptors . The other 20% that do not shrink are associated with more stable memories, linked to sleep spindles (Seibt et al., 2017; de Vito, et al., 2017; Diering et al., 2017; Li et al., 2017).

Sleep spindles (also called sharp-wave ripples), electrical impulses emanating from the hippocampus through the thalamus to regions of the cerebral cortex during dreamless (non-REM) sleep, occur up to 1,000 times a night (Wei et al., 2016; Tamminen et al., 2010). This hippocampal input activates selective memory traces during deep sleep and causes them to replay. During such memory replay, the corresponding synapses are strengthened for long-term storage in the cortex. This causes proteins within the target cells to be produced in larger quantities (levy et al., 2016), preventing dendritic spine shrinkage (as above). Additional research has identified a specific mechanism for strengthening motor learning in this way (Gulati et al., 2017). This activity is highest during the first third of sleep. Watch a demonstration of sleep spindles below (UC Berkeley, 2011). Note that the time period has been stretched in the animation and covers a mere 400 milliseconds, or about 0.4 seconds. Watch for the pattern of activity that identifies individual spindles.

<youtube width ="300"">2zQNWSLackk></youtube>

Sleep also clears short-term memory. The mechanism for this occurs within the hippocampus with the destruction and replacement of specific histone proteins called H3.3 inside hippocampal neurons (Maze et al., 2015), with the effect of neural pathways (linkages) within the hippocampus being deconstructed and reconstructed on a regular basis.

Other sleep-related phenomena impacting long-term memory involve the reduced presence of dopamine during sleep, accomplished by specialized neurons that "turn off" dopamine producing neurons. Reduced arousal results in less signaling by dopamine cells, stopping interference caused by mental and behavioral activity. That is, sleep essentially isolates the brain from all of the stimuli that can interfere with memory consolidation. Second, as sleep progresses to deeper levels, dopamine neurons become less reactive to stimuli. Less dopamine in the system and reduced reactivity of dopamine cells lead to more stable memories (Berry et al., 2015).

Studies demonstrate that chronic lack of sleep lowers grade averages in high school and college students (Heijden, et al., 2016; Jacobson, 2014). The most recent research (Fattinger et al., 2017) shows that even one night without deep sleep interferes with the ability to learn new motor tasks. The physical basis for this is a daytime loss of connectivity between neurons in the hippocampus due to shrinkage and loss of dendritic spines, the receivers of signals from other neurons (Havekes et al., 2016).

Tidbit: Although not specific to memory, an unexpected recent finding tells us that not only sleep, but our sleep position, impacts the brain's ability to clear waste products and toxic proteins (Lee et al., 2015). Sleeping on one's side increases the efficiency of waste removal, including amyloid beta (Aβ), a known component of Alzheimers disease. Waste is removed from the brain via lymphatic vessels, running parellel to blood vessels (Absinta et al., 2017).

Multi-tasking and parallel processing

11. Multitasking is really a process of disengagement, transition, and re-engagement

Multitasking has become synonymous with modern society (Wallis, 2006; Keim, 2009). Grade-schoolers to adults multitask between their gadgets, between their gadgets and nearby friends, between eating and watching TV. The list goes on.

Within the brain multitasking is actually a process of disengaging from one task within working memory, managed by the thalamic reticular nucleus (TRN), switching to and engaging in another, and so forth (Benbunan, et. al., 2011). What was formerly the focus of attention (in working memory one) is diverted to the second or third circuit, and focus is moved to the new task. In effect, the TRN is acting as a switchboard, suppressing some signals while allowing others through, then reversing itself to allow formally suppressed signals through and formally flowing signals to halt (Wimmer et al., 2015). This capacity is dependent on a number of factors (Borst, et. al., 2010).

Cognitive resources. One well researched approach to multitasking describes the many cognitive resources available to us all of which can operate in parallel with the others, but each of which can process only one operation at a time (see Figure 12 below). For example, we can only recall one fact at a time. Thus, a particular task may first call on visual processing, then manual operations, then declarative memory, then manual operations, and so forth. An additional task can be performed parallel to the original, with its own series of cognitive demands, as long as the two tasks do not demand the same resource at the same time. When this happens, it’s first-come-first-served, where the task currently using the resource maintains primacy as long as necessary, while the other task must wait for the resource to become available. This phenomenon acts as interference, and multitasking is rendered less efficient and more error prone.

Interestingly, the different cognitive resources operate at different time scales, with procedural processing taking only 50ms (milliseconds), declarative memory taking 200 to 500ms, and a resource called “problem state” taking up to thousands of milliseconds to complete processing. Problem state is especially important because it is what allows us to multitask. Think of problem state as a snapshot of the current state of the task. Before switching to a different task, the brain must create a representation of the task as it exists in the moment. Add another 200ms to make the attentional switch to the other task.

Consider the common occurrence of talking on the phone while completing a task on your computer. Below we see Task 1, the computer task, and Task 2, talking on the phone. As we see below, we can't attend to the computer and the phone at the same time, so we attend to the screen while the voice on the phone has to wait. And so it goes, with each cognitive resource attending to one task at a time.

Only different processes can operate in parallel Multitasking
12. Only different processes using different resources can operate in parallel

Errors creep in when we must frequently switch tasks, and problem state representations accumulate. In other words, we switch tasks quicker than the individual memories fade. This increases the chances that an old snapshot will be retrieved rather than the most recent, causing errors and decreasing accuracy.

Concentration. Attention is dependent on the ability to screen out distractions while focusing on the task at hand. Keeping irrelevant stimuli at bay increases efficiency and processing power. The brain’s capacity to ignore distractions varies between individuals and also diminishes with age, affecting working memory and the ability to swap attention and memory, thus decreasing our ability to multitask. This ability (or increasing inability) to screen out distractions is one of the greatest concerns of researchers and philosophers regarding the networked world we now live in (Carr, 2010).

Predictability. Task switching is a matter of swapping attention and memory (Ophir, et. al., 2009). Predictable tasks increase the ease and speed with which the swap is made. For example switching between computer-based tasks is more efficient than switching between the computer and answering the telephone. This example also illustrates another key factor in multitasking – the cognitive load required by the different tasks.

Cognitive load. Handling two tasks where both require significant mental resources is much less efficient than handling tasks with low cognitive load, or between one requiring significant resources and others that do not. Thus, eating a sandwich while working at the computer poses little challenge for most people. However, as the number of simultaneous tasks increases, the brain’s ability to keep them separate diminishes – leading to decreased efficiency and increased errors, another function of increasing cognitive load.

Prior Activation. Accomplished multitasking occurs most readily when the demands of the involved tasks coincide with brain areas already activated, and less so when the areas were previously quiet. Thus it’s easier to switch between tasks you are already doing, but more difficult to add a new one with new demands.

Sasai et al. (2016) demonstrated what they refer to as functional split brain operations among different networks during a split-attention task. Subjects first drove a car while listening to GPS instructions (integrated task), with fMRI images showing significant integration (cross communication) between the "driving" network and "listening' network. The "driving" network consisted of the occipital, parietal, and motor cortices, and the "listening" network consisted of the temporal lobe and dorsolateral PFC. Conversely, when subjects drove while listening to a radio talk show (split attention task), integration was markedly reduced; one networking was driving while the other was listening. We should note that both tasks were low in cognitive load.

What It Means

We can see that these findings have several implications for the design of tasks.

  • Multiple demanding tasks will lead to more errors and extended time to completion.
  • Pairing demanding and easier tasks or multiple easy tasks is more likely to produce efficient error-free work.
  • Largely routine (predictable) tasks can more easily be executed while multitasking.
  • Tasks that display the current state (e.g., what's on the computer screen) when switching to a different one can be picked up again more readily because the brain is largely freed from maintaining the problem state.
  • Firing up inactive parts of the brain is time consuming, so ongoing tasks are more efficiently handled than a succession of new tasks (Dye, 2008).

See Training for multitasking in the learning activities article.

Tidbit: Women are generally better at multitasking than men. "Recent research reveals that male brains appear to consume more energy when they need to shift attention. In addition to this, in men there is greater activity in the dorsolateral prefrontal areas of the brain compared to women, as well as activation in some other areas which is not usually observed in women" (Kuptsova, et al., 2016). These differences fade after age 50.

Habits and expertise

Habits and expertise both involve subconscious or preconscious processing, automatically carrying out formerly conscious behaviors. They allow the brain to function with immensely increased economy and efficiency, freeing up and enhancing working memory for other tasks. Habits most often refer to the behaviors of everyday life: our morning routine, how we eat, how we drive. Expertise, on the other hand, involves skills that have been practiced to the extent of automaticity or near-so. Habit and expertise can be both blessing and curse, as we see below.

Habits: Acting without thinking

As we repeat a behavior, physical or mental, it becomes laid down in special "habit circuits", involving the infralimbic cortex (at the base of the prefrontal cortex) and striatum within the basal nuclei (Graybiel & Smith, 2014). Habits are not an either-or phenomenon, but rather a continuum of behavior. Their essence lies in the transition of deliberate behavior into self-reinforcing auto-piloted routines. Habits begin in the prediction error process discussed above - when the outcome of a particular behavior is rewarding, dopamine is released and positive feedback loops form. The striatum chunks the sequence of actions that bring reward, after which they become a single 'unit'. It is the same form of chunking we see in expertise, discussed next. Although this chunking is perceived by the cortex, physical changes there come only after extended repetition of chunked behaviors. At this point, the habit is formed and control transferred to the infralimbic cortex. Habits are fast, but they are inflexible. Once begun, they play out regardless of consequence. Doing things deliberately is slow but flexible.

Even though habits are nearly automatic, they can actually be under the control of working memory, which must be "online" for the habit to be triggered. In other words, working memory signals the infralimbic cortex that the context is right for the habitual behavior. Working memory can also intervene to stop an initiated chunk of behavior, as when we work to break the habit by switching to slow, deliberate behavior. This can be exceedingly difficult for long-existing habits because habits are fast and inflexible. Mark Twain said it best: "Habit is habit, and not to be flung out the window by any man, but coaxed downstairs one step at a time."

Expertise

Expertise is a very interesting phenomenon because it illustrates the shortcuts the brain is capable of in order to make brainwork more efficient, just as habits do. The difference is that experts not only know the rules and procedures very well, they also know when to sidestep them and intervene with their own strategies (Farrington-Darby & Wilson, 2006). They have thousands of hours of experience and have been exposed to a multitude of variations. As a result, they think differently from the rest of us - at least within their specialty.

“One challenge of this … is deciding what’s signal and what’s noise. Much of the time, it’s my intuition that helps figure out which is which.” Gurpeet Dhaliwal, a clinical diagnostician (in Hafner, 2012).

Two recent studies back Dr. Dhaliwal’s intuition. Dane (2012) found that intuitive decision making is more successful for experts than novices, for whom analytical decisions are more successful. Further, experts did not outperform novices when making their decisions analytically! Why is this so? Experts are known to perceive patterns others do not see; their knowledge is structured into larger “chunks” for more efficient access, and they are better able to screen out irrelevant information and focus on what’s important in the moment. As such, expertise is very domain specific, and highly dependent on perceptual learning (Geller, 2011) as well as cognitive learning (Ballester, et. al., 2008).

How does the brain organize itself as expertise builds? We already know about memory circuits and how individual neurons participate in multiple circuits in multiple ways. This is the physical manifestation of complex schema built up over time within the expert’s mind. The efficient use of knowledge and experience comes about when it becomes intuitive and second nature – habitualized. The net result is that the brain can redirect immediate neural tasks to the more intuitive infralimbic cortex, freeing up working memory for higher-order tasks. This shift is quick and unconscious.

Experts perceive patterns quickly by matching them to a series of stereotyped arrangements – chunks – so they can use them as starting points for responding to the problem. Novices, on the other hand, must first work through the process of recognizing and making sense of patterns, and deciding what is important and what is not – tasks requiring slower processing in the cerebral cortex.

12. Expertise can plateau.

Experience: Hours of practice. How, then, does one build expertise? One popular notion is that it takes 10,000 hours of practice to become expert, with steady progression from novice to intermediate to expert (Ericsson & Towne, 2010). Actually, it’s not that simple. Although practice does matter, it does not ensure progression. In fact, it’s quite easy to stall out unless the practice is “deliberate.”

Training: Deliberate practice. Deliberate practice requires full concentration on improving a specific aspect of performance during practice activities (Figure 12). We can distinguish this from fluency building which focuses on the accuracy and speed of learned behaviors. This deliberate practice approach to expertise asserts that it is trainable (see deliberate practice), and not something that comes about only after years of experience. Proponents of this approach have identified a number of techniques that mediate expertise, all of which can be taught and learned.

  • Superior ability to monitor and control aspects of performance execution.
  • Maintenance of highly complex and sophisticated schema of domain-specific situations, leading to superior anticipation skills and the attendant ability to respond quickly.
  • Superior control over motor actions in order to perform very complex physical actions consistently.
  • An acquired ability to retrieve information from long-term memory virtually as quickly as if it were stored in short-term memory, giving rise to the concept of “long-term working memory.”
  • During training, individuals are given tasks with well-defined goals concentrating on specific aspects of performance, given immediate and informative feedback, opportunities for subsequent correction, and opportunities for deliberate practice.
  • Restricting the amount of practice without a break, with daily limits of 4-5 hours.
  • Assigning metacognitive tasks to help learners diagnose how they need to change their mental representations in order to produce and reproduce the correct action in similar future situations. In other words, constantly refining and correcting mental schema.

Assumptions about expertise proven to be false include such endowments as innate talent, faster physiological processes, larger brains, and the ability to gain more from practice. No doubt, expertise is not a gift but a hard earned achievement.

Impact of aging

Cognitive changes

13. Aspects of cognition and aging (Hartsthorne & Germine, 2015)

While aging moves like clockwork, there is no direct correlation between age and mental functioning, just as there is no direct correlation between aging and physical health. Cognitive function is maintained through a combination of genes, physical health, and continued mental effort. None of this is news, of course, but there is a new emerging picture of age-related cognition.

The most popular view focuses on "fluid" versus "crystallized" intelligence: the capacity to think logically and solve problems in novel situations, independent of acquired knowledge, versus the ability to use skills, knowledge, and experience. Fluid intelligence is thought to peak around age 20 followed by a slow decline as we age, whereas crystallized intelligence is thought to accumulate during a lifetime, peaking in old age. The current research also finds these general patterns, however "this dichotomy is way too coarse. There are a lot more patterns going on..." (Hartshorne & Germine, 2015). Although the research is considered preliminary, their findings suggest that different abilities peak at different ages and decline in different ways. Summarizing Figure 13:

  • Social cognition (purple) such as the ability to read others' emotions and moods, peaks in the 40s and remains relatively stable through the 60s.
  • Vocabulary (black) builds throught the lifespan, peaking around 70.
  • Processing speed (green), the quickness of manipulating words, numbers, etc., generally peaks in the late teens and declines steadily after the 30s.
  • Visual-spatial working memory (blue) tasks such as the ability to find, recognize, and remember visual elements, especially the ability to recall faces and some mental manipulation, peaks around age 30 and declines slowly until the late 60s, after which there is a quick decline. The researchers note that visual memory responds minimally to practice and experience.
  • Verbal working memory (red) tasks such as remembering word and number lists peaks a bit later than visual memory, and declines a bit slower.

"The picture that emerges from these findings is of an older brain that moves more slowly than its younger self, but is just as accurate in many areas and more adept in others ... on top of being more knowledgeable" (Carey, 2015).

More recent research (Ankudowich et al., 2016) suggests that some signs of aging, such as poorer contextual memory, may be a matter of changes in priorities and thus processing of incoming stimuli as we age. Comparing young and older subjects in a visual memory task, researchers found that younger adults accessed their visual cortex while older participants accessed their medial prefrontal cortex, known to be involved with one's own life and introspection. In other words, younger and older were focusing on different aspects of the same memory.

Physical changes

In the past several decades, investigators have learned much about what happens in the brain when people have a neurodegenerative disease such as Parkinson’s disease, Alzheimer's disease, or other dementias. Their findings also have revealed much about what happens during healthy aging. As a person gets older, changes occur in all parts of the body, including the brain (Alzheimer's Disease Education and Referral Center):

  • Certain parts of the brain shrink, especially the prefrontal cortex and the hippocampus. Recent research (Erickson et al., 2016) indicates that aerobic exercise training increases hippocampal volume by 2%, effectively reversing age-related loss in volume by 1 to 2 years and improving memory.
  • Communication between the hemispheres increases with age, a compensating mechanism for reduced functioning in the left hemisphere on cognitive tasks (Davis at al., 2017).
  • Reduction in volume of the right posterior parietal cortex is associated with increased risk aversion in the elderly (Grubb et al., 2016).
  • Glial cells are impacted by aging much more than neurons (Soreq et al., 2017). Two types of glia that support neural metabolism and waste removal show the greatest reduction in number, especially in the hippocampus and substantia nigra.
  • For years, science has known that the brains of animals and humans who regularly exercise are different than the brains of those who are sedentary (Reynolds, 2016). Research has recently discovered that strenuous exercise causes genes within the hippocampus to produce more brain-derived neurotropic factor (BDNF), in turn increasing the production of new neurons and their survival rate. Scientists sometimes refer to BDNF as "Miricle-Gro" for the brain.
  • Changes in neurons and neurotransmitters affect communication between neurons. In certain brain regions, communication between neurons can be reduced because white matter (myelin-covered axons) is degraded or lost.
  • The dopamine system in the hippocampus, key to long-term episodic memory, degrades with age and is associated with impaired cognitive functioning in the elderly (Nyberg et al., 2016).
  • Changes in the brain’s blood vessels occur. Blood flow can be reduced because arteries narrow and less growth of new capillaries occurs.
  • In some people, structures called plaques and tangles develop outside of and inside neurons, respectively, although in much smaller amounts than in Alzheimer's Disease (see "The Hallmarks of AD" for more information on plaques and tangles).
  • Damage by free radicals increases (free radicals are a kind of molecule that reacts easily with other molecules; see "The Aging Process" for more on these molecules).
  • Inflammation increases (inflammation is the complex process that occurs when the body responds to an injury, disease, or abnormal situation).

Summary: The process of learning and memory

The physical manifestation of learning, which is adaptation and memory, is quite fascinating and provides learners and instructors alike with important insight into how learning is best accomplished.

We know that the human brain continuously, though briefly, records sensory experience, and that differing sensory inputs are routed to different brain regions for processing. We also know about different types of memory – sensory, working, short-term, and long-term. How then do all the parts of the brain work together to create learning? The following description is an approximation of the actual physical process of learning and memory, based on current research, some of which does not fit neatly together. Therefore, some poetic license has been taken.

As mentioned previously, the recognition network is the source of all input. Perception is routed through the thalamus, a switching station of sorts, to the different cortical regions for processing. The thalamus is coated with a thin layer of cells, called the thalamic reticular nucleus, which acts as an attentional screen determining the specific signals that get passed to the thalamus and thus the cortex (Ahrens et al., 2014). Located within the limbic system, the thalamus is also part of the affective network, explaining the role of affect in perception. Learning that is accompanied by affective experience, like anticipation and concern, is much more “sticky” than when affect is absent. This is so because, first, the thalamus “speaks louder” with increased signal strength when input contains personal relevance. Second, memories with associated affect are distributed more broadly within the brain – logic circuits and affective circuits.

During intentional learning, our experiences are routed to working memory where they are processed and manipulated in conjunction with other cortical regions and the basal nuclei. The prefrontal cortex (PFC) is regarded as the hub of the brain's working memory system (Markowitz et al., 2015), with sensory centers (e.g., visualizing), and the cerebelum (i.e., motor learning) called upon as necessary (Mason, 2014). This processing appears as theta-band oscillations beginning within the striatum and eventual synchronized patterns in the PFC, which pieces together the incoming stream of input (referred to as dynamic functional connectivity). "The striatum learns very simple things really quickly, and then its output trains the PFC to gradually pick up on the bigger picture." In other words, the striatum is thought to rapidly acquire simple information (single associations, decision alternatives, etc.) in piecemeal fashion, while the PFC knits together such details into more elaborate and generalized representations (Antzoulatos & Miller, 2014). It has become generally accepted that synchronization of brainwaves between differing regions of the brain is the primary mechanism for learning, memory, adaptation, and decision making (Polanıa et al., 2015; Yamamoto et al., 2014; Narayanan et al., 2013; Sejnowski & Delbruck, 2012).

Short-term memories remain vulnerable to interference by succeeding experience until they make their way into long-term memory through a process of consolidation, although recent evidence tells us that memories reside in the hippocampus as long as six weeks (Attardo et al., 2015). Another recent finding reveals that different "grades" of memories are stored in different locations within the hippocampus (Collin et al., 2015), with less detailed memories stored toward the front and more detailed ones toward the back. Different aspects of the same event can be stored in the different locations with, for example, the fact that you had lunch with a friend last week stored toward the front, and a specific conversation you had stored toward the back.

Beginning with the hippocampus emanating signals, routed through the thalamus, to relevant cortices (e.g., occipital lobe for visual information), a cascade of gene activations and protein synthesis takes place within neurons. The gene activations change the neuron's structure (primarily within the signal-receiving dendrites (Cichon & Gan, 2015; Kandel et al., 2015) and the proteins are expelled into and congregate around the synapse. All this activity makes involved cells more sensitive to each other, making it easier to pass along their excitation.

Conclusion

Our brief look at the physical brain raises both philosophical and practical issues. Can the mind really be simply an accumulation of neural activity? We can't claim any purview of this tough question, but this article tells us much about how the brain works and how it learns. Any attempt we make to design and teach real learning events must account for this reality.

Suggestion: Review this article and jot down the "rules" we must account for when we design and teach.


#top

Over to Physical Brain

Up to Foundations

⇑ ⇑ Up to Home