16 Cognitive Development

Take a few moments to sit quietly and think about anything you want.

What just went on in your mind? If you are like many people, you heard a little “voice” in your head articulating your thoughts. Maybe it said, “Hmm, I wonder what I should think about.” What happens inside the minds of infants and children when they think? It seems pretty likely that, at least for infants, there is no “voice in the head” speaking to them, so at some level, their thinking, or cognition, must differ from ours.

But how different is it? And how could you find out? Maybe infants do not think at all. Perhaps newborns’ behavior is fixed by simple reflexes. Over time, they begin to react to the external world and through experience (i.e., learning) they begin to think. At the other extreme is the possibility that infant thought is exactly like adult thought (with the obvious exception that they cannot think in words). Psychologists who study cognitive development in children have been asking these questions and providing the answers through research for many years. As you will see, particularly in the case of infants, the psychologists have come up with some very ingenious methods of finding out what goes on inside the mind of a child.

What about preschoolers and older children? Do their thought processes differ from ours? Why might these be important questions? If you have not already, many of you will have the opportunity to teach children one day, perhaps as an educator, but more likely as a parent. No one would try to teach children subjects for which they lacked sufficient background knowledge. For example, you would not teach calculus to a child who had not yet learned arithmetic. But what if you ignore the way the child’s mind works, assuming that it works the same way that yours does? You run the same risk of being unable to reach the child. You need not be teaching an academic subject to a child to be concerned about this. Many common parenting difficulties can be improved through an understanding of child cognition. One goal of this module is to introduce you to what psychologists have discovered about child cognition and to hint at some of the ways you can use this knowledge to understand and improve relations with children.

And what about when we get older? Do we continue to think the same way? Or are we destined to break down, to decline? Modern developmental psychologists ask these questions about people’s changes in cognition throughout their whole lifespan. And their answers can help us to understand and interact with each other and can teach us what to expect for ourselves as the years go by.

16.1 begins our coverage of cognitive development with a description of what might be the greatest intellectual feat of our lives, developing and using language.

16.2 introduces us to the work of the most famous and influential researcher in the history of cognitive development, Jean Piaget. Because his work dealt exclusively with children and adolescents, this is the one section that will not include any reference to development and cognition in adulthood.

16.3 expands upon the work of Piaget and introduces some other topics in cognitive development

16.4 concludes our coverage with another age-limited topic, namely cognitive disorders associated with aging.

16.1 Developing and Using Language

Activate

  • How many different words would you estimate that you know?
  • At what age do you think children develop the ability to formulate unique utterances that they have never heard previously?
  • Try to think back to your conversations over the past 24 – 48 hours. What kinds of things do you typically talk about? How do your conversation topics differ when you are talking to friends versus romantic partners versus family versus people you encounter at a place of work?

Language Development

We have bad news. All of us may have already had our most impressive cognitive achievement. We did most of it by the time we were three, so it has been downhill ever since. Think about it. Between birth and age three, nearly all children worldwide have developed from being completely unable to communicate beyond a reflexive cry to having a solid working knowledge of one language or more. This knowledge includes a vocabulary (in English, for example) of 1,000 words, the ability to understand almost any utterance they hear, and the ability to produce an infinite variety of complex and unique utterances that other speakers of the language can understand. By one estimate, fifth graders have learned 40,000 words (Anglin, 1993). Assume that an average college textbook has 500 key terms in it. By the time an average child reaches eleven, they have acquired a vocabulary equivalent to the key terms in 80 college textbooks.

Many mental abilities once thought to separate us from the “lower” animals have turned out not to hold up under close scrutiny. Using tools, planning, solving problems, categorizing, and many other cognitive feats have been observed in non-human primates. What about language, though? Without a doubt, non-human animals communicate with each other. But so far, no other species has been found that can duplicate our ability in language. Although there have been a few apes that have been taught sign language, the cases have been largely oversold. Although their learning is impressive, it takes a massive effort to get the apes to reach a level of facility that an average three-year-old reaches with no special instruction whatsoever.

The traditional view of language is that it is the ultimate cultural invention; it is as if we are unlike other animals in profound ways because we have developed large vocabularies and grammar and syntax. In other words, language has transcended our biological roots, and shows that humans have become something more than a really smart animal. We see this kind of thinking as another example of believing in nature versus nurture. “If something is an invention of culture, it cannot be a product of nature,” the falsely dichotomous thinking goes.

We prefer to think of language as perhaps the greatest example of nature and nurture working together to produce a uniquely human ability (Pinker, 1994). The nurture of language is that each culture develops its own unique language, and children learn the language to which they are exposed. The nature of language is that children will learn a language if they are exposed to one. In other words, all children are born with the ability to learn language. In a striking parallel between humans and monkeys, Poremba et al. (2004) demonstrated that the left hemispheres of monkeys are active when they are listening to monkey vocalization, but not other similar sounds. Human beings process human vocalizations in much the same way. This evidence suggests that human language may have evolved as an innate capability.

Another indication of the nature of language is the universality of the language-learning process. Throughout the world, infants progress through the same stages in the same order on the road to learning a language, although they may reach a specific stage at a different time (this is true within a given culture, as well). Infants quickly progress by the second or third month from simple crying to producing sounds commonly known as cooing. Coos consist mostly of vowel sounds (“aah”) sometimes with consonants in the front, (“goo”). Infants coo when they are interacting with other people, and parents often learn that they can increase the frequency of cooing by responding to the sounds. Around six to nine months, infants begin the babbling stage. When babbling, infants string together syllables consisting of vowels and consonants. Early on in babbling, the infant produces nearly all possible human speech sounds. You can see the influence of nurture as they progress through the stage; they begin to focus on only the speech sounds present in the language they are beginning to learn (de Boysson-Bardies et al. 1989).

Incidentally, deaf children who are learning sign language develop very similarly. Just like hearing children, they begin babbling as infants, although often a little later. They also “babble” with their hands, producing hand formations that will eventually be used in signs. Some researchers have reported that deaf infants often begin producing signs earlier than hearing infants produce words (Bonvillian et al., 1983; Orlansky & Bonvillian, 1988). Others have found that the age at which signs and words are produced are similar for deaf and hearing children (Acredolo & Goodwyn, 1991; Petitto & Marentette, 1991).

Infants enter a one-word stage at about one year old. The magical appearance of the first word is not as much of a milestone as you might expect. First, it probably is not all that important developmentally. Throughout the development of language infants and young children consistently understand far more than they can produce, so infants know many words before they produce their first one. Second, the first word is hard to catch. Many parents miss it (even if they don’t realize it) because it is difficult to recognize the first real word from a coincidental babble. For example, imagine a parent who hears their child say “ba” one time in the presence of a ball when they are 8 months old. Of course, many parents would believe that it was their child’s first word, and they might also be quite proud of them for being so advanced and of themselves for being such good parents. It does not count as a real word, though, unless the infant produces it consistently. And much of the time, they do not, especially at 8 months old. Most children’s actual first spoken word probably occurs some time around one-year-old.

Once children begin to use words consistently, parents and other people familiar with a particular child can understand what they are saying, even if the pronunciation often makes the speech unintelligible to an outsider. The learning of new words begins slowly at first and picks up speed dramatically at around 18-months. A typical 18-month old may be able to say 50 words, while a two-year-old can say 300 or more.

During the height of the new word explosion at age two, children begin a sentence production stage. Now that they have large sets of words with which to express themselves, producing a single word at a time is too restrictive. In order to express complex thoughts, a child needs to learn how to string together words to form meaningful sentences (Bloom, 1998). At first, they are simple two-word utterances consisting of a noun and a verb or another descriptive word (e.g., “mommy go,” or “daddy home”). These sentences resemble the way messages used to be sent by telegram, by including only the essential words, and the child is said to be at the telegraphic speech stage. During the third and fourth years, children’s sentences become longer, as the missing words from the telegraphic speech get inserted (for example, “daddy home” becomes “daddy is coming home”), and sentences are used to express more complex ideas (for example, “mommy is going to the store to buy a cake”). From this point, language continues to grow more complex, and children still have a rapidly increasing vocabulary. As they progress through childhood, they are able to understand and produce more and more complex sentences.

 

[table id=U4M16-1 /]

 

 

You can also access this video directly at: https://youtu.be/d0FGHFrMRXI

How Does Language Work for Adults?

Let’s skip ahead a few years and consider how we communicate with each other through language as adults. In essence, we are trying to build the same situation model between a speaker and a listener. A situation model is a mental representation of the time, space, causes, intentions, reference to individuals and objects related to a conversation. In other words, it is a mental representation of the topics (Kashima, 2020; Pickering & Garrod, 2004). One key part of the process is prediction. While each person is trying to plan what to say next, they are actively trying to predict what the other person is going to say next, which can only happen accurately if the two share the same understanding of the situation, that is, the same situation model (Pickering & Garrod, 2013). Have you ever been in a conversation where the other person tries to finish your sentence for you? Although that can be annoying, it does reflect a natural outcome of the normal processing.

One way to construct the same situation model could be for both people to literally say everything. And how many conversations have you had where that happens? Right, none. Instead, we rely on the other person making the correct inferences (see Modules 5 and 7). Again, an inference is assuming that something is true based on previous knowledge or reasoning. So the two people speaking are trying to ensure that each is making the same inferences to build the same situation model while avoiding stating unnecessary things. How? Two ways are through two related ideas: audience design and common ground. Both require making an assessment of the knowledge that the other person has about the topics of the conversation. With common ground, we make a judgment of the knowledge shared between two people, which allows certain information to go unstated and unexplained. For example, two baseball fans can meaningfully share that WHIP and OPS are vastly superior to ERA and batting average for determining the value of an MLB player. If you have no idea what we just said, then you understand the need for audience design, in which a speaker assesses that different listeners require that different information be provided in order to make an utterance understandable. As a result, we tailor our utterances to the specific audience we are talking to. So, to someone new to baseball, we would begin by explaining that WHIP is a measure of the quality of a pitcher; it stands for the number of walks plus the number of hits that a pitcher gives up per innings pitched (Walks Hits Innings Pitched, or WHIP). And of course, we would continue by defining the meaning of the other abbreviations and terms.

We rely on priming to do much of the work for us. You can think of priming as reminding, the activation of some idea or concept from memory by another related concept. So if we say Y-M-C-A, and you instantly think of the Village People, you can thank priming for that. And, of course, you can thank the Village People (The Village People was a musical group that had a few gigantic hits back in the 1970s, none more gigantic than YMCA).

 

 

You can also access the video directly at: https://youtu.be/CS9OO0S5w2k

audience design: In conversation, when a speaker assesses that different listeners require that different information be provided in order to make an utterance understandable. As a result, we tailor our utterances to the specific audience we are talking to

common ground: judgment of the knowledge shared between two people, which allows certain information to go unstated and unexplained

priming: the activation of some idea or concept from memory by another related concept

situation model: a mental representation that is formed based on a person’s understanding of language

 

Debrief

  • Think about some different-aged infants and young children that you know. Can you recognize the stage of language development in each child? Try to recall some representative utterances or sentences from each child.

16.2. Where It All Started: Jean Piaget

Activate

  • Imagine each of the following situations:
  • You are playing with a six-month-old infant and suddenly leave the room to answer the telephone.
  • You take a four-year-old child’s small cup of juice and empty it into a larger cup.
  • While trying to settle a fight over the TV between a seven-year-old and a 12-year old, you decide to let a coin flip decide. The seven-year-old loses.

For each situation: How do you think the child will react? What is going on inside the child’s mind?

The Swiss psychologist Jean Piaget (1896 – 1980) was the most influential theorist in the history of developmental psychology. His thinking has forever changed our view of how the minds of children work, and he basically invented the field of cognitive development as we know it today. Piaget’s insight was to notice that children understand the world differently than adults do. Like so many brilliant and creative ideas, Piaget’s insight came about by thinking about a commonplace phenomenon in a new way. He was working in Paris in the 1920s for the Binet Laboratory, a publisher of intelligence tests, helping them prepare a reasoning test that had been developed in English for French children. It is almost trivially true that children get questions wrong when taking reasoning tests. After all, that is how psychologists measure individual differences in reasoning ability or intelligence. What Piaget noticed is that children’s errors were not haphazard; if a child missed one specific question, they were likely to miss other specific questions, as well. Moreover, children at similar ages tended to err the same way. It was his realization that children made errors, not because of a lack of intelligence, but because of their developmental state that led to the vastly different conception of childhood cognition that we have today.

So, Piaget first raised the question, do children think differently from adults? Most people are surprised to learn that prior to Piaget, children were more or less considered miniature adults, whose cognitive processes were essentially the same as those of adults. The “obviousness” with which many people observe that children certainly think differently from adults is itself a testament to Piaget’s influence. His answer, of course, was “yes, they think differently,” and it explains many everyday observations about children. If you understand the differences between adult and child thought, it can help you to correct misunderstandings and miscommunications that might otherwise occur. As you will see, however, things are not so simple. Although there is good agreement among psychologists that the cognitive processes of children differ from those of adults, there is also agreement that the differences are not as dramatic as Piaget proposed. So, let us take a look at some of Jean Piaget’s major ideas and try to use them to understand the minds of children (particularly children you may know).

Piaget sought to explain two main aspects of the development of cognition:

  • How conceptual schemes are used to interpret some new experiences, and how the schemes are changed to account for others
  • How cognitive development proceeds through four stages over the first 12-15 years of life

Conceptual Schemes, Assimilation, and Accommodation

Piaget believed that throughout life, our goal is to build up an understanding of the world through establishing and modifying conceptual schemes. For Piaget—and many others who followed him—a scheme is a framework that a child uses to organize knowledge about the world and interpret new information; it is essentially the same idea as a concept(see Module 7.1). Schemes are the mental frames that allow us to comprehend the vast amount of information to which we are constantly exposed.

Schemes for infants are very simple; they are frameworks for understanding actions or simple sensory input. For example, Piaget would have called the newborn’s reflexive sucking a scheme. Later in life, conceptual schemes include frameworks for understanding more complex actions, such as laughing or walking, as well as other entities in the world, such as objects, people, and animals. It is the infant’s cognitive task to come to an understanding of what the world is and how it works by using and modifying schemes. As the child develops, conceptual schemes become more complex through the processes of assimilation and accommodation.

Sometimes, a new experience or piece of information is understood as an example of an already established scheme, the process that Piaget called assimilation. For example, suppose a child has a conceptual scheme for dogs that has been built through experiences with the family dog, a Labrador retriever. Some time later, they encounter a corgi and is told that this, too, is a dog. The new example is assimilated into the scheme for the dog, allowing the child to understand what the new animal is.

At other times, a new experience or piece of information may not fit into a preexisting scheme. The child may initially try to assimilate it but will fail. In order to arrive at a satisfactory understanding of the world, the child will need to use the process of accommodation, modifying the initial scheme to allow for separate concepts. For example, upon seeing a wolf at the zoo, our child may assimilate at first and think that it is a dog. They will need to accommodate, that is, change this too-inclusive scheme for dog and divide it into dogs and wolves.

Accommodation need not follow an inappropriate assimilation, as in the previous example; it can also be used to help the child make subtle distinctions between similar concepts. Again, think about our child who has formed their scheme of dog from their encounters with their very friendly family dog. Perhaps the neighbor’s dog is not so friendly. The child will need to learn to distinguish between friendly and unfriendly dogs, so that they can figure out which ones are safe to approach and which ones they should avoid. That is, they will need to accommodate and create new sub-conceptual schemes, one for friendly dogs and one for unfriendly dogs.

Assimilation and accommodation often occur at the same time. For example, while the child is assimilating the neighbor’s dog, correctly realizing that it is another example of the same conceptual scheme (i.e., it is also a dog), they can also accommodate, or subdivide their scheme of dog into friendly dogs and unfriendly dogs.

Assimilation and accommodation occur for all of the conceptual schemes that we hold, including social categories. You should realize that forming and modifying conceptual schemes are not trivial processes and can be quite difficult for a child to carry out.

These processes do not end in childhood. Rather, they continue throughout life. For example, you have a scheme for how to learn in a classroom. It is an organized set of connected ideas that helps you figure out what is going on and how you are supposed to act in that setting. For example, when you walk into a classroom, you know that the person standing at the front of the room is the teacher and that you should probably be quiet while they are talking. You also know which actions will help you succeed in that classroom. For instance, you should complete assignments to help you learn and raise your hand to ask questions in class.

Now, think about what might happen to that scheme when many colleges moved to remote instruction at the beginning of the Covid-19 pandemic in March 2020. Initially, you may have thought that learning remotely would be similar to learning in a face-to-face classroom so you may have tried to assimilate online classes into your preexisting classroom scheme. Maybe you did not check your email or log into the course website very frequently because in face-to-face classrooms, your teacher reminded you about upcoming deadlines. Maybe you did not distribute your workload evenly across the week because in face-to-face classes, this happens naturally when your class meets multiple times per week. As many of us learned, however, online instruction is very different from face-to-face instruction and assimilation may lead to difficulties with learning and staying on top of your workload. To be successful, you need to accommodate and create a new scheme for online classes. The new scheme includes new information, like being attentive to emails, planning out your workload on your own, and using Zoom to meet remotely. The concepts of assimilation and accommodation are constantly at work as you learn and build mental representations that are better and better matches for reality.

You can also access the video directly at: https://youtu.be/Xj0CUeyucJw

scheme: mental framework for organizing knowledge about the world and interpreting new information; same idea as concept (Module 7)

assimilation: interpreting a new experience or piece of information by understanding that it is an example of an existing scheme

accommodation: changing an existing scheme to account for a new experience or piece of information that does not fit into it

Piaget’s Stage Theory of Cognitive Development

Piaget proposed that children progress through four broad stages of cognitive development. Within each stage, children continue to use the processes of assimilation and accommodation with their conceptual schemes; in fact, these processes continue throughout life. The key idea for Piaget’s stage theory, however, is mental operations. These are mental procedures that can be reversed and are used for thinking, understanding, reasoning, and problem solving (see Piaget, 1942; 1957; 1970). For example, one mental operation is multiplying two numbers to obtain a product. If you start with the product, you can run the operation in reverse; of course, this is division. According to Piaget, children younger than about 2 are nowhere near using these mental operations; these children are at what he called the sensorimotor stage. From about 2-7, they were close, but still unable to use mental operations in most situations; hence, he called these children preoperational. Between about 7 and 11 or so, children could use mental operations, but only in certain situations; these children were at the concrete operations stage. Finally, adolescents (and adults) beyond age 11 could use the mental operations in any situation; he called this stage formal operations.

mental operations: reversible mental procedures that can be used to solve problems or reason about the world

 

[table id=U4M16-2 /]

Sensorimotor stage (ages 0 – 2)

Piaget believed that during the first two years of life, the main cognitive tasks for the infant were to learn about how the physical world works and how to interact with the world. During the sensorimotor stage, the child learns how to coordinate sensory input and movements, and thus learns how the world works and their place in it. The infant progresses from being able to produce simple reflexes only, such as sucking when a nipple is placed into the mouth, through more complex motor responses to more complex sensations.

Early in the sensorimotor stage, the infant makes many random movements. As some of these movements lead to pleasurable sensations, the infant learns over time to produce them. For example, infants will typically insert their hands into their mouths by accident during the first couple of months after birth. Although they probably find this pleasant, as evidenced by the vigorous sucking that they do, these young infants do not yet purposely put their fingers in their mouths. It is not until a bit later that the infant “discovers” their own fingers and can move them to their own mouth to produce pleasurable feelings. Thus, it is a coordination of sensation (the pleasurable feeling of the fingers) with the motor response (moving the fingers to the mouth) that is a hallmark of sensorimotor development.

At the beginning of the sensorimotor stage, the infant’s attention is essentially focused on their own body. They gradually change to an outside world focus throughout the first two years. At around one year, the infant begins actively exploring the world by manipulating objects—for example, picking objects up, stacking them, putting things inside of other things. Parents are often frustrated when children progress through the sensorimotor stage, as they end up losing small objects like television remote controls when the young children delight in discovering the countless spaces into which the small rectangular devices fit.

One essential conceptual scheme that develops during the second half of the first year is that of an objectIt is the key scheme that allows a rapid movement of thought beyond the child themself to the outside world and is an important basis of nearly all future cognitive development. Just imagine, if you did not even realize that there was such an entity as an object, how could you even think about the world? The sensorimotor child must realize that objects are separate from and independent of the self. In other words, they have to learn that objects are in the world and that the objects do not depend on the child to be there.

The realization that objects (including other people) continue to exist after the child stops looking at them is called object permanence, and it develops between about six months and one year of age. Picture a six-month-old infant sitting in a high chair and playing with a rattle. The rattle is slippery from all of the saliva on it (because, as you know, “playing with” for a six-month-old probably means putting it in their mouth), so they drop it. The typical six-month-old infant will not even look for the toy and immediately become interested in something else as if they forgot that they just had a rattle in their hands. That is essentially what Piaget proposed; more precisely, he proposed that they forgot that the rattle had ever existed. As the child advances through the second six-months, you would see a developing awareness that the rattle exists after it falls. At eight months, they might strain to look or reach for it for a few seconds, but will quickly lose interest. By one year, most infants have a very clear understanding that the rattle still exists. They will very obviously look for and try to reach the rattle, and do not soon forget about it. It is at this age, that infants first understand the concept of hiding, and they can begin to play games like hide-and-seek. Prior to that time, the infant would simply forget about the hiding person and fail to seek.

 

You can also access the video directly at: https://youtu.be/rVqJacvywAQ

This realization about what an object is allows the child to make great strides in understanding the world. By the end of the sensorimotor stage, infants have learned a great deal more about objects, about the way the physical world works, and how they can interact with the world. As the infants get ready to move into the next stage, they begin to think more in symbols, meaning that they can now represent information from the world in their minds. For example, they can form a mental picture, or image, of a dog, and their conceptual schemes contain a great deal of information for individual concepts (for example, dogs bark, they have fur, they have four legs, and so on). And, of course, perhaps the greatest accomplishment related to the developing child’s use of symbols is their growing language ability.

object permanence: the realization that objects exist even when you cannot see them

Preoperational stage (ages 2 – 7)

To a large extent, children in the preoperational stage are defined by what they do not have, namely, mental operations. Although they have mastered the coordination of their sensory experiences and motor responses, learned many of the important principles related to physical causality, can represent the world in their minds, and are quite adept as using language, preoperational children lack the ability to apply the reversible mental operations in most cases. For example, although some very advanced five-year-olds may be able to multiply two numbers together, at least for some simple problems, few of them understand that division is the reverse of multiplication. Piaget believed that preoperational children lacked most important mental operations that allow older children and adults to reason logically.

One key type of mental operation that preoperational children lack pertains to physical manipulations of substances. For example, if we take a glob of clay and flatten it out so that it looks bigger, you can simply run the flattening in reverse to realize that the amount of clay has not changed. As a result, you realize that the amount of some substance does not change, or is conserved, when it is subjected to various physical manipulations. Piaget called this understanding conservation, and you can easily imagine many examples of how the form or shape of something might change without changing how much of the substance there is. For example, imagine pouring water from one container to another, spreading out pizza dough before cooking it, cutting spaghetti into small pieces, even tearing a sheet of paper into pieces. In each case, we can mentally reverse the action and know that it is still the same water, pizza dough, spaghetti, or paper. Preoperational children, on the other hand, because they have not yet acquired this operation, are bound by their senses; if something looks like it has more, it has more; if there are more pieces, there is more.

Preoperational children’s failure to conserve shows up in many different everyday reasoning situations. Picture a seven-year-old deciding to use a larger-than-usual bowl for their breakfast cereal. Their three-year-old brother thinks the larger bowl is an excellent idea until they see a normal amount of cereal their own large bowl. When the three-year-old sees that the cereal does not fill up the bowl, they cry because they do not have enough cereal. Unable to “mentally pour” the cereal back into a normal-sized bowl, the youngster does not realize that it is the same amount of cereal that they get every day.

 

You can also access the video directly at: https://www.youtube.com/watch?v=GLj0IZFLKvg

Piaget went further than simply describing what preoperational children cannot do; he also described the characteristics of the reasoning processes that these children do have. Recall that they have just left the sensorimotor stage, in which the children develop a basic understanding of the way the physical world works as a consequence of coordinating their senses and actions. You can think of preoperational reasoning as a step beyond this. Preoperational children are beginning to reason about the world, but in a way that is still tied to their own sensations or perceptions. As a consequence, they are egocentric, able to reason using their own point of view only. Piaget’s most famous demonstration of children’s perceptual egocentrism was through the use of the mountain-view problem (Piaget & Inhelder, 1969). He set up a model of some mountains and placed a doll in the display. Piaget then asked the children what view the doll saw. The children answered by pointing to one of several pictures that showed different views of the mountains. Preoperational children usually chose the picture that showed the view that they themselves saw, regardless of the doll’s position. You can see preoperational children’s egocentrism frequently. For example, preoperational children are not very good at hide-and-seek. As long as they cannot see the seeker, they think they are hidden.

conservation: the realization that the amount of a given substance does not change, even though its appearance might

egocentrism: the ability to reason from an individual’s point of view only

Concrete operations stage (age 7–11)

The concrete operations stage, lasting from approximately 7 until adolescence, marks the beginning of the child’s consistent, though still limited, use of mental operations. For example, if you test a nine-year-old on a conservation task, they are likely to get it right; they are able to mentally reverse an activity such as pouring liquid from one container to another. The use of these operations is limited, however, to situations involving tangible, or concrete, concepts. Just as the preoperational child’s thinking was tied to current perceptions; the concrete operational child’s use of mental operations is also tied to perceptions.

Concrete operational children, then, have acquired mental operations whose absence had formerly led them to make errors as preoperational children. There are several operations that pertain to mathematical reasoning. For example, you might recall from elementary school the transitive property of numbers: if A is larger than B and B is larger than C, then A is larger than C. Concrete operational children can understand transitivity. If you tell them that Jack is taller than Jill and Jill is taller than Jim, they can verify mentally that Jack is taller than Jim. You can also see that the development of these mental operations allows the concrete operational child to begin reasoning in a more logical manner (the Jack, Jill, and Jim problem is a simple logical reasoning problem).

You can also access the video directly at: https://youtu.be/gA04ew6Oi9M

Concrete operational children’s use of the operations is limited to situations in which the reasoning context is concrete. For example, although they would have no difficulty with the Jack-Jill-Jim problem, some concrete operational children might fail at the abstract A-B-C version of it. You can also see the limitations of concrete operational children by examining other aspects of their math reasoning ability. Although they may be quite skilled at using arithmetic operations—for example, understanding that addition and subtraction or multiplication and division are reverses of each other—most have difficulty understanding algebra concepts. In algebra, a symbol (e.g., the letter x) is an abstract variable that can assume any specific, or concrete, number. An understanding of this idea, which may be beyond most concrete operational children, develops in the final of Piaget’s stages, formal operations.

Formal operations stage (over age 11)

Piaget suggested that children’s thinking undergoes its last major change beginning around age 11-14 when they enter the formal operations stage. The shift from concrete to formal operational thinking is marked by a release of reasoning from perceptions. Formal operational thinkers begin reasoning about abstract concepts, such as justice and fairness, in a qualitatively different, more sophisticated way than concrete operational children. To an eight-year-old, “unfair” may mean they did not get the most, or a coin flip is “unfair” if the child loses. A formal operational thinker realizes that “fairness” requires one to consider the perspectives of all of the people involved.

Along with the adolescent’s new way of thinking about abstract concepts comes an increase in hypothetical reasoning, that is, reasoning about things that are possible or that are untrue. They can imagine a better world and often wonder why we cannot achieve it. Their wondering and reasoning are also marked by an increasing skill at logical thought.

 

You can also access the video directly at: https://youtu.be/zjJdcXA1KH8

Evaluation of Piaget’s Theory

As we have said, Jean Piaget has been the most influential theorist in cognitive development by far. Actually, it is fair to say that he is the most influential developmental psychologist, period. Virtually all of the cognitive development research that has been conducted since Piaget’s work was discovered in the US around 1960 has been a reaction to it. What have the researchers found? Although Piaget gave us a profound new understanding of how children may understand the world differently from adults, he misjudged many specific aspects of children’s reasoning.

Is children’s thinking really so primitive?

If you have ever had the opportunity to spend time with young children, you might wonder what other psychologists think that Piaget got wrong. After all, the examples we have given you are real; young children really do make these kinds of reasoning errors. Two-year-olds really are bad at hide-and-seek; we did not make that up. Well, one way you can begin to see the problem with Piaget’s ideas is to realize that children do not always make these kinds of errors.

For example, preoperational children’s egocentrism often does not extend beyond simple perceptions. On the contrary, they can sometimes show a remarkable sensitivity to other people’s point of view. For example, four-year-olds will typically use simpler speech when talking to two-year-olds than when talking to older children or adults, something that requires them to take into account the perspective of the other person (Shatz & Gelman, 1973). Other violations of preoperational egocentrism, even the perceptual variety, are common, as well. For example, we recently held up a cereal box and asked a three-year-old to point to what they saw; they pointed to the picture on their side of the box. When we asked them to point to what we saw, they turned the box around and pointed at the side we had been looking at.

Piaget underestimated children in other respects as well. For example, Renee Baillargeon has demonstrated that infants show some understanding of object permanence as early as three months of age. In one of her experiments (Baillargeon, 1987), three-month-old infants watched a screen move 180 degrees from horizontal to vertical and to horizontal again; the infants were positioned at one end, so the screen moved away from them the whole time. While the screen was still low enough, a block was visible behind it; as the screen continued to move, the block was soon hidden behind the screen. At this point, Baillargeon was able to demonstrate that the infant had some understanding that the block was still there (object permanence). A trap door allowed the block to slip down so that the screen could continue to move. From the vantage point of the infant, though, this was an “impossible event;” the block should have stopped the screen. Infants stared longer at this event, as if surprised at what happened, than they did at a “possible event” in which the block actually did stop the screen.

Do all people develop the highest levels of thinking ability?

In one very important respect, Piaget probably overestimated people. Think again about our description of formal versus concrete operational thinkers’ conceptions of fairness. Perhaps you had the same reaction that we do when thinking about this example. To be blunt, we know a few adults whose definition of “fairness” sounds an awful lot like the eight-year-old we described.

The situation looks even worse when we consider the proposal that logical reasoning is a natural part of development. Some researchers have shown that logical reasoning ability is much more common in technologically advanced societies, suggesting that its development is dependent on educational experiences (Super, 1980). Even more dramatically, it is, in fact, very difficult for even highly educated people to reason logically (Module 7). It looks as if ascension to formal operational thinking is not exactly a sure thing. Although it is true that adolescents get better than younger children at reasoning logically (Müller et al., 2001), that is not the same thing as saying that they get good at it.

Later in his career, Piaget reevaluated his position on formal operations, concluding that many adolescents fail to use their formal operational thinking in many situations (Piaget, 1972). Others have made the more extreme assertion that many people never develop the ability to use formal operations (Leadbeater, 1991).

Is cognitive development stagelike or continuous? The reality that children can perform some reasoning tasks earlier and others later than Piaget believed (or not at all) indicates that development may be much more continuous than stagelike. Piaget believed that all of the operations within a stage developed together, resulting in a very rapid acquisition of abilities across a wide variety of domains. For example, concrete operational children who have acquired conservation would be able to use it in all appropriate situations and would be able to use all of the other concrete operations as well. If this were true, it would make sense to characterize concrete operations as a stage, a period of development that is different in kind (i.e., qualitatively) from other periods.

It is easy to find cases in which this is not true, however. Imagine a four-year-old who fails a standard “liquid in bottle” conservation test. The same child asks you for a cookie. When you hand over the cookie, the child, pressing their advantage, asks for two cookies. You take the cookie back, break it in half, and return it to the child. Although this trick often works on a two-year-old, it will not fool many four-year-olds. Thus, the child in this example conserves in the cookie domain, but not in the liquid-in-bottle domain.

In general, development seems to be domain-specific. Skills or abilities acquired in one area do not automatically transfer over to others. The resulting view of cognitive development is one of more continuous change, as an operation such as conservation is applied to different situations at different times.

It is also important to remember the cases of adults and older children who fail in reasoning tasks that they should have mastered years earlier. You may be surprised to realize that the abilities required for the tasks do not even always come from the formal operational stage. For example, the last time you called someone “self-centered” or “egocentric” you were probably not talking about a four-year-old. Indeed a great many adults have difficulty understanding other people’s perspectives and consequently show a lack of empathy. Indeed, when asked to explain why violence occurred in the world, His Holiness the Dalai Lama once explained, “There is too much cruelty … or lack of compassion and empathy with our fellow human beings,” (quoted in McCool, 1999).

After all of the criticisms of important aspects of Piaget’s theory, you might wonder what is left. The truth is, not many of the details of his original theory have withstood without significant modification (but recall from Unit 1 that is the way science is supposed to work). The major principle that children think differently at different points in development is alive and well, however. It is probably useful for us to describe some more recent discoveries about cognitive development in the next section. By paying attention to the similarities to and differences from Piagetian ideas, you should be able to see his continuing influence on the field of cognitive development.

16.3 Other Topics in Cognitive Development

Developing a Theory of Mind: Understanding What Other People Think

We hope that you are getting the impression that children’s cognitive abilities are often closer to those of adults than Piaget believed. Again, though, we are not saying that their abilities are identical. Think about words such as believe, want, intend, pretend, and, for that matter, think. When you use these words to describe other people (as in, “My psychology professor believes that psychology is the most important subject in the world”), you are actually doing something quite remarkable. You are assuming that other people have minds just like yours, ones that lead them to engage in certain behaviors. It is something that psychologists have called a theory of mind (Wellman, 1990).

A theory of mind may not seem all that remarkable at first, but if you think about it, it really is pretty impressive. Having a theory of mind is a form of mind-reading, the ability to know how other people’s thoughts direct their behavior. It is not the psychic variety, of course, but it is mind-reading nonetheless. Our closest relatives in the animal kingdom, chimpanzees, despite their many notable cognitive achievements, such as tool use and problem-solving, apparently do not understand the inner states of other minds as well as an average four-year-old human (Povinelli & Vonk, 2003; Tomasello et al., 2003). Although computer scientists have created a computer program powerful enough to defeat the world chess champion, they cannot come up with one that has a theory of mind as advanced as an average chimpanzee.

Here is another excellent opportunity to ask the question, “How would you know if a young child (or a chimpanzee) has a theory of mind?” We could focus on one key concept, that of belief. Three philosophers separately suggested the following kind of test to see if young children realize that other people hold beliefs (Bennett, 1978; Dennett, 1978; Harman, 1978). Suppose you are in a room with a red box and a blue box, and you hide a ten-dollar bill under the red box and leave the room. While you are gone (and are completely unable to see the room), a prankster enters and moves the money to under the blue box. When you come back to retrieve your ten-dollar bill, the first place you look is under the blue box. Would you be surprised? Most adults would be; they realize that you should have looked under the red box, where you believed the money would be. They have a theory of mind.

Actually, this is quite an advanced theory of mind. One study that used a version of this task found that some four-year-olds (and no three-year-olds) demonstrated this level of theory of mind (Wimmer & Perner, 1983). Other researchers have found evidence for more primitive theories of mind in younger children. Three-year-olds can reason about other people’s desires and about some beliefs (Stein & Levine, 1989). There is even evidence that infants as young as 9 months old can express their understanding of other people’s minds through their gestures (Bretherton et al., 1981).

One reason these observations are important is that they, too, reveal that Piaget underestimated children. If Piaget was correct that children under 7 (preoperational or lower) could only see the world through their own eyes (i.e., if they are egocentric), then they would not have a theory of mind, which requires an understanding of the other person’s perspective. Another reason these observations are important is that they show us that very young children are far more perceptive than we might imagine. In a sense, they are able to “read our minds,” at least for simple messages, at a very early age.

Although most psychologists agree that infants do not have an adult-like understanding of other people’s minds, they do have at least a primitive version of a theory of mind. Even very young infants will follow the gaze of another person. For example, if the infant sees a parent looking at a toy, the infant will look at the toy, too. From these early beginnings, the child’s theory of mind develops as the child realizes more about the internal states of other people. They eventually come to realize that people have perceptions, desires, and beliefs.

Let us think about an interesting implication of the development of children’s theory of mind, namely children’s ability to deceive. Have you ever heard the claim that very young children are unable to lie? Is that true? If it is true, why is it so? When two-year Juliana is asked, “Who broke the flowerpot?” “Who made a mess on the family room floor?” or “Who put the sock in the toilet bowl?” Juliana will often reply, “Ben (Juliana’s fifteen-year-old brother) did it.” In most cases, Ben is probably innocent, but is it fair to say that Juliana is telling a lie? How would you know? Clearly, we would need some insight into Juliana’s intention. A lie is a lie because it is told with the intention to deceive. Does Juliana intend to deceive when they blame the flooding in the basement after a heavy rain on their brother?

You see, what is happening is that all of the adults that Juliana spends time with think it is hilarious when they say, “Ben did it.” Juliana gets very happy when they make people laugh, and the laughter often leads the adults to forget about any transgressions that Juliana may have committed. So, Juliana’s untruths are more accurately seen as examples of operant conditioning. Juliana gets positive reinforcement (laughter) and avoids punishment by saying “Ben did it.”

But that does not really prove that we know that Juliana is not lying. Perhaps two-year-olds are very cunning and skilled prevaricators. Well, a clue that two-year-olds might not realize that their goal is to deceive is that they really are not very good at it. Ask Juliana who made it rain and ruin the family picnic, and Juliana might very well say, “Ben did it.” If Juliana were really trying to deceive people with the lie, Juliana would be a bit more selective in their use of the statement.

A second and more important reason to think that Juliana is not really lying is that very young children appear to lack an essential element of a theory of mind, the absence of which renders them literally unable to deceive. In order to deceive, you must have quite a sophisticated conception of what is going on in the other person’s head. You have to realize that other people have beliefs before you can try to trick them into adopting a false belief. This level of theory of mind is fairly late to develop. According to Henry Wellman (1993), two-year-olds know that other people can perceive and want, but they do not yet understand that other people hold beliefs. This appears to be an important developmental change in the theory of mind of children between around three and four years of age. Before then, children’s lack of understanding about beliefs makes it impossible for them to understand that the goal of a lie is to lead someone else to adopt a false belief.

You can also access the video directly at: https://youtu.be/YGSj2zY2OEM

 

theory of mind: the realization that other people have thoughts, beliefs, desires, etc. that guide their behavior

Developing Memory

It is very likely that memory begins before birth, and infant and child memory is quite impressive. At the same time, we can remember virtually none of the events from the first three years of our lives. How can we reconcile these seemingly contradictory points?

First, let us talk about how we know that newborns and infants can remember and that memory begins before birth. Of course, you cannot just ask the infants. Instead, researchers had to develop ingenious techniques that allowed them to get at this information indirectly. The basic idea is simple: if you can consistently get an infant to respond differently to two different stimuli, you can conclude that the infant perceives, or remembers, a difference between the stimuli. That simple idea allows us to draw many conclusions about the capabilities of infants. So, for example, consider the famous Cat in the Hat studies (DeCasper & Fifer, 1980; DeCasper & Prescott, 1984; DeCasper & Spence, 1986, 1991). Anthony DeCasper and his colleagues were able to demonstrate that newborns can recognize their mother’s voice, and, even more impressively, they can recognize a story that had been read to them before they were born. The researchers had a “wired” pacifier that could record the rate at which an infant sucked. Newborns sucked on the pacifier faster when listening to a tape of their mother than a tape of another woman. In one version of an experiment, one group of newborns had been read Dr. Seuss’s The Cat in the Hat by their mothers several times over the last weeks of pregnancy. After birth, they sucked faster only when their mothers read the familiar story.

A similar research technique called the habituation paradigm has led to many additional discoveries about infant abilities. The important observation that underlies this technique is that infants appear to bore easily. When infants are shown a new object, they stare at it with apparent interest. Then, they get used to it, or habituate, and their attention is easily drawn to other things. By exposing infants to different stimuli and keeping track of whether or not the infant has habituated or not, you can tell whether the infant recognizes a stimulus is something familiar or not. Researchers using the habituation paradigm have demonstrated that infants from three to six months old could remember visual information for periods from two weeks to a couple of months (Fagan, 1974; Bahrick & Pickens, 1995). Even newborns have demonstrated brief memories using the habituation technique (Slater et al., 1991).

Researchers have also shown that infants have impressive memory ability for associations. For example, researchers placed infants in a crib and attached a mobile to their foot with a ribbon. The infants quickly learned to associate moving their foot with the movement of the mobile. Even an eight-week-old infant could remember the association for up to two weeks if the training was given over time (Rovee-Collier & Fagen, 1981; Vander Linde et al., 1985). Six-month-olds, if they were briefly reminded by placing them in the same situation again during the interval, could remember the association for six weeks. The infants’ memories are heavily dependent on context; if you change the situation slightly, for example, by changing the color scheme of the crib in which the experiment is conducted, the infants are much worse at remembering the association (Rovee-Collier et al., 1992).

The types of memories we have described so far are implicit memories (memories for skills and procedures without conscious recall), and it is clear that young infants have them. Older infants begin to show signs that they have explicit memory (memory with conscious encoding and recall). Suppose an adult shows a child how to play with a novel toy, but does not let the infant play with it. After a delay, the infant is given the opportunity to play; the researchers look for the infant to imitate the behaviors previously modeled by the adult. Bauer and Wewerka (1995) used a procedure similar to this to show that one-year-old infants could remember an event 12 months later. Bauer and her colleagues have also shown that nine-month-old infants can remember events for one month if the modeled behavior is repeated one week after the first experience (Bauer et al., 2001). Researchers have demonstrated very impressive memory abilities in children of many ages, particularly if the events are meaningful to the child. For example, one study found that three- and four-year-olds could remember events from a trip to Disneyworld a year and a half earlier (Hamond & Fivush, 1991).

That is not to say that children’s memories are entirely reliable. Recall that the memories of adults can be easily distorted (Module 5). It turns out that children are even more susceptible to these kinds of memory distortions. In an extensive review of the psychological research on the issue, Bruck and Ceci (1999) concluded that children under 10, and especially preschoolers, are more easily misled into falsely remembering events than adults are. In one dramatic demonstration, Stephen Ceci and his colleagues (1994) were able to get almost 60% of preschoolers to falsely remember an event like getting their fingers caught in a mousetrap, simply by repeating a set of leading questions over an 11-week period.

In addition to the increased susceptibility to distortion, there are some clear differences between children’s and adult’s memories, with older people having better memory; an important source of the differences is the speed of mental processing. Other improvements seem more related to differences in memory strategy use or in some non-memory processes, rather than a fundamental difference in the way memory works. For example, short-term or working memory capacity increases with age, but the improvement likely reflects the role of background knowledge on memory (Dempster, 1978; 1985). One way you can see this is by observing children who are experts in chess; their working memory of chessboard positions is better than it is for unrelated strings of numbers, and more like the working memory of adults (Chi, 1978; Schneider et al.,  1993).

So, infants and children have quite good, but by no means perfect, memories, which brings us back to the question with which we began this section: why do adults have almost no memories of events and episodes from early childhood? We have very few memories from earlier than six or seven, and memories of events that happened before three and a half are extremely rare. They are so rare, that it is more likely that they result from memory distortions than from actual memory. No one really knows why we have infantile amnesia, as it is called. There are two good candidate explanations we would like to share with you; both are related to the principles of encoding through recoding from Module 5:

  • Because children younger than three and a half are still developing their language abilities, they often try to encode events into memory verbatim (in other words, exactly as it occurs). A verbatim memory trace, being relatively unconnected from the rest of knowledge in memory, may be difficult to access later on, so these memory traces die away from disuse. Older children and adults, because their language skills are more flexible, can encode an event in a richer narrative form, which makes it easier to access in the future (Ceci, 1993; Fivush & Hamond, 1990; Nelson, 1993). In essence, the memories are embedded in a network of other knowledge, so they can be retrieved again.
  • The second possibility is related to the suggestion that in order to improve your memory for material, you can make it meaningful by applying it to yourself. Because children’s self-concepts are developing over the first few years, the adult self may be trying to retrieve an event that happened to a different, child self (Fivush, 1988; Howe & Courage, 1993).

habituation (research technique): a technique that researchers use to demonstrate infant memory by showing that infants look longer at new objects than familiar ones

infantile amnesia: adults’ near complete lack of memory for events from early childhood

We do not need to really talk about memory in adults because that is essentially what you saw in Module 5. Let us turn, then, to what may happen to our memories as we age. Many people fear “mental decline” when they age, and most probably do not mean intelligence or reasoning when they talk about it. Rather, they are referring to the dreaded “senior moment,” the unfortunate and apparently inevitable memory loss that we all have to look forward to. People in their 40’s and many in their 50’s often complain of senior moments. The truth of the matter is that memory decline for most people is very minor as they age. The decline is more of a perception than a reality. Although three-quarters of people older than fifty in the US report that they suffer from memory problems, only around one-third of the over fifty population, actually do (Arnst, 2003). It is true that the older one gets, the more likely memory problems become, but very few fifty-somethings are actually suffering from age-related memory problems, as that one-third rate also includes people in their 60’s, 70’s, 80’s and beyond.

Two factors that contribute to our inflated perception of age-related memory declines are confirmation bias and expectation effects in perception (Module 1, Module 13). According to the confirmation bias, we have a tendency to notice and remember cases that confirm our belief, such as when a 60-year old loses their car keys. We will fail to notice and remember cases that do not confirm our belief, such as a 22-year old student who misses class because they lost their car keys. Expectation effects will lead us to perceive forgetful behavior exhibited by different-aged people in a way that is consistent with our expectations. For example, an older faculty friend of ours reports that when they were a child, they used to forget a lot; they would forget to bring home their homework, and they lost several watches until their parents gave up and stopped buying them. Their childhood forgetfulness was perceived as a reflection of the fact that they were careless and irresponsible. When they lost their wedding ring at age 25, it was because they were (still) irresponsible. In their early 30’s, their forgetfulness was seen as evidence that they had become the prototypical “absent-minded professor.” In their early 40’s when they used to go to the wrong parking lot at the end of the day because they forgot where they parked their car, it was a consequence of the amount of stress in their life. Now that they are in their late 50’s and they just locked themself out of their office because they forgot their keys for the fifth time this semester, it is a senior moment. Although we think you get the idea, we should emphasize that our colleague’s forgetfulness has been a constant throughout their life; it is only our expectations that lead us to perceive it as something different at different ages.

We will say this: memory decline in old age is big business. At the risk of being cynical, we might suggest that this fact contributes to the salience of “senior moments.” According to Consumer Reports, sales of supplements to aid memory doubled from 2006 – 2015 (Calderone, 2018). Consider the sadly typical story of the herb ginkgo biloba. This herb, an extract from leaves of the ginkgo tree, had been shown to slightly improve cognitive functioning in patients who are suffering from mild to moderate cognitive impairment, usually patients in the early stages of Alzheimer’s disease. When the herb was tested on people who are experiencing normal, age-related memory problems, the effects have been weak and inconsistent, however (Gold et al., 2002). Undeterred by the lack of support for ginkgo’s effectiveness, many manufacturers have sold the supplement to millions of normal individuals; it has been especially popular in Europe. Still today, it is readily available, despite the current consensus that there is no conclusive evidence that ginkgo helps for ANY condition (National Center for Complementary and Integrative Health, 2016).

Still, the quest is on for a serious cure for age-related memory decline. Business, having conquered those other scourges of the aged, impotence and baldness, has turned its attention to cognition, and they continually come to the party with a new cure. But when these supposed remedies are held under the light of research, the results have been equally bleak for other popular supplements, such as B vitamins and Omega 3 fatty acids (Kivipelto et al., 2017; Meng-Meng et al., 2014).

Many people have a preference for the quick fix, the easy solution. For example, many will take a diet drug rather than exercising and changing their eating behavior in order to control their weight. This is true even though most people know how important exercise is for controlling weight. In the case of age-related cognitive decline, however, many people do not even realize that there are two non-drug solutions to the problem. The first solution is to “use it or lose it.” Quite simply, people who continue to use their cognitive abilities as they age continue to be able to use them. In fact, a growing body of evidence suggests that many different activities, even non-intellectual ones, can help people retain their cognitive functioning as they age (Kramer, et al., 2004; Richards et al., 2003; Singh-Manoux et al., 2003).

The second solution is physical exercise, both aerobic exercise and strength training (Busse et al., 2009; Robitaille et al., 2014) . Hmm, controlling weight and stemming cognitive decline from one solution. Maybe we should bottle this exercise thing and sell it.

Changes in Reasoning and Intelligence

Earlier in this module, we observed that reasoning develops in different domains, as children (and for that matter, adolescents and adults) gain knowledge and experience in those domains. For example, when children learn about biological categories, they begin to be able to reason about the types of properties that are essential for different animal types, for example (Gelman & Markman, 1986). As people gain more knowledge and experience in different subject areas, their reasoning often gets closer to what we would recognize as logical reasoning (Müller et al., 2001).

Perhaps because of this increased knowledge and experience, adolescents and adults are better than younger children at other types of reasoning and thinking as well. For example, they are better able to use analogies to solve problems (Moshman, 1998). In analogical reasoning, one understands a concept or solves a problem by noting similarities to another concept or problem. For example, an individual might learn about the behavior of the parts of an atom by realizing that an atom is like the solar system (Gentner, 1983). More broadly, adolescent thinkers become more deliberate in their reasoning, and their metacognitive skills improve (“thinking about thinking,” see Module 7; Campbell & Bickhard, 1986; Inhelder & Piaget, 1958; Moshman, 1990; 1994; 1998)

Popular wisdom holds that we reach various peaks in cognitive ability around age 30, followed by a gradual, but accelerating decline, a near-perfect parallel to the common beliefs about physical changes associated with aging. As we have said before, however, popular wisdom is not always correct. In module 12 you learned that in reality, the physical declines are barely noticeable before age 50, and they can be dramatically slowed through physical activity. The news is even better with respect to cognitive changes. Many of the declines that people talk about reflect illness. In normal, healthy adults, some aspects of reasoning, intelligence, and memory do not decline at all and may even improve throughout the lifespan.

Many instructors who teach diverse groups of undergraduates (for example at a community college), notice a difference between “traditional” (i.e., 19 or 20 years old) and “non-traditional,” or returning students (i.e., older). In short, the older students seem better able to apply the course content, and this seems particularly true of those who were in their 40’s and 50’s, These instructors’ casual observations are supported by research.

Several decades ago, Raymond Cattell (1963) proposed a distinction between fluid and crystallized intelligence, a distinction that has withstood the test of time. Fluid intelligence refers to your speedy reasoning ability. Think of it as your ability to solve logic and math problems or brain teasers. It does look as if fluid intelligence reaches a peak at around age 30 and then begins its long decline, a result of a reduction in speed of mental processing (Salthouse, 1991; 1996). Crystallized intelligence is your accumulated store of knowledge and your ability to apply that knowledge to solve problems (Lemme, 2002; Sternberg, 1996). Crystallized intelligence continues to increase, at least through the 50’s and perhaps throughout the lifespan (Baltes, 1987; Schaie, 1996). We do not believe it would be a stretch to note that crystallized intelligence is closely related to what we think of as wisdom. Thus the popular image of the wise elder may well be grounded in truth. It is also worth noting that fluid and crystallized intelligence constitute one key dimension of the CHC Theory of Intelligence you read about in Module 8.

analogical reasoning: a problem-solving technique that involves noting similarities between concepts or problems

fluid intelligence: an individual’s speedy reasoning ability

crystallized intelligence: an individual’s accumulated store of knowledge and the ability to apply the knowledge to solve problems

 

Debrief

  • What is your earliest memory? Are you willing to admit that the memory might not be a genuine one? How would you explain the persistence of this particular memory?
  • Are you a good judge of other peoples’ goals, intentions, and beliefs, or do you often misconstrue them? How sophisticated is your theory of mind?

16.4 Cognitive Disorders of Aging

For some people, aging can be associated with cognitive decline, however. As people age, their risk of developing serious disorders that can affect their cognitive functioning does increase. One key risk is a reduction of blood flow to areas of the brain. Although these can be minor, the extreme version, a stroke, is very severe. When a blood vessel that feeds an area of the brain is blocked or bursts, the neurons die in the sections of the brain that normally receive blood from the blood vessel. Strokes can be deadly; they are the third leading cause of death in the US, accounting for over 160,000 deaths per year. People who survive a stroke suffer from brain damage and a consequent loss of abilities, such as memory, movement, and speech. When the lost functions include cognitive abilities, a person is said to be suffering from dementia. Some stroke victims are able to regain some functions lost through the damage, however, a demonstration of adult brain plasticity.

stroke: a loss of blood flow to an area of the brain as a result of the blockage or bursting of a blood vessel. The brain areas die from lack of oxygen, and the consequence is brain damage and some loss of abilities.

dementia: a serious loss of cognitive abilities as a result of disease or disorder

One severe disorder of aging, and the most important source of dementia, is Alzheimer’s disease, a fatal and incurable affliction. There are two types of Alzheimer’s disease. Early-onset Alzheimer’s is quite rare and can strike as early as 30. Late-onset Alzheimer’s disease by definition strikes after 65. Alzheimer’s is a progressive disease; the symptoms start slowly and gradually worsen. Its most famous symptom is memory loss. A person in the early stages of Alzheimer’s may occasionally forget the names of common objects or get lost in a familiar place. As the disease progresses, the memory problems become more severe, and Alzheimer’s patients eventually wind up unable to recognize even their closest family members. Additional symptoms may include other cognitive problems such as confusion, loss of language and judgment skills, and personality changes. Eventually, they end up unable to care for themselves and completely unresponsive. Death occurs an average of 8 years after the disease is diagnosed.

Because of people’s awareness of the disorder and its tragic consequences, many middle-aged and elderly people fear that they are entering the early stages of Alzheimer’s after occasional memory lapses. They are probably not. Although estimates of the incidence of Alzheimer’s can vary,  the Alzheimer’s Association (2023) reported that 5% of people between 65 and 74, 13% of people between 75 and 84, and 33% of people 85 or older have Alzheimer’s disease. 

Still, although the percentages of people affected are low for the younger groups and they never reach a majority for any group, they translate into enormous numbers. For example, one estimate suggested that 32 million people worldwide currently suffer from Alheimer’s disease-related dementia, with an additional 69 million experiencing mild cognitive decline, and a staggering 315 million in early not-yet-diagnosed stages (Gustavson et al. 2022). This is more than the entire population of the US (334 million in 2023). In the US alone, nearly 14 million people are expected to suffer from Alzheimer’s disease by 2060 (Alzheimer’s Association, 2023)

Because of these projections, a great deal of research effort is currently being devoted to discovering the causes of and treatments for Alzheimer’s disease. Scientists have a pretty good handle on what happens to the brains of Alzheimer’s patients, thanks to autopsies of patients and advanced brain-scanning techniques. They are still figuring out the causes, however. One key abnormality in the brains of Alzheimer’s patients is an excess of a protein known as amyloid beta in the hippocampus. In 2006, an extremely important study was conducted that appeared to confirm that amyloid beta was indeed the cause of Alzheimer’s disease (Lesné, Koh, Kotilinek, et al., 2006). The paper has been cited by other researchers more the 2,500 times and has had enormous influence over theory about the cause and research into possible treatments. There is only one problem, and unfortunately, it is a big one. In 2022, it was discovered that key images in the paper had been faked, and in 2024, the authors (all but one of them) agreed to retract the paper (Piller, 2024); it was officially retracted in June 2024. The field is currently trying to sort things out, as some researchers still believe that the amyloid beta hypothesis is correct, while many others do not. 

We can make one key observation that probably speaks to this debate. Simply put, the large majority of treatments that were developed based on the assumption that amyloid beta is the cause of Alzheimer’s symptoms have been rather ineffective, and only two have been approved by the FDA, and the approval was marked by controversy. One key problem is the research that led to the approval used removal of amyloid beta from the brain as the key outcome, and that is most assuredly, not the same thing as improving symptoms (Karran, E., De Strooper, B., 2022; NIH, 2024)

So, we should definitely consider other causes. Even if the amyloid beta hypothesis turns out to be correct, it will not be the whole story. For example, why is the extra amyloid there? And are there additional contributing causes? Let’s consider genetics first. Genetics are likely a significant part of the Alzheimer’s story, as researchers estimate that they contribute from 58% – 79% of Alzheimer’s risk (Gouveia Roque, Phatnani, & Hengs 2024). Researchers have also made great progress at identifying possible genes that are associated with Alzheimer’s Disease, having identified more than 40 separate candidates (Bellenguez, Grenier-Boley & Lambert, 2020). Note that, as always, having a genetic component, even one as substantial as we observe for Alzheimer’s disease, is not a guarantee that an at-risk individual will develop the disease. It does suggest that they should do everything they can to reduce the risk by taking care of lifestyle risk factors (see below). 

The genetic explanation still does not tell us how Alzheimer’s Disease actually develops, and that, too is an active area of research. One intriguing piece of the puzzle comes from the gut-brain connection that you might have heard about. It probably does not surprise you to learn that the brain and digestive system are connected. Both the sympathetic and parasympathetic nervous system can send signals to the gut, changing the movements and secretions of the digestive system, and its immune system activity. The connections run both directions, though. The connections that run from gut to brain may affect emotions and memory (Mayer 2011).  So it is at least a reasonable idea that the gut could influence the development of Alzheimer’s. Indeed, a growing body of research suggests that this is the case (see, for example, Kowalski & Muzak, 2019).

In one amazing study, Stephanie Grabrucker and her colleagues (2023) injected rats with fecal extracts from the guts of Alzheimer’s patients or control patients (ok, we would not want to do this research ourselves, but it certainly is fascinating!). The rats injected with the extracts from Alzheimer’s patients were less able to create new neurons in the hippocampus and developed memory deficits. In other words, the researchers were able to transfer some Alzheimer’s symptoms to the rats by injecting them with fecal material from the gut of human patients suffering from the disease. Intriguing? Yes! End of the story? Not even close. But if you are interested, pay attention to research developments on the causes of Alzheimer’s disease over the next few years. 

Let us finish with some good news. While we are waiting for researchers to figure out the exact cause and develop effective treatments, we should all realize that there are a great many environmental and behavioral causes that contribute to Alzheimer’s disease. By paying attention to these lifestyle factors, individuals, especially at-risk ones, can greatly reduce their chances of developing Alzheimer’s disease, as well as other types of dementia (Livingston et al, 2017). Luckily, they are the sorts of things that will help in other areas of life as well. These factors include :

  • Get enough exercise
  • Lose weight if obese
  • Do not smoke
  • Control high blood pressure
  • Remain socially and intellectually engaged
  • Treat hearing loss. (Hearing loss might be responsible for up to 9% of Alzheimer’s cases)

 

 

 

You can also access the video directly at: https://youtu.be/0GXv3mHs9AU

Alzheimer’s disease: a progressive, fatal disorder characterized by memory loss, other cognitive symptoms, and personality change

Amyloid beta: a protein that surround the brain’s neurons in Alzheimer’s disease patients. Although controversial, it might be responsible for the symptoms of the disease.

 

 

 

License

Icon for the Creative Commons Attribution-NonCommercial 4.0 International License

Introduction to Psychology, 4th Edition Copyright © 2022 by Ken Gray; Elizabeth Arnott-Hill; Or'Shaundra Benson; and Maureen Gray is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License, except where otherwise noted.

Share This Book