PS124-1: Identify basic concepts and principles of psychology.


CHAPTER 4: Developing Through the Life Span

Life is a journey, from womb to tomb. So it is for me, and so it will be for you. My story, and yours, began when a man and a woman contributed 20,000+ genes to an egg that became a unique person. Those genes coded the protein building blocks that, with astonishing precision, formed our bodies and predisposed our traits. My grandmother bequeathed to my mother a rare hearing-loss pattern, which she, in turn, gave to me (the least of her gifts). My father was an amiable extravert, and sometimes I forget to stop talking. As a child, my talking was impeded by painful stuttering, for which Seattle Public Schools gave me speech therapy.

Along with my parents’ nature, I also received their nurture. Like you, I was born into a particular family and culture, with its own way of viewing the world. My values have been shaped by a family culture filled with talking and laughter, by a religious culture that speaks of love and justice, and by an academic culture that encourages critical thinking (asking, What do you mean? How do you know?).

We are formed by our genes, and by our contexts, so our stories will differ. But in many ways we are each like nearly everyone else on Earth. Being human, you and I have a need to belong. My mental video library, which began after age 4, is filled with scenes of social attachment. Over time, my attachments to parents loosened as peer friendships grew. After lacking confidence to date in high school, I fell in love with a college classmate and married at age 20. Natural selection disposes us to survive and perpetuate our genes. Sure enough, two years later a child entered our lives and I experienced a new form of love that surprised me with its intensity.


But life is marked by change. That child now lives 2000 miles away, and one of his two siblings has found her calling in South Africa. The tight rubber bands linking parent and child have loosened, as yours likely have as well.

Change also marks most vocational lives, which for me transitioned from a teen working in the family insurance agency, to a premed chemistry major and hospital aide, to (after discarding my half-completed medical school applications) a psychology professor and author. I predict that in 10 years you, too, will be doing things you do not currently anticipate.

Stability also marks our development. When I look in the mirror I do not see the person I once was, but I feel like the person I have always been. I am the same person who, as a late teen, played basketball and discovered love. A half-century later, I still play basketball and still love (with less passion but more security) the life partner with whom I have shared life’s griefs and joys.

We experience a continuous self, but that self morphs through stages—growing up, raising children, enjoying a career, and, eventually, life’s final stage, which will demand my presence. As I wend my way through this cycle of life and death, I am mindful that life’s journey is a continuing process of development, seeded by nature and shaped by nurture, animated by love and focused by work, begun with wide-eyed curiosity and completed, for those blessed to live to a good old age, with peace and never-ending hope.

Across the life span we grow from newborn to toddler, from toddler to teenager, and from teen to mature adult. At each stage of life’s journey there are physical, cognitive, and social milestones. Let’s begin at the very beginning.

Developmental Psychology’s Major Issues

4-1: What three issues have engaged developmental psychologists?

Developmental psychology examines our physical, cognitive, and social development across the life span, with a focus on three major issues:

  • 1. Nature and nurture: How does our genetic inheritance (our nature) interact with our experiences (our nurture) to influence our development? How have your nature and your nurture influenced your life story?

  • 2. Continuity and stages: What parts of development are gradual and continuous, like riding an escalator? What parts change abruptly in separate stages, like climbing rungs on a ladder?

  • 3. Stability and change: Which of our traits persist through life? How do we change as we age?

developmental psychology a branch of psychology that studies physical, cognitive, and social change throughout the life span.

“Nature is all that a man brings with him into the world; nurture is every influence that affects him after his birth.”

Francis Galton, English Men of Science, 1874

We will reflect on these three developmental issues throughout this chapter.

Prenatal Development and the Newborn

4-2: What is the course of prenatal development, and how do teratogens affect that development?

Conception

Nothing is more natural than a species reproducing itself. And nothing is more wondrous. With humans, the process starts when a woman’s ovary releases a mature egg—a cell roughly the size of the period at the end of this sentence. Like space voyagers approaching a huge planet, the 200 million or more deposited sperm begin their race upstream, approaching a cell 85,000 times their own size. The relatively few reaching the egg release digestive enzymes that eat away its protective coating. As soon as one sperm penetrates that coating and is welcomed in the egg’s surface blocks out the others. Before half a day elapses, the egg nucleus and the sperm nucleus fuse. The two have become one.


Consider it your most fortunate of moments. Among 200 million sperm, the one needed to make you, in combination with that one particular egg, won the race. And so it was for innumerable generations before us. If any one of our ancestors had been conceived with a different sperm or egg, or died before conceiving, or not chanced to meet the partner or … the mind boggles at the improbable, unbroken chain of events that produced you and me.

Prenatal Development

Fewer than half of all fertilized eggs, called zygotes, survive beyond the first 2 weeks (Grobstein, 1979; Hall, 2004). But for you and me, good fortune prevailed. One cell became 2, then 4—each just like the first—until this cell division had produced some 100 identical cells within the first week. Then the cells began to differentiate—to specialize in structure and function. How identical cells do this—as if one decides “I’ll become a brain, you become intestines!”—is a puzzle that scientists are just beginning to solve.

zygote the fertilized egg; it enters a 2-week period of rapid cell division and develops into an embryo.

About 10 days after conception, the zygote attaches to the mother’s uterine wall, beginning approximately 37 weeks of the closest human relationship. The zygote’s inner cells become the embryo. The outer cells become the placenta, the life-link that transfers nutrients and oxygen from mother to embryo. Over the next 6 weeks, the embryo’s organs begin to form and function. The heart begins to beat.

embryo the developing human organism from about 2 weeks after fertilization through the second month

By 9 weeks after conception, an embryo looks unmistakably human. It is now a fetus (Latin for “offspring” or “young one”). During the sixth month, organs such as the stomach have developed enough to give the fetus a chance of survival if born prematurely.

fetus the developing human organism from 9 weeks after conception to birth.

At each prenatal stage, genetic and environmental factors affect our development. By the sixth month, microphone readings taken inside the uterus reveal that the fetus is responsive to sound and is exposed to the sound of its mother’s muffled voice (Ecklund-Flores, 1992; Hepper, 2005). Immediately after birth, newborns prefer her voice to another woman’s or to their father’s (Busnel et al., 1992; DeCasper et al., 1984, 1986, 1994). They also prefer hearing their mother’s language. If she spoke two languages during pregnancy, they display interest in both (Byers-Heinlein et al., 2010). And just after birth, the melodic ups and downs of newborns’ cries bear the tuneful signature of their mother’s native tongue (Mampe et al., 2009). Babies born to French-speaking mothers tend to cry with the rising intonation of French; babies born to German-speaking mothers cry with the falling tones of German. Would you have guessed? The learning of language begins in the womb.

In the two months before birth, fetuses demonstrate learning in other ways, as when they adapt to a vibrating, honking device placed on their mother’s abdomen (Dirix et al., 2009). Like people who adapt to the sound of trains in their neighborhood, fetuses get used to the honking. Moreover, four weeks later, they recall the sound (as evidenced by their blasé response, compared with the reactions of those not previously exposed).

Sounds are not the only stimuli fetuses are exposed to in the womb. In addition to transferring nutrients and oxygen from mother to fetus, the placenta screens out many harmful substances, but some slip by. Teratogens, agents such as toxins, viruses, and drugs, can damage an embryo or fetus. This is one reason pregnant women are advised not to drink alcoholic beverages. A pregnant woman never drinks alone. As alcohol enters her bloodstream, and her fetus’, it depresses activity in both their central nervous systems. Alcohol use during pregnancy may prime the woman’s offspring to like alcohol and may put them at risk for heavy drinking and alcohol use disorder during their teens. In experiments, when pregnant rats drank alcohol, their young offspring later displayed a liking for alcohol’s taste and odor (Youngentob et al., 2007, 2009).

teratogens (literally, “monster maker”) agents, such as toxins, chemicals, and viruses, that can reach the embryo or fetus during prenatal development and cause harm.

Even light drinking or occasional binge drinking can affect the fetal brain (Braun, 1996; Ikonomidou et al., 2000; Sayal et al., 2009). Persistent heavy drinking puts the fetus at risk for birth defects and for future behavior problems, hyperactivity, and lower intelligence. For 1 in about 800 infants, the effects are visible as fetal alcohol syndrome (FAS), marked by a small, misproportioned head and lifelong brain abnormalities (May & Gossage, 2001). The fetal damage may occur because alcohol has what Chapter 2 called an epigenetic effect: It leaves chemical marks on DNA that switch genes abnormally on or off (Liu et al., 2009).

fetal alcohol syndrome (FAS) physical and cognitive abnormalities in children caused by a pregnant woman’s heavy drinking. In severe cases, symptoms include noticeable facial misproportions.

Prenatal development

zygote:  

conception to 2 weeks

embryo:  

2 weeks through 8 weeks

fetus:  

9 weeks to birth

“You shall conceive and bear a son. So then drink no wine or strong drink.”

Judges 13:7

“I felt like a man trapped in a woman’s body. Then I was born.”

Comedian Chris Bliss

The Competent Newborn

4-3: What are some newborn abilities, and how do researchers explore infants’ mental abilities?

Babies come with software preloaded on their neural hard drives. Having survived prenatal hazards, we as newborns came equipped with automatic reflex responses ideally suited for our survival. We withdrew our limbs to escape pain. If a cloth over our face interfered with our breathing, we turned our head from side to side and swiped at it.

New parents are often in awe of the coordinated sequence of reflexes by which their baby gets food. When something touches their cheek, babies turn toward that touch, open their mouth, and vigorously root for a nipple. Finding one, they automatically close on it and begin sucking—which itself requires a coordinated sequence of reflexive tonguing, swallowing, and breathing. Failing to find satisfaction, the hungry baby may cry—a behavior parents find highly unpleasant and very rewarding to relieve.

The pioneering American psychologist William James presumed that newborns experience a “blooming, buzzing confusion,” an assumption few people challenged until the 1960s. Then scientists discovered that babies can tell you a lot—if you know how to ask. To ask, you must capitalize on what babies can do—gaze, suck, turn their heads. So, equipped with eye-tracking machines and pacifiers wired to electronic gear, researchers set out to answer parents’ age-old questions: What can my baby see, hear, smell, and think?

Prepared to feed and eat

Consider how researchers exploit habituation—a decrease in responding with repeated stimulation. We saw this earlier when fetuses adapted to a vibrating, honking device placed on their mother’s abdomen. The novel stimulus gets attention when first presented. With repetition, the response weakens. This seeming boredom with familiar stimuli gives us a way to ask infants what they see and remember.

habituation decreasing responsiveness with repeated stimulation. As infants gain familiarity with repeated exposure to a visual stimulus, their interest wanes and they look away sooner.

Indeed, even as newborns, we prefer sights and sounds that facilitate social responsiveness. We turn our heads in the direction of human voices. We gaze longer at a drawing of a face-like image. We prefer to look at objects 8 to 12 inches away, which—wonder of wonders—just happens to be the approximate distance between a nursing infant’s eyes and its mother’s (Maurer & Maurer, 1988).

Within days after birth, our brain’s neural networks were stamped with the smell of our mother’s body. Week-old nursing babies, placed between a gauze pad from their mother’s bra and one from another nursing mother, have usually turned toward the smell of their own mother’s pad (MacFarlane, 1978). What’s more, that smell preference lasts. One experiment capitalized on the fact that some nursing mothers in a French maternity ward used a chamomile-scented balm to prevent nipple soreness (Delaunay-El Allam, 2010). Twenty-one months later, their toddlers preferred playing with chamomile-scented toys! Their peers who had not sniffed the scent while breast feeding showed no such preference. (This makes me wonder: Will adults, who as babies associated chamomile scent with their mother’s breast, become devoted chamomile tea drinkers?)

Infancy and Childhood

As a flower unfolds in accord with its genetic instructions, so do we humans. Maturation—the orderly sequence of biological growth—decrees many of our commonalities. We stand before walking. We use nouns before adjectives. Severe deprivation or abuse can retard our development, but the genetic growth tendencies are inborn. Maturation (nature) sets the basic course of development; experience (nurture) adjusts it. Once again, we see genes and scenes interacting.

maturation biological growth processes that enable orderly changes in behavior, relatively uninfluenced by experience.

“It is a rare privilege to watch the birth, growth, and first feeble struggles of a living human mind.”

Annie Sullivan, in Helen Keller’s The Story of My Life, 1903

Physical Development

4-4: During infancy and childhood, how do the brain and motor skills develop?

Brain Development

The formative nurture that conspired with nature began at conception, with the prenatal environment in the womb. Nurture continues outside the womb, where our early experiences foster brain development.

In your mother’s womb, your developing brain formed nerve cells at the explosive rate of nearly one-quarter million per minute. From infancy on, brain and mind—neural hardware and cognitive software—develop together. On the day you were born, you had most of the brain cells you would ever have. However, the wiring among these cells—your nervous system—was immature: After birth, these neural networks had a wild growth spurt branching and linking in patterns that would eventually enable you to walk, talk, and remember.

From ages 3 to 6, the most rapid brain growth was in your frontal lobes, which enable rational planning. During those years, your ability to control your attention and behavior developed rapidly (Garon et al., 2008; Thompson-Schill et al., 2009).

Frontal lobe development continues into adolescence and beyond. The last cortical areas to develop are the association areas—those linked with thinking, memory, and language. As they develop, mental abilities surge (Chugani & Phelps, 1986; Thatcher et al., 1987). The neural pathways supporting language and agility proliferate into puberty. Then, a use-it-or-lose-it pruning processshuts down unused links and strengthens others (Paus et al., 1999; Thompson et al., 2000).

Stringing the circuits young

Your genes dictated your overall brain architecture, rather like the lines of a coloring book, but experience fills in the details (Kenrick et al., 2009). So how do early experiences leave their “marks” in the brain? Mark Rosenzweig and David Krech opened a window on that process when they raised some young rats in solitary confinement in an impoverished environment, and others in a communal playground that simulated a natural environment. When the researchers later analyzed the rats’ brains, those who died with the most toys had won. The rats living in the enriched environment had usually developed a heavier and thicker brain cortex.


Rosenzweig was so surprised by this discovery that he repeated the experiment several times before publishing his findings (Renner & Rosenzweig, 1987; Rosenzweig, 1984). So great are the effects that, shown brief video clips, you could tell from the rats’ activity and curiosity whether their environment had been impoverished or enriched (Renner & Renner, 1993). After 60 days in the enriched environment, the rats’ brain weights increased 7 to 10 percent and the number of synapses mushroomed by about 20 percent (Kolb & Whishaw, 1998).

Such results have motivated improvements in environments for laboratory, farm, and zoo animals—and for children in institutions. Stimulation by touch or massage also benefits infant rats and premature babies (Field et al., 2007). “Handled” infants of both species develop faster neurologically and gain weight more rapidly. By giving preemies massage therapy, neonatal intensive care units help them to go home sooner (Field et al., 2006).

Nature and nurture together sculpt our synapses. Brain maturation provides us with an abundance of neural connections. Experiences—sights and smells, touches and tugs—activate and strengthen some neural pathways while others weaken from disuse. Like forest pathways, popular tracks are broadened and less-traveled ones gradually disappear. The result by puberty is a massive loss of unemployed connections.

Here at the juncture of nurture and nature is the biological reality of early childhood learning. During early childhood—while excess connections are still on call—youngsters can most easily master such skills as the grammar and accent of another language. We seem to have a critical period for some skills. Lacking any exposure to spoken, written, or signed language before adolescence, a person will never master any language. Likewise, lacking visual experience during the early years, a person whose vision is restored by cataract removal will never achieve normal perceptions. Without stimulation, the brain cells normally assigned to vision will die during the pruning process or be diverted to other uses. The maturing brain’s rule: Use it or lose it.

critical period an optimal period early in the life of an organism when exposure to certain stimuli or experiences produces normal development.

Although normal stimulation during the early years is critical, the brain’s development does not end with childhood. As we saw in Chapter 2’s discussion of brain plasticity, our neural tissue is ever changing and new neurons are born. If a monkey pushes a lever with the same finger several thousand times a day, brain tissue controlling that finger changes to reflect the experience. Human brains work similarly. Whether learning to keyboard or skateboard, we perform with increasing skill as our brain incorporates the learning (Ambrose, 2010).


“Genes and experiences are just two ways of doing the same thing—wiring synapses.”

Joseph LeDoux, The Synaptic Self, 2002

Motor Development

The developing brain enables physical coordination. As an infant’s muscles and nervous system mature, skills emerge. With occasional exceptions, the sequence of physical (motor) development is universal. Babies roll over before they sit unsupported, and they usually crawl on all fours before they walk. These behaviors reflect not imitation but a maturing nervous system; blind children, too, crawl before they walk.

There are, however, individual differences in timing. In the United States, for example, 25 percent of all babies walk by 11 months of age, 50 percent within a week after their first birthday, and 90 percent by age 15 months (Frankenburg et al., 1992). The recommended infant back-to-sleep position (putting babies to sleep on their backs to reduce the risk of a smothering crib death) has been associated with somewhat later crawling but not with later walking (Davis et al., 1998; Lipsitt, 2003).

In the eight years following the 1994 launch of a U.S. Back to Sleep educational campaign, the number of infants sleeping on their stomach dropped from 70 to 11 percent—and SIDS (sudden infant death syndrome) deaths fell by half (Braiker, 2005).

Genes guide motor development. Identical twins typically begin walking on nearly the same day (Wilson, 1979). Maturation—including the rapid development of the cerebellum at the back of the brain—creates our readiness to learn walking at about age 1. Experience before that time has a limited effect. The same is true for other physical skills, including bowel and bladder control. Before necessary muscular and neural maturation, neither pleading nor punishment will produce successful toilet training.

Brain Maturation and Infant Memory

Can you recall your first day of preschool or your third birthday party? Our earliest memories seldom predate our third birthday. We see this infantile amnesia in the memories of some preschoolers who experienced an emergency fire evacuation caused by a burning popcorn maker. Seven years later, they were able to recall the alarm and what caused it—if they were 4 to 5 years old at the time. Those experiencing the event as 3-year-olds could not remember the cause and usually misrecalled being already outside when the alarm sounded (Pillemer, 1995). Other studies have confirmed that the average age of earliest conscious memory is 3.5 years (Bauer, 2002, 2007). As children mature, from 4 to 6 to 8 years, childhood amnesia is giving way, and they become increasingly capable of remembering experiences, even for a year or more (Bruce et al., 2000; Morris et al., 2010). The brain areas underlying memory, such as the hippocampus and frontal lobes, continue to mature into adolescence (Bauer, 2007).

Although we consciously recall little from before age 4, our brain was processing and storing information during those early years. In 1965, while finishing her doctoral work in psychology, Carolyn Rovee-Collier observed an infant memory. She was a new mom, whose colicky 2-month-old, Benjamin, could be calmed by moving a crib mobile. Weary of hitting the mobile, she strung a cloth ribbon connecting the mobile to Benjamin’s foot. Soon, he was kicking his foot to move the mobile. Thinking about her unintended home experiment, Rovee-Collier realized that, contrary to popular opinion in the 1960s, babies are capable of learning. To know for sure that her son wasn’t just a whiz kid, she repeated the experiment with other infants (Rovee-Collier, 1989, 1999). Sure enough, they, too, soon kicked more when hitched to a mobile, both on the day of the experiment and the day after. They had learned the link between moving legs and moving mobiles. If, however, she hitched them to a different mobile the next day, the infants showed no learning, indicating that they remembered the original mobile and recognized the difference. Moreover, when tethered to the familiar mobile a month later, they remembered the association and again began kicking.

Traces of forgotten childhood languages may also persist. One study tested English-speaking British adults who had no conscious memory of the Hindi or Zulu they had spoken as children. Yet, up to age 40, they could relearn subtle sound contrasts in these languages that other people could notlearn (Bowers et al., 2009). What the conscious mind does not know and cannot express in words, the nervous system and our two-track mind somehow remembers.

Cognitive Development

4-5: From the perspectives of Piaget, Vygotsky, and today’s researchers, how does a child’s mind develop?

Cognition refers to all the mental activities associated with thinking, knowing, remembering, and communicating. Somewhere on your life journey, you became conscious. When was that, and how did your mind unfold from there? Developmental psychologist Jean Piaget [pee-ah-ZHAY] spent his life searching for the answers to such questions. His interest began in 1920, when he was in Paris developing questions for children’s intelligence tests. While administering the tests, Piaget became intrigued by children’s wrong answers, which were often strikingly similar among same-age children. Where others saw childish mistakes, Piaget saw intelligence at work.

cognition all the mental activities associated with thinking, knowing, remembering, and communicating.

A half-century spent with children convinced Piaget that a child’s mind is not a miniature model of an adult’s. Thanks partly to his work, we now understand that children reason differently than adults, in “wildly illogical ways about problems whose solutions are self-evident to adults” (Brainerd, 1996).

Jean Piaget (1896–1980)

Piaget’s studies led him to believe that a child’s mind develops through a series of stages, in an upward march from the newborn’s simple reflexes to the adult’s abstract reasoning power. Thus, an 8-year-old can comprehend things a toddler cannot, such as the analogy that “getting an idea is like having a light turn on in your head,” or that a miniature slide is too small for sliding, and a miniature car is much too small to get into.


Piaget’s core idea is that the driving force behind our intellectual progression is an unceasing struggle to make sense of our experiences. To this end, the maturing brain builds schemas, concepts or mental molds into which we pour our experiences. By adulthood we have built countless schemas, ranging from cats and dogs to our concept of love.

schema a concept or framework that organizes and interprets information.

To explain how we use and adjust our schemas, Piaget proposed two more concepts. First, we assimilate new experiences—we interpret them in terms of our current understandings (schemas). Having a simple schema for dog, for example, a toddler may call all four-legged animals dogs. But as we interact with the world, we also adjust, or accommodate, our schemas to incorporate information provided by new experiences. Thus, the child soon learns that the original dog schema is too broad and accommodates by refining the category.

assimilation interpreting our new experiences in terms of our existing schemas.

accommodation adapting our current understandings (schemas) to incorporate new information.

Piaget’s Theory and Current Thinking

Piaget believed that children construct their understanding of the world while interacting with it. Their minds experience spurts of change, followed by greater stability as they move from one cognitive plateau to the next, each with distinctive characteristics that permit specific kinds of thinking. TABLE 4.1 summarizes the four stages in Piaget’s theory.

Table 4.1: Piaget’s Stages of Cognitive Development

Typical Age Range

Description of Stage

Developmental Phenomena

Birth to nearly 2 years

Sensorimotor

Experiencing the world through senses and actions (looking, hearing, touching, mouthing, and grasping)

  •  Object permanence

  •  Stranger anxiety

About 2 to about 6 or 7 years

Preoperational

Representing things with words and images; using intuitive rather than logical reasoning

  •  Pretend play

  •  Egocentrism

About 7 to 11 years

Concrete operational

Thinking logically about concrete events; grasping concrete analogies and performing arithmetical operations

  • Conservation

  • Mathematical transformations

About 12 through adulthood

Formal operational

Abstract reasoning

  •  Abstract logic

  •  Potential for mature moral reasoning


Sensorimotor Stage

In the sensorimotor stage, from birth to nearly age 2, babies take in the world through their senses and actions—through looking, hearing, touching, mouthing, and grasping. As their hands and limbs begin to move, they learn to make things happen.

sensorimotor stage in Piaget’s theory, the stage (from birth to about 2 years of age) during which infants know the world mostly in terms of their sensory impressions and motor activities.

Very young babies seem to live in the present: Out of sight is out of mind. In one test, Piaget showed an infant an appealing toy and then flopped his beret over it. Before the age of 6 months, the infant acted as if the toy ceased to exist. Young infants lack object permanence—the awareness that objects continue to exist when not perceived. By 8 months, infants begin exhibiting memory for things no longer seen. If you hide a toy, the infant will momentarily look for it. Within another month or two, the infant will look for it even after being restrained for several seconds.

object permanence the awareness that things continue to exist even when not perceived.

So does object permanence in fact blossom at 8 months, much as tulips blossom in spring? Today’s researchers think not. They believe object permanence unfolds gradually, and they see development as more continuous than Piaget did. Even young infants will at least momentarily look for a toy where they saw it hidden a second before (Wang et al., 2004).

Researchers also believe Piaget and his followers underestimated young children’s competence. Consider these simple experiments:

  •  Baby physics: Like adults staring in disbelief at a magic trick (the “Whoa!” look), infants look longer at an unexpected and unfamiliar scene of a car seeming to pass through a solid object, a ball stopping in midair, or an object violating object permanence by magically disappearing (Baillargeon, 1995, 2008; Wellman & Gelman, 1992).

  •  Baby math: Karen Wynn (1992, 2000) showed 5-month-olds one or two objects. Then she hid the objects behind a screen, and visibly removed or added one. When she lifted the screen, the infants sometimes did a double take, staring longer when shown a wrong number of objects. But were they just responding to a greater or smaller mass of objects, rather than a change in number (Feigenson et al., 2002)? Later experiments showed that babies’ number sense extends to larger numbers, to ratios, and to such things as drumbeats and motions (Libertus & Brannon, 2009; McCrink & Wynn, 2004; Spelke & Kinzler, 2007). If accustomed to a Daffy Duck puppet jumping three times on stage, they showed surprise if it jumped only twice.


Clearly, infants are smarter than Piaget appreciated. Even as babies, we had a lot on our minds.

Preoperational Stage

Piaget believed that until about age 6 or 7, children are in a preoperational stage—too young to perform mental operations (such as imagining an action and mentally reversing it). For a 5-year-old, the milk that seems “too much” in a tall, narrow glass may become an acceptable amount if poured into a short, wide glass. Focusing only on the height dimension, this child cannot perform the operation of mentally pouring the milk back. Before about age 6, said Piaget, children lack the concept of conservation—the principle that quantity remains the same despite changes in shape.

preoperational stage in Piaget’s theory, the stage (from about 2 to about 6 or 7 years of age) during which a child learns to use language but does not yet comprehend the mental operations of concrete logic.

conservation the principle (which Piaget believed to be a part of concrete operational reasoning) that properties such as mass, volume, and number remain the same despite changes in the forms of objects.


PRETEND PLAY A child who can perform mental operations can think in symbols and therefore begins to enjoy pretend play. Contemporary researchers have found that symbolic thinking appears at an earlier age than Piaget supposed. Judy DeLoache (1987) showed children a model of a room and hid a miniature stuffed dog behind its miniature couch. The 2½-year-olds easily remembered where to find the miniature toy, but they could not use the model to locate an actual stuffed dog behind a couch in a real room. Three-year-olds—only 6 months older—usually went right to the actual stuffed animal in the real room, showing they could think of the model as a symbol for the room. Piaget did not view the stage transitions as abrupt shifts. Even so, he probably would have been surprised to see symbolic thinking at such an early age.

EGOCENTRISM Piaget contended that preschool children are egocentric: They have difficulty perceiving things from another’s point of view. Asked to “show Mommy your picture,” 2-year-old Gabriella holds the picture up facing her own eyes. Three-year-old Gray makes himself “invisible” by putting his hands over his eyes, assuming that if he can’t see his grandparents, they can’t see him. Children’s conversations also reveal their egocentrism, as one young boy demonstrated (Phillips, 1969, p. 61):

Do you have a brother?”

Yes.”

What’s his name?”

Jim.”

Does Jim have a brother?”

No.”

egocentrism in Piaget’s theory, the preoperational child’s difficulty taking another’s point of view.

Like Gabriella, TV-watching preschoolers who block your view of the TV assume that you see what they see. They simply have not yet developed the ability to take another’s viewpoint. Even we adults may overestimate the extent to which others share our opinions and perspectives, a trait known as the curse of knowledge. We assume that something will be clear to others if it is clear to us, or that e-mail recipients will “hear” our “just kidding” intent (Epley et al., 2004; Kruger et al., 2005). Children are even more susceptible to such egocentrism.

THEORY OF MIND When Little Red Riding Hood realized her “grandmother” was really a wolf, she swiftly revised her ideas about the creature’s intentions and raced away. Preschoolers, although still egocentric, develop this ability to infer others’ mental states when they begin forming a theory of mind (a term first coined by psychologists David Premack and Guy Woodruff [1978], to describe chimpanzees’ seeming ability to read intentions).

theory of mind people’s ideas about their own and others’ mental states—about their feelings, perceptions, and thoughts, and the behaviors these might predict.

As the ability to take another’s perspective gradually develops, preschoolers come to understand what made a playmate angry, when a sibling will share, and what might make a parent buy a toy. And they begin to tease, empathize, and persuade. Between about 3½ and 4½, children worldwide come to realize that others may hold false beliefs (Callaghan et al., 2005; Sabbagh et al., 2006). Jennifer Jenkins and Janet Astington (1996) showed Toronto children a Band-Aids box and asked them what was inside. Expecting Band-Aids, the children were surprised to discover that the box actually contained pencils. Asked what a child who had never seen the box would think was inside, 3-year-olds typically answered “pencils.” By age 4 to 5, the children’s theory of mind had leapt forward, and they anticipated their friends’ false belief that the box would hold Band-Aids. Children with autism spectrum disorder have difficulty understanding that another’s state of mind differs from their own.

Concrete Operational Stage

By age 6 or 7, said Piaget, children enter the concrete operational stage. Given concrete (physical) materials, they begin to grasp conservation. Understanding that change in form does not mean change in quantity; they can mentally pour milk back and forth between glasses of different shapes. They also enjoy jokes that use this new understanding:

concrete operational stage in Piaget’s theory, the stage of cognitive development (from about 6 or 7 to 11 years of age) during which children gain the mental operations that enable them to think logically about concrete events.

Mr. Jones went into a restaurant and ordered a whole pizza for his dinner. When the waiter asked if he wanted it cut into 6 or 8 pieces, Mr. Jones said, “Oh, you’d better make it 6, I could never eat 8 pieces!” (McGhee, 1976)

Piaget believed that during the concrete operational stage, children become able to comprehend mathematical transformations and conservation. When my daughter, Laura, was 6, I was astonished at her inability to reverse simple arithmetic. Asked, “What is 8 plus 4?” she required 5 seconds to compute “12,” and another 5 seconds to then compute 12 minus 4. By age 8, she could answer a reversed question instantly.

Formal Operational Stage

By about age 12, our reasoning expands from the purely concrete (involving actual experience) to encompass abstract thinking (involving imagined realities and symbols). As children approach adolescence, said Piaget, many become capable of thinking more like scientists. They can ponder hypothetical propositions and deduce consequences: If this, then that. Systematic reasoning, what Piaget called formal operational thinking, is now within their grasp.

formal operational stage in Piaget’s theory, the stage of cognitive development (normally beginning about age 12) during which people begin to think logically about abstract concepts.

Although full-blown logic and reasoning await adolescence, the rudiments of formal operational thinking begin earlier than Piaget realized. Consider this simple problem:

If John is in school, then Mary is in school. John is in school. What can you say about Mary?

Formal operational thinkers have no trouble answering correctly. But neither do most 7-year-olds (Suppes, 1982).

An Alternative Viewpoint: Lev Vygotsky and the Social Child

As Piaget was forming his theory of cognitive development, Russian psychologist Lev Vygotsky (1896–1934) was also studying how children think and learn. He noted that by age 7, they increasingly think in words and use words to solve problems. They do this, he said, by internalizing their culture’s language and relying on inner speech (Fernyhough, 2008). Parents who say “No, no!”when pulling a child’s hand away from a cake are giving the child a self-control tool. When the child later needs to resist temptation, he may likewise say “No, no!” Second-graders who muttered to themselves while doing math problems grasped third-grade math better the following year (Berk, 1994). Whether out loud or inaudibly, talking to themselves helps children control their behavior and emotions and master new skills.

Lev Vygotsky (1896–1934)

Where Piaget emphasized how the child’s mind grows through interaction with the physical environment, Vygotsky emphasized how the child’s mind grows through interaction with the socialenvironment. If Piaget’s child was a young scientist, Vygotsky’s was a young apprentice. By mentoring children and giving them new words, parents and others provide a temporary scaffoldfrom which children can step to higher levels of thinking (Renninger & Granott, 2005). Language, an important ingredient of social mentoring, provides the building blocks for thinking, noted Vygotsky (who was born the same year as Piaget, but died prematurely of tuberculosis).

Reflecting on Piaget’s Theory

What remains of Piaget’s ideas about the child’s mind? Plenty—enough to merit his being singled out by Time magazine as one of the twentieth century’s 20 most influential scientists and thinkers and rated in a survey of British psychologists as the last century’s greatest psychologist (Psychologist, 2003). Piaget identified significant cognitive milestones and stimulated worldwide interest in how the mind develops. His emphasis was less on the ages at which children typically reach specific milestones than on their sequence. Studies around the globe, from aboriginal Australia to Algeria to North America, have confirmed that human cognition unfolds basically in the sequence Piaget described (Lourenco & Machado, 1996; Segall et al., 1990).

However, today’s researchers see development as more continuous than did Piaget. By detecting the beginnings of each type of thinking at earlier ages, they have revealed conceptual abilities Piaget missed. Moreover, they view formal logic as a smaller part of cognition than he did. Piaget would not be surprised that today, as part of our own cognitive development, we are adapting his ideas to accommodate new findings.

“Assessing the impact of Piaget on developmental psychology is like assessing the impact of Shakespeare on English literature.”

Developmental psychologist Harry Beilin (1992)

CLOSE UP: Autism Spectrum Disorder and “Mind-Blindness”

Diagnoses of autism spectrum disorder (ASD), a disorder marked by social deficiencies, have been increasing. Once believed to affect 1 in 2500 children, ASD now affects 1 in 110 American children and about 1 in 100 in Britain (CDC, 2009; Lilienfeld & Arkowitz, 2007; NAS, 2011). The increase in ASD diagnoses has been offset by a decrease in the number of children considered “cognitively disabled” or “learning disabled,” which suggests a relabeling of children’s disorders (Gernsbacher et al., 2005; Grinker, 2007; Shattuck, 2006). A massive $6.7 billion National Children’s Study now under way aims to enroll 100,000 pregnant women in 105 countries and to follow their babies until they turn 21. Researchers hope this study will help explain the rising rates of ASD, as well as premature births, childhood obesity, and asthma (Belluck, 2010; Murphy, 2008).

autism spectrum disorder (ASD) a disorder that appears in childhood and is marked by deficient communication, social interaction, and understanding of others’ states of mind.

The underlying source of ASD’s symptoms seems to be poor communication among brain regions that normally work together to let us take another’s viewpoint. This effect appears to result from ASD-related genes interacting with the environment (State  Šestan, 2012). People with ASD are therefore said to have an impaired theory of mind (Rajendran & Mitchell, 2007; Senju et al., 2009). They have difficulty inferring others’ thoughts and feelings. They do not appreciate that playmates and parents might view things differently. Mind reading that most of us find intuitive (Is that face conveying a smirk or a sneer?) is difficult for those with ASD. Most children learn that another child’s pouting mouth signals sadness, and that twinkling eyes mean happiness or mischief. A child with ASD fails to understand these signals (Frith & Frith, 2001). In hopes of a cure, desperate parents have sometimes subjected children to ineffective therapies (Shute, 2010).

Autism spectrum disorder

This speech-language pathologist is helping a boy with ASD learn to form sounds and words. ASD is marked by deficient social communication and difficulty grasping others’ states of mind.

Ozier Muhammad/The New York Times/Redux

ASD (formerly referred to as “autism”) has differing levels of severity. “High- functioning” individuals have normal intelligence, and they often have an exceptional skill or talent in a specific area. But they lack social and communication skills, and they tend to become distracted by minor and unimportant stimuli (Remington et al., 2009). Those at the spectrum’s lower end are unable to use language at all.

ASD afflicts four boys for every girl. Psychologist Simon Baron-Cohen believes this hints at one way to understand this disorder. He has argued that ASD represents an “extreme male brain” (2008, 2009). Although there is some overlap between the sexes, he believes that boys are better “systemizers.” They tend to understand things according to rules or laws, for example, as in mathematical and mechanical systems. Children exposed to high levels of the male sex hormone testosterone in the womb may develop more masculine and autistic traits (Auyeung et al, 2009).

In contrast, girls are naturally predisposed to be “empathizers,” Baron-Cohen contends. They are better at reading facial expressions and gestures, though less so if given testosterone (van Honk et al, 2011).

Biological factors, including genetic influences and abnormal brain development, contribute to ASD (State  Šestan, 2012). Childhood MMR vaccinations do not (Demicheli et al., 2012). Based on a fraudulent 1998 study—the most damaging medical hoax of the last 100 years” (Flaherty, 2011)—some parents were misled into thinking that the childhood MMR vaccine increased risk of ASD. The unfortunate result was a drop in vaccination rates and an increase in cases of measles and mumps. Some unvaccinated children suffered long-term harm or even death.

Twin and sibling studies provide some evidence for biology’s influence. If one identical twin is diagnosed with ASD, the chances are 50 to 70 percent that the co-twin will also receive this diagnosis (Lichtenstein et al., 2010; Sebat et al., 2007). A younger sibling of a child with ASD also is at a heightened risk (Sutcliffe, 2008). Random genetic mutations in sperm-producing cells may also play a role. As men age, these mutations become more frequent, which may help explain why an over-40 man has a much higher risk of fathering a child with ASD than does a man under 30 (Reichenberg et al., 2007). Researchers are now sleuthing ASD’s telltale signs in the brain’s synaptic and gray matter (Crawley, 2007; Ecker et al., 2010; Garber, 2007).

Autism” case number 1

In 1943, Donald Gray Triplett, an “odd” child with unusual gifts and social deficits, was the first person to receive the diagnosis of a previously unreported condition, which psychiatrist Leo Kanner termed “autism.” (After a 2013 change in the diagnosis manual, his condition is now called autism spectrum disorder.) In 2010, at age 77, Triplett was still living in his native home and Mississippi town, where he often played golf (Donvan & Zucker, 2010).

Biology’s role in ASD also appears in brain-function studies. People without ASD often yawn after seeing others yawn. And as they view and imitate another’s smiling or frowning, they feel something of what the other is feeling. Not so among those with ASD, who are less imitative and show much less activity in brain areas involved in mirroring others’ actions (Dapretto et al., 2006; Perra et al., 2008; Senju et al., 2007). When people with ASD watch another person’s hand movements, for example, their brain displays less-than-normal mirroring activity (Oberman & Ramachandran, 2007; Théoret et al., 2005). Scientists are continuing to explore and vigorously debate the idea that the brains of people with ASD have “broken mirrors” (Gallese et al., 2011).

Seeking to “systemize empathy,” Baron-Cohen and his Cambridge University colleagues (2007; Golan et al., 2010) collaborated with Britain’s National Autistic Society and a film production company. Knowing that television shows with vehicles have been popular among kids with ASD, they created animations with toy vehicle characters in a pretend boy’s bedroom, grafting emotion-conveying faces onto toy trams, trains, and tractors. After the boy leaves for school, the characters come to life and have experiences that lead them to display various emotions (www.thetransporters.com). The children were surprisingly able to generalize what they had learned to a new, real context. By the intervention’s end, their previously deficient ability to recognize emotions on real faces equaled that of children without ASD.

Implications for Parents and Teachers

Future parents and teachers, remember this: Young children are incapable of adult logic. Preschoolers who block one’s view of the TV simply have not learned to take another’s viewpoint. What seems simple and obvious to us—getting off a teeter-totter will cause a friend on the other end to crash—may be incomprehensible to a 3-year-old. Also remember that children are not passive receptacles waiting to be filled with knowledge. Better to build on what they already know, engaging them in concrete demonstrations and stimulating them to think for themselves. Finally, accept children’s cognitive immaturity as adaptive. It is nature’s strategy for keeping children close to protective adults and providing time for learning and socialization (Bjorklund & Green, 1992).

“Childhood has its own way of seeing, thinking, and feeling, and there is nothing more foolish than the attempt to put ours in its place.”

Philosopher Jean-Jacques Rousseau, 1798

Social Development

4-6: How do parent-infant attachment bonds form?

From birth, babies are social creatures, developing an intense bond with their caregivers. Infants come to prefer familiar faces and voices, then to coo and gurgle when given a parent’s attention. After about 8 months, soon after object permanence emerges and children become mobile, a curious thing happens: They develop stranger anxiety. They may greet strangers by crying and reaching for familiar caregivers. “No! Don’t leave me!” their distress seems to say. Children this age have schemas for familiar faces; when they cannot assimilate the new face into these remembered schemas, they become distressed (Kagan, 1984). Once again, we see an important principle: The brain, mind, and social-emotional behavior develop together.

stranger anxiety the fear of strangers that infants commonly display, beginning by about 8 months of age.

Origins of Attachment

One-year-olds typically cling tightly to a parent when they are frightened or expect separation. Reunited after being apart, they shower the parent with smiles and hugs. No social behavior is more striking than the intense and mutual infant-parent bond. This attachment bond is a powerful survival impulse that keeps infants close to their caregivers. Infants become attached to those—typically their parents—who are comfortable and familiar. For many years, psychologists reasoned that infants became attached to those who satisfied their need for nourishment. It made sense. But an accidental finding overturned this explanation.

attachment an emotional tie with another person; shown in young children by their seeking closeness to the caregiver and showing distress on separation.

Stranger anxiety

Body Contact

During the 1950s, University of Wisconsin psychologists Harry Harlow and Margaret Harlow bred monkeys for their learning studies. To equalize experiences and to isolate any disease, they separated the infant monkeys from their mothers shortly after birth and raised them in sanitary individual cages, which included a cheese-cloth baby blanket (Harlow et al., 1971). Then came a surprise: When their blankets were taken to be laundered, the monkeys became distressed.

The Harlows recognized that this intense attachment to the blanket contradicted the idea that attachment derives from an association with nourishment. But how could they show this more convincingly? To pit the drawing power of a food source against the contact comfort of the blanket, they created two artificial mothers. One was a bare wire cylinder with a wooden head and an attached feeding bottle, the other a cylinder wrapped with terry cloth.

When raised with both, the monkeys overwhelmingly preferred the comfy cloth mother. Like other infants clinging to their live mothers, the monkey babies would cling to their cloth mothers when anxious. When exploring their environment, they used her as a secure base, as if attached to her by an invisible elastic band that stretched only so far before pulling them back. Researchers soon learned that other qualities—rocking, warmth, and feeding—made the cloth mother even more appealing.


Human infants, too, become attached to parents who are soft and warm and who rock, feed, and pat. Much parent-infant emotional communication occurs via touch (Hertenstein et al., 2006), which can be either soothing (snuggles) or arousing (tickles). Human attachment also consists of one person providing another with a secure base from which to explore and a safe haven when distressed. As we mature, our secure base and safe haven shift—from parents to peers and partners (Cassidy & Shaver, 1999). But at all ages we are social creatures. We gain strength when someone offers, by words and actions, a safe haven: “I will be here. I am interested in you. Come what may, I will support you” (Crowell & Waters, 1994).

Familiarity

Contact is one key to attachment. Another is familiarity. In many animals, attachments based on familiarity form during a critical period—an optimal period when certain events must take place to facilitate proper development (Bornstein, 1989). As noted earlier, humans seem to have a critical period for language. Goslings, ducklings, and chicks have a critical period for attachment, called imprinting, which falls in the hours shortly after hatching, when the first moving object they see is normally their mother. From then on, the young fowl follow her, and her alone.

imprinting the process by which certain animals form attachments during a critical period very early in life.

Konrad Lorenz (1937) explored this rigid attachment process. He wondered: What would ducklings do if he was the first moving creature they observed? What they did was follow him around: Everywhere that Konrad went, the ducks were sure to go. Although baby birds imprint best to their own species, they also will imprint on a variety of moving objects—an animal of another species, a box on wheels, a bouncing ball (Colombo, 1982; Johnson, 1992). Once formed, this attachment is difficult to reverse.

Children—unlike ducklings—do not imprint. However, they do become attached to what they’ve known. Mere exposure to people and things fosters fondness. Children like to reread the same books, rewatch the same movies, and reenact family traditions. They prefer to eat familiar foods, live in the same familiar neighborhood, and attend school with the same old friends. Familiarity is a safety signal. Familiarity breeds content.


Attachment Differences

4-7: How have psychologists studied attachment differences, and what have they learned?

What accounts for children’s attachment differences? To answer this question, Mary Ainsworth (1979) designed the strange situation experiment. She observed mother-infant pairs at home during their first six months. Later she observed the 1-year-old infants in a strange situation (usually a laboratory playroom). Such research has shown that about 60 percent of infants display secure attachment. In their mother’s presence they play comfortably, happily exploring their new environment. When she leaves, they become distressed; when she returns, they seek contact with her.

Other infants avoid attachment or show insecure attachment, marked either by anxiety or avoidance of trusting relationships. They are less likely to explore their surroundings; they may even cling to their mother. When she leaves, they either cry loudly and remain upset or seem indifferent to her departure and return (Ainsworth, 1973, 1989; Kagan, 1995; van IJzendoorn & Kroonenberg, 1988).

Ainsworth and others found that sensitive, responsive mothers—those who noticed what their babies were doing and responded appropriately—had infants who exhibited secure attachment (De Wolff & van IJzendoorn, 1997). Insensitive, unresponsive mothers—mothers who attended to their babies when they felt like doing so but ignored them at other times—often had infants who were insecurely attached. The Harlows’ monkey studies, with unresponsive artificial mothers, produced even more striking effects. When put in strange situations without their artificial mothers, the deprived infants were terrified.

But is attachment style the result of parenting? Or are other factors also at work?

Temperament and Attachment

How does temperament—a person’s characteristic emotional reactivity and intensity—affect attachment style? Temperament is genetically influenced. Shortly after birth, some babies are noticeably difficult—irritable, intense, and unpredictable. Others are easy—cheerful, relaxed, and feeding and sleeping on predictable schedules (Chess & Thomas, 1987).

temperament a person’s characteristic emotional reactivity and intensity.

The genetic effect appears in physiological differences. Anxious, inhibited infants have high and variable heart rates and a reactive nervous system. When facing new or strange situations, they become more physiologically aroused (Kagan & Snidman, 2004). One form of a gene that regulates the neurotransmitter serotonin predisposes a fearful temperament and, in combination with unsupportive caregiving, an inhibited child (Fox et al., 2007).

Temperament differences typically persist. Consider:

  •  The most emotionally reactive newborns have tended also to be the most reactive 9-month-olds (Wilson & Matheny, 1986; Worobey & Blajda, 1989).

  •  Exceptionally inhibited and fearful 2-year-olds often were still relatively shy as 8-year-olds; about half became introverted adolescents (Kagan et al., 1992, 1994).

  •  The most emotionally intense preschoolers have tended to be relatively intense young adults (Larsen & Diener, 1987). In one long-term study of more than 900 New Zealanders, emotionally reactive and impulsive 3-year-olds developed into somewhat more impulsive, aggressive, and conflict-prone 21-year-olds (Caspi, 2000).

Such evidence supports the conclusion that our biologically rooted temperament helps form our enduring personality (McCrae et al., 2000, 2007; Rothbart et al., 2000).

Parenting studies that neglect such inborn differences, noted Judith Harris (1998), do the equivalent of “comparing foxhounds reared in kennels with poodles reared in apartments.” To separate the effects of nature and nurture on attachment, we would need to vary parenting while controlling temperament. (Pause and think: If you were the researcher, how might you have done this?)

Full-time dad

Dutch researcher Dymphna van den Boom’s solution was to randomly assign 100 temperamentally difficult 6- to 9-month-olds to either an experimental group, in which mothers received personal training in sensitive responding, or to a control group, in which they did not. At 12 months of age, 68 percent of the experimental group infants were rated securely attached, as were only 28 percent of the control group infants. Other studies have confirmed that intervention programs can increase parental sensitivity and, to a lesser extent, infant attachment security (Bakermans-Kranenburg et al., 2003; Van Zeijl et al., 2006).

As many of these examples indicate, researchers have more often studied mother care than father care, but fathers are more than just mobile sperm banks. Despite the widespread attitude that “fathering a child” means impregnating, and “mothering” means nurturing, nearly 100 studies worldwide have shown that a father’s love and acceptance are comparable to a mother’s love in predicting an offspring’s health and well-being (Rohner & Veneziano, 2001). In one mammoth British study following 7259 children from birth to adulthood, those whose fathers were most involved in parenting (through outings, reading to them, and taking an interest in their education) tended to achieve more in school, even after controlling for other factors such as parental education and family wealth (Flouri & Buchanan, 2004). Fathers matter.

Children’s anxiety over separation from parents peaks at around 13 months, then gradually declines. This happens whether they live with one parent or two, are cared for at home or in a day-care center, live in North America, Guatemala, or the Kalahari Desert. Does this mean our need for and love of others also fades away? Hardly. Our capacity for love grows, and our pleasure in touching and holding those we love never ceases. The power of early attachment does nonetheless gradually relax, allowing us to move into a wider range of situations, communicate with strangers more freely, and stay emotionally attached to loved ones despite distance.

“Out of the conflict between trust and mistrust, the infant develops hope, which is the earliest form of what gradually becomes faith in adults.”

Erik Erikson (1983)


Attachment Styles and Later Relationships

Developmental theorist Erik Erikson (1902–1994), working with his wife, Joan Erikson, believed that securely attached children approach life with a sense of basic trust—a sense that the world is predictable and reliable. He attributed basic trust not to environment or inborn temperament, but to early parenting. He theorized that infants blessed with sensitive, loving caregivers form a lifelong attitude of trust rather than fear.

basic trust according to Erik Erikson, a sense that the world is predictable and trustworthy; said to be formed during infancy by appropriate experiences with responsive caregivers.

Although debate continues, many researchers now believe that our early attachments form the foundation for our adult relationships (Birnbaum et al., 2006; Fraley, 2002). Our adult styles of romantic love tend to exhibit secure, trusting attachment; insecure-anxious attachment; or insecure-avoidant attachment (Feeney & Noller, 1990; Rholes & Simpson, 2004; Shaver & Mikulincer, 2007). Feeling insecurely attached to others during childhood, for example, may take two main forms in adulthood (Fraley et al., 2011). One is anxiety, in which people constantly crave acceptance but remain vigilant to signs of possible rejection. The other is avoidance, in which people experience discomfort getting close to others and use avoidant strategies to maintain distance from others.

Adult attachment styles can also affect relationships with one’s own children. Avoidant people’s discomfort with closeness makes parenting more stressful and unsatisfying (Rholes et al., 2006). But say this for those (nearly half of all humans) who exhibit insecure attachments: Anxious or avoidant tendencies have helped our groups detect or escape dangers (Ein-Dor et al., 2010).

Deprivation of Attachment

4-8: How does childhood neglect or abuse affect children’s attachments?

If secure attachment fosters social trust, what happens when circumstances prevent a child’s forming attachments? In all of psychology, there is no sadder research literature. Babies locked away at home under conditions of abuse or extreme neglect are often withdrawn, frightened, even speechless. The same is true of those reared in institutions without the stimulation and attention of a regular caregiver, as was tragically illustrated during the 1970s and 1980s in Romania. Having decided that economic growth for his impoverished country required more human capital, Nicolae Ceauşescu, Romania’s Communist dictator, outlawed contraception, forbade abortion, and taxed families with fewer than five children. The birthrate indeed skyrocketed. But unable to afford the children they had been coerced into having, many families abandoned them to government-run orphanages with untrained and overworked staff. Child-to-caregiver ratios often were 15 to 1, so the children were deprived of healthy attachments with at least one adult. When tested after Ceauşescu was assassinated in 1989, these children had lower intelligence scores and double the 20 percent rate of anxiety symptoms found in children assigned to quality foster care settings (Nelson et al., 2009). Dozens of other studies across 19 countries have confirmed that orphaned children tend to fare better on later intelligence tests if raised in family homes. This is especially so for those placed at an early age (van IJzendoorn et al., 2008).

“What is learned in the cradle lasts to the grave.”

French proverb

Most children growing up under adversity (as did the surviving children of the Holocaust) are resilient; they become normal adults (Helmreich, 1992; Masten, 2001). So do most victims of childhood sexual abuse, notes Harvard researcher Susan Clancy (2010), while emphasizing that using children for sex is revolting and never the victim’s fault.

But others, especially those who experience no sharp break from their abusive past, don’t bounce back so readily. The Harlows’ monkeys raised in total isolation, without even an artificial mother, bore lifelong scars. As adults, when placed with other monkeys their age, they either cowered in fright or lashed out in aggression. When they reached sexual maturity, most were incapable of mating. If artificially impregnated, females often were neglectful, abusive, even murderous toward their first-born. Another primate experiment confirmed the abuse-breeds-abuse phenomenon in rhesus monkeys: 9 of 16 females who had been abused by their mothers became abusive parents, as did no female raised by a nonabusive mother (Maestripieri, 2005).

The deprivation of attachment

In humans, too, the unloved may become the unloving. Most abusive parents—and many condemned murderers—have reported being neglected or battered as children (Kempe & Kempe, 1978; Lewis et al., 1988). Some 30 percent of people who have been abused later abuse their children—a rate lower than that found in the primate study, but four times the U.S. national rate of child abuse (Dumont et al., 2007; Kaufman & Zigler, 1987).

Although most abused children do not later become violent criminals or abusive parents, extreme early trauma may nevertheless leave footprints on the brain. Abused children exhibit hypersensitivity to angry faces (Pollak, 2008). As adults, they exhibit stronger startle responses (Jovanovic et al., 2009). If repeatedly threatened and attacked while young, normally placid golden hamsters grow up to be cowards when caged with same-sized hamsters, or bullies when caged with weaker ones (Ferris, 1996). Such animals show changes in the brain chemical serotonin, which calms aggressive impulses. A similarly sluggish serotonin response has been found in abused children who become aggressive teens and adults. “Stress can set off a ripple of hormonal changes that permanently wire a child’s brain to cope with a malevolent world,” concluded abuse researcher Martin Teicher (2002).

Such findings help explain why young children who have survived severe or prolonged physical abuse, childhood sexual abuse, or wartime atrocities are at increased risk for health problems, psychological disorders, substance abuse, and criminality (Freyd et al., 2005; Kendall-Tackett et al., 1993, 2004; Wegman & Stetler, 2009). Abuse victims are at considerable risk for depression if they carry a gene variation that spurs stress-hormone production (Bradley et al., 2008). As we will see again and again, behavior and emotion arise from a particular environment interacting with particular genes.

We adults also suffer when our attachment bonds are severed. Whether through death or separation, a break produces a predictable sequence. Agitated preoccupation with the lost partner is followed by deep sadness and, eventually, the beginnings of emotional detachment and a return to normal living (Hazan & Shaver, 1994). Newly separated couples who have long ago ceased feeling affection are sometimes surprised at their desire to be near the former partner. Deep and longstanding attachments seldom break quickly. Detaching is a process, not an event.

Day Care

4-9: How does day care affect children?

Developmental psychologists’ research has uncovered no major impact of maternal employment on children’s development, attachments, and achievements (Friedman & Boyle, 2008; Goldberg et al., 2008; Lucas-Thompson et al., 2010).

Contemporary research now focuses on the effects of differing quality of day care on different types and ages of children (Vandell et al., 2010). Sandra Scarr (1997) explained: Around the world, “high-quality child care consists of warm, supportive interactions with adults in a safe, healthy, and stimulating environment.…Poor care is boring and unresponsive to children’s needs.” Even well-run orphanages can produce healthy, thriving children. In Africa and Asia, where more and more children are losing parents to AIDS and other diseases, orphanages typically are unlike those in Ceauşescu’s Romania, and the children living in quality orphanages fare about as well as those living in communities (Whetten et al., 2009).

Children thrive under varied types of responsive caregiving. Westernized attachment features one or two caregivers and their offspring, but multiple caregivers are the norm in other cultures, such as the Efe of Zaire (Field, 1996; Whaley et al., 2002). Even before an Efe mother holds her newborn, the baby is passed among several women. In the weeks to come, the infant will be constantly held (and fed) by other women and will form strong multiple attachments.

An example of high-quality day care

One ongoing study in 10 American cities has followed 1100 children since the age of 1 month. The researchers found that at ages 4½ to 6, children who had spent the most time in day care had slightly advanced thinking and language skills. They also had an increased rate of aggressiveness and defiance (NICHD, 2002, 2003, 2006). But the child’s temperament, the parents’ sensitivity, and the family’s economic and educational level influenced aggression more than did the time spent in day care.

There is little disagreement that the children who merely exist for nine hours a day in understaffed centers deserve better. What all children need is a consistent, warm relationship with people they can learn to trust. The importance of such relationships extends beyond the preschool years, as Finnish psychologist Lea Pulkkinen (2006) observed in her career-long study of 285 individuals tracked from age 8 to 42. Her finding—that adult monitoring of children predicts favorable outcomes—led her to undertake, with support from Finland’s parliament, a nationwide program of adult-supervised activities for all first and second graders (Pulkkinen, 2004; Rose, 2004).

Parenting Styles

4-10: What are three parenting styles, and how do children’s traits relate to them?

Some parents spank, some reason. Some are strict, some are lax. Some show little affection, some liberally hug and kiss. Do such differences in parenting styles affect children?

The most heavily researched aspect of parenting has been how, and to what extent, parents seek to control their children. Investigators have identified three parenting styles:

  • 1. Authoritarian parents impose rules and expect obedience: “Don’t interrupt.” “Keep your room clean.” “Don’t stay out late or you’ll be grounded.” “Why? Because I said so.”

  • 2. Permissive parents submit to their children’s desires. They make few demands and use little punishment.

  • 3. Authoritative parents are both demanding and responsive. They exert control by setting rules, but, especially with older children, they encourage open discussion and allow exceptions.

Too hard, too soft, and just right, these styles have been called. Research indicates that children with the highest self-esteem, self-reliance, and social competence usually have warm, concerned, authoritative parents (Baumrind, 1996; Buri et al., 1988; Coopersmith, 1967). Those with authoritarian parents tend to have less social skill and self-esteem, and those with permissive parents tend to be more aggressive and immature. The participants in most studies have been middle-class White families, and some critics suggest that effective parenting may vary by culture. Yet studies with families of other races and in more than 200 cultures worldwide have confirmed the social and academic correlates of loving and authoritative parenting (Rohner & Veneziano, 2001; Sorkhabi, 2005; Steinberg & Morris, 2001). For example, two studies of thousands of Germans found that those whose parents had maintained a curfew exhibited better adjustment and greater achievements in young adulthood than did those with permissive parents (Haase et al., 2008).

A word of caution: The association between certain parenting styles (being firm but open) and certain childhood outcomes (social competence) is correlational. Correlation is not causation. Perhaps you can imagine possible explanations for this parenting-competence link.

Cultures vary

Parents who struggle with conflicting advice should also remember that all advice reflects the advice giver’s values. For parents who prize unquestioning obedience or whose children live in dangerous environments, an authoritarian style may have the desired effect. For those who value children’s sociability and self-reliance, authoritative firm-but-open parenting is advisable.

Culture and Child Rearing

Child-rearing practices reflect not only individual values, but also cultural values, which vary across time and place. Should children be independent or comply? If you live in a Westernized culture, you likely prefer independence. “You are responsible for yourself,” Western families and schools tell their children. “Follow your conscience. Be true to yourself. Discover your gifts. Think through your personal needs.” A half-century ago and more, however, Western cultural values placed greater priority on obedience, respect, and sensitivity to others (Alwin, 1990; Remley, 1988). “Be true to your traditions,” parents then taught their children. “Be loyal to your heritage and country. Show respect toward your parents and other superiors.” Cultures can change.

Children across time and place have thrived under various child-rearing systems. Many Americans now give children their own bedrooms and entrust them to day care. Upper-class British parents traditionally handed off routine caregiving to nannies, then sent their 10-year-olds off to boarding school. These children generally grew up to be pillars of British society, as did their parents and their boarding-school peers.

Many Asian and African cultures place less value on independence and more on a strong sense of family self—a feeling that what shames the child shames the family, and what brings honor to the family brings honor to the self. These cultures also value emotional closeness, and infants and toddlers may sleep with their mothers and spend their days close to a family member (Morelli et al., 1992; Whiting & Edwards, 1988). In the African Gusii society, babies nurse freely but spend most of the day on their mother’s back—with lots of body contact but little face-to-face and language interaction. When the mother becomes pregnant again, the toddler is weaned and handed over to someone else, often an older sibling. Westerners may wonder about the negative effects of this lack of verbal interaction, but then the African Gusii may in turn wonder about Western mothers pushing their babies around in strollers and leaving them in playpens (Small, 1997).

Parental involvement promotes development

Such diversity in child rearing cautions us against presuming that our culture’s way is the only way to rear children successfully. One thing is certain, however: Whatever our culture, the investment in raising a child buys many years not only of joy and love but of worry and irritation. Yet for most people who become parents, a child is one’s biological and social legacy—one’s personal investment in the human future. To paraphrase psychiatrist Carl Jung, we reach backward into our parents and forward into our children, and through their children into a future we will never see, but about which we must therefore care.

“You are the bows from which your children as living arrows are sent forth.”

Kahlil Gibran, The Prophet, 1923

Reflections on Nature and Nurture

The unique gene combination created when our mother’s egg engulfed our father’s sperm helped form us, as individuals. Genes predispose both our shared humanity and our individual differences.

But it is also true that our experiences form us. In the womb, in our families, and in our peer social relationships, we learn ways of thinking and acting. Even differences initiated by our nature may be amplified by our nurture. We are not formed by either nature or nurture, but by the interaction between them. Biological, psychological, and social-cultural forces interact.

Mindful of how others differ from us, however, we often fail to notice the similarities stemming from our shared biology. Regardless of our culture, we humans share the same life cycle. We speak to our infants in similar ways and respond similarly to their coos and cries (Bornstein et al., 1992a,b). All over the world, the children of warm and supportive parents feel better about themselves and are less hostile than are the children of punishing and rejecting parents (Rohner, 1986; Scott et al., 1991). Although Hispanic, Asian, Black, and White Americans differ in school achievement and delinquency, the differences are “no more than skin deep.” To the extent that family structure, peer influences, and parental education predict behavior in one of these ethnic groups, they do so for the others as well. Compared with the person-to-person differences within groups, the differences between groups are small.

Adolescence

4-11: How is adolescence defined, and how do physical changes affect developing teens?

Many psychologists once believed that childhood sets our traits. Today’s developmental psychologists see development as lifelong. As this life-span perspective emerged, psychologists began to look at how maturation and experience shape us not only in infancy and childhood, but also in adolescence and beyond. Adolescence—the years spent morphing from child to adult—starts with the physical beginnings of sexual maturity and ends with the social achievement of independent adult status. In some cultures, where teens are self-supporting, this means that adolescence hardly exists.

adolescence the transition period from childhood to adulthood, extending from puberty to independence.

How will you look back on your life 10 years from now? Are you making choices that someday you will recollect with satisfaction?

G. Stanley Hall (1904), one of the first psychologists to describe adolescence, believed that the tension between biological maturity and social dependence creates a period of “storm and stress.” Indeed, after age 30, many who grow up in independence-fostering Western cultures look back on their teenage years as a time they would not want to relive, a time when their peers’ social approval was imperative, their sense of direction in life was in flux, and their feeling of alienation from their parents was deepest (Arnett, 1999; Macfarlane, 1964).

But for many, adolescence is a time of vitality without the cares of adulthood, a time of rewarding friendships, heightened idealism, and a growing sense of life’s exciting possibilities.

Physical Development

Adolescence begins with puberty, the time when we mature sexually. Puberty follows a surge of hormones, which may intensify moods and which trigger a series of bodily changes outlined in Chapter 5, Gender and Sexuality.

puberty the period of sexual maturation, during which a person becomes capable of reproducing.


Just as in the earlier life stages, the sequence of physical changes in puberty (for example, breast buds and visible pubic hair before menarche—the first menstrual period) is far more predictable than their timing. Some girls start their growth spurt at 9, some boys as late as age 16. Though such variations have little effect on height at maturity, they may have psychological consequences: It is not only when we mature that counts, but how people react to our physical development.

For boys, early maturation has mixed effects. Boys who are stronger and more athletic during their early teen years tend to be more popular, self-assured, and independent, though also more at risk for alcohol use, delinquency, and premature sexual activity (Conley & Rudolph, 2009; Copeland et al., 2010; Lynne et al., 2007). For girls, early maturation can be a challenge (Mendle et al., 2007). If a young girl’s body and hormone-fed feelings are out of sync with her emotional maturity and her friends’ physical development and experiences, she may begin associating with older adolescents or may suffer teasing or sexual harassment (Ge & Natsuaki, 2009).


An adolescent’s brain is also a work in progress. Until puberty, brain cells increase their connections, like trees growing more roots and branches. Then, during adolescence, comes a selective pruning of unused neurons and connections (Blakemore, 2008). What we don’t use, we lose.

As teens mature, their frontal lobes also continue to develop. The growth of myelin, the fatty tissue that forms around axons and speeds neurotransmission, enables better communication with other brain regions (Kuhn, 2006; Silveri et al., 2006). These developments bring improved judgment, impulse control, and long-term planning.

Maturation of the frontal lobes nevertheless lags behind that of the emotional limbic system. Puberty’s hormonal surge and limbic system development help explain teens’ occasional impulsiveness, risky behaviors, and emotional storms—slamming doors and turning up the music (Casey et al., 2008). No wonder younger teens (whose unfinished frontal lobes aren’t yet fully equipped for making long-term plans and curbing impulses) so often succumb to the tobacco corporations, which most adult smokers could tell them they will later regret. Teens actually don’t underestimate the risks of smoking—or fast driving or unprotected sex. They just, when reasoning from their gut, weigh the immediate benefits more heavily (Reyna & Farley, 2006; Steinberg, 2007, 2010). They seek thrills and rewards, but they can’t yet locate the brake pedal controlling their impulses.

So, when Junior drives recklessly and academically self-destructs, should his parents reassure themselves that “he can’t help it; his frontal cortex isn’t yet fully grown”? They can at least take hope: The brain with which Junior begins his teens differs from the brain with which he will end his teens. Unless he slows his brain development with heavy drinking—leaving him prone to impulsivity and addiction—his frontal lobes will continue maturing until about age 25 (Beckman, 2004; Crews et al., 2007).


In 2004, the American Psychological Association joined seven other medical and mental health associations in filing U.S. Supreme Court briefs arguing against the death penalty for 16- and 17-year-olds. The briefs documented the teen brain’s immaturity “in areas that bear upon adolescent decision making.” Teens are “less guilty by reason of adolescence,” suggested psychologist Laurence Steinberg and law professor Elizabeth Scott (2003; Steinberg et al., 2009). In 2005, by a 5-to-4 margin, the Court concurred, declaring juvenile death penalties unconstitutional.

“If a gun is put in the control of the prefrontal cortex of a hurt and vengeful 15-year-old, and it is pointed at a human target, it will very likely go off.”

National Institutes of Health brain scientist Daniel R. Weinberger, “A Brain Too Young for Good Judgment,” 2001

Cognitive Development

4-12: How did Piaget, Kohlberg, and later researchers describe adolescent cognitive and moral development?

“When the pilot told us to brace and grab our ankles, the first thing that went through my mind was that we must all look pretty stupid.”

Jeremiah Rawlings, age 12, after a 1989 DC-10 crash in Sioux City, Iowa

During the early teen years, reasoning is often self-focused. Adolescents may think their private experiences are unique, something parents just could not understand: “But, Mom, you don’t really know how it feels to be in love” (Elkind, 1978). Capable of thinking about their own thinking, and about other people’s thinking, they also begin imagining what others are thinking about them. (They might worry less if they understood their peers’ similar self-absorption.) Gradually, though, most begin to reason more abstractly.

Developing Reasoning Power

When adolescents achieve the intellectual summit that Jean Piaget called formal operations, they apply their new abstract reasoning tools to the world around them. They may think about what is ideally possible and compare that with the imperfect reality of their society, their parents, and even themselves. They may debate human nature, good and evil, truth and justice. Their sense of what’s fair changes from simple equality—to what’s proportional to merit (Almas et al., 2010). Having left behind the concrete images of early childhood, they may now seek a deeper conception of God and existence (Elkind, 1970; Worthington, 1989). Reasoning hypothetically and deducing consequences also enables adolescents to detect inconsistencies and spot hypocrisy in others’ reasoning, sometimes leading to heated debates with parents and silent vows never to lose sight of their own ideals (Peterson et al., 1986).

Demonstrating their reasoning ability

Developing Morality

Two crucial tasks of childhood and adolescence are discerning right from wrong and developing character—the psychological muscles for controlling impulses. To be a moral person is to thinkmorally and act accordingly. Jean Piaget and Lawrence Kohlberg proposed that moral reasoning guides moral actions. A more recent view builds on psychology’s game-changing new recognition that much of our functioning occurs not on the “high road” of deliberate, conscious thinking but on the “low road” of unconscious, automatic thinking.

Moral reasoning

Moral Reasoning

Piaget (1932) believed that children’s moral judgments build on their cognitive development. Agreeing with Piaget, Lawrence Kohlberg (1981, 1984) sought to describe the development of moral reasoning, the thinking that occurs as we consider right and wrong. Kohlberg posed moral dilemmas (for example, whether a person should steal medicine to save a loved one’s life) and asked children, adolescents, and adults whether the action was right or wrong. His analysis of their answers led him to propose three basic levels of moral thinking: preconventional, conventional, and postconventional (TABLE 4.2). Kohlberg claimed these levels form a moral ladder. As with all stage theories, the sequence is unvarying. We begin on the bottom rung and ascend to varying heights. Kohlberg’s critics have noted that his postconventional stage is culturally limited, appearing mostly among people who prize individualism (Eckensberger, 1994; Miller & Bersoff, 1995).

Table 4.2: Kohlberg’s Levels of Moral Thinking

Level (approximate age)

Focus

Example

Preconventional morality(before age 9)

Self-interest; obey rules to avoid punishment or gain concrete rewards.

“If you save your dying wife, you’ll be a hero.”

Conventional morality(early adolescence)

Uphold laws and rules to gain social approval or maintain social order.

“If you steal the drug for her, everyone will think you’re a criminal.”

Postconventional morality(adolescence and beyond)

Actions reflect belief in basic rights and self-defined ethical principles.

“People have a right to live.”

Moral Intuition

Psychologist Jonathan Haidt (2002, 2006, 2010) believes that much of our morality is rooted in moral intuitions—“quick gut feelings, or affectively laden intuitions.” In this intuitionist view, the mind makes moral judgments as it makes aesthetic judgments—quickly and automatically. We feeldisgust when seeing people engaged in degrading or subhuman acts. Even a disgusting taste in the mouth heightens people’s disgust over various moral digressions (Eskine et al., 2011). We feelelevation—a tingly, warm, glowing feeling in the chest—when seeing people display exceptional generosity, compassion, or courage. Such feelings in turn trigger moral reasoning, says Haidt.

One woman recalled driving through her snowy neighborhood with three young men as they passed “an elderly woman with a shovel in her driveway. I did not think much of it, when one of the guys in the back asked the driver to let him off there.…When I saw him jump out of the back seat and approach the lady, my mouth dropped in shock as I realized that he was offering to shovel her walk for her.” Witnessing this unexpected goodness triggered elevation: “I felt like jumping out of the car and hugging this guy. I felt like singing and running, or skipping and laughing. I felt like saying nice things about people” (Haidt, 2000).

“Could human morality really be run by the moral emotions,” Haidt wonders, “while moral reasoning struts about pretending to be in control?” Consider the desire to punish. Laboratory games reveal that the desire to punish wrongdoings is mostly driven not by reason (such as an objective calculation that punishment deters crime) but rather by emotional reactions, such as moral outrage (Darley, 2009). After the emotional fact, moral reasoning—our mind’s press secretary—aims to convince us and others of the logic of what we have intuitively felt.

This intuitionist perspective on morality finds support in a study of moral paradoxes. Imagine seeing a runaway trolley headed for five people. All will certainly be killed unless you throw a switch that diverts the trolley onto another track, where it will kill one person. Should you throw the switch? Most say Yes. Kill one, save five.

Now imagine the same dilemma, except that your opportunity to save the five requires you to push a large stranger onto the tracks, where he will die as his body stops the trolley. Kill one, save five? The logic is the same, but most say No. Seeking to understand why, a Princeton research team led by Joshua Greene (2001) used brain imaging to spy on people’s neural responses as they contemplated such dilemmas. Only when given the body-pushing type of moral dilemma did their brain’s emotion areas activate. Despite the identical logic, the personal dilemma engaged emotions that altered moral judgment.

While the new research illustrates the many ways moral intuitions trump moral reasoning, other research reaffirms the importance of moral reasoning. The religious and moral reasoning of the Amish, for example, shapes their practices of forgiveness, communal life, and modesty (Narvaez, 2010). Joshua Greene (2010) likens our moral cognition to a camera. Usually, we rely on the automatic point-and-shoot mode. But sometimes we use reason to manually override the camera’s automatic impulse.

Moral Action

Our moral thinking and feeling surely affect our moral talk. But sometimes talk is cheap and emotions are fleeting. Morality involves doing the right thing, and what we do also depends on social influences. As political theorist Hannah Arendt (1963) observed, many Nazi concentration camp guards during World War II were ordinary “moral” people who were corrupted by a powerfully evil situation.

“It is a delightful harmony when doing and saying go together.”

Michel Eyquem de Montaigne (1533–1592)

Today’s character education programs tend to focus on the whole moral package—thinking, feeling, and doing the right thing. Research has demonstrated that as children’s thinking matures, their behavior also becomes less selfish and more caring (Krebs & Van Hesteren, 1994; Miller et al., 1996). Programs now also teach children empathy for others’ feelings, and the self-discipline needed to restrain one’s own impulses—to delay small gratifications now to enable bigger rewards later. Those who have learned to delay gratification have become more socially responsible, academically successful, and productive (Funder & Block, 1989; Mischel et al., 1988, 1989). In service-learning programs, where teens have tutored, cleaned up their neighborhoods, and assisted older adults, their sense of competence and desire to serve has increased, and their school absenteeism and dropout rates have diminished (Andersen, 1998; Piliavin, 2003). Moral action feeds moral attitudes.

Social Development

4-13: What are the social tasks and challenges of adolescence?

“Somewhere between the ages of 10 and 13 (depending on how hormone-enhanced their beef was), children entered adolescence, a.k.a. ‘the de-cutening.’”

Jon Stewart et al., Earth (The Book), 2010

Theorist Erik Erikson (1963) contended that each stage of life has its own psychosocial task, a crisis that needs resolution. Young children wrestle with issues of trust, then autonomy (independence), then initiative. School-age children strive for competence, feeling able and productive. The adolescent’s task is to synthesize past, present, and future possibilities into a clearer sense of self (TABLE 4.3). Adolescents wonder, “Who am I as an individual? What do I want to do with my life? What values should I live by? What do I believe in?” Erikson called this quest the adolescent’s search for identity.

Table 4.3: Erikson’s Stages of Psychosocial Development

Stage (approximate age)

Issue

Description of Task

Infancy (to 1 year)

Trust vs. mistrust

If needs are dependably met, infants develop a sense of basic trust.

Toddlerhood (1 to 3 years)

Autonomy vs. shame and doubt

Toddlers learn to exercise their will and do things for themselves, or they doubt their abilities.

Preschool (3 to 6 years)

Initiative vs. guilt

Preschoolers learn to initiate tasks and carry out plans, or they feel guilty about their efforts to be independent.

Elementary school (6 years to puberty)

Competence vs. inferiority

Children learn the pleasure of applying themselves to tasks, or they feel inferior.

Adolescence (teen years into 20s)

Identity vs. role confusion

Teenagers work at refining a sense of self by testing roles and then integrating them to form a single identity, or they become confused about who they are.

Young adulthood(20s to early 40s)

Intimacy vs. isolation

Young adults struggle to form close relationships and to gain the capacity for intimate love, or they feel socially isolated.

Middle adulthood(40s to 60s)

Generativity vs. stagnation

In middle age, people discover a sense of contributing to the world, usually through family and work, or they may feel a lack of purpose.

Late adulthood (late 60s and up)

Integrity vs. despair

Reflecting on his or her life, an older adult may feel a sense of satisfaction or failure.

Forming an Identity

To refine their sense of identity, adolescents in individualistic cultures usually try out different “selves” in different situations. They may act out one self at home, another with friends, and still another at school or on Facebook. If two situations overlap—as when a teenager brings friends home—the discomfort can be considerable. The teen asks, “Which self should I be? Which is the real me?” The resolution is a self-definition that unifies the various selves into a consistent and comfortable sense of who one is—an identity.

identity our sense of self; according to Erikson, the adolescent’s task is to solidify a sense of self by testing and integrating various roles.

“Self-consciousness, the recognition of a creature by itself as a ‘self,’ [cannot] exist except in contrast with an ‘other,’ a something which is not the self.”

C. S. Lewis, The Problem of Pain, 1940

For both adolescents and adults, group identities are often formed by how we differ from those around us. When living in Britain, I become conscious of my Americanness. When spending time with my daughter in Africa, I become conscious of my minority White race. When surrounded by women, I am mindful of my gender identity. For international students, for those of a minority ethnic group, for people with a disability, for those on a team, a social identity often forms around their distinctiveness.

social identity the “we” aspect of our self-concept; the part of our answer to “Who am I?” that comes from our group memberships.

Erikson noticed that some adolescents forge their identity early, simply by adopting their parents’ values and expectations. (Traditional, less individualistic cultures teach adolescents who they are, rather than encouraging them to decide on their own.) Other adolescents may adopt the identity of a particular peer group—jocks, preps, geeks, goths.

Most young people do develop a sense of contentment with their lives. When American teens were asked whether a series of statements described them, 81 percent said Yes to “I would choose my life the way it is right now.” The other 19 percent agreed that “I wish I were somebody else” (Lyons, 2004). Reflecting on their existence, 75 percent of American collegians say they “discuss religion/spirituality” with friends, “pray,” and agree that “we are all spiritual beings” and “search for meaning/purpose in life” (Astin et al., 2004; Bryant & Astin, 2008). This would not surprise Stanford psychologist William Damon and his colleagues (2003), who have contended that a key task of adolescence is to achieve a purpose—a desire to accomplish something personally meaningful that makes a difference to the world beyond oneself.

Who shall I be today?

Several nationwide studies indicate that young Americans’ self-esteem falls during the early to mid-teen years, and, for girls, depression scores often increase. But then self-image rebounds during the late teens and twenties (Erol & Orth, 2011; Robins et al., 2002; Twenge & Nolen-Hoeksema, 2002). Late adolescence is also a time when agreeableness and emotional stability scores increase (Klimstra et al., 2009).

These are the years when many people in industrialized countries begin exploring new opportunities by attending college or working full time. Many college seniors have achieved a clearer identity and a more positive self-concept than they had as first-year students (Waterman, 1988). Collegians who have achieved a clear sense of identity are less prone to alcohol misuse (Bishop et al., 2005).

Erikson contended that adolescent identity formation (which continues into adulthood) is followed in young adulthood by a developing capacity for intimacy, the ability to form emotionally close relationships. Romantic relationships, which tend to be emotionally intense, are reported by some two in three North American 17-year-olds, but fewer among those in collectivist countries such as China (Collins et al., 2009; Li et al., 2010). Those who enjoy high-quality (intimate, supportive) relationships with family and friends tend also to enjoy similarly high-quality romantic relationships in adolescence, which set the stage for healthy adult relationships. Such relationships are, for most of us, a source of great pleasure.

intimacy in Erikson’s theory, the ability to form close, loving relationships; a primary developmental task in late adolescence and early adulthood.

Parent and Peer Relationships

4-14: How do parents and peers influence adolescents?

As adolescents in Western cultures seek to form their own identities, they begin to pull away from their parents (Shanahan et al., 2007). The preschooler who can’t be close enough to her mother, who loves to touch and cling to her, becomes the 14-year-old who wouldn’t be caught dead holding hands with Mom. The transition occurs gradually, but this period is typically a time of diminishing parental influence and growing peer influence.

“Men resemble the times more than they resemble their fathers.”

Ancient Arab proverb

As Aristotle long ago recognized, we humans are “the social animal.” At all ages, but especially during childhood and adolescence, we seek to fit in with our groups and are influenced by them (Harris, 1998, 2000):

  •  Children who hear English spoken with one accent at home and another in the neighborhood and at school will invariably adopt the accent of their peers, not their parents. Accents (and slang) reflect culture, “and children get their culture from their peers,” noted Judith Rich Harris (2007).

  •  Teens who start smoking typically have friends who model smoking, suggest its pleasures, and offer cigarettes (J. S. Rose et al., 1999; R. J. Rose et al., 2003). Part of this peer similarity may result from a selection effect, as kids seek out peers with similar attitudes and interests. Those who smoke (or don’t) may select as friends those who also smoke (or don’t).

  •  When Mihaly Csikszentmihalyi [chick-SENT-me-hi] and Jeremy Hunter (2003) used a beeper to sample the daily experiences of American teens, they found them unhappiest when alone and happiest when with friends.

By adolescence, parent-child arguments occur more often, usually over mundane things—household chores, bedtime, homework (Tesser et al., 1989). Conflict during the transition to adolescence tends to be greater with first-born than with second-born children, and greater with mothers than with fathers (Burk et al., 2009; Shanahan et al., 2007).

For a minority of parents and their adolescents, differences lead to real splits and great stress (Steinberg & Morris, 2001). But most disagreements are at the level of harmless bickering. And most adolescents—6000 of them in 10 countries, from Australia to Bangladesh to Turkey—have said they like their parents (Offer et al., 1988). “We usually get along but …,” adolescents often reported (Galambos, 1992; Steinberg, 1987).

“I love u guys.”

Emily Keyes’ final text message to her parents before dying in a Colorado school shooting, 2006

Positive parent-teen relations and positive peer relations often go hand in hand. High school girls who had the most affectionate relationships with their mothers tended also to enjoy the most intimate friendships with girlfriends (Gold & Yanof, 1985). And teens who felt close to their parents have tended to be healthy and happy and to do well in school (Resnick et al., 1997). Of course, we can state this correlation the other way: Misbehaving teens are more likely to have tense relationships with parents and other adults.

Although heredity does much of the heavy lifting in forming individual temperament and personality differences, parents and peers influence teen’s behaviors and attitudes.

Most teens are herd animals, talking, dressing, and acting more like their peers than their parents. What their friends are, they often become, and what “everybody’s doing,” they often do. In teen calls to hotline counseling services, peer relationships have been the most discussed topic (Boehm et al., 1999). In 2008, according to a Nielsen study, the average American 13- to 17-year-old sent or received more than 1700 text messages a month (Steinhauer & Holson, 2008). Many adolescents become absorbed by social networking, sometimes with a compulsive use that produces “Facebook fatigue.” Online communication stimulates intimate self-disclosure—both for better (support groups) and for worse (online predators and extremist groups) (Subrahmanyam & Greenfield, 2008; Valkenburg & Peter, 2009).

For those who feel excluded by their peers, the pain is acute. “The social atmosphere in most high schools is poisonously clique-driven and exclusionary,” observed social psychologist Elliot Aronson (2001). Most excluded “students suffer in silence.…A small number act out in violent ways against their classmates.” Those who withdraw are vulnerable to loneliness, low self-esteem, and depression (Steinberg & Morris, 2001). Peer approval matters.

Teens have tended to see their parents as having more influence in other areas—for example, in shaping their religious faith and in thinking about college and career choices (Emerging Trends, 1997). A Gallup Youth Survey revealed that most shared their parent’s political views (Lyons, 2005).

THINKING CRITICALLY ABOUT: How Much Credit or Blame Do Parents Deserve?

In procreation, a woman and a man shuffle their gene decks and deal a life-forming hand to their child-to-be, who is then subjected to countless influences beyond their control. Parents, nonetheless, feel enormous satisfaction in their children’s successes, and feel guilt or shame over their failures. They beam over the child who wins an award. They wonder where they went wrong with the child who is repeatedly called into the principal’s office. Freudian psychiatry and psychology encouraged such ideas, by blaming problems from asthma to schizophrenia on “bad mothering,” and society has reinforced parent blaming. Believing that parents shape their offspring as a potter molds clay, people readily praise parents for their children’s virtues and blame them for their children’s vices. Popular culture endlessly proclaims the psychological harm toxic parents inflict on their fragile children. No wonder having and raising children can seem so risky.

But do parents really produce future adults with an inner wounded child by being (take your pick from the toxic-parenting lists) overbearing—or uninvolved? Pushy—or ineffectual? Overprotective—or distant? Are children really so easily wounded? If so, should we then blame our parents for our failings, and ourselves for our children’s failings? Or does talk of wounding fragile children through normal parental mistakes trivialize the brutality of real abuse?

Parents do matter. The power of parenting is clearest at the extremes: the abused children who become abusive, the neglected who become neglectful, the loved but firmly handled who become self-confident and socially competent. The power of the family environment also appears in the remarkable academic and vocational successes of children of people who fled from Vietnam and Cambodia—successes attributed to close-knit, supportive, even demanding families (Caplan et al., 1992).

Yet in personality measures, shared environmental influences from the womb onward typically account for less than 10 percent of children’s differences. In the words of behavior geneticists Robert Plomin and Denise Daniels (1987; Plomin, 2011), “Two children in the same family are [apart from their shared genes] as different from one another as are pairs of children selected randomly from the population.” To developmental psychologist Sandra Scarr (1993), this implied that “parents should be given less credit for kids who turn out great and blamed less for kids who don’t.” Knowing children are not easily sculpted by parental nurture, perhaps parents can relax a bit more and love their children for who they are.

Howard Gardner (1998) has concluded that parents and peers are complementary:

Peer power

Parents are more important when it comes to education, discipline, responsibility, orderliness, charitableness, and ways of interacting with authority figures. Peers are more important for learning cooperation, for finding the road to popularity, for inventing styles of interaction among people of the same age. Youngsters may find their peers more interesting, but they will look to their parents when contemplating their own futures. Moreover, parents [often] choose the neighborhoods and schools that supply the peers.

This power to select a child’s neighborhood and schools gives parents an ability to influence the culture that shapes the child’s peer group. And because neighborhood influences matter, parents may want to become involved in intervention programs that aim at a whole school or neighborhood. If the vapors of a toxic climate are seeping into a child’s life, that climate—not just the child—needs reforming. Even so, peers are but one medium of cultural influence. As an African proverb declares, “It takes a village to raise a child.”

Emerging Adulthood

4-15: What is emerging adulthood?

In the Western world, adolescence now roughly corresponds to the teen years. At earlier times, and in other parts of the world today, this slice of life has been much smaller (Baumeister & Tice, 1986). Shortly after sexual maturity, young people would assume adult responsibilities and status. The event might be celebrated with an elaborate initiation—a public rite of passage. The new adult would then work, marry, and have children.

When schooling became compulsory in many Western countries, independence was put on hold until after graduation. From Europe to Australia, adolescents are now taking more time to establish themselves as adults. In the United States, for example, the average age at first marriage has increased more than 4 years since 1960 (to 28 for men, 26 for women). In 1960, three in four women and two in three men had, by age 30, finished school, left home, become financially independent, married, and had a child. Today, fewer than half of 30-year-old women and one-third of men have achieved these five milestones (Henig, 2010).

Delayed independence has overlapped with an earlier onset of puberty. Together, later independence and earlier sexual maturity have widened the once-brief interlude between biological maturity and social independence. In prosperous communities, the time from 18 to the mid-twenties is an increasingly not-yet-settled phase of life, which some now call emerging adulthood (Arnett, 2006, 2007; Reitzle, 2006). No longer adolescents, these emerging adults, having not yet assumed full adult responsibilities and independence, feel “in between.” After high school, those who enter the job market or go to college may be managing their own time and priorities more than ever before. Yet they may be doing so from their parents’ home—unable to afford their own place and perhaps still emotionally dependent as well. Recognizing today’s more gradually emerging adulthood, the U.S. government now allows dependent children up to age 26 to remain on their parents’ health insurance (Cohen, 2010).


In the 1890s, the average interval between a woman’s first menstrual period and marriage, which typically marked a transition to adulthood, was about 7 years; in industrialized countries today it is about 12 years (Guttmacher, 1994, 2000). Although many adults are unmarried, later marriage combines with prolonged education and earlier menarche to help stretch out the transition to adulthood.

emerging adulthood for some people in modern cultures, a period from the late teens to mid-twenties, bridging the gap between adolescent dependence and full independence and responsible adulthood.

Reflections on Continuity and Stages

Let’s pause now to reflect on the second developmental issue introduced at the beginning of this chapter: continuity and stages. Do adults differ from infants as a giant redwood differs from its seedling—a difference created by gradual, cumulative growth? Or do they differ as a butterfly differs from a caterpillar—a difference of distinct stages?

Generally speaking, researchers who emphasize experience and learning see development as a slow, continuous shaping process. Those who emphasize biological maturation tend to see development as a sequence of genetically predisposed stages or steps: Although progress through the various stages may be quick or slow, everyone passes through the stages in the same order.

Are there clear-cut stages of psychological development, as there are physical stages such as walking before running? We have considered the stage theories of Jean Piaget on cognitive development, Lawrence Kohlberg on moral development, and Erik Erikson on psychosocial development. And we have seen their stage theories criticized: Young children have some abilities Piaget attributed to later stages. Kohlberg’s work reflected an individualistic worldview and emphasized thinking over feeling and acting. And, as you will see in the next section, adult life does not progress through a fixed, predictable series of steps. Chance events can influence us in ways we would never have predicted.

Although research casts doubt on the idea that life proceeds through neatly defined, age-linked stages, the concept of stage remains useful. The human brain does experience growth spurts during childhood and puberty that correspond roughly to Piaget’s stages (Thatcher et al., 1987). And stage theories contribute a developmental perspective on the whole life span, by suggesting how people of one age think and act differently when they arrive at a later age.

Adulthood

The unfolding of people’s adult lives continues across the life span. It is, however, more difficult to generalize about adulthood stages than about life’s early years. If you know that James is a 1-year-old and Jamal is a 10-year-old, you could say a great deal about each child. Not so with adults who differ by a similar number of years. The boss may be 30 or 60; the marathon runner may be 20 or 50; the 19-year-old may be a parent who supports a child or a child who receives an allowance. Yet our life courses are in some ways similar. Physically, cognitively, and especially socially, we differ at age 50 from our 25-year-old selves. In the discussion that follows, we recognize these differences and use three terms: early adulthood (roughly twenties and thirties), middle adulthood (to age 65), and late adulthood (the years after 65). Within each of these stages, people will vary widely in physical, psychological, and social development.

Physical Development

4-16: What physical changes occur during middle and late adulthood?

Like the declining daylight after the summer solstice, our physical abilities—muscular strength, reaction time, sensory keenness, and cardiac output—all begin an almost imperceptible decline in our mid-twenties. Athletes are often the first to notice. World-class sprinters and swimmers peak by their early twenties. Women—who mature earlier than men—also peak earlier. But most of us—especially those of us whose daily lives do not require top physical performance—hardly perceive the early signs of decline.

Physical Changes in Middle Adulthood

Post-40 athletes know all too well that physical decline gradually accelerates. During early and middle adulthood, physical vigor has less to do with age than with a person’s health and exercise habits. Many of today’s physically fit 50-year-olds run 4 miles with ease, while sedentary 25-year-olds find themselves huffing and puffing up two flights of stairs.

Adult abilities vary widely

Aging also brings a gradual decline in fertility, especially for women. For a 35- to 39-year-old woman, the chances of getting pregnant after a single act of intercourse are only half those of a woman 19 to 26 (Dunson et al., 2002). Men experience a gradual decline in sperm count, testosterone level, and speed of erection and ejaculation. Women experience menopause, as menstrual cycles end, usually within a few years of age 50. Expectations and attitudes influence the emotional impact of this event. Is it a sign of lost femininity and growing old? Or is it liberation from menstrual periods and fears of pregnancy? For men, too, expectations can influence perceptions. Some experience distress related to a perception of declining virility and physical capacities, but most age without such problems.

menopause the time of natural cessation of menstruation; also refers to the biological changes a woman experiences as her ability to reproduce declines.

With age, sexual activity lessens. Nevertheless, most men and women remain capable of satisfying sexual activity, and most express satisfaction with their sex life. This was true of 70 percent of Canadians surveyed (ages 40 to 64) and 75 percent of Finns (ages 65 to 74) (Kontula & Haavio-Mannila, 2009; Wright, 2006). In another survey, 75 percent of respondents reported being sexually active into their eighties (Schick et al., 2010). And in an American Association of Retired Persons sexuality survey, it was not until age 75 or older that most women and nearly half of men reported little sexual desire (DeLamater & Sill, 2005). Given good health and a willing partner, the flames of desire, though simmered down, live on. As Alex Comfort (1992, p. 240) jested, “The things that stop you having sex with age are exactly the same as those that stop you riding a bicycle (bad health, thinking it looks silly, no bicycle).”

Physical Changes in Later Life

Is old age “more to be feared than death” (Juvenal, Satires)? Or is life “most delightful when it is on the downward slope” (Seneca, Epistulae ad Lucilium)? What is it like to grow old?

Strength and Stamina

Although physical decline begins in early adulthood, we are not usually acutely aware of it until later life, when the stairs get steeper, the print gets smaller, and other people seem to mumble more. Muscle strength, reaction time, and stamina diminish in late adulthood. As a lifelong basketball player, I find myself increasingly not racing for that loose ball. But even diminished vigor is sufficient for normal activities. Moreover, exercise slows aging. Active older adults tend to be mentally quick older adults. Physical exercise not only enhances muscles, bones, and energy and helps to prevent obesity and heart disease, it also stimulates brain cell development and neural connections, thanks perhaps to increased oxygen and nutrient flow (Erickson et al., 2010; Pereira et al., 2007).

“For some reason, possibly to save ink, the restaurants had started printing their menus in letters the height of bacteria.”

Dave Barry, Dave Barry Turns Fifty, 1998

Sensory Abilities

With age, visual sharpness diminishes, and distance perception and adaptation to light-level changes are less acute. The eye’s pupil shrinks and its lens becomes less transparent, reducing the amount of light reaching the retina: A 65-year-old retina receives only about one-third as much light as its 20-year-old counterpart (Kline & Schieber, 1985). Thus, to see as well as a 20-year-old when reading or driving, a 65-year-old needs three times as much light—a reason for buying cars with untinted windshields. This also explains why older people sometimes ask younger people, “Don’t you need better light for reading?”

Most stairway falls taken by older people occur on the top step, precisely where the person typically descends from a window-lit hallway into the darker stairwell (Fozard & Popkin, 1978). Our knowledge of aging could be used to design environments that would reduce such accidents (National Research Council, 1990).

The senses of smell and hearing also diminish. In Wales, teens’ loitering around a convenience store has been discouraged by a device that emits an aversive high-pitched sound almost no one over 30 can hear (Lyall, 2005). Some students have also used that pitch to their advantage with cell-phone ringtones their instructors cannot hear (Vitello, 2006).

Health

For those growing older, there is both bad and good news about health. The bad news: The body’s disease-fighting immune system weakens, making older adults more susceptible to life-threatening ailments, such as cancer and pneumonia. The good news: Thanks partly to a lifetime’s accumulation of antibodies, people over 65 suffer fewer short-term ailments, such as common flu and cold viruses. One study found they were half as likely as 20-year-olds and one-fifth as likely as preschoolers to suffer upper respiratory flu each year (National Center for Health Statistics, 1990).

The Aging Brain

Up to the teen years, we process information with greater and greater speed (Fry & Hale, 1996; Kail, 1991). But compared with teens and young adults, older people take a bit more time to react, to solve perceptual puzzles, even to remember names (Bashore et al., 1997; Verhaeghen & Salthouse, 1997). The neural processing lag is greatest on complex tasks (Cerella, 1985; Poon, 1987). At video games, most 70-year-olds are no match for a 20-year-old.

Slower neural processing combined with diminished sensory abilities can increase accident risks. Fatal accident rates per mile driven increase sharply after age 75. By age 85, they exceed the 16-year-old level. Nevertheless, because older people drive less, they account for fewer than 10 percent of crashes (Coughlin et al., 2004).


Slowing reactions contribute to increased accident risks among those 75 and older, and their greater fragility increases their risk of death when accidents happen (NHTSA, 2000). Would you favor driver exams based on performance, not age, to screen out those whose slow reactions or sensory impairments indicate accident risk?

How old does a person have to be before you think of him or her as old? Depends on who you ask. For 18- to 29-year-olds, 67 was old. For those 60 and over, old was 76 (Yankelovich, 1995).

Brain regions important to memory begin to atrophy during aging (Schacter, 1996). In young adulthood, a small, gradual net loss of brain cells begins, contributing by age 80 to a brain-weight reduction of 5 percent or so. Earlier, we noted that late-maturing frontal lobes help account for teen impulsivity. Late in life, atrophy of the inhibition-controlling frontal lobes seemingly explains older people’s occasional blunt questions and comments (“Have you put on weight?”) (von Hippel, 2007).

As noted earlier, exercise helps counteract some effects of brain aging. It aids memory by stimulating the development of neural connections and by promoting neurogenesis, the birth of new nerve cells, in the hippocampus. Sedentary older adults randomly assigned to aerobic exercise programs exhibit enhanced memory, sharpened judgment, and reduced risk of significant cognitive decline (Colcombe et al., 2004; Liang et al., 2010; Nazimek, 2009).

“I am still learning.”

Michelangelo, 1560, at age 85

Exercise also helps maintain the telomeres, which protect the ends of chromosomes (Cherkas et al., 2008; Erickson, 2009; Pereira et al., 2007). With age, telomeres wear down, much as the tip of a shoelace frays. This wear is accentuated by smoking, obesity, or stress. As telomeres shorten, aging cells may die without being replaced with perfect genetic replicas (Epel, 2009).

The message for seniors is clear: We are more likely to rust from disuse than to wear out from overuse.

Cognitive Development

4-17: How does memory change with age?

Among the most intriguing developmental psychology questions is whether adult cognitive abilities, such as memory, intelligence, and creativity, parallel the gradually accelerating decline of physical abilities.

As we age, we remember some things well. Looking back in later life, people asked to recall the one or two most important events over the last half-century tend to name events from their teens or twenties (Conway et al., 2005; Rubin et al., 1998). Whatever people experience around this time of life—the events of 9/11, the civil rights movement, World War II—becomes pivotal (Pillemer, 1998; Schuman & Scott, 1989). Our teens and twenties are a time of so many memorable “firsts”—first date, first job, first day at college or university, first meeting of in-laws.

Early adulthood is indeed a peak time for some types of learning and remembering. In one test of recall, people (1205 of them) watched videotapes as 14 strangers said their names, using a common format: “Hi, I’m Larry” (Crook & West, 1990). Then those strangers reappeared and gave additional details. For example, saying “I’m from Philadelphia” provided visual and voice cues for remembering the person’s name. After a second and third replay of the introductions, everyone remembered more names, but younger adults consistently surpassed older adults.

Perhaps it is not surprising, then, that nearly two-thirds of people over age 40 say their memory is worse than it was 10 years ago (KRC, 2001). In fact, how well older people remember depends on the task. In another experiment (Schonfield & Robertson, 1966), when asked to recognize 24 words they had earlier tried to memorize, people showed only a minimal decline in memory. When asked to recall that information without clues, the decline was greater.

In our capacity to learn and remember, as in other areas of development, we differ. Younger adults vary in their abilities to learn and remember, but 70-year-olds vary much more. “Differences between the most and least able 70-year-olds become much greater than between the most and least able 50-year-olds,” reports Oxford researcher Patrick Rabbitt (2006). Some 70-year-olds perform below nearly all 20-year-olds; other 70-year-olds match or outdo the average 20-year-old.

If you are within 5 years of 20, what experiences from the past year will you likely never forget? (This is the time of your life you may best remember when you are 50.)

No matter how quick or slow we are, remembering seems also to depend on the type of information we are trying to retrieve. If the information is meaningless—nonsense syllables or unimportant events—then the older we are, the more errors we are likely to make. If the information is meaningful, older people’s rich web of existing knowledge will help them to hold it. But they may take longer than younger adults to produce the words and things they know: Quick-thinking game show winners are usually young or middle-aged adults (Burke & Shafto, 2004). Older people’s capacity to learn and remember skills declines less than their verbal recall (Graf, 1990; Labouvie-Vief & Schell, 1982; Perlmutter, 1983).

Chapter 9, Thinking, Language, and Intelligence, explores another dimension of cognitive development: intelligence. As we will see, cross-sectional studies (comparing people of different ages) and longitudinal studies (restudying people over time) have identified mental abilities that do and do not change as people age. Age is less a predictor of memory and intelligence than is proximity to death. Tell me whether someone is 8 months or 8 years from death and, regardless of age, you’ve given me a clue to that person’s mental ability. Especially in the last three or four years of life, cognitive decline typically accelerates (Wilson et al., 2007). Researchers call this near-death drop terminal decline (Backman & MacDonald, 2006).

cross-sectional study a study in which people of different ages are compared with one another.

longitudinal study research in which the same people are restudied and retested over a long period.

Social Development

4-18: What themes and influences mark our social journey from early adulthood to death?

Many differences between younger and older adults are created by significant life events. A new job means new relationships, new expectations, and new demands. Marriage brings the joy of intimacy and the stress of merging two lives. The three years surrounding the birth of a child bring increased life satisfaction for most parents (Dyrdal & Lucas, 2011). The death of a loved one creates an irreplaceable loss. Do these adult life events shape a sequence of life changes?

Adulthood’s Ages and Stages

As people enter their forties, they undergo a transition to middle adulthood, a time when they realize that life will soon be mostly behind instead of ahead of them. Some psychologists have argued that for many the midlife transition is a crisis, a time of great struggle, regret, or even feeling struck down by life. The popular image of the midlife crisis is an early-forties man who forsakes his family for a younger girlfriend and a hot sports car. But the fact—reported by large samples of people—is that unhappiness, job dissatisfaction, marital dissatisfaction, divorce, anxiety, and suicide do not surge during the early forties (Hunter & Sundel, 1989; Mroczek & Kolarz, 1998). Divorce, for example, is most common among those in their twenties, suicide among those in their seventies and eighties. One study of emotional instability in nearly 10,000 men and women found “not the slightest evidence” that distress peaks anywhere in the midlife age range (McCrae & Costa, 1990).

“Midway in the journey of our life I found myself in a dark wood, for the straightway was lost.”

Dante, The Divine Comedy, 1314

For the 1 in 4 adults who report experiencing a life crisis, the trigger is not age but a major event, such as illness, divorce, or job loss (Lachman, 2004). Some middle-aged adults describe themselves as a “sandwich generation,” simultaneously supporting their aging parents and their emerging adult children or grandchildren (Riley & Bowen, 2005).

Life events trigger transitions to new life stages at varying ages. The social clock—the definition of “the right time” to leave home, get a job, marry, have children, or retire—varies from era to era and culture to culture. The once-rigid sequence for many Western women—of student to worker to wife to at-home mom to worker again—has loosened. Contemporary women occupy these roles in any order or all at once. The social clock still ticks, but people feel freer about being out of sync with it.

social clock the culturally preferred timing of social events such as marriage, parenthood, and retirement.

Even chance events can have lasting significance, by deflecting us down one road rather than another (Bandura, 1982). Albert Bandura (2005) recalls the ironic true story of a book editor who came to one of Bandura’s lectures on the “Psychology of Chance Encounters and Life Paths”—and ended up marrying the woman who happened to sit next to him. The sequence that led to my authoring this book (which was not my idea) began with my being seated near, and getting to know, a distinguished colleague at an international conference. Chance events can change our lives.

“The important events of a person’s life are the products of chains of highly improbable occurrences.”

Joseph Traub, “Traub’s Law,” 2003

Adulthood’s Commitments

Two basic aspects of our lives dominate adulthood. Erik Erikson called them intimacy (forming close relationships) and generativity (being productive and supporting future generations). Researchers have chosen various terms—affiliation and achievement, attachment and productivity, connectedness and competence. Sigmund Freud (1935) put it most simply: The healthy adult, he said, is one who can love and work.

Love

We typically flirt, fall in love, and commit—one person at a time. “Pair-bonding is a trademark of the human animal,” observed anthropologist Helen Fisher (1993). From an evolutionary perspective, relatively monogamous pairing makes sense: Parents who cooperated to nurture their children to maturity were more likely to have their genes passed along to posterity than were parents who didn’t.

Adult bonds of love are most satisfying and enduring when marked by a similarity of interests and values, a sharing of emotional and material support, and intimate self-disclosure. Couples who seal their love with commitment—via (in one Vermont study) marriage for heterosexual couples and civil unions for homosexual couples—more often endure (Balsam et al., 2008). Marriage bonds are especially likely to last when couples marry after age 20 and are well educated. Compared with their counterparts of 50 years ago, people in Western countries are better educated and marrying later. Yet, ironically, they are nearly twice as likely to divorce. (Both Canada and the United States now have about one divorce for every two marriages, and in Europe, divorce is only slightly less common.) The divorce rate partly reflects women’s lessened economic dependence and men’s and women’s rising expectations. We now hope not only for an enduring bond, but also for a mate who is a wage earner, caregiver, intimate friend, and warm and responsive lover.

Love

Might test driving life together in a “trial marriage” minimize divorce risk? In one Gallup survey of American twenty-somethings, 62 percent thought it would (White-head & Popenoe, 2001). In reality, in Europe, Canada, and the United States, those who cohabit before marriage have had higher rates of divorce and marital dysfunction than those who did not cohabit (Jose et al., 2009). The risk appears greatest for those cohabiting prior to engagement (Goodwin et al., 2010; Rhoades et al., 2009).

American children born to cohabiting parents are about five times more likely to experience their parents’ separation than are children born to married parents (Osborne et al., 2007). Two factors contribute. First, cohabiters tend to be initially less committed to the ideal of enduring marriage. Second, they become even less marriage supporting while cohabiting.

What do you think? Does marriage correlate with happiness because marital support and intimacy breed happiness, because happy people more often marry and stay married, or both?

Nonetheless, the institution of marriage endures. Worldwide, reports the United Nations, 9 in 10 heterosexual adults marry. And marriage is a predictor of happiness, sexual satisfaction, income, and physical and mental health (Scott et al., 2010). National Opinion Research Center surveys of nearly 50,000 Americans since 1972 reveal that 40 percent of married adults, though only 23 percent of unmarried adults, have reported being “very happy.” Lesbian couples, too, report greater well-being than those who are alone (Peplau & Fingerhut, 2007; Wayment & Peplau, 1995). Moreover, neighborhoods with high marriage rates typically have low rates of social pathologies such as crime, delinquency, and emotional disorders among children (Myers & Scanzoni, 2005).

“Our love for children is so unlike any other human emotion. I fell in love with my babies so quickly and profoundly, almost completely independently of their particular qualities. And yet 20 years later I was (more or less) happy to see them go—I had to be happy to see them go. We are totally devoted to them when they are little and yet the most we can expect in return when they grow up is that they regard us with bemused and tolerant affection.”

Developmental psychologist Alison Gopnik, “The Supreme Infant,” 2010

Marriages that last are not always devoid of conflict. Some couples fight but also shower each other with affection. Other couples never raise their voices yet also seldom praise each other or nuzzle. Both styles can last. After observing the interactions of 2000 couples, John Gottman (1994) reported one indicator of marital success: at least a five-to-one ratio of positive to negative interactions. Stable marriages provide five times more instances of smiling, touching, complimenting, and laughing than of sarcasm, criticism, and insults. So, if you want to predict which newlyweds will stay together, don’t pay attention to how passionately they are in love. The couples who make it are more often those who refrain from putting down their partners. To prevent a cancerous negativity, successful couples learn to fight fair (to state feelings without insulting) and to steer conflict away from chaos with comments like “I know it’s not your fault” or “I’ll just be quiet for a moment and listen.”

Often, love bears children. For most people, this most enduring of life changes is a happy event. “I feel an overwhelming love for my children unlike anything I feel for anyone else,” said 93 percent of American mothers in a national survey (Erickson & Aird, 2005). Many fathers feel the same. A few weeks after the birth of my first child I was suddenly struck by a realization: “So this is how my parents felt about me?”

When children begin to absorb time, money, and emotional energy, satisfaction with the marriage itself may decline (Doss et al., 2009). This is especially likely among employed women who, more than they expected, carry the traditional burden of doing the chores at home. Putting effort into creating an equitable relationship can thus pay double dividends: a more satisfying marriage, which breeds better parent-child relations (Erel & Burman, 1995).

Although love bears children, children eventually leave home. This departure is a significant and sometimes difficult event. For most people, however, an empty nest is a happy place (Adelmann et al., 1989; Gorchoff et al., 2008). Many parents experience a “postlaunch honeymoon,” especially if they maintain close relationships with their children (White & Edwards, 1990). As Daniel Gilbert (2006) has said, “The only known symptom of ‘empty nest syndrome’ is increased smiling.”

If you have left home, did your parents suffer the “empty nest syndrome”—a feeling of distress focusing on a loss of purpose and relationship? Did they mourn the lost joy of listening for you in the wee hours of Saturday morning? Or did they seem to discover a new freedom, relaxation, and (if together) renewed satisfaction with their own relationship?

Work

For many adults, the answer to “Who are you?” depends a great deal on the answer to “What do you do?” For women and men, choosing a career path is difficult, especially during bad economic times. Even in the best of times, few students in their first two years of college or university can predict their later careers.

In the end, happiness is about having work that fits your interests and provides you with a sense of competence and accomplishment. It is having a close, supportive companion who cheers your accomplishments (Gable et al., 2006). And for some, it includes having children who love you and whom you love and feel proud of.

Well-Being Across the Life Span

4-19: Do self-confidence and life satisfaction vary with life stages?

“When you were born, you cried and the world rejoiced. Live your life in a manner so that when you die the world cries and you rejoice.”

Native American proverb

To live is to grow older. This moment marks the oldest you have ever been and the youngest you will henceforth be. That means we all can look back with satisfaction or regret, and forward with hope or dread. When asked what they would have done differently if they could relive their lives, people’s most common answer has been “Taken my education more seriously and worked harder at it” (Kinnier & Metha, 1989; Roese & Summerville, 2005). Other regrets—“I should have told my father I loved him,” “I regret that I never went to Europe”—have also focused less on mistakes made than on the things one failed to do (Gilovich & Medvec, 1995).

From the teens to midlife, people typically experience a strengthening sense of identity, confidence, and self-esteem (Huang, 2010; Robins & Trzesniewski, 2005). In later life, challenges arise: Income shrinks. Work is often taken away. The body deteriorates. Recall fades. Energy wanes. Family members and friends die or move away. The great enemy, death, looms ever closer. And for those in the terminal decline phase, life satisfaction does decline as death approaches (Gerstorf et al., 2008).

Small wonder that most presume that happiness declines in later life (Lacey et al., 2006). But worldwide, as Gallup researchers discovered, most find that the over-65 years are not notably unhappy. If anything, positive feelings, supported by enhanced emotional control, grow after midlife and negative feelings subside (Stone et al., 2010; Urry & Gross, 2010). Older adults increasingly use words that convey positive emotions (Pennebaker & Stone, 2003), and they attend less and less to negative information. Compared with younger adults, for example, they are slower to perceive negative faces and more attentive to positive news (Carstensen & Mikels, 2005; Scheibe & Carstensen, 2010). Older adults also have fewer problems in their social relationships (Fingerman & Charles, 2010), and they experience less intense anger, stress, and worry (Stone et al., 2010).

The aging brain may help nurture these positive feelings. Brain scans of older adults show that the amygdala, a neural processing center for emotions, responds less actively to negative events (but not to positive events), and it interacts less with the hippocampus, a brain memory-processing center (Mather et al., 2004; St. Jacques et al., 2009; Williams et al., 2006). Brain-wave reactions to negative images also diminish with age (Kisley et al., 2007).

“At 20 we worry about what others think of us. At 40 we don’t care what others think of us. At 60 we discover they haven’t been thinking about us at all.”

Anonymous

Moreover, at all ages, the bad feelings we associate with negative events fade faster than do the good feelings we associate with positive events (Walker et al., 2003). This contributes to most older people’s sense that life, on balance, has been mostly good. Given that growing older is an outcome of living (an outcome most prefer to early dying), the positivity of later life is comforting. Thanks to biological, psychological, and social-cultural influences, more and more people flourish into later life.

“The best thing about being 100 is no peer pressure.”

Lewis W. Kuester, 2005, on turning 100

Death and Dying

4-20: A loved one’s death triggers what range of reactions?

Warning: If you begin reading the next paragraph, you will die.

But of course, if you hadn’t read this, you would still die in due time. Death is our inevitable end. Most of us will also suffer and cope with the deaths of relatives and friends. Usually, the most difficult separation is from a spouse—a loss suffered by five times more women than men. When, as usually happens, death comes at an expected late-life time, grieving may be relatively short-lived.

“Love—why, I’ll tell you what love is: It’s you at 75 and her at 71, each of you listening for the other’s step in the next room, each afraid that a sudden silence, a sudden cry, could mean a lifetime’s talk is over.”

Brian Moore, The Luck of Ginger Coffey, 1960

Grief is especially severe when a loved one’s death comes suddenly and before its expected time on the social clock. The sudden illness or accident claiming a 45-year-old life partner or a child may trigger a year or more of memory-laden mourning that eventually subsides to a mild depression (Lehman et al., 1987).

For some, however, the loss is unbearable. One Danish long-term study of more than 1 million people found that about 17,000 of them had suffered the death of a child under 18. In the five years following that death, 3 percent of them had a first psychiatric hospitalization. This rate was 67 percent higher than the rate recorded for parents who had not lost a child (Li et al., 2005).

Even so, reactions to a loved one’s death range more widely than most suppose. Some cultures encourage public weeping and wailing; others hide grief. Within any culture, individuals differ. Given similar losses, some people grieve hard and long, others less so (Ott et al., 2007). Contrary to popular misconceptions, however,

  •  terminally ill and bereaved people do not go through identical predictable stages, such as denial before anger (Friedman & James, 2008; Nolen-Hoeksema & Larson, 1999). A Yale study following 233 bereaved individuals through time did, however, find that yearning for the loved one reached a high point four months after the loss, with anger peaking, on average, about a month later (Maciejewski et al., 2007).

  •  those who express the strongest grief immediately do not purge their grief more quickly (Bonanno & Kaltman, 1999; Wortman & Silver, 1989).

  •  bereavement therapy and self-help groups offer support, but there is similar healing power in the passing of time, the support of friends, and the act of giving support and help to others (Baddeley & Singer, 2009; Brown et al., 2008; Neimeyer & Carrier, 2009). Grieving spouses who talk often with others or receive grief counseling adjust about as well as those who grieve more privately (Bonanno, 2004; Stroebe et al., 2005).

“Consider, friend, as you pass by, as you are now, so once was I. As I am now, you too shall be. Prepare, therefore, to follow me.”

Scottish tombstone epitaph

We can be grateful for the waning of death-denying attitudes. Facing death with dignity and openness helps people complete the life cycle with a sense of life’s meaningfulness and unity—the sense that their existence has been good and that life and death are parts of an ongoing cycle. Although death may be unwelcome, life itself can be affirmed even at death. This is especially so for people who review their lives not with despair but with what Erik Erikson called a sense of integrity—a feeling that one’s life has been meaningful and worthwhile.

“At 70, I would say the advantage is that you take life more calmly. You know that ‘this, too, shall pass’!”

Eleanor Roosevelt, 1954

Reflections on Stability and Change

It’s time to reflect on the third big developmental issue: As we follow lives through time, do we find more evidence for stability or change? If reunited with a long-lost grade-school friend, do we instantly realize that “it’s the same old Andy”? Or do people we befriend during one period of life seem like strangers at a later period? (At least one acquaintance of mine would choose the second option. He failed to recognize a former classmate at his 40-year college reunion. The aghast classmate eventually pointed out that she was his long-ago first wife.)

Research reveals that we experience both stability and change. As we have seen, some of our characteristics, such as temperament, are very stable:

  •  One study followed 1000 New Zealanders through time, beginning at age 3. Those who had scored low in conscientiousness and self-control as preschoolers were more vulnerable to ill health, substance abuse, arrest, and single parenthood as 32-year-olds (Moffitt et al., 2011).

  •  Another research team interviewed adults who, 40 years earlier, had their talkativeness, impulsiveness, and humility rated by their elementary school teachers (Nave et al., 2010). To a striking extent, their traits persisted.

“As at 7, so at 70,” says a Jewish proverb. The widest smilers in childhood and college photos are, years later, the ones most likely to enjoy enduring marriages (Hertenstein et al., 2009). While 1 in 4 of the weakest college smilers eventually divorced, only 1 in 20 of the widest smilers did so. As people grow older, personality gradually stabilizes (Ferguson, 2010; Hopwood et al., 2011; Kandler et al., 2010). The struggles of the present may be laying a foundation for a happier tomorrow.

We cannot, however, predict all of our eventual traits based on our early years of life (Kagan et al., 1978, 1998). Some traits, such as social attitudes, are much less stable than temperament (Moss & Susman, 1980). Older children and adolescents learn new ways of coping. Although delinquent children have elevated rates of later work problems, substance abuse, and crime, many confused and troubled children blossom into mature, successful adults (Moffitt et al., 2002; Roberts et al., 2001; Thomas & Chess, 1986). Happily for them, life is a process of becoming.

In some ways, we all change with age. Most shy, fearful toddlers begin opening up by age 4, and most people become more conscientious, stable, agreeable, and self-confident in the years after adolescence (Lucas & Donnellan, 2009; Roberts et al., 2003, 2006, 2008; Shaw et al., 2010). Many irresponsible 18-year-olds have matured into 40-year-old business or cultural leaders. (If you are the former, you aren’t done yet.) Such changes can occur without changing a person’s position relative to others of the same age. The hard-driving young adult may mellow by later life, yet still be a relatively driven senior citizen.

Smiles predict marital stability

Life requires both stability and change. Stability provides our identity. It enables us to depend on others and be concerned about the healthy development of the children in our lives. Our trust in our ability to change gives us our hope for a brighter future. It motivates our concerns about present influences and lets us adapt and grow with experience.