You are on page 1of 1

The Neuroscience of 20-

Somethings
Ferris Jabr • August 29, 2012

In the opening scene of Lena Dunham's HBO


series Girls, the Horvaths tell their 24-year-
old daughter Hannah that they will no longer
support her—or, as her mother puts it: "No.
More. Money." A recent college graduate,
Hannah has been living in Brooklyn,
completing an unpaid internship and
working on a series of personal essays. The
Horvaths intend to give Hannah "one final
push" toward, presumably, a lifestyle that
more closely resembles adulthood. Hannah
protests. Her voice quavers. She tells her
parents that she does not want to see them
the following day, even though they are
leaving town soon: "I have work and then I
have a dinner thing and then I am busy—
trying to become who I am."

Across the United States—and in developed


nations around the world—twenty-
somethings like Hannah are taking longer to
finish school, leave home, begin a career,
get married and reach other milestones of
adulthood. These trends are not just
anecdotal; sociologists and psychologists
have gathered supporting data. Robin
Marantz Henig summarizes the patterns in
her 2010 New York Times Magazine feature:

"One-third of people in their 20s move to


a new residence every year. Forty
percent move back home with their
parents at least once. They go through
an average of seven jobs in their 20s,
more job changes than in any other
stretch. Two-thirds spend at least some
time living with a romantic partner
without being married. And marriage
occurs later than ever. The median age at
first marriage in the early 1970s, when
the baby boomers were young, was 21
for women and 23 for men; by 2009 it
had climbed to 26 for women and 28 for
men, five years in a little more than a
generation."

These demographic shifts have transformed


the late teens through mid twenties into a
distinct stage of life according to Jeffrey
Arnett of Clark University, who calls the new
phase "emerging adulthood." Arnett
acknowledges that emerging adulthood is
relevant only to about 18 percent of the
global population, to certain groups of
twenty-somethings in developed nations
such as the United States, Canada, Western
Europe, Japan and Australia. To make some
broad generalizations, people living in the
rest of world—particularly in developing
countries—are much more likely to finish
formal education in their teens and marry by
their early twenties.

Although Arnett primarily studies how


society, culture and the economy have
created emerging adulthood, some
scientists and journalists have wondered
whether biology is involved as well. Henig
writes that some researchers think a lengthy
preamble to adulthood might be "better-
suited to our neurological hard-wiring" and
that the general ambivalence of twenty-
somethings—feeling that they are sort of
adults, but not really adults— "reflects what
is going on in the brain, which is also both
grown-up and not-quite-grown-up." Most
recently, The Wall Street Journal ran an
article recommending that concerned
parents of twenty-somethings "chill out"
because "recent research into how the brain
develops suggests that people are better
equipped to make major life decisions in
their late 20s than earlier in the decade. The
brain, once thought to be fully grown after
puberty, is still evolving into its adult shape
well into a person's third decade, pruning
away unused connections and
strengthening those that remain, scientists
say."

After reading The Wall Street Journal article,


a flock of questions began flapping in my
own twenty-something mind. What does it
mean to have a "fully grown" adult brain
anyways and, if my peers and I do not yet
have such a brain, exactly how un-grown-up
are our noggins, how uncooked our
noodles? Were we neurologically unfit to
make the important decisions about careers
and marriage that some of us have already
made? If emerging adulthood is itself an
emerging phenomenon, what is its precise
relationship to the biology of an organ
whose defining characteristics began
evolving millions of years ago? And does a
Peter Pan brain have any redeeming
qualities?

Budding brains

In an ongoing study that kicked off in 1991,


Jay Giedd of the National Institute of Mental
Health has been tracking the brain
development of nearly 4,000 people ranging
in age from a few days to 96 years. Every
two years, Giedd invites his volunteers to
the lab to scan their brains with magnetic
resonance imaging (MRI). Giedd and his
colleagues have learned that, contrary to
neuroscientists' earliest assumptions, the
brain continues to markedly rewire itself
even after puberty.

Between 12 and 25, the brain changes its


structure in a few important ways. Like an
overeager forest, neurons in the early
adolescent brain become bushier, growing
more and more overlapping branches whose
twigs reach toward one another, nearly
touching except for tiny gaps known as
synapses. When an electrical impulse—or
action potential—reaches a twig, the neuron
flings spurts of chemical messages across
the synapse. Over time, depending on how
teens busy their minds, twigs around the
least used synapses wither, while twigs
flanking the most trafficked synapses grow
thicker, strengthening those connections.
Meanwhile, as neurons in the adolescent
brain make and break connections, glia—
non-firing brain cells—set to work wrapping
neurons in a fatty white tissue known as
myelin, considerably increasing the speed at
which electrical impulses travel along
neurons' branches.

Sign up for Scientific American’s free newsletters.

Although these developmental changes


continue far longer than researchers initially
thought, they are not as dramatic in the
twenties as they are in the teens. "In the
twenties, the brain is definitely still
changing, but it's not rampant biological
change," explains Beatriz Luna of the
University of Pittsburgh. "Most of the
brain's systems are good to go in one's
twenties." In an email message, B.J. Casey
of Weill Cornell Medical College made a
similar remark: "Most of my functional
imaging work shows the greatest brain
changes between 13 and 17 with relative
stability across 20s."

In her own studies, Luna has found that, at


least on certain cognitive tasks, people in
their early twenties perform just as well as
people in their late twenties. She often asks
her volunteers to deliberately look away
from a flashing light on a screen—a test of
impulse inhibition, since flickers attract our
attention. "Ten-year-olds stink at it, failing
about 45 percent of the time," as David
Dobbs put it in his National Geographic
feature. "Teens do much better. In fact, by
age 15 they can score as well as adults if
they're motivated, resisting temptation
about 70 to 80 percent of the time…And by
age 20, their brains respond to this task
much as the adults' do."

In Luna's studies, brain behavior changed in


parallel with improving scores. Older
volunteers showed higher activity in brain
regions involved in identifying errors, such
as the anterior cingulate cortex. Related
research has shown that older adolescents
have stronger bridges of neural tissue
connecting the emotional and motor centers
of their brains with the prefrontal cortex, an
"executive" brain region known for, among
many other things, inhibiting impulses and
tempering bubbling emotions. Luna and
other researchers now think that, more than
the growth of any single brain region, this
increasing interconnectedness
characterizes brain development in the
twenties. Of course, that doesn't mean that
once someone leaves behind their twenties
they will never again lose their cool or act
thoughtlessly instead of prudently.
Individual variation makes all the difference.
Some teens and twenty-somethings are
simply more cautious and composed than
some adults.

To reflect the ongoing structural changes in


the adolescent and twenty-something brain,
many journalists and scientists use words
and phrases like "unfinished," "work in
progress," "under construction" and "half-
baked." Such language implies that the brain
eventually reaches a kind of ideal state when
it is "done." But there is no final, optimal
state. The human brain is not a soufflé that
gradually expands over time and finally
finishes baking at age 30. Yes, we can
identify and label periods of dramatic
development—or windows of heightened
plasticity—but that should not eclipse the
fact that brain changes throughout life.

Studies have confirmed, for example, that


London taxicab drivers grow larger
hippocampi as they learn to navigate
London's convoluted roadways. This growth
in the hippocampus, a brain region essential
for forming new memories, is not explained
by youth: According to the Public Carriage
Office, 98 per cent of London taxi drivers
are over the age of 30 [PDF]. Granted, the
hippocampus is one of only two regions
thought to grow new neurons in adulthood,
but the brain remains remarkably plastic in
other ways too. When one part of the brain
shrivels—say, from stroke or traumatic injury
—nearby regions often adopt their deceased
neighbor's functions. When blind people
learn to use echolocation, areas of their
brains usually devoted to vision learn to
interpret the echoes of clicks and taps
instead. Neuroplasticity is an everyday
phenomenon as well. The adult brain
constantly strengthens and weakens
connections between its cells. In fact,
learning and memory are dependent on such
flexibility. Learning a new language or
picking up an instrument may be easier
when one is young, but adaptability and
creativity do not expire on one's 30th
birthday.

Brains of our past and present

Mapping structural changes in the brain over


time tells us how the brain matures, but not
why it matures that way. To answer why we
have to think about the benefits that
prolonged brain development would have
offered our ancestors. After all, the human
brain's fundamental phases of development
could not have popped into existence in the
last 50 years or even thousands of years ago
—more likely, they evolved at least a couple
million years ago in the Paleolithic, when the
human brain began to increase in size and
morph into the organ we know today.
Keeping the brain extra flexible for a longer
period of time may have provided our
ancestors with more opportunities to quickly
master new skills and adapt to a changing
environment. But taking too long to learn
how to manage emotions, control impulses
and plan ahead may also have impeded
survival.

Painting an accurate tableau of the


Paleolithic lifestyle is difficult because the
evidence is scant, but we can say a few
things with confidence. First, although the
exact lifespans of early humans are not
certain, evidence from the fossil record—as
well as death rates among modern hunter-
gatherer societies—suggests that most
early humans did not live as long as people
in developed nations today. Children
frequently died in their first years of life. If
you made it to 15, you were more likely to
live at least another 15 or 20 years, but
people who lived past their forties were
probably in a minority. Second, early
humans likely started having children far
sooner than people in industrialized
countries today. Paleolithic twenty-
somethings, we can safely assume, did not
have the luxury of spending a few years
after college "finding themselves" while
backpacking through India and volunteering
on organic farms. Rather, people who
survived to their twenties in the Paleolithic
probably had to bear the responsibilities of
parenthood as well as contribute
substantially to their community's survival.
These are not exactly circumstances that
favor leisurely cognitive development late
into one's twenties.

When I described this scenario to Giedd,


however, he suggested that widening the
window of heightened neuroplasticity to
encompass one's twenties may have helped
Homo sapiens adapt to rapid shifts in the
climate. Unfortunately, as with many
hypotheses in evolutionary psychology,
scientists do not have a way to objectively
test these ideas. Still, if we want to fully
understand the brain, we cannot ignore the
fact that it evolved in circumstances very
different from our own.

For now, let's put the brains of ancient


twenty-somethings out of our minds. What
about the twenty-somethings of today?
Even if the brain's developmental changes
are more dramatic in the teens than in the
twenties, the best available evidence
suggests that a twenty-something's brain
boasts a little more adaptability than an
older brain. Our twenties might represent a
final opportunity to begin mastering a
particular skill with a kind of facility we
cannot enjoy in later decades. Should
people in their twenties buckle down and
choose something, anything, to practice
while their brains are still nimble? Does the
neuroscience suggest that, for all their
freedom and fun, gallivanting twenty-
somethings neglect their last years of
heightened plasticity? Should parents
encourage their 20-year-olds to shirk adult
responsibilities lest they hamper an
advantageous period of self-discovery and
wild experimentation?

Solid neuroscience that can directly answer


these questions does not yet exist. "It's too
soon to tell," Giedd says, "but we're
wondering." He and his colleagues plan to
compare the brain development of girls who
become pregnant in their teens to girls who
do not. "Teen pregnancy changes all your
priorities and what you do with your time—
how do those experiences change the
brain?" Arnett agrees that such
neuroimaging studies would be useful.
"Even in industrialized countries, a lot of
people still get married pretty early. You
could do brain studies comparing people
who experience their twenties differently
and contrast how their brains develop."

In the meantime, twenty-somethings can


expect increasingly frequent waves of sage
advice from academics, bloggers and
concerned parents alike. "Watching talking
cats on YouTube isn't as good for cognitive
development as reading or taking classes,"
Laurence Steinberg of Temple University
told The Wall Street Journal. Truth. In the
same article, Jennifer Tanner, co-chair of
the Society for the Study of Emerging
Adulthood, provides her own pearl: "My
advice is, if your parents are currently doing
things for you that you could do for yourself,
take the controls. Say, 'No. Mom, Let me get
my own shampoo.'" Thanks for the tip, Ms.
Tanner. I mean, if I were living at home to
save money, I wouldn't mind sharing the
jumbo size 2-in-1 shampoo and conditioner
with my siblings. But I'm pretty sure the vast
majority of my peers have a handle on
shampoo selection by now. Because we're
worth it.

Emerging adulthood is real—it's happening,


albeit to a small percentage of the world's
population. Whether we can, at this moment
in time, meaningfully link this life stage to
neuroscience seems a tenuous proposition
at best. By itself, brain biology does not
dictate who we are. The members of any one
age group are not reducible to a few
distinguishing structural changes in the
brain. Ultimately, the fact that a twenty-
something has weaker bridges between
various brain regions than someone in their
thirties is not hugely important—it's just one
aspect of a far more complex identity.

You might also like