Archived: A Powerful Idea About Our Brains Stormed Pop Culture and Captured Minds. It’s Mostly Bunk.

This is a simplified archive of the page at https://slate.com/technology/2022/11/brain-development-25-year-old-mature-myth.html

Use this page embed on your own site:

The strange history of a persistent myth.

neuroscience, science, teens, medicine, history, cover-story, redux, Section:technology, AdNode:technology/future_tenseReadArchived
A brain, partly obscured in shadow, spins around.

Illustrations by Rey Velasquez Sagcal

When Leonardo DiCaprio’s relationship with model/actress Camila Morrone ended three months after she celebrated her 25th birthday, the lifestyle site YourTango turned to neuroscience. DiCaprio has a well-documented history of dating women under 25. (His current flame, who is 27, is a rare exception.) “Given that DiCaprio’s cut-off point is exactly around the time that neuroscientists say our brains are finished developing, there is certainly a case to be made that a desire to date younger partners comes from a desire to have control,” the article said. It quotes a couples therapist, who says that at 25, people’s “brains are fully formed and that presents a more elevated and conscious level of connection”—the type of connection, YourTango suggests, that DiCaprio wants to avoid.

YourTango was parroting a factoid that’s gained a chokehold over pop science in the past decade: that 25 marks the age at which our brains become “fully developed” or “mature.” This assertion has been used as an explanation for a vast range of phenomena. After 25, it’s harder to learn, a Fast Company piece claimed. Because “the risk management and long-term planning abilities of the human brain do not kick into high gear” until 25, an op-ed in Mint argued, people shouldn’t get married before then. In early 2020, Slate’s sex columnists Jessica Stoya and Rich Juzwiak fielded a reader question about the ethics of having sex with people under 25. “I am told, at least once every couple weeks, that if you’re under 25, you’re incapable of consent because your ‘frontal lobes are still developing,’ ” the distressed reader wrote.

Even some young people now regard age 25 as a turning point with seemingly magical properties. In one Reddit thread, a 24-year-old asks whether older, presumably wiser Redditors noticed changes after 25. (“I suddenly stopped finding Leonardo DiCaprio attractive,” one commenter quipped.) Others use the factoid to justify a range of bad decisions, from why college kids continued hosting keggers at the height of COVID to why some men are terrible at texting.

But this notion has taken on more urgent stakes, too. After the school massacre in Uvalde, Texas, a Washington Post article noted that the shooter had just turned 18 and been allowed to purchase guns legally, following a long line of men under 25 who’d committed similar atrocities. (The Post notes that the Parkland school shooter was 19, the Newtown shooter 20, and the Virginia Tech shooter 23.) Shouldn’t gun laws, experts argued, reflect that these young men don’t yet have fully developed brains?

So, what does happen to your brain at 25? And how did so many people get the idea that something profound happens at that specific age? The past two decades of neuroscience research provide some clues. A huge breakthrough in how we study brains and a few intriguing kernels from studies seem to have become the basis for a powerful idea that reaches far beyond the facts. The real answer to these questions may lie in a culture that’s uneasily grappling with what science can (and can’t) tell us about ourselves.

About three decades ago, a new brain-imaging technique changed the face of neuroscience. The technology came from the more familiar MRI, which you might know is used in medicine to create a static image of the organs, tissues, and bones inside your body. (For example, you might get an MRI for an injured knee joint or to get a closer look at a tumor.) In 1991 scientists developed a new technique using the hulking, cylindrical machines to measure changes in blood flow, which they called functional magnetic resonance imaging, or fMRI. If MRIs were photos of the inside of your body, fMRIs were like video.

Over the next few years, scientists refined the technique, opening up a new horizon for neuroscience: the potential to peer into the brain as people think.

To say people were excited about fMRI is an understatement. In 1993 scientists gushed about fMRI’s potential to the New York Times. “This is the wonder technique we’ve all been waiting for,” said one researcher. Another called it “the most exciting thing to happen in the realm of cognitive neuroscience in my lifetime.” A third said he thought fMRI would do for neuroscience “what the discovery of genetic code did for molecular biology.”

The advent of fMRI was undeniably a boon for understanding the brain’s inner workings—and the public loved learning about the findings. Neuroscience validated things we already intuitively felt were true: that our minds process emotional pain similarly to physical pain, that we all have knee-jerk reactions to faces of people who are a different race than us, that something special happens when we look at someone we love. With these new imaging tools, we could better understand ourselves and how we change over our life spans. As part of this, a new subarea of research grew: Why is it that teenagers seem especially prone to bad decisions?

The bulk of adolescent imaging work focused on capturing brain structure by taking detailed images of the brain, and then probing brain function by recording brain activity in real time as people watched or listened to stimuli. On the structural front, researchers discovered that as children grew older, the prefrontal cortex, a brain area responsible for cognitive control, experienced physical changes. In particular, they found that white matter—bundles of nerve fibers that facilitate communication across brain areas—increases, suggesting a greater capacity for learning. Those changes continued well into people’s 20s.

They also found important clues to brain function. For instance, a 2016 study found that when faced with negative emotion, 18- to 21-year-olds had brain activity in the prefrontal cortices that looked more like that of younger teenagers than that of people over 21. Alexandra Cohen, the lead author of that study and now a neuroscientist at Emory University, said the scientific consensus is that brain development continues into people’s 20s.

But, she wrote in an email, “I don’t think there’s anything magical about the age of 25.”

Yet we’ve seen that many people do believe something special happens at 25. That’s the result of pop culture telephone: As people reference the takeaways from Cohen and other researchers’ work, the nuance gets lost. For example, to add an air of credibility to its DiCaprio theory, YourTango excerpts a passage from a 2012 New York Times op-ed written by the psychologist Larry Steinberg: “Significant changes in brain anatomy and activity are still taking place during young adulthood, especially in prefrontal regions that are important for planning ahead, anticipating the future consequences of one’s decisions, controlling impulses, and comparing risk and reward,” he wrote.

Steinberg is a giant in the field of adolescent development, well known for his four decades of research on adolescent and young adults. The passage YourTango quoted accurately describes the science, but it’s definitely a stretch to imply that it explains Leonardo DiCaprio’s dating history. When we spoke, I told Steinberg his work had been referenced in this way. “Oh no,” he said, laughing. I then asked whether he had insights about where the figure 25 came from, and he said roughly the same thing as Cohen: There’s consensus among neuroscientists that brain development continues into the 20s, but there’s far from any consensus about any specific age that defines the boundary between adolescence and adulthood. “I honestly don’t know why people picked 25,” he said. “It’s a nice-sounding number? It’s divisible by five?”

Kate Mills, a developmental neuroscientist at the University of Oregon, was equally puzzled. “This is funny to me—I don’t know why 25,” Mills said. “We’re still not there with research to really say the brain is mature at 25, because we still don’t have a good indication of what maturity even looks like.”

Maturity is a slippery concept, especially in neuroscience. A banana can be ripe or not, but there’s no single metric to examine to determine a brain’s maturity. In many studies, though, neuroscientists define maturity as the point at which changes in the brain level off. This is the metric researchers considered in determining that the prefrontal cortex continues developing into people’s mid-20s.

A clock with a spinning hand, located where a brain would be.

That means that for some people, changes in the prefrontal cortex really might plateau around 25—but not for everyone. And the prefrontal cortex is just one area of the brain; researchers homed in on it because it’s a major player in coordinating “higher thought,” but other parts of the brain are also required for a behavior as complex as decision making. The temporal lobe helps process others’ speech and language so you can understand what’s going on, while the occipital lobe allows you to watch for social cues. According to a 2016 Neuron paper by Harvard psychologist Leah Somerville, the structure of these and other brain areas changes at different rates throughout our life span, growing and shrinking; in fact, structural changes in the brain continue far past people’s 20s. “One especially large study showed that for several brain regions, structural growth curves had not plateaued even by the age of 30, the oldest age in their sample,” she wrote. “Other work focused on structural brain measures through adulthood show progressive volumetric changes from ages 15–90 that never ‘level off’ and instead changed constantly throughout the adult phase of life.”

To complicate things further, there’s a huge amount of variability between individual brains. Just as you might stop growing taller at 23, or 17—or, if you’re like me, 12—the age that corresponds with brain plateaus can differ greatly from person to person. In one study, participants ranged from 7 to 30 years old, and researchers tried to predict each person’s “brain age” by mapping the connections in each person’s brain. Their age predictions accounted for about 55 percent of the variance among the participants, but far from all of it. “Some 8-year-old brains exhibited a greater ‘maturation index’ than some 25 year old brains,” Somerville wrote in her Neuron review. Some of those differences might be random genetic variation, but people’s behavior and lived experience contribute as well. “Childhood experiences, epigenetics, substance use, genetics related to anxiety, psychosis, and ADHD—all that affects brain development as well,” said Sarah Mallard Wakefield, a forensic psychiatrist.

All this means that people’s brains can look very different from one another at 25. If we’re leaving it up to neuroscience to define maturity, the answer is clear as mud. The concept of adulthood has been around much longer than neuroscience has been able to weigh in on it. Ultimately, we are the ones who must define the shift from adolescence to adulthood.

It’s no coincidence that the “mature brains at 25” factoid’s popularity has grown during a period of massive cultural change. Since the first crop of brain-development studies were published in the 2000s, the U.S. has experienced two recessions and major shifts in how young people approach traditional “adulthood” milestones. Whereas people in 1950 got married in their early 20s, people are now marrying around 30, if at all. College, now strongly tied to economic mobility, is more expensive than ever; people graduate in massive debt while job wages have stagnated, making it very difficult to achieve financial independence and stability. As people spend more time in school and marry and find stable jobs later, they are also having children later, if at all. Pundits have called it the Great Delay. And in a 2010 New York Times piece, journalist Robin Marantz Henig asked, “Why are 20-somethings taking so long to grow up?”

Henig pointed to a theory from sociologist Jeffrey Arnett. Classic models of development jumped from adolescence to adulthood, but Arnett proposed the addition of a new stage—”emerging adulthood”—in between adolescence and adulthood. From the ages of 18 to 25, Arnett argued at the turn of the millennium, people are not yet fully fledged adults. Researchers have debated whether the phenomenon is new or even a developmental stage at all; unlike other developmental stages, there don’t seem to be negative consequences for people who don’t go through it. Plus, unlike classic stages like infancy or toddlerhood, it is not a universal phenomenon—rather, it appears mostly in Western societies. Though many people experience it as a period of doubt and instability, emerging adulthood is still a sort of privilege; many other young people don’t have a shot at education, or exploring their career and marriage options.

Nonetheless, the idea provided a tantalizing explanation for twentysomethings’ apparent “failure to launch,” and neuroscience of the 2010s seemed to further support the idea of a prolonged path to adulthood. Sure enough, the Times piece points to the “new understanding” that “children’s brains are not fully mature until at least 25.” With brain research as evidence, the theory of emerging adulthood seemed unassailable. There were pictures of the brain—what harder evidence is that?

Believing that neuroscience reveals all is a trap many people fall into. In the 2005 Supreme Court case Roper v. Simmons, attorney Seth Waxman exemplified this bias toward neuro research while arguing that people younger than 18 should not face the death penalty. “The very fact that science—and I’m not just talking about social science here, but the important neurobiological science that has shown that these adolescents are—their character is not hard-wired,” Waxman argued. He contrasted social science with “neurobiological science,” implying that the latter is more important—irrefutable.

But the takeaways from neuroscience are rarely ironclad, which complicates the question of what role these studies should have in shaping policy around the rights and responsibilities of young people. Contrary to what Waxman and many others might believe, neuroscience can be just as squishy as psychology, a field some snobs argue isn’t even a science. Just like psychologists, neuroscientists must make judgment calls about how to collect and interpret data, and there are no right answers for how best to do that. Studying people is messy. “Despite being popularly viewed as revealing the ‘objective truth,’ neuroimaging techniques involve an element of subjectivity,” three health researchers who study adolescents wrote in a 2009 paper.

A brain hovers and spins over a ruler.

The choices researchers make in their methodology and data analysis affect their results. Even the participants researchers study biases their data set. (It’s well established that most research is biased toward people in “WEIRD”—Western, educated, industrialized, rich, and democratic—countries.) Plus, the researchers said, “the cognitive or behavioral implications of a given brain image or pattern of activation are not necessarily straightforward.”

In other words, researchers might be able to take a picture or video of the brain, but it’s not always clear what this really shows. The interpretation of neuroimaging is the most difficult and contentious part; in a 2020 study, 70 different research teams analyzed the same data set and came away with wildly different conclusions. Now that tens of thousands of fMRI studies have been published, researchers are identifying flaws in common neuroscience methods and questioning the reliability of their measures.

That’s not to say we should disregard the neuroscience—we just need to acknowledge its limitations. “We are giving neuroscience a starring role where it should have a supporting role,” Steinberg said.

The hard work of defining what maturity or adulthood really is falls on us as a society. How we talk about maturity and adulthood—and the evidence we use to support that—has real-world consequences for our behavior and self-concept. It’s impossible to measure the full effect of the “maturity at 25” factoid, but the fact that some poor 24-year-old Redditor believes that something magical might happen to her in the coming year could very well affect how they think about themselves and what they’re capable of. Mills told me she’s heard from middle and high school students that their teachers often point to “brain science” as justification for their bad decisions. (Mills is currently working on a study to interview young people about what they think and feel when they hear those kinds of assertions.)

Even with a flimsy basis at best, the real-world consequences of the “brains are fully mature at 25” myth are only beginning to emerge. Some of those are relatively harmless; using this half-truth to explain Leonardo DiCaprio’s dating habits primarily hurts DiCaprio, who hardly needs our sympathy. But as people continue to cite this factoid, it has the power to create serious societal change. In some cases, the result might literally save lives—for instance, keeping dangerous weapons out of the hands of young people or preventing instances of capital punishment. In other cases, it could cost lives; anti-trans activists cite this as evidence that young people should not be allowed to access lifesaving, gender-affirming care. The ultimate trajectory of this growing belief—and the profound effect it could have on young lives—is impossible to know, but it’s clear that neuroscience has and will be deployed to shape policy.

Perhaps the whole enterprise needs a reframe. It’s unrealistic to expect people to appreciate all the nuances of neuroscience, and naive to believe that scientific evidence won’t be weaponized for political purposes. It feels inevitable that people will gravitate toward a neat, simple story that feels intuitively true: We’re adults at 25. But rather than using that factoid to defend bad decisions, why not use its lessons to reframe youth as an opportunity? As the brain develops in adolescence and early adulthood, it stays open to change; that’s what allows us to learn. “Children and adolescents are not broken adults, but rather, they’re functioning perfectly well for their developmental period,” Mills said. They’re exactly where they need to be; the extra malleability in youth prepares us to figure out our surroundings. “This is the time we’re learning about our identity, other people, how we fit into the world—we need the brain to be malleable,” she said. And while adolescence is typically a time of big changes, reaching adulthood doesn’t mean the end of that growth. You can make good or bad decisions at any age; you’ll mature and regress throughout your life. You, like your brain, are endlessly complex, and we’re so much more than brain scans will ever reveal.