For much of the 20th century, humanity appeared to be experiencing an unprecedented cognitive boom, a phenomenon so consistent and widespread that it became known simply as the Flynn Effect. Named after the late New Zealand political scientist James Robert Flynn, this effect documented that the average score on intelligence quotient (IQ) tests had been rising steadily across populations in industrialized nations, typically at a rate of about three IQ points per decade. This suggested that each successive generation was fundamentally smarter than the last, an intellectual ascent driven by rapid societal change. However, in the last two decades, a disturbing new trend has emerged. In several developed countries, including Norway, the UK, and France, this upward march has not merely plateaued—it has reversed. This “Reverse Flynn Effect,” or cognitive decline, challenges the long-held assumptions about human progress and forces a difficult examination of what IQ tests truly measure and how modern life is reshaping the human mind.
The Enigma of the Flynn Effect
The IQ test is a standardized metric designed to measure general intelligence, or g, by comparing a person’s performance to the average of their peer group. By convention, the average score is set at 100, with a standard deviation of 15 points. When new versions of IQ tests are standardized, the average score for the new cohort is calibrated back to 100. The Flynn Effect was the discovery that, when members of the new cohort took the old, un-recalibrated test, their scores were significantly higher—sometimes by nearly 30 points over a century.
This astonishing improvement was evident on every major IQ test, spanned all age ranges, and was observed across dozens of modern, industrialized countries, from Western Europe to Japan and South Korea. The rise was continuous and roughly linear from the earliest days of mass testing until the mid-1990s. For example, British children’s scores on the Raven’s Progressive Matrices test—a measure of abstract reasoning—rose by a remarkable 14 IQ points between 1942 and 2008. This global trend presented a deep paradox to scientists: if we were getting that much smarter, where was the cultural renaissance to show for it?
What Drove the Century of Cognitive Gain?
The cause of the Flynn Effect was never definitively pinned down to a single factor, but scientists agree that the rise was too rapid to be genetic, pointing instead to environmental factors. Multiple interconnected societal improvements are cited as likely drivers. One significant factor is improved nutrition and health. Better prenatal and early childhood nutrition, alongside the eradication of common diseases and the introduction of simple public health measures like iodized salt, led to healthier, more fully developed brains and nervous systems, resulting in better cognitive performance.
Equally important are changes in education and lifestyle. The 20th century saw a massive increase in educational attainment, with more complex and prolonged schooling becoming the norm. Furthermore, the intellectual environment of modern life fundamentally changed the way people think. As society industrialized and grew more abstract, individuals were constantly required to deal with hypothetical scenarios, logical classification, and non-literal, scientific reasoning—the exact skills required by IQ tests. Flynn himself noted that the largest score increases were concentrated not on basic arithmetic or vocabulary, but on abstract, problem-solving components of the tests, like pattern recognition.
The Limits of the Intelligence Quotient
To resolve the paradox of skyrocketing IQ scores without a proportional surge in genius-level intellectual achievement, James Flynn eventually concluded that IQ tests do not measure general intelligence (g) as accurately as previously assumed. Instead, the incredible gains primarily reflected an increased mastery of abstract problem-solving ability—a specific cognitive skill set that the modern world has taught us to prize and practice relentlessly.
Flynn argued that the vast improvement was a cultural victory: modern society had trained its members to become highly skilled at tackling abstract classification and hypothetical logic, skills that were not necessary for daily survival in a more concrete, agricultural past. When presented with a question like “What do a dog and a rabbit have in common?”, a person from an older, pre-scientific culture might answer, “They are not alike; a dog has four legs and a rabbit has small ears.” A modern person, trained in abstraction, immediately answers, “They are both mammals/animals.” The IQ score reflects the adoption of this abstract, scientific worldview, rather than a fundamental rewiring of the brain’s raw potential.
The Slowdown and the Reverse Flynn Effect
The long-running global trend of cognitive ascent appears to have reached its peak in many developed countries around the mid-1990s. Subsequent data has revealed a consistent and troubling decline, a phenomenon now termed the Reverse Flynn Effect. A notable Norwegian study, using data from military conscription, found that men born before 1975 showed the expected Flynn Effect gain, but those born after 1975 demonstrated a steady decline in IQ scores, resulting in a difference of about seven points between generations.
Similar declines have been documented in cohorts from the UK, Sweden, France, and, more recently, parts of the United States. A Northwestern University study noted consistent negative slopes in verbal reasoning, matrix reasoning, and mathematical abilities between 2006 and 2018. This reversal is particularly concerning because a concurrent analysis of genetic data suggests the cause is not genetic, but entirely environmental. Researchers who controlled for shared family factors, such as parental education, found that the decline was still fully recoverable from variation within families (e.g., younger siblings scoring lower than older siblings), pointing conclusively to external, shifting social and environmental forces as the culprit.
Exploring the Causes of the Great Reversal
The scientific community is currently grappling with a range of possible causes for the sudden decline in cognitive scores, all of which reflect a change in the modern environment. One hypothesis centers on changes in the education system. As schools shifted focus away from certain types of abstract reasoning or rote knowledge, or adopted new teaching methodologies, students may have become less skilled at the specific tasks required by IQ tests, even if they gained other valuable skills.
A more contemporary theory gaining traction involves digital media and changes in information consumption. Critics suggest that the switch from consuming complex, dense print material to rapid, fragmented digital content may be degrading the capacity for deep, sustained, and logical thought. The IQ reversal seems to have accelerated right around 2010—the period of the smartphone’s rapid ascent—with the steepest declines often seen in the youngest cohorts, the heaviest users of digital devices. Other environmental factors, such as the potential impact of chemical pollution (like endocrine disruptors) or a decline in nutritional quality in certain populations, are also being explored as biological explanations for the decreasing scores. Ultimately, the reversal serves as a potent warning that, while we may have reached a ceiling on abstract problem-solving due to saturation, a continued decline suggests that the environment we’ve created is no longer an unambiguous engine of cognitive growth.