The Complexity Trap: Abundance, AI, and the Fallacy of Misplaced Concreteness

abundance

Ezra Klein and Derek Thompson suggest that smart governance and innovation can solve most of society’s problems in their book Abundance. Their vision emphasizes building—more housing, infrastructure, clean energy, and scientific capacity—rather than merely redistributing or regulating. The core idea is clear and optimistic: abundance is achievable and essential.

They champion an eco-modernist ideal, believing we can overcome climate crisis and social stagnation through technology, deregulation, and expansive development. They argue for a progressive shift—from simply preventing harm through regulation to actively promoting growth.

Yet this optimistic vision misses a deeper truth about complexity. Energy underlies growth and society’s failure to deliver a better future isn’t a product of poor governance or excessive regulation. It is inherent to complexity itself. As systems expand and interconnect, they grow harder to manage, more prone to breakdowns, and increasingly costly to sustain. The belief that government can swiftly reform or steer such complexity isn’t just overly optimistic—it’s fundamentally naive.

Figure 1., Modern Civilization is Complex and Energy-Intensive. Source: Labyrinth Consulting Services, Inc.

This misreading isn’t limited to Klein and Thompson; it’s a broader civilizational blindspot. Eco-modernism misunderstands the constraints imposed by entropy, ecological limits, and energy dynamics. What seems like institutional failure might actually be systemic exhaustion caused by overshoot and declining returns on complexity. The solution isn’t endless acceleration toward abundance, but deliberate simplification—focusing on local, low-energy, resilient community systems.

Adam Becker echoes a similar critique of the progress narrative which frames AI as a mythic savior promising transcendent solutions to human challenges—death, scarcity, and ecological ruin. In his book More Everything Forever, Becker portrays Silicon Valley’s elites as architects of a future in which AI paves the way to utopia, space colonies replace Earth, and mortality itself becomes optional. Yet beneath this promise lurks a darker impulse: a pursuit of control masked as progress. Influential figures such as Musk, Thiel, and Andreessen flirt with authoritarian and even eugenic ideas, embracing thinkers like Curtis Yarvin who propose technocratic rule. AI in their hands could become not a tool for human empowerment but a digital sovereign loyal only to its creators.

While Becker’s warnings risk conflating profit-driven opportunists with conspiratorial despots—the deeper issues he highlights remain valid: unaccountable power, unchecked market forces, and eroding democratic oversight. The essential danger isn’t AI becoming a godlike entity but humanity submitting to oversimplified narratives, mistaking efficiency and optimization for meaningful progress.

Projections about AI’s impact span a wide range—from visions of technological abundance and human liberation to fears of mass unemployment, inequality, and erosion of democratic governance. A recent report by the AI Futures Project paints a likely scenario: rapid automation driven by U.S.-China rivalry that accelerates economic output but sidelines human agency. By 2030, AI may dominate critical decisions, intensifying geopolitical tensions and marginalizing human influence.

Figure 2. AI could control key decisions, escalate geopolitical tensions and reduce human agency. Source: Labyrinth Consulting Services, Inc.

Joe Lonsdale, CEO of Palantir, argues that AI-driven disruption is simply a new iteration of historical technological advances—inevitable, necessary, and ultimately beneficial. Like previous industrial leaps, he sees AI as clearing away inefficiencies, promoting productivity, and fostering innovation. Yet this perspective neglects a critical reality: that complexity itself—rather than mere managerial or technological shortcomings—might represent our fundamental challenge.

Society’s problems aren’t simply failures of policy or governance but symptoms of deeper systemic realities. Joseph Tainter’s work, The Collapse of Complex Societies, illustrates that civilizations typically fail not from an inability to solve problems but from the unsustainable complexity each solution adds. Today’s escalating interconnectedness and scale strain our ability to manage or even comprehend them effectively. Yet, the dominant narrative insists that smarter management or better technology will restore balance—an appealing but misguided belief.

Alfred North Whitehead warned precisely against such errors, identifying the fallacy of misplaced concreteness—mistaking abstractions for concrete realities—as a recurring misstep in human thinking. Gravity, GDP, and intelligence are simplified conceptual tools, not actual tangible entities. Yet we habitually speak and act as if these abstractions embody reality itself, mistaking simplified maps for the infinitely messier territory.

Figure 3. Information isn’t understanding and GDP isn’t well-being. Source: Labyrinth Consulting Services, Inc.

AI isn’t true understanding; GDP isn’t genuine well-being; forecasts aren’t guaranteed futures. Treating these simplifications as concrete realities risks surrendering meaningful human judgment to abstractions. The greater danger isn’t that AI surpasses us, but that we willingly confuse our simplified tools for genuine truths, overlooking the fragile complexity that defines civilization itself.

Ultimately, the eco-modernist vision of abundance through acceleration, however compelling, risks becoming a detached fantasy—disconnected from the physical limits, thermodynamic realities, and ecological boundaries that govern our world.

Art Berman is anything but your run-of-the-mill energy consultant. With a résumé boasting over 40 years as a petroleum geologist, he’s here to annihilate your preconceived notions and rearm you with unfiltered, data-backed takes on energy and its colossal role in the world's economic pulse. Learn more about Art here.

Share this Post:

Posted in

Read More Posts

4 Comments

  1. John McDonald on June 17, 2025 at 2:37 pm

    Thank you for your thoughtful articles – which I just discovered. Your up to date analysis of our ecological/economic overshoot problems clearly advance the insights of my deceased friend and PhD advisor, Herman Daly.
    Keep up the good work!

    • Art Berman on June 17, 2025 at 6:28 pm

      John,

      Herman Daly is a continuous source of inspiration.

      All the best,

      Art

  2. Greg Hunter on June 16, 2025 at 3:15 am

    The Old Gray Lady, the Paper of Record or as I have learned to discern The Manhattan Enquirer. They benefit from toeing the corporate line and have been “successful” at shielding the rich and powerful from accountability since their founding. If they are not seeing the obvious then assume they are operating on the paradigm that Wall Street and DC have maintained “control” over commerce and power that was cemented during the Progressive Era. Relying on technology to maintain that advantage is the only way the know to keep it going as it as “worked” so many times.

    • Art Berman on June 16, 2025 at 5:17 pm

      Greg,

      I’m not sure what your comment is trying to say or how it relates to the post on The Complexity Trap.

      Your opening line seems like a dig at The New York Times, but it’s wrapped in insider references that make it hard to follow. Since I didn’t mention the Times in my post, I’m even more confused.

      If you have a point, please make it clearly and directly—-otherwise, I’ll have to block you. I don’t have time for cryptic or performative commentary.

      Art

Leave a Comment