Begin typing your search...

    Course correction: Don't fear the AI bubble bursting

    Similar circumstances transformed agriculture. In the early 20th century, abundant, low-wage labour dulled the incentive to mechanize.

    Course correction: Dont fear the AI bubble bursting
    X

    Ai bubble

    Bubbles are great. May the bubbles continue,” Eric Schmidt, Google’s former chief executive, recently said. For artificial intelligence to advance, companies must continue to pour record-breaking investments into AI infrastructure — or so the thinking goes. Build more data centres, and AI will find a cure for cancer, reach artificial general intelligence, and beat China.

    But progress usually happens under pressure. When energy gets expensive, people invent energy-saving methods. When there’s a worker shortage, they invent labour-saving machines. A deflating AI bubble may be just what the tech industry needs: As funding dries up, companies will have to build models that do more with fewer chips and less power.

    Economists have a name for innovation in times of scarcity: directed technical change. In 1977, as Americans stood on long gas lines, President Jimmy Carter likened the energy crisis to a war, and businesses responded, developing technologies we now take for granted: more-efficient engines, better-insulated homes, early electric and hybrid vehicle technologies, and early forms of renewable energy.

    Similar circumstances transformed agriculture. In the early 20th century, abundant, low-wage labour dulled the incentive to mechanize. Then, in 1927, the Mississippi River burst through the levees, turning cotton country into an inland sea. Many residents took refuge in Red Cross camps; in some counties, up to four-fifths of families left. With fewer hands for planting and harvesting, landlords turned to machines: Tractors replaced teams, and mechanical tools spread faster there than in neighbouring counties.

    Generative AI needs its own course correction — both for energy efficiency and its own advancement. Large language models, for all their wonders, can only predict the next thing a human would say. Train one on texts from the late 1800s, and it won’t invent airplanes or rockets. It will channel ideas from that period, when leading scientists thought human flight was impossible. If we only scale up our current approach, wasting money on fast-obsolete chips and energy-guzzling data centres, we won’t progress beyond current technology, which still yields limited, mediocre results.

    Better AI would remember what it learns and squeeze more work from each watt. Tech companies spend billions running large language models that don’t learn while they run. A tool that does both simultaneously would come closer to approximating the human brain, allowing it to innovate more readily.

    The boom-and-bust pattern has long shaped AI. In the 1980s, the blossoming AI industry tried to replicate human reasoning by feeding systems thousands of “if-then” rules. The approach proved expensive and limited. But the shock pushed researchers toward models that learned from examples and dealt better with uncertainty, while neural networks, then unfashionable, quietly improved.

    Progress became easier to measure, and the field stopped betting everything on a single big approach. Jobs were lost, labs closed, but that slowdown taught scientists and developers better habits — empirical, flexible, and results-focused — that set the stage for modern AI.

    Scarcity still pushes AI forward, as companies with fewer resources learn to do more with less. In 2018, Europe’s data curbs imposed strict rules and heavy fines on how personal data could be collected and stored. In response, tech firms fine-tuned existing models and used artificially generated data. And since 2018, DeepSeek, a Chinese company, has worked around US export constraints. Its models show how scarcity breeds ingenuity.

    Around 1900, electric vehicles had promise; New York and London even ran electric taxi fleets. But underinvestment in the power grid, coupled with cheap oil, led to a system that favoured internal combustion. It isn’t hard to imagine a different century had we priced carbon early and kept building the grid. Without changing course, AI could face the same fate — a technology with immense promise but trapped in an outdated paradigm that saps resources.

    Humans are astonishingly energy-efficient. A child can pick up cause and effect, how the world behaves, and basic social norms with a brain that runs on mere 20 watts of power. Today’s AI models burn through mountains of data and electricity to approximate that performance, yet still misfire when handling unfamiliar problems. A course-corrected AI would help us tackle new challenges rather than merely refining what we already know.

    Bubbles are noisy while they inflate. When they burst, the froth clears, and you can see which ideas hold up without subsidy. If the AI boom cools, what survives will be the systems that do more with less.


    The New York Times

    Carl Benedikt Frey
    Next Story