AI is Eating the World: Why Ubiquitous Intelligence is Inevitable and How It Will Happen

Youngjin Yoo, Associate Dean of Research, Weatherhead School of Management
Co-Faculty Director, xLab

More than three decades ago, Mark Weiser envisioned a world where computing would seamlessly integrate into everyday life, becoming invisible and omnipresent. He called this vision ubiquitous computing. In Weiser’s world, technology would no longer demand our attention but would instead adapt to our needs, embedding itself into the fabric of our lives. It was a profound shift in thinking—from standalone devices to an interconnected, context-aware environment where computing operates in the background, empowering human activity without intrusion.

Weiser’s vision was primarily technical, focused on embedding computational power everywhere to create harmonious interactions between humans and machines. Decades later, Marc Andreessen articulated the economic consequences of such ubiquitous computing. In his now-famous declaration that “software is eating the world,” Andreessen emphasized how software, built on global cloud-based digital infrastructure, could scale infinitely, serving as a deflationary force across industries. Software, in Andreessen’s view, didn’t just integrate into life—it dominated and transformed industries by enabling unprecedented scalability and efficiency.

Scholars such as Carl Shapiro, Hal Varian, and Ajay Agrawal have highlighted the role of marginal cost as a driver of digital and AI transformations. Shapiro and Varian, in their seminal book Information Rules, noted how digital goods enjoy near-zero marginal costs of reproduction and communication. Agrawal and his collaborators, in their book Prediction Machines, extended this framework to Artificial Intelligence, identifying the near-zero marginal cost of prediction as the defining economic force of the AI era. These perspectives weave together the technical vision of Weiser, the economic disruption highlighted by Andreessen, and the deeper economic mechanisms driving the digital economy.

Today, these foundational ideas converge in the age of Artificial Intelligence. AI is eating the world, but it is doing so by becoming ubiquitous, distributed, and embedded at scale. It builds on Weiser’s vision of pervasive, invisible technology and Andreessen’s insight into economic disruption, creating a world where intelligence operates at the margins of time and space. The result is a new paradigm: the era of ubiquitous intelligence, powered by the near-zero marginal cost of pattern recognition, prediction, and generation.

The Economics of Marginal Costs

The internet age was built on the near-zero marginal cost of communication. Connecting billions of people and transmitting information across the globe became virtually free, laying the foundation for the digital economy. Then came the era of “software eating the world.” Software, with its near-zero marginal cost of reproduction, automated tasks, created new services, and disrupted traditional industries. This was powerful, but it was still fundamentally about automating what was already known or understood.

The next wave, already upon us, is the near-zero marginal cost of prediction. Machine learning algorithms, trained on vast datasets, could suddenly predict everything from consumer preferences to financial market movements with startling accuracy. This unlocked new levels of efficiency and personalization, giving rise to recommendation engines, fraud detection systems, and targeted advertising. Companies like Amazon and Netflix leveraged this power to dominate their respective markets, demonstrating the disruptive force of cheap prediction.

But we are now entering a new epoch: the near-zero marginal cost of generation. Generative AI models, like GPT-4 and DALL-E 3, can create new content—text, images, code, music, even video—that is often indistinguishable from human-created work. This is not just about automating the known; it’s about generating the new. And, crucially, they can do it at a marginal cost that is rapidly approaching zero.

AI is Eating Software

AI is not merely augmenting software. It is fundamentally reshaping it from within. The ability to predict and generate at near-zero marginal cost enables AI to become the core engine driving software’s capabilities, effectively “eating” the traditional software stack from the inside out.

AI is becoming the new foundation for how software is created and operates. Traditional software development relies on explicit programming, where developers write specific instructions for every task. AI-driven software, on the other hand, learns from data and can adapt, improve, and even generate new code autonomously. In essence, AI is not just another layer on top of the existing software stack. It’s becoming the foundational layer upon which all future software will be built. AI is eating software by transforming it into something more dynamic, adaptable, and intelligent.

The Firm at the Edge

Traditional organizational structures, including conventional firms, exist to identify opportunities, find solutions, and deliver those solutions at scale and competitive costs. Historically, firms achieved this through centralized coordination and control: by allocating resources through careful centralized decision-making, coordination and control, they sought to minimize risk and maximize efficiency. 

The rise of digital technology has disrupted these centralized paradigms. Digital tools and distributed intelligence now enable organizations—whether they be traditional firms, digital platforms, nascent decentralized autonomous organizations (DAOs), or other emerging forms—to constantly experiment. They can test new ideas constantly and across diverse contexts. They can now create offerings just for the here and now.  To wit, every engagement with a customer is an opportunity to learn, experiment, and iterate. Intelligence is moving outward, toward the edges of organizations and systems, enabling real-time, hyper-local decision-making and execution. Just as software is always operating in run-time, firms are becoming run-time organizations.

Economics at the Margin plays a critical role in this transformation. The near-zero marginal cost of communication, prediction, and now generation has drastically reduced the barriers to experimentation and scaling. Firms can now iterate rapidly, testing new solutions with minimal incremental cost, while leveraging distributed platforms and third-party partners to expand their reach. This evolution transforms the locus of economic activity in two ways. First, the temporal dimension of decision-making and execution will continue to shrink. Firms can no longer rely on static, long-term strategies. Whoever can grab the opportunity at the moment and can generate a solution at a lower marginal cost will win the competition. Second, the decentralization will accelerate. Decentralization allows firms to embed intelligence directly into devices, systems, and environments. By moving decision-making closer to the point of action, firms can generate new offerings that work in that particular context. 

The Future of Ubiquitous Intelligence

Of course, the era of ubiquitous intelligence raises new challenges and risks, including unprecedented upfront capital investment at a scale we have never seen before, energy consumption, security, equitable access, privacy, ethical use and volatility. Withstanding these challenges, the trajectory is clear: we are entering a world where intelligence is abundant, accessible, and embedded in the fabric of life. Just as the near-zero marginal cost of reproduction and communication gave rise to the internet and the software revolution, the near-zero marginal cost of prediction and generation is driving the age of ubiquitous intelligence.

As AI eats the world, it creates a new one—one where intelligence is woven into every interaction, every environment, and every moment. It is a world shaped by the economics of the margin, the resilience of distributed systems, and the creativity of intelligent generation. Happy Year 2025, the Year Zero of Ubiquitous Intelligence.