This article originally ran in Sherwood News on December 26, 2024.
In 1959, physicist Robert Noyce and his colleagues at the Fairchild Semiconductor Corporation created a nifty little piece of technology: a wafer of silicon with an entire electronic circuit printed on it. As a technical breakthrough, it was an impressively clever feat, but what made Noyce’s achievement historically important was the cascade of innovations that followed. By making it possible to progressively shrink computing power into smaller and cheaper packages, the integrated circuit gave rise to wave after wave of transformative products, from the pocket calculator to the personal computer, the internet, and mobile computing. It’s no exaggeration to say that Noyce’s invention engendered the entire digital world as we know it today.
Noyce and some of his collaborators left Fairchild to form their own company, IntelINTC $19.92 (-2.41%), in 1968. Like OpenAI today, Intel had a head start on a new field of almost unlimited promise, and it used its advantage to great effect. In 1971 the company invented the microprocessor chip, a development that would later enable personal computers to bring the information age into homes of retail consumers. In 1997, Time magazine named Intel CEO Andrew Grove its Man of the Year. By 2000, the company had a market cap of $250 billion and ranked as the sixth-most-valuable business in the world.
That, however, proved to be the company’s high-water mark. In the years that followed, the company learned that being present at the birth of a technology and riding it to great heights doesn’t automatically mean that you’ll be able to sniff out market trends and outcompete newcomers indefinitely. In 1999, NvidiaNVDA $138.46 (0.28%), a much smaller rival, began shipping its first graphics processing unit, or GPU. In the decade to come, these chips would prove essential to the growth of PC gaming and multimedia applications, and then proved equally vital to both cryptocurrency mining and the large language models that power OpenAI. This time it was Nvidia, not Intel, whose chips were powering the revolution, and the changing of the guard was reflected in the company’s share prices. Ten years ago, Intel’s market cap was 15x bigger than Nvidia’s; today Nvidia’s is 30x bigger.
Investor Warren Buffett famously once said that he likes to invest in companies whose business has an “economic moat” around it — some factor, whether brand strength, network effects, patents, or economies of scale, that prevents rivals from swooping in and poaching its business. A head start in a technology can function as a moat for a while. But success inevitably draws rivals, and sooner or later some of them will catch up, or the demands of the market will change and favor a different technology. For now, OpenAI has a first-mover advantage in machine learning, and it has every reason to be optimistic about its fortunes in the years ahead. But if history is any guide, its halcyon days will eventually go the way of all flesh.