When Linus Torvalds posted his now-legendary 1991 announcement about a “hobby” operating system kernel, no one would have predicted that Linux would become the backbone of modern computing. In a speech at the Open-Source Summit, North America, Jonathan Corbet, executive editor of LWN and longtime kernel developer, recounted the Linux kernel’s remarkable journey, highlighting its disruptive beginnings, its unique development model, and the challenges that have shaped its evolution.
Linux just keeps growing
Today, as Corbet said, “We are up to just over 40 million lines of code at this point. It’s fair to say that, indeed, Linux has become big and professional.” The first thing he noted was that, no matter what else was happening in the world, Linux had just kept growing.
Also: 7 things every Linux beginner should know – before downloading their first distro
In a graph of Linux kernel growth, Corbet observed, “If you look at this line, it’s really monotonically upward. It’s always increasing. We’re always building on the kernel, despite the fact that a lot of things were happening in the world over these three decades. We had the dotcom crash in 2000. We had the SCO lawsuit. In 2008, there was the global economic crisis. And of course, we had the COVID pandemic. But you don’t really see an effect on the development speed of Linux from any of these events. We have somehow managed to sustain everything we can do despite all the stuff that has happened in the world.”
How has Linux managed this?
Corbet believed that what has set Linux apart is its radical openness.
Unlike the centralized, exclusive development of other Unix-like systems, now largely forgotten, Linux welcomed contributions from anyone willing to submit code. This openness, combined with the GNU General Public License (GPL) 2.0, fostered a unified codebase and prevented fragmentation. “Anybody could be a part of it,” Corbet emphasized, “and as a result, Linux developed a community that you didn’t find in other open projects at that time.” You really won’t find it anywhere else, even now.
Also: This city is dumping Microsoft Office and Windows for OnlyOffice and Linux – here’s why
Of course, for years, no one took Linux seriously. It was dismissed as a toy in an era when Unix fragmentation and the rise of Windows NT dominated industry thinking. The prevailing wisdom held that only large corporations could build operating system kernels, leaving little attention for a community-driven initiative. Yet, as Corbet noted, Linux exemplified Clayton Christensen’s concept of disruptive innovation: a technology dismissed as inferior that quietly matures until it overtakes established players.
Another factor, Corbet explained, was that in the early 1990s, the BSD Unix systems were much more mature than Linux; they were capable of doing more and were more usable. Still, their permissive BSD license model led to a whole bunch of forks. None of them gained the critical mass in terms of either the development community or adoption to dominate Linux.
Instead, the Linux kernel stayed one thing. It stayed together, in part because its GPLv2 copyright policy meant everybody retains their copyright under the same license. It means that nobody owns Linux, or everybody owns Linux. It is not a company project. It’s not something somebody can pull out from under you, and that makes a huge difference.
Also: Want to ditch Windows? This Linux distro makes that transition easy
Linux was also successful, Corbet believes, because “Linus had no pride. He threw open the door to everybody, and anybody who could send him a patch could participate. And so we’d take it; we’d throw away a lot of stuff. It didn’t work. It seemed wasteful in ways, but there were no boundaries. Anybody could be a part of it. It was a lot more fun, and it was a lot more open.”
You can see how that approach was successful, he added, after the late 1990s when “Linux caught the attention of industry giants. IBM’s 2001 billion-dollar investment marked a turning point, shifting perceptions from skepticism to serious engagement. The dot-com boom fueled a Linux bubble, with startups and venture capital flooding the ecosystem. Yet, when the bubble burst, kernel development continued unabated, underscoring Linux’s independence from any single corporate patron.”
Corbet continued, “Much of the commercial structure around Linux self-destructed over the course of about a month in 2000, but development of the Linux kernel did not slow down. Nothing really changed there, which was perhaps the first object lesson that Linux is truly independent of its corporate patrons.”
Linux’s modular approach
Another reason Linux has been successful is its modular approach. By focusing solely on the kernel while leaving user-space utilities and distributions to others, it accelerated innovation and allowed parallel experimentation. Corbet cited the emergence of Beowulf clusters in the late 1990s. By stringing together commodity PCs running Linux to create supercomputers, Linux began its rise as the only supercomputer operating system and the dominant operating system of today’s data centers and clouds.
Also: I put Linux on this 8-inch mini laptop, and unlocked a new way of computing
Corbet remembered, “I worked in a supercomputing center when this was happening, and I went to them and I said, ‘Hey, we should really be looking at this.’ And they said, “No, no, we have these Crays over here, and that’s all that we’re ever going to need.” That really didn’t age very well. Now, people don’t really talk about Beowulf clusters anymore, for a simple reason. We just call them data centers.”
All of this development was happening over e-mail lists. Today, almost all of Linux’s key development occurs over the Linux Kernel Mailing List (LKML). Sometimes old technology is the best technology.
Beginning with the first Linux Kernel Summit in San Jose, California, on March 30 to March 31, 2001, developers began meeting face-to-face. That was when it became clear that while mailing lists are invaluable, personal connections are still vital. However, Corbet worries that current US visa policies will hamper such gatherings going forward.
A major shift
Corbet then turned his attention to the technical side of the Linux kernel. The Linux kernel’s development model underwent a major shift. “At the 2004 Kernel Summit. We adopted what was called, what we called the new kernel development model. Now it’s just the kernel development model where the first two weeks of every development cycle are what’s called the merge window, where all the new code, new features go in, then for the following weeks we fix the problems. This works well enough at this point that every release takes nine or 10 weeks in total. You can set your clock by the last 15 years; there have been exactly two exceptions.”
Also: The Linux 6.15 kernel arrives – and it’s big a victory for Rust fans
As Linux scaled, its development process faced bottlenecks. The biggest was the reliance on Torvalds to manually apply every patch. The adoption of BitKeeper, a proprietary source code management tool, temporarily alleviated these issues but introduced new dependencies.
Then, in April 2005, things went badly wrong. The “BitKeeper license was abruptly withdrawn, and overnight, we found ourselves without the software tool on which we had built the entire development process. So this brought everything to a halt. We were all kind of sitting there looking at each other, saying, ‘Now what?’ So Torvalds responded by creating Git in 10 days. Today this free and open source version control manager has revolutionized not only Linux development but software collaboration worldwide.
Also: I’ve used virtually every Linux distro, but this one has a fresh perspective
This approach, now used by many projects, enables rapid innovation without sacrificing reliability. Each year, 4,000 to 5,000 developers contribute over 80,000 commits, supported by a diverse array of companies, none of which dominate the project.
Corbet also highlighted the importance of embracing new technologies, such as the Rust programming language, to ensure the kernel’s long-term health and attract new contributors. “If you come back in five or ten years,” he predicted, “you’re going to see a very different looking kernel source base, and I think that’s really important for our sustainability.”
Get the morning’s top stories in your inbox each day with our Tech Today newsletter.