Skip navigation
How Linux Conquered the Data Center
A sign with Linux penguins is seen in front of the Oracle headquarters Redwood Shores, California, in 2007. (Photo by Justin Sullivan/Getty Images)

How Linux Conquered the Data Center

Linux’s ubiquity in the data center wasn’t by design. It was the effect of a powerful and unpredictable undercurrent.

Some of the people who worked to create the original Linux operating system kernel remember this time with almost crystal clarity, as though a bright flashbulb indelibly etched its image on the canvasses of their minds.

In 1998 Red Hat was continuing to gather together names of new allies and prospective supporters for its enterprise Linux.  Several more of the usual suspects had joined the party: Netscape, Informix, Borland’s Interbase, Computer Associates (now CA), Software AG.  These were the challengers in the Windows software market, the names to which the VARs attached extra discounts.  As a single glimpse of the Softsel Hot List or the Ingram Micro D sales chart would tell any CIO studying the market, none of these names were the leaders in their respective software categories, nor were they expected to become leaders.

One Monday in July of that year, Oracle added its name to Red Hat’s list.

“That was a seminal moment,” recalled Dirk Hohndel, now VMware’s Chief Open Source Officer.  He happened to be visiting the home of his good friend and colleague, Linus Torvalds — the man for whom Linux was named.  A colleague of theirs, Jon “Maddog” Hall, burst in to deliver the news: Their project was no longer a weekend hobby.

“Linus and I looked at each other and said, ‘Wow!  That will change the world!’”

“When Oracle started to support their database on Red Hat Linux,” related Mark Hinkle, the Linux Foundation’s vice president of marketing, “that was a signal to the industry that you could trust your financial data to a Linux operating system.”

“It was an announcement of something that Oracle had planned to do,” noted Hohndel.  “We all have seen how these announcements often play out.”

The advocates of free and open software, Hall among them, had long spoken about the inexorable march of progress and the alignment of free software with free expression.  But history will show that the progenitors of Linux first perceived victory not when some debate opponent conceded the merits of the GNU license, but rather when a major commercial database vendor gave Linux its tacit stamp of approval.

The Facilitator

The tech journalists of the day — veterans of the battle between MS-DOS and OS/2, and between Windows and Macintosh — tended to view any prospective platform battle in the enterprise market as political warfare, whose key supporters signed up in one camp or the other.

Oracle’s news did not have a preassigned camp.  A casual admission that Oracle was testing the Linux support waters appeared in paragraph 13 of InfoWorld’s page-5 news of an announcement by Informix that it would support Linux. Maybe Informix’s move was prompted, the piece suggested, by rumors of Linux running in Oracle’s labs.

The cover of Canada’s weekly Maclean’s magazine for July 20, featuring Red Hat founder Robert Young, was titled, “The War for the Desktop.”  Right scale, wrong geography.

 

Oracle's then CEO, Larry Ellison, delivers a keynote address at the 2006 Oracle OpenWorld conference October 25, 2006 in San Francisco. (Photo by Justin Sullivan/Getty Images)

 

Critical mass for enterprise Linux had already been achieved.  Oracle was aware of this before most everyone else — even before the folks who created Linux.  Which is one reason they remember the event like a historical milestone.  But they weren’t entirely thinking of what it meant for Linux alone.

“There was this sort of greenfield opportunity with the explosion of the Internet and connected technologies over a non-private network," Hinkle said. "So the Web [HTTP], e-mail, and DNS became opportunities for Linux, the Apache Web Server, and BIND running on Linux, started to grow and really exploded, because of the need for these services exploded.”

Together, these services formed a flexible, adaptable stack.  A technician could easily build a server, install Linux, install the stack, and deploy the server.  Later, an operator could instantiate a virtual server with just as few steps, but in a few minutes rather than a few hours.

This was the breakthrough:  Organizations could build infrastructure — first physical, then virtual — when and where it was needed.  Linux was just a facilitator.

The Tide

Hohndel, one of the Linux kernel’s original midwives, has honed a theory over the years as to the true underpinnings of Linux’s success.  Though he cites events such as the December 1999 IPO of VA Linux Systems as critical to establishing Linux as a premier OS in the public mind, a more significant trend was running beneath the headlines.

The ascent of open source services brought with it DNS and the ability for data centers to begin discriminating which servers were delegated which workloads, according to their internet domains.  Then MySQL brought along the ability to simply query information — especially server logs — without invoking some gargantuan data warehouse on a per-seat licensing basis.  The rise of MySQL’s stature — especially in the tech press — may have precipitated Oracle’s strategy.

“The Apache Web server, and MySQL, and certainly PHP played a huge role in the success of open source, and Linux is part of this,” said Hohndel.

Open source alternatives threatened the comfortable position of databases and productivity packages in the enterprise, not because they were attacking the desktop like MacLean’s suggested, but rather because they were rendering the desktop irrelevant.  Oracle was among the first to seriously respond.  That response garnered the attention of CIOs and IT managers.

The moment the enterprise witnessed Oracle bestowing its blessing upon Linux as a legitimate option for hosting business databases, the CIO began taking Linux as legitimately as did the IT manager.

“If it hadn’t been for the early free software movement and the open source movement in the late ‘90s, Linux by itself would not have changed the world,” Hohndel continued.  “Open source is a software development methodology.  It is fantastic at creating collaboration, innovation, and shared APIs.  But it actually has a very poor track record at creating enterprise, production-ready software.”

When open source met virtualization, the resulting chemistry changed the world.  Infrastructure became virtual, and workloads became portable.  The cloud was born.  Enterprise software publishers couldn’t come up with anything competitive, because they had not yet fathomed what the cloud actually meant.

“I often compare software development to biological processes,” explained Linus Torvalds, during a recent Open Source Leadership Summit, “where it really is evolution.  It is not intelligent design.  I’m there in the middle of the thing, and I can tell you, it is absolutely not intelligent design.”

Software, as Torvalds perceives it, is a congealing, coalescing nebula of contributed ideas and trial-and-error functions, only some of which are inevitably successful.  He argued that perhaps as important to the ascent of open source in the enterprise as Linux, if not more so, was the creation of Git (which he also spearheaded).  This is the formal system for automating the contribution of bits and pieces of code upstream.  It is the guarantor of improvements and disentanglements, such as there are, not only in Linux but all the components of the stack which Linux facilitates.

 

Linux creator Linus Torvalds speaking at the Open Source Leadership Summit, February 15, 2017

 

“All the really stressful times for me personally have been about process,” said Torvalds.  “They’ve never been about code.  When code doesn’t work, that can actually be exciting.  That can be fun!  You hit your head on a wall for a couple of weeks and that can be very frustrating; but then when you solve it, the feeling of accomplishment is huge.”

Git enabled an ecosystem for software that enlisted the customer as a participant, not just a benefactor.  It spurred on the development of containerization platforms such as Docker and CoreOS, orchestration platforms such as Kubernetes, and schedulers such as Mesosphere DC/OS.  While Linux is the progenitor of Git, it is Git that may be responsible for the ultimate conquest of the data center.

The Reformation

Like Keurig with its encapsulated coffee and HP with its impregnable ink packets, Microsoft believed it had a permanent supply line to its own gravy train.  Virtualization blew up this train.  Up to this point, if an enterprise had a key business process or an application relying upon a critical database, it needed a Windows-based platform to run it.  Windows was tied to processors.  Thus there appeared to be an unbreakable chain linking databases to operating systems to processors.

Once that chain was broken, the concept of composable infrastructure was born.  But the world still looked to Microsoft to provide it.

Instead, in 2007, Microsoft began a campaign of reduced expectations for what Windows Server could deliver.  Incapable of replicating the live migration feature that VMware demonstrated with aplomb, Microsoft postponed deadlines into the stratosphere and in the interim tried convincing its customers (through the press) that such a feature wasn’t something anyone wanted anyway.

 

Microsoft's then CEO, Steve Ballmer, delivers the keynote address for the global launch of Windows Server 2003 April 24, 2003 in San Francisco. (Photo by Justin Sullivan/Getty Images)

 

The coup de grace against Microsoft’s mountaintop position in enterprise operating systems had been delivered. . . but not by Linux.  Leveraging so much functionality upon a single design point — the OS/processor connection — ended up immobilizing Windows Server at the time it needed to evolve most.  This, in turn, forced the server industry to implement a massive work-around, paving the way for cloud platforms.

Linux marched in, along with that conquering force.  Today, Microsoft is in the midst of “re-imaging,” if you will, Windows Server as a service provider — as a key component, but certainly one component among many.  For Microsoft, the platform has become Azure.

“As a platform, specifically a virtualization platform, we need to ensure that we can host both Windows Server and Linux workloads equally well,” Erin Chapple, Microsoft’s Windows Server General Manager, said in a message to Data Center Knowledge.  “In retrospect, we should have started that work sooner for Linux.”

Note Chapple's relocation of “we” and “platform” to locations outside of Windows Server, which is no longer presented as the ecosystem in its totality.

“We believe that customers look at the overall value proposition of data center products in their procurement decisions, from licensing to innovation roadmap and beyond,” she said.  “In the last couple decades, this approach has led to the mixed approach to operating systems in the data center that many organizations, including Microsoft, have today. . . As part of our approach to listening to customers and our learnings from running one of the two largest commercial clouds in the world, we’ve invested in an open and flexible platform that is all about choice and supporting customers to run all their workloads, including Linux, and in Windows Server 2016, a product designed to meet our customers’ unique needs in the modern data center.”

Oracle’s stamp of approval, the rise of the stack, the triumph of virtualization, the easy success of Git, the empowerment of the cloud, and the collapse of the tower that Microsoft built — all these independent factors resulted in today’s state of affairs in the enterprise data center.

And what is that state, exactly?  There’s a good argument to be made that Windows Server has not really been vanquished — indeed, that its place in the enterprise remains guaranteed, even as just the support platform for certain applications.  Comparing Windows Server to Linux in today’s server environment may be akin to comparing a piston engine to a square yard of asphalt in today’s highway environment.  The only folks who still see a need to pair them together may be the ones who publish headlines for a living.

“I think that there’s always going to be — especially for coordinated, massive services — a need for an operating system and a scheduler,” remarked the Linux Foundation’s Hinkle.  “But I do think you are going to see in the future containers running directly on the chip without the operating systems that you see today.  With more IT, there’s just going to be more use cases.  I don’t know that any time soon you’re going to see a situation where the operating system isn’t important.  It’s just going to be more and more abstracted.”

Hohndel and Torvalds were right that the world would change.  And in this new and altered realm, the part that scored the first blow in a triumphant battle may continue to exist.  But historians — if there are any — may have a difficult time identifying it.

Hide comments

Comments

  • Allowed HTML tags: <em> <strong> <blockquote> <br> <p>

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.
Publish