My first job out of school was in 1995 working for Silicon Graphics on high end supercomputers. In particular, I was working on the Irix kernel scheduler.
In fact, my first paper was about a piece of technology that now is viewed as very archaic: batch scheduling. At the time the problem was an area of active research…
Supercomputers were about to hit a brick wall that year. A combination of the end of the cold war that killed DARPA funding and the increased performance of x86 processors and networks that made clustering technologies good enough for an increasingly large share of the computational pie.
In parallel, while SGI’s server business imploded, the high end graphics business imploded as well.
The Octane was supposed to be the next generation high end workstation,
only to discover that the combination of AGP and Nvidia made the value of a completely custom design… well less valuable.
1996 was the last profitable year for SGI.
One of the more vivid recollections I have about the era is the discussion of the Top 500 supercomputer sites. Folks in the supercomputer biz bemoaned that Intel would soon dominate the list with commodity computing systems… That the entire era of supercomputers with their amazing underlying technologies was about to go away.
Nearly a decade after I left SGI, I attend a conference at Vail where I heard the exact same speech.
10 years later, the top 500 sites still has a collection of eclectic system designs.
And that got me thinking about supercomputers and their markets and the business economics …
The most important part of this blog is to tell you that there are other people who have written about this elsewhere. The most famous, and the most brilliant is book is titled The Innovators Dilemma by Clayton Christensen. If you haven’t read that book, stop and go read it.
Waiting.
Did you read it?
Waiting.
Good, you’re done.
Alright.
So what is a Supercomputer market? A supercomputer market is a market where the computational requirements are increasing faster than Moore’s law or are inherently so large that conventional computing systems are too slow at present and for the foreseeable future.
A much greater Systems Architect than I described it as: The customer wants the performance to increase 4x every two years.
To deliver that kind of performance, vendors have to deliver exotic computational architectures that are at the limits of what humanity can create at this time.
The price the customer pays for that kind of horse power is determined by the business value of the problem being solved.
And as long as the customer’s computation needs remain outside of what conventional computer systems can build every subsequent generation will command about the same price or more.
The nice property of supercomputer markets is that it’s practically impossible for new entrants to compete in the space unless they can figure out how to fundamentally disrupt the incumbent. Especially if the market has been evolving for a long time.
If you had to build a new supercomputer from scratch the capital cost would be staggering, never mind the challenge of finding the brain power necessary to build it outside of the established vendors.
The amazing sweet spot for a business is when a supercomputer market is a broad market that has huge computational needs that can only be satisfied with supercomputers. Basically you are building these amazing computers that people will pay a large chunk of cash because the choice is buy them or not be in the business that requires them. And there is a lot of those people.
A key indicator of not being in a supercomputer market is if the customer doesn’t have to buy the next generation of hardware and can also remain in the business that use them. In other words, the next generation is an improvement but no longer essential
How is this different from The Innovators Dilemma?
The Innovators Dilemma focuses on individual vendors and how they get disrupted but ignores the broader market trend. In point of fact, the Innovators Dilemma focuses on how new architectures can disrupt alternative architectures while the broader supercomputer economics trend remains.
If we look at disk drives, the canonical example in the book, the Innovator’s Dilemma observes that disk drive vendors came and went. My observation is that there was a macro need for more storage, and as long as that remained true supercomputer economics would hold true. The broader supercomputer economics trend held true even as many vendors got disrupted… It was only in 1996 when capacity prices collapsed that the average computer user had their average storage needs satisfied.
My favorite example is the PC because it’s not a supercomputer 🙂
For almost 18 years, consumers would spend about 5k on a new PC, because the next generation PC was so much better than the last generation PC. The computational needs of consumers was inherently greater than what computers could deliver in 1981 and remained so until 1999. For 18 years all you had to do if you were Dell is build a computer and people would buy it because their needs were unmet with the current generation computer.
And that brings me to the problem with supercomputer markets …
It turns out that there are two kinds of supercomputer problems. The first is what I call inherently computationally hard. These problems are the kinds of problems where you are trying to simulate physical processes or dealing with hard problems and as a result are inherently computationally expensive and will remain so indefinitely. The second kind of problem is what I call capped computational problems.
And maybe this is my second insight.
A capped computation problem is one where humans are consuming the computation directly. If you are building something for people, eventually you run into that bottleneck – the human ability to perceive and interact.
Put differently, supercomputer markets can exist indefinitely as long as you are processing machine level interactions that are not gated by human processing.
So what ends this kind of macro trend?
- Solving the problem
- Lack of interest in the problem
Some examples of 1 include things like extreme high-end 3D graphics, and PC’s. An example of 2 is the cold war dividend when Clinton cut funding into the military and that cut funding for the purchase of supercomputers because a class of problems were simply no longer that interesting.
The problem with supercomputer markets that are constrained by human perception is that eventually the computers get too fast. And the reality is that they eventually end. This doesn’t meant that the market for the product but the very nature of the market for the product changes.
So what happens when supercomputer markets end?
Typically the incumbent vendor goes out of business super-fast. Basically no one wants to buy their next generation hardware because the last one solved the problem.
And at that point the market transitions to an enterprise market with different economics. The most important being that customers want the next generation to be twice the performance at half the cost.
Visually
What this picture tries to show is that while customer demand is unmet by the supercomputer technology, the supercomputer technology continues to thrive, selling each new generation of hardware. Once the supercomputer technology meets customer demand, then customer demand shifts to mainstream technologies.
This is not a case of mainstream getting good enough, instead its a case of the customer no longer caring for incremental improvements because the problem is solved.
Leave a Reply