Are Supercomputers Obsolete?

from the just-string-together-some-Playstations dept

With all the talk recently about this or that new computing “cluster” technology out there, some researchers are suggesting that the US government is wasting its money trying to fund the next great supercomputer. Instead, they argue that the money should go directly to research on increasing storage systems instead. The argument is that the new cluster technologies and distributed computing make the ideas of supercomputers increasingly obsolete. There might be some truth in those assertions. Of course, the article doesn’t point out that the researchers leading the charge on this one are a bit biased. The article talks about Gordon Bell from Microsoft’s Bay Area Research Center, who has gotten a lot of publicity for the “mylifebits” backup brain solution that would (surprise, surprise) require some changes in data storage technology to become really feasible.


Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Are Supercomputers Obsolete?”

Subscribe: RSS Leave a comment
3 Comments
Steve Janss says:

Supercomputer vs Supercluster

Interesting thought about superclusters making supercomputing obsolete…

Google, of course, runs on one of the largest and long-running (several years) super-cluster in the world.

But is it a supercomputer?

Well, that depends upon the task at hand.

Some tasks can be broken down into many smaller tasks which require little, if any, intercommunication. The SETI screensaver project was one of those. This type of task lends itself to distributed computing, and was (still is, I believe) the largest, most powerful, most widely distributed supercluster on the planet.

Was it a supercomputer? Yes, since the task was to analyze radio frequency data for patterns. Single task, distributed throughout millions of computers all over the planet. In this case, each chunk of data can be analyzed independant of the others.

Even the new weather supercomputer, if it were taked to do what the SETI project did, would be many, many times slower, as powerful as it is. Yet the distributed computer continued chunking away, day and night, using excess processor cycles!

To me, the technology is fairly simple, yet the concept is still amazing.

Other tasks require more frequent communication between the sub-tasks (Google), and so this approach would NOT work. However, a super-cluster like the one they’re running works just fine. In this case, each sub-task has little (if anything) to do with other sub-tasks. John’s search really has nothing to do with Sarah’s search, other than the fact that both are accessing the same database.

Yet other tasks require a high level of communication between the sub-tasks (weather supercomputer and the DoD’s nuclear simulation supercomputer). Thus, even a super-cluster connected with 100Base-T isn’t fast enough, as the lag time between nodes becomes the bottleneck.

This is when a real supercomputer is called for, when the processors can communicate between themselves and the memory at the same speed your Pentium IV communicates with the L2 cache.

Now THAT’s fast! And necessary in finite element analysis where every element affects not only those elements immediately surrounding them, but nearby elements, as well.

So, the correct answer is: A supercomputer is not defined merely by the hardware and software, but by the task it’s performing, as well.

Consider, for example, trying to task the SETI project with the weather data… The internodal com jam would bottleneck the entire project, and the overall speed of the supercomputer would probably bog down to something like that of a Cray-1 (or less).

Also consider trying to task the weather supercomputer with the SETI project – it would be many times slower than were the millions of computers slogging away in the actual project.

Well, there you have it.

– Steve Janss

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...