Utility Computing Still The Next Next Big Thing
from the it-doesn't-matter-that-IT-doesn't-matter dept
Computing as a utility has been heralded as “next big thing” for several years now. Although it has yet to make it big, some like Nicholas Carr are convinced that it’s finally starting to take off. For Carr, it’s the logical extension to the idea that a company’s IT setup is not sufficient to gain any competitive advantage, and thus unnecessary to keep in house. It’s a point he’s been making for some time, though moves out of Microsoft, Google and Sun would appear to back him up that things are happening now. But companies making moves does not in itself vindicate the vision. While broadband and cheap hardware may make it easier to build a gigantic server farm, other factors may mitigate its ascent. Hackers are moving upstream, attacking centralized points like DNS servers — concentrated “power plants” will make particularly inviting targets. The rise p2p architecture, as a way of reducing bandwidth congestion, also enforces the importance of having intelligence and power on the edge of the network. Advocates of utility computing like to draw an analogy to electricity (which is definitely a commodity), but even there the industry is moving in a different direction. Instead of building massive power plants, a lot of R&D is focused on distributed, on-site power generation. Also, unlike actual power plants, which are heavily regulated and have enormous barriers to entry, an equivalent computing plant will have very little protection for the market. Instead companies will be investing a lot of money into rapidly depreciating equipment that will be built and sold for much less in a few years. While there’s likely to be some useful applications of the model, it’s way to early to say that it’s finally arrived. As for the companies betting heavily on it, it may turn out to be a costly mistake.