High Speed Data Compression – Cold Fusion Of The Internet?
from the hyperbole-of-the-internet? dept
Here’s an article about some researchers who are claiming they’ve created a new way to compress data that they think can revolutionize data transfer and storage. They say it challenges theoretical assumptions about compression, and have even called in “cold fusion” for the internet – which could be more accurate than they realize. I’m waiting until I see/hear more about this. The language that people are using to describe this seems way too extreme at this point.
Comments on “High Speed Data Compression – Cold Fusion Of The Internet?”
be skeptical... very skeptical..
“The company has thus far demonstrated the technology only on very small bit strings, but if the formula can be scaled up to handle massive amounts of data, the consequences could be enormous. “
WHAT?!? Who tests “revolutionary” compression algorthms on “very small bit strings”? How much processor time does this new technique take? There’s gotta be some trade-off here like it takes 10 supercomputers an average human lifespan to compress one 650MB disk…. (if they’re going to exaggerate the benefits, I might as well exaggerate some drawbacks.)
Re: be skeptical... very skeptical..
you’re just a pessimist – they said it was “high-speed”. i mean, c’mon, we had the same development hurdles when the micro-processor was first invented. When they were first being developed they had a 1-bit or 2-bit bus and were very expensive to make (in the thousands/proc). 8-bit busses didn’t happen (cheaply) until shortly before the Intel 8086.
so even if this technology is expensive (in CPU cycles) currently, if scalable, it will no doubt revolutionize the entire industry.
i have a feeling the scalability challenge is more related to the actual underlying principles involved (ie quantum mech.’s, physics, etc) rather than the CPU cylces used to achieve the result. what if, for example, the mathematics being used has limitations when it comes to larger chunks of data?