High Speed Data Compression – Cold Fusion Of The Internet?

from the hyperbole-of-the-internet? dept

Here’s an article about some researchers who are claiming they’ve created a new way to compress data that they think can revolutionize data transfer and storage. They say it challenges theoretical assumptions about compression, and have even called in “cold fusion” for the internet – which could be more accurate than they realize. I’m waiting until I see/hear more about this. The language that people are using to describe this seems way too extreme at this point.

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “High Speed Data Compression – Cold Fusion Of The Internet?”

Subscribe: RSS Leave a comment
mhh5 says:

be skeptical... very skeptical..

“The company has thus far demonstrated the technology only on very small bit strings, but if the formula can be scaled up to handle massive amounts of data, the consequences could be enormous. “

WHAT?!? Who tests “revolutionary” compression algorthms on “very small bit strings”? How much processor time does this new technique take? There’s gotta be some trade-off here like it takes 10 supercomputers an average human lifespan to compress one 650MB disk…. (if they’re going to exaggerate the benefits, I might as well exaggerate some drawbacks.)

free says:

Re: be skeptical... very skeptical..

you’re just a pessimist – they said it was “high-speed”. i mean, c’mon, we had the same development hurdles when the micro-processor was first invented. When they were first being developed they had a 1-bit or 2-bit bus and were very expensive to make (in the thousands/proc). 8-bit busses didn’t happen (cheaply) until shortly before the Intel 8086.

so even if this technology is expensive (in CPU cycles) currently, if scalable, it will no doubt revolutionize the entire industry.

i have a feeling the scalability challenge is more related to the actual underlying principles involved (ie quantum mech.’s, physics, etc) rather than the CPU cylces used to achieve the result. what if, for example, the mathematics being used has limitations when it comes to larger chunks of data?

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...