To Boost Its New Crappy DRM, Hollywood Tries Giving Away Free Movies

from the free-sometimes-isn't-worth-the-cost dept

We’re always told that the reason there’s so much piracy out there is because “people just want stuff for free.” This isn’t actually supported by the facts, because we see people pay when they can get things for free all the time. And, similarly, we know that those who often get the most free stuff, also buy more. In other words, price may be one component of why people buy — and free may be an appealing price — but it is hardly the only component in how people make their decisions on obtaining content. One of the key issues, for many, is the freedom and or convenience in how they can make use of said content — an area where DRM solutions take away value from the end-user (which, by definition, lowers the price that the average person is willing to pay).

Given all that, there’s something rather amusing about Hollywood’s new pitch for its Ultraviolet platform. As you may recall, this is the kinder, gentler DRM for video content that the industry has been pushing. It does let you watch content on multiple devices (within limits), but it’s still DRM. And, as such, it’s no surprise that the reception to UltraViolet has been somewhat lukewarm.

In order to deal with that, the movie studios are trying something different: giving away free movies. Yes, there’s something somewhat bizarre about Hollywood using “free” movies as the incentive to get people to buy into their Ultraviolet DRM, which is meant to get them away from the “free” movies they were getting through unauthorized means. While it may attract a few people, it seems likely that the industry is going to (once again) discover the point that many of us have been making for ages. It’s not just about free. If free comes with massive strings — such as annoying DRM — it’s just not going to attract that many people. If they were strategic thinkers, perhaps they’d finally realize that it’s not just about free, but about the overall package, and then maybe they’d stop making the overall package so annoying all in an effort to stop some people from accessing the same content… for free.

Filed Under: , , , ,

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “To Boost Its New Crappy DRM, Hollywood Tries Giving Away Free Movies”

Subscribe: RSS Leave a comment
89 Comments
Chuck Norris' Enemy (deceased) (profile) says:

Who do we fear?

And it comes down to trust. I can download a drm free film from a torrent site and hope it doesn’t contain device crippling malware/trojans (you would be wise to scan the file after download)? Or do I download a free, drm laden film from the studios that will probably not work how I want or cripple my device? The sad thing for Hollywood is more people trust the wares and goodwill of generous ‘pirates’ than the producers of the original content.

Machin Shin (profile) says:

Re: Re: Who do we fear?

I think your a bit mistaken. After all, why do you think programs like VLC keep putting out updates? The video formats are not changing that much, and a lot of updates don’t add features.

It is not common, but it is not impossible for someone to make a “movie file” that is really a collection of malicious code written to exploit a flaw in a video player.

Zos (profile) says:

Re: Re: Re: Who do we fear?

first off, that’s nonsense. in 5 years of torrenting, i’ve never found a virus, outside of a few dodgy game cracks, which was mostly my own fault for not sticking to trusted uploaders. never anything in a movie or music file.
Then again, i use a bit of common sense, and actually check comments before downloading random crap, and i stick to streamed porn. (if you’re going to find something dodgy, it’s most likely in porn, for soem reason porn downloaders all seem to be morons when it comes to comps.

secondly, it’s not like it’s hard to run malware bytes and fix it if you do get teh internet herpes.

thirdly, most communities are pretty damned quick to pull dodgy files.

PaulT (profile) says:

Re: Re: Re:2 Who do we fear?

“in 5 years of torrenting, i’ve never found a virus”

In 20 years of owning a PC, I’ve never had a virus. The only virus I ever got was before I knew what they were, a silly thing on the Atari ST that reversed your mouse cursor movement every 10 times you booted from an infected floppy. Cute.

But, that doesn’t mean that viruses aren’t a real problem. To give another example – when malicious code automatically running from websites first appeared, I rejected that idea because it sounded like people just blaming the site instead of their own badly secured PC. It’s confirmed as a real threat now.

“Then again, i use a bit of common sense”

A rare commodity, I’m afraid.

“secondly, it’s not like it’s hard to run malware bytes and fix it if you do get teh internet herpes.”

Having worked in tech support facing the general public, I can safely say that most people don’t know what that is, and would probably put up with problems caused by an infection rather than fix it if they weren’t too intrusive.

“thirdly, most communities are pretty damned quick to pull dodgy files.”

Key word – communities. More mainstream sites might not, or even honeypot-style sites might be set up deliberately to catch the clueless user who just googles for a file instead of using a trusted community site.

Rich says:

Re: Re: Who do we fear?

This isn’t true. Viruses can lurk in any kind of file, regardless of their “executability.” Malware can, and has, taken advantage of bugs in software that processes the file, doing such things as stack-smashing to trick the computer into executing code that the malware wants it to. For example, a video file has a particular structure to it, with fields that have defined ranges of values. A poorly written program to display the file, may not check the ranges of these values, and so something bad, like overwrite code, etc. Now, modern OSes and languages can guard against a lot these types of attacks, but nothing is foolproof.

Mason Wheeler (profile) says:

Re: Re: Re: Who do we fear?

Now, modern OSes and languages can guard against a lot these types of attacks, but nothing is foolproof.

…especially when the people doing the coding are fools. Most software like this is written in C, (or derivatives such as C++ or Objective-C,) which is in no way a “modern language that can guard against these types of attacks.” By having no language-level safety features whatsoever, anything written in a C dialect practically comes with a big “HACK ME” sign as a standard feature.

We’ve known that the C language is one big security hole (that can’t be fixed because too much existing code relies on these flaws to do “clever” things a microsecond or two more quickly than doing them in a sane way would take). But people keep using it, and people keep getting hacked because of it, leading to billions upon billions of dollars worth of damage. In any sane world, today it would be considered an act of criminal negligence to write any operating system or other network-facing software in a C dialect. But people still keep doing it…

Robert (profile) says:

Re: Re: Re:2 Who do we fear?

As someone experience in both sides (embedded and .Net) I am sorry, but C is not that bad of a language. As someone else mentioned, bad programmers.

Personally, Java and the .Net platform are too bloated and you have way too many libraries added, swelling your binary (and memory footprint) well beyond what should be required.

And those “safe” platforms have their own security faults. Again, bad programmers are the problem. There’s no excuse for memory leaks, other than either lazy or bad management who won’t let the developer do things properly. That’s not a C problem either!

C#’s garbage collector is not infallible, neither is Java’s.

C’s had a LONG life and a LOT of use, which means it’s pitfalls are well known. C#, for example, is far from being totally understood. Not all of its pitfalls are known. Same goes for the .Net framework, or any of the web frameworks.

Criminal negligence for using a non-flavour-of-the-month framework? That’s quite extreme.

Code properly and you won’t have a problem, there’s less risk with a well known language like C (who’s a lot more than a few microseconds faster!) than newer frameworks like .Net 4.5.

You should really revisit C, or at the minimum, take some embedded courses and see the power of such a language.

I’d take C any day over C#. C# has its place and C has its place. Those places don’t necessarily need to overlap and in some cases they should not overlap at all!

Mason Wheeler (profile) says:

Re: Re: Re:3 Who do we fear?

You’re making a lot of assumptions here, and putting a lot of words in my mouth. I never said anything about Java or .NET, or memory leaks for that matter. And I never said anything about “criminal negligence for using a non-flavour-of-the-month framework”. I said criminal negligence for using a tool that is known and has been widely known since 19-freaking-88 to be unsuitable. (In my original post I mentioned 1988, with a link to the Wikipedia article on the Morris Worm. Not sure what happened, but apparently something in Techdirt’s system didn’t like the link.)

If a contractor knowingly built a building using shoddy materials, and that building later collapsed, causing significant harm and property damage, the contractor would be held liable. How is this any different?

Personally, I avoid Java and .NET whenever I can, and I’m well aware of the problems, both conceptual and practical, with using a garbage collector. (See http://programmers.stackexchange.com/questions/129530/what-are-the-complexities-of-memory-unmanaged-programming/129555#129555 for my thoughts on the matter.) And I “revisit C” fairly often, generally to fix stupid bugs in open-source libraries I’m using. And about half the time, it’s a bug that would have been impossible to make in a sane language.

Contrary to popular belief, it’s actually possible to do native code, with all the benefits thereof, without C. Here’s a fun fact for you: By the time that the Morris Worm came around and conclusively proved that C is unsuitable for its original intended purpose, namely OS development, Apple had been busy for several years reinventing the concept of the operating system, and laying of modern OS design that the entire home computer revolution has been built on ever since. In Pascal.

Robert (profile) says:

Re: Re: Re:4 Who do we fear?

I responded to what you wrote. Claiming criminal negligence was your words!

As for Pascal, what is Apple using now for OS-X? Are they still using Pascal? What embedded systems run with Pascal as their OS design?

I am actually asking. Just because everyone uses C doesn’t mean it’s flawless, I know, but it doesn’t necessarily mean it is totally absurd.

I also find it disingenuous to compare a standard OS for a typical user (most critical systems have been hardened to not succumb to said failures you are referring to – though I am certain vulnerabilities exist) to shoddy construction materials. There’s no threat to loss of life using a standard computer, by comparison to shoddy craftsmanship or improper use of available materials.

And “I never said anything about Java or .Net…” You didn’t say anything about Pascal or anything, you just bashed C.

http://en.wikipedia.org/wiki/Morris_worm

How does this explain why C is not the best language for operating systems?

According to the article, it was related to known vulnerabilities. How does that prove C was the problem and it was not the best design for an OS?

Some other reading:
http://stackoverflow.com/questions/520068/why-is-the-linux-kernel-not-implemented-in-c

I do wonder though, Apple used assembly and an extended form of Pascal.

Perhaps the frequency of hardware changes, variety of hardware available, and productivity requirements are the reason people use C and not assembly as the core?

Please explain how it is unsuitable.

All I can find is:
http://www.google.ca/url?sa=t&rct=j&q=why%20c%20is%20unsuitable%20for%20language%20for%20operating%20systems&source=web&cd=30&cad=rja&ved=0CG4QFjAJOBQ&url=http%3A%2F%2Fciteseerx.ist.psu.edu%2Fviewdoc%2Fdownload%3Fdoi%3D10.1.1.170.9818%26rep%3Drep1%26type%3Dpdf&ei=FiTvUKfFL-f7iwK89IHgBw&usg=AFQjCNGcOJQmN9YxPthN0PR0xPE7ni_oKA&bvm=bv.1357700187,d.cGE

Which states that:
-Data hiding above function level not supported in C
-Free use of global variables is common (programmer – not language problem) and results in mayhem when global variables are manipulated externally
-> this makes C unsuitable for component based system
– C lacks concurrency support, no language constructs for synchronization – makes it difficult to port kernel extensions, ie: device drivers from one OS to another.

Is that what you meant?

Mason Wheeler (profile) says:

Re: Re: Re:5 Who do we fear?

Yes, I did say “criminal negligence,” and I meant it. What I didn’t say was “flavour-of-the-month framework.” You make it sound like I’m in favor of pushing some fad on developers through legal means, which is obviously ridiculous. What I am actually in favor of is recognizing the willful use of shoddy materials for what it is, and imposing a standard of liability, just like we have for physical construction.

Yes, admittedly, the Pascal language has lost a lot of popularity since those days. But that has nothing to do with technical merit. You can sum the reason why up in two words: Brian Kernighan. He was a very talented, very persuasive writer, and people have been treating his paper “Why Pascal is not my favorite programming language” practically as gospel ever since it was published, despite the facts that:

1) a lot of the objective language-level problems (as opposed to the purely stylistic gripes) he criticized were already obsolete when he wrote about them, and the rest no longer apply and haven’t for a long time now, and
2) as the co-author of the definitive book on C programming, Kernighan had an obvious direct monetary interest in getting people to not use C’s competitors, and therefore cannot be trusted as an objective source.

And “I never said anything about Java or .Net…” You didn’t say anything about Pascal or anything, you just bashed C.

That’s because, even though Pascal is my language of choice, I don’t think it’s the only suitable thing out there. But out of all the languages out there, the only ones I see that are in widespread use and actively causing damage are the C family. (And PHP, but that’s a completely different topic.)

According to the article, it was related to known vulnerabilities. How does that prove C was the problem and it was not the best design for an OS?

Because the “known vulnerability” was a buffer overflow, a flaw in the C language, and one that C makes very easy to create.

It’s possible to create a buffer overflow bug in Pascal, but you have to really go out of your way to do it. (Including, in modern Pascal dialects, turning off the compiler’s buffer checking that’s designed to make this impossible.) In managed languages, it’s even harder, since you don’t have the option to turn off the buffer checking. But in C, it’s trivial, as there are no bounds checks, either at runtime generated by the compiler or in the type system. (It’s perfectly legal to declare a char[20] and then write to index 28. In Pascal, that’s a compiler error.)

If you want to truly understand why the C family is unsuitable for operating systems, try monitoring Windows Update (or the update system on your OS of choice) for a while. Have a look at how often security patches show up to fix cases where “a carefully crafted [foo] could allow a malicious user to take control of the system.” Those are buffer overrun exploits, and they keep coming. They keep happening, over and over and over again.

We’re coming up on the 25-year anniversary of the Morris Worm. A quarter century later, we’re still making the same mistake again and again and again, because the language is flawed and makes it very easy to make that mistake! One of my coworkers likes to say that Dennis Ritchie’s true legacy is the buffer overrun, and you know what? He’s right.

Robert (profile) says:

Re: Re: Re:6 Who do we fear?

Is buffer overrun not a programmer problem?

Why is the the fault of a language that developers can’t understand it’s limitations?

There’s a reason people chose C and so far, you have only highlighted a limitation and the primary problem can be boiled down to two human factors:
1) Buffer overruns are due to bad coding (design or implementation/typos/errors)
2) Developers don’t understand 1) or don’t care or are told not to care.

I won’t fault C because of code design.

People have to understand what they are doing with the tools they use.

I don’t agree with the philosophy that compilers should protect against such behaviour. In some cases it may be desirable. If someone does it, well then it is their fault and I would hesitate to blame the language on the coder’s mistake.

Man, you are bitter about C.

Out of curiousity, what language did you develop? No disrespect meant, but you sound like “Everyone is using this POS language with this flaw instead of mine, which is flawless.”

Your words indicate you’re really pissed and it just seems like a redirection of anger, and it is not placed at coders but instead the tool they used.

Robert (profile) says:

Re: Re: Re:7 Who do we fear?

Another thing, in your example of arrays..

Indexing past the array is permissible because of the flexibility. You create an array for a contiguous memory allocation. You can access it with pointers or the index, which I will hazard a guess works out to the same thing when converted to assembly?

How is that a design flaw? Restricting is not useful.

You have to keep your pointers in check! That’s a developer problem, not a language problem.

Why should the language restrict such behaviour?

If MS would push their developers to dot the i’s and cross their t’s and actually check their code for such mistakes in coding practices, this would not be a problem.

Unless the hardware caused this, corruption of pointers, due to some underlying problem in the context switch? Or a misconfiguration of a CPU register?

I don’t know, but aside from pointer corruption caused by a glitch in the hardware (timing?), the only problem with accessing beyond the bounds of an array is due to bad coding!

Easy mistake, but what would be the loss to restrict C to protect against accessing beyond bounds? How can you tell what will happen at run-time? How do you know it’s not intentional?

Mason Wheeler (profile) says:

Re: Re: Re:8 Who do we fear?

You create an array for a contiguous memory allocation. You can access it with pointers or the index, which I will hazard a guess works out to the same thing when converted to assembly?

How is that a design flaw? Restricting is not useful.

Of course restricting is useful, because it provides proof that your code will not index beyond the allocation. If you want to get a pointer to an element, you have to start from an index anyway, so any sane type system will ensure that your starting point is inside the array. (But C’s won’t.) So the real problem here isn’t using the pointer, but moving the pointer.

If I had to guess, I’d say you were thinking about iteration when you wrote that. And iteration with a pointer can actually produce slightly more efficient ASM than iteration with an index variable, especially if you’re doing something non-trivial inside your loop. It’s a trick I’ve used myself a time or two, when efficiency was at a premium. But there are two points to keep in mind here.

First, most of the time, efficiency is not at a premium. Even modern “embedded systems” often have hundreds of MHz and tens or hundreds of MB of RAM at their disposal thanks to Moore’s Law.

And second, even with pointer-based indexing, you can still stay within bounds when iterating an array. Just off the top of my head, a for..in loop could easily be implemented this way.

If MS would push their developers to dot the i’s and cross their t’s and actually check their code for such mistakes in coding practices, this would not be a problem.

…except that Microsoft is hardly the only source of these errors. They show up in Linux. They show up in Firefox and Chrome. They show up in major open-source projects, exactly the place where “Linus’s Law” predicts they should not, because it’s such an easy mistake to make, and such an easy mistake to miss when reviewing it.

How can you tell what will happen at run-time?

With run-time bounds checking, created automatically by the compiler. That’s how modern languages ensure that buffer overruns do not occur in dynamic arrays, whose size are not known at compile-time. This is not a hard problem, no matter how confusing you try to make it sound.

How do you know it’s not intentional?

I addressed this in my other response. Please provide a legitimate scenario for doing this intentionally, otherwise this argument has no validity.

Robert (profile) says:

Re: Re: Re:9 Who do we fear?

“With run-time bounds checking, created automatically by the compiler. That’s how modern languages ensure that buffer overruns do not occur in dynamic arrays, whose size are not known at compile-time. This is not a hard problem, no matter how confusing you try to make it sound.”

Can you give examples of such languages?

Are any used for drivers? Could any be used for OS development?

Also, not all embedded systems have that much RAM. I firmly do not believe MHz and MB are excuses to bloat your code, not that you’re implying it, but I don’t think you should encourage it.

Bytes are free, yes, but you should not use them just because you can!

I have many MSP430 dev boards, 2012’s (not the year, the model), which has only 128B of RAM. That makes it fun! I could code in assembly, and have, but I prefer C so I can focus on the algorithm. I don’t import libraries because of limited flash sizes (2kB).

I am aware of the Raspberry Pi and would love to have it, but back orders, having an infant at home, house maintenance, the desire to play/write music, and the expectation of my employer that I will become a CRM Dynamics developer… those prevent me from having the fun I would love.

Though I do ask, can you give examples, actual code samples, that leave buffer overruns wide open in C that someone can exploit or cause themselves?

And examples of languages that prevent such possibilities.

Remember, questions like “What language is OS X written in?” is not the same as “And how’d that work out for you?” sarcastic comments!

Mason Wheeler (profile) says:

Re: Re: Re:10 Who do we fear?

Can you give examples of such languages?

On the native side, Delphi, or other dialects of Object Pascal. (Also Haskell, I think.)

On the non-native side, anything that runs on the CLR or the JVM.

Are any used for drivers? Could any be used for OS development?

Delphi used to be used for drivers, until Microsoft changed the driver format and Borland neglected to update the special linker required. Were that to change, there’s no reason why it couldn’t be used for drivers or OS work.

Also, not all embedded systems have that much RAM. I firmly do not believe MHz and MB are excuses to bloat your code, not that you’re implying it, but I don’t think you should encourage it.

Bytes are free, yes, but you should not use them just because you can!

Oh, I definitely agree with you there, I just think that there need to be limits. When people talk about not using bounds checking, they invariably attempt to justify it by talking about the performance impact, as if their code was still going to be run on 1980s hardware, and that’s just plain ridiculous in this day and age! This is exactly the sort of scenario that the old adage about “premature optimization” was designed for.

Though I do ask, can you give examples, actual code samples, that leave buffer overruns wide open in C that someone can exploit or cause themselves?

Personally, off the top of my head? No. I try to avoid C whenever possible. However, I know where you can find plenty of real-world examples. Go to the bugtracker for Mozilla Firefox, or for any major open-source *nix project, and run a search for “buffer overflow” or “buffer overrun”, and you’ll get results, complete with code samples.

And examples of languages that prevent such possibilities.

See above.

Mason Wheeler (profile) says:

Re: Re: Re:7 Who do we fear?

Yes, a buffer overrun bug existing in code is a programmer problem. But a buffer overrun bug successfully executing is not a programmer problem. It’s a language problem, as evidenced by the simple fact that languages exist in which it’s possible to create buffers but not possible to execute buffer overruns!

Imagine a chainsaw that had a button that would cause the chain to disconnect. Clearly, there’s a legitimate use for this feature: maintenance. But now imagine that this button was located right by the most convenient place for the user to put his thumb while he was holding the trigger down, and that there was no safety system whatsoever in place to prevent the chain-disconnection feature from being activated while the motor was active.

And then someone’s using it one day, and they accidentally bump the button with their thumb, and the chain flies off right into their face. Would you say “oh, that’s the user’s fault; he should have understood the tool he was using” or would you say “why in the world was he using such an unsafe tool in the first place?!?”

Thousands of people end up in the hospital because of this, but people keep buying this model of chainsaw, because the slick salesmen at the hardware store keep reassuring them that this chainsaw is just fine. “Oh, no, that thing they talk about on the news, that’s just a bunch of sensationalism. You know how journalists are, always jumping on any story they can find. They need ratings, y’know? But this is perfectly safe; all you need to do is understand the tool and know what you’re doing. Those guys they’re talking about on the news, well, I hate to say it, but that’s really their own fault. They weren’t careful. They weren’t smart enough to understand what they’re doing. But you’re not like that… right?”

I’m sure you can agree how ridiculous this scenario would be with a physical tool. Why, then, should it be considered acceptable in the virtual development world?

You say it’s all just developer error. If it had only happened the one time, I’d be inclined to agree with you. But when there’s a clearly-established pattern of the same error over and over and over again, spanning multiple decades and thousands of developers, all of them making the same mistake even when they know about it, simply because that mistake is so easy to make, that’s not developer error. That’s a bad tool that makes it far too easy to make a certain mistake. (Are you familiar with “Unsafe At Any Speed”? What I’m describing is analogous to the difference between an unsafe driver and an unsafe car.)

You assert that “in some cases it may be desirable.” I find this idea absurd. Extraordinary claims require extraordinary evidence. In what situations would it be desirable to deliberately create a buffer overrun in your own code? (Legitimate scenarios only, please. The obvious answer, “writing malware,” does not count.)

And no, I didn’t develop any language. But I’ve been working with computers, both as a programmer and as a user, for decades, and I’ve seen the damage that widespread acceptance and usage of C has done, from both sides. I’m well aware that there’s no such thing as a flawless language, but there’s a huge distance between “no language is perfect” and “therefore we should not reject a language known to have serious safety flaws, because competing languages which do not have serious safety flaws are not perfect either.” But apparently your mind is simply better-equipped than mine at making leaps of logic of that magnitude…

Robert (profile) says:

Re: Re: Re:8 Who do we fear?

Any particular reason you’re attempting to flame?

I’m trying to have an intelligent discussion and you’re attempting to behave like a troll. When you hurl insults, not disagree, then that’s not debating.

You could say “you sound like a naive programmer and here’s why…” and explain. But you didn’t. And your analogies are really horrible too. People could hold the chainsaw incorrectly and harm themselves too! People using radial arm saws have removed their hands and many shops have stopped using them as a result, but is that the tool’s fault?

A chain release button, to me, is not even on the same level as a buffer overrun.

Please explain how following proper rules of programming results in buffer overflows? If the developer guarded against that with if-statements, how can a buffer overflow result from accessing an array? Please also clarify the data structures used in your example, such as an array-index or pointer-array or simple buffer and pointer.

Then please explain, if C was so dangerous, why are these other languages you say are safer not being used? Why is it no one is using them? Is it really the results of R&K or is it ease of development and portability?

How would you design the OS and what language would you use? I’m not Tannenbaum, I am not Torvalds. I’ve not been lucky enough to edit the Linux Kernel, though I’d love to. I’ve modified drivers when I could and edited board support packages on embedded systems. But no, I didn’t write a compiler or OS. I studied EE. I have done many projects at home, from the ground up in hardware and software, and my language of choice is C. But that’s me, that’s my experience. I’ve been warned to protect my code, don’t leave it open to the problems buffer overflows cause.

My primary roles were testing, system testing, debugging, etc… but I listened to their warnings.

Why do thousands of developers do that? Because of poor coding standards? Because of laziness? Because they don’t test their code enough? They are focused on producing instead of quality? They tested their own work without trying to break it?

So yes, if you want to be productive in this debate, rather than sarcastic and inflammatory, you could explain why other languages are not used. Give logical reasons please. I don’t know if I can fully believe it was the sales pitch of C that resulted in its adoption for just about every hardware interfacing software development project.

You can’t be the only one, and I’ve seen books on avoiding buffer overflows, so why has no one else adopted it?

And if C was unsafe at any speed, why would it be so widely adopted? Why would it not have been dropped? It cannot be a conspiracy! I don’t buy that, unless you can provide some proof of that, it just does not seem logical.

If you choose to flame again, I won’t bother replying. That’s highly counter productive. I do not feel I have flamed you at all, questioned, yes, but I think you’re inserting tones that are not present.

If I wanted to flame you, it would be blatantly obvious.

Mason Wheeler (profile) says:

Re: Re: Re:9 Who do we fear?

This sudden defensiveness is a bit surprising. I wasn’t trying to flame anyone either. Sure, I snarked at you a little there at the end, but was it any worse than the armchair psychoanalysis attempt where you claimed, with no evidence, that I was most likely a bitter language designer whose language had failed? That’s not flaming, but me saying that you apparently have a superior capacity for making huge leaps of logic is? Color me confuzzled here. As you said, if I wanted to flame you, it would be blatantly obvious.

Then please explain, if C was so dangerous, why are these other languages you say are safer not being used? Why is it no one is using them? Is it really the results of R&K or is it ease of development and portability?

Well, let’s analyze this one objectively:

Ease of development? Only if compared against ASM. Just look at the focus of this thread: In C, you have to manually write your own bounds checks! And that is demonstrably something that people get wrong, a lot. Nothing easy there.

Portability? Not for OSes, which is what we’re talking about here. OSes have many design goals, but they’re very low-level software, designed to interface with hardware. Portability is not one of them, by definition.

How would you design the OS and what language would you use? I’m not Tannenbaum, I am not Torvalds.

And neither am I, nor would I like to be. Their ideas have found some acceptance in the server world, but outside of that particular niche, both of their philosophies have largely been a failure. The computer revolution has been driven by home users, built on the OS principles pioneered by Apple and popularized by Microsoft.

But that’s me, that’s my experience. I’ve been warned to protect my code, don’t leave it open to the problems buffer overflows cause.

Are you sure? Like I said in my other reply, these bugs have a long history of showing up in exactly the places where you would think they wouldn’t.

You know FreeBSD? Widely renowned for its obsessive attention to security? Known for its strict standards and review practices? Well, have a look at http://securitytracker.com/id/1026460. Just last year, someone found a buffer overflow exploit in FreeBSD’s Telnet implementation. This is freaking Telnet we’re talking about! A well-understood protocol that’s been around for decades. I bet whoever wrote that didn’t think they leave their code open to buffer overflow problems either. But they did.

Why do thousands of developers do that? Because of poor coding standards? Because of laziness? Because they don’t test their code enough? They are focused on producing instead of quality?

Because they’re human, and humans make mistakes. And when you see people making the exact same mistakes so consistently, even when they know not to, even when they’re trying not to but they just slipped up at some point and then someone ends up getting hacked because of an honest mistake, shouldn’t you at least consider that maybe, just possibly, it might be happening because of a design flaw in the tool that just makes it too easy to make that particular mistake?

I don’t know if I can fully believe it was the sales pitch of C that resulted in its adoption for just about every hardware interfacing software development project.

That’s what got it adopted at first. C came out of Bell Labs, with the full force of AT&T’s marketing and branding behind it. Never underestimate the power of a good sponsor. (Do you really think anyone would be using Java today if it had been invented by J. Random Hacker in his garage, and not by Sun?)

Nowadays, I’d say a bigger factor is inertia, the idea that “this is how we do it because this is how we’ve always done it.” Everyone knows that C and its derivatives are the only way to write an OS, so that’s how everyone does it. People talk about portability. They say that it’s because a C compiler is one of the first things anyone writes for any new platform. But that’s just a chicken-and-egg argument; people write C compilers for new platforms because everyone expects there to be one, because that’s how it’s always been for other new platforms.

And if C was unsafe at any speed, why would it be so widely adopted? Why would it not have been dropped? It cannot be a conspiracy!

Umm… what’s with the C word? I never said anything about a conspiracy. Again, please don’t put words in my mouth.

And I note that you haven’t actually provided legitimate example of a time when a developer might want to intentionally create a buffer overflow for legitimate purposes. As expected.

JEDIDIAH says:

Re: Re: Re: Who do we fear?

What you are describing is a generic buffer overflow bug. In order to exploit such a bug, a large number of assumptions have to be true. An “inert data” format is not executed under normal circumstances. So any condition that allows a movie to be an executable requires a bit of planetary alignment. Things like platform, player, payload all have to line up just so.

This is even trickier when you don’t have a “standard” decoder due the fact that a format is not associated with a particular software vendor or their exclusive proprietary application.

Even PDF vulnerabilities can be blunted by not using the “official” decoder.

An uninviting environment slows down the spread of pathogens of all kinds.

Jon B. (profile) says:

Re: Who do we fear?

You have to TRY to download a movie from a torrent site with malware.

Like AC said, if it’s a movie file, it can’t contain malware…

However, if you go to some shady sites that have a promise of good quality, recent releases, they’ll try to sucker you in by getting you to install their download manager, or fill out an offer for a credit card before you get a download link… but still, you have to be really committed to downloading malware if you make it that far.

The downside to downloaded movies is the lack of metadata. You have to do some manual labor sometimes to get the movie put into your library where all your media players and devices will recognize what it is. For me at home, though, that has all be solved with Plex Server. Plex just figured out what movie it is based on the name of the file and downloads cover art, lots of meta data, and appropriate video thumbnails automatically for the movie as soon as it’s done downloading, and it automatically just plays on any device in my house.

For me, the sad thing for Hollywood is that I now get a better PACKAGED experience from my downloaded movies – the cover art, plot summaries, etc, all directly onscreen. The official means of playing an Ultraviolet movie don’t give me that experience.

Rich says:

Re: Re: Re:2 Who do we fear?

As I said in my post above, executability has nothing to do with it. Malware in a non-executable file takes advantage of flaws in the software that uses it. Imagine a hypothetical text editor that didn’t handle files with extremely long lines, or very large file sizes, etc., very well. A malicious programmer would look for a way to exploit this, so that when his “text file” is loaded, the editor would cobber its own code with that embedded within the file. Now, modern OSes will mark sections of memory containing code (ironically called text sections!) as non-writable, but older ones didn’t. You can even find examples of jpeg files that had viruses in them, because of this.

PaulT (profile) says:

Re: Re: Re:4 Who do we fear?

“You also have to factor in OS, OS version, and microprocessor architecture.”

That’s true of any malware, though. OSX and Linux users have traditionally been able to sit smug and laugh at Windows users with their malware problems, even if they’re using the same infected files. Meanwhile, the vast majority of malware takes advantage of flaws that have already been patched – they just take advantage of those who haven’t bothered updating/patching yet. That won’t be 100%, but it might a non-trivial number above 0%.

There’s no such thing as malware that hits every machine, but that’s not relevant. A successful zombie botnet might only need a few thousand infections, and as we’ve seen even technically proficient users might let their guard down when running non executable files. You don’t need to get everybody, just enough to make your aims successful.

gorehound (profile) says:

Re: Who do we fear?

Fuck the MAFIAA ! At this point in the game I would not give a shit what they are working on or putting out.I will purposely ignore them.I will make sure they never can get into my Wallet.
If I need to see something so bad (which I doubt I have to) I will just Opt to Buy A Used Physical Copy.That way they do not get a dime from me.

I will Buy and Support Local and Indie Non-Hollywood and Non-DRM Art !!!

art guerrilla (profile) says:

Re: Who do we fear?

first things first: WHO does NOT like ‘free’ stuff ? ? ?
EVERYONE likes free stuff, EVERY-fucking-PERSON on this planet…

so i REALLY hate the copy maximalists constant tsk-tsking about freetards who only want free…
EVERYONE wants free…
(AND the copy maximalists MOST of all: they want free rent for NO effort…)

lastly, for any kampers here who have dealt with entitled rich people (which is most ALL rich people), they will KNOW that rich people are the FIRST in line to not only get ‘free’ stuff, but to stiff underlings, hirelings, and other bidnesses ALL THE FUCKING TIME…

’cause -you know- they deserve to have all their shit free, unlike us li’l peeps of no means…

art guerrilla
aka ann archy
eof

Anonymous Coward says:

Re: Re: Re:

The funny part is that it will fail because the DRM system they’ve put in place is so impossibly frustrating to get working.

You would hope that they would learn something when it fails, but I suspect it will instead be twisted around as evidence that “pirates” won’t purchase from legal channels even when those channels are available for free.

weneedhelp (profile) says:

Free?

Free with DRM or free without it. Well that took all of a millisecond. Free w/out DRM duh.
Ill pay for DRM free stuff.

Fuhk man, how many times do we have to tell you?
I want to be able to use my content however I damn well please. I want to play it on my laptop, desktop, Ipad, or Windows phone… whatever.

You can take my money and use it how you wish. I will use my legally purchased content as I wish, and if I cant, I just wont buy it. There is your lost sale.

Michael (profile) says:

Re: Re:

I cut the cable cord a couple of years ago. I have several VUDU-enabled devices (Blu-Ray player, PS3, a smart TV, and a Sony streaming box) and I use it pretty regularly.

VUDU is really easy to use. Wal-Mart set it up pretty well and I find it more convenient for rentals than anything else. Once I had entered my credit card number, it became a click it and rent/buy it (well, license it) process.

I had not originally realized it was UV behind the ‘purchases’ until I installed Flixter on my phone and the movies just appeared there.

Now, the bad. You need a UV license for the content AND it has to be available on the service. So, I can watch some movies on my phone but not on any of my TV’s, and I can watch some movies on the TV’s, but not my phone – that’s pretty annoying.

Overall, I would say for the average consumer, it is much easier than my DLNA server setup (to ditch those plastic disk things) because the DLNA ‘standard’ is still so un-standard that unless things are encoded correctly, they may only play on some of the devices (BTW – Sony is the worst, their Blu-Ray player, streaming box, and PS3 all support different file types).

JEDIDIAH says:

Re: Re: Re: A really wrong approach (DLNA)

If you are going to have a media server, DLNA is about the worst thing you can use it for. Most DLNA clients suck really badly. They tend to be the best argument for HTPCs.

Alternately, you can use a better streamer like a Roku and run 3rd party software on it. Apps like Plex and XBMC are light years ahead of the DLNA clients baked into most TVs, Stereos, or BluRay players.

Zos (profile) says:

Re: Re: Re:2 A really wrong approach (DLNA)

roku + plex ftw. i can play pretty much anything, anywhere, anytime. I’ve never seen a virus in a torrent outside of a couple dodgy game cracks from random unknown uploaders.

so long as your not running random .exe’s attempting to download from shady sites, you’re probably never going to see a virus in a download.

explicit coward (profile) says:

Consumer’s Free <> Hollywood’s Free – it’s that simple.

To me entertainment freedom means I get to watch what I want, when I want, where I want, on the device I want without any advertisment, without any fbi-warning, with the means to make backups – this sort of freedom is certainly worth a few of my bucks.

A DRM-, advertisment-, fbi-warning-laden construct is worthless to me.

Dear Hollywood, get back to me when you got some offer that is worth the name…

That One Guy (profile) says:

While I’m sure this’ll initially draw in a few people, lured in by the idea of ‘free movies’, if the service is even half as bad as I keep hearing, I get the feeling the vast majority will decide that it’s just not worth it.

This of course will lead to the rather funny point that the service is so bad they can’t even give it away.

Also, you missed the best part of the source article: to get the movies, it requires the customer to buy either a new tv or blu-ray player, which makes them anything but ‘free’.

Zakida Paul says:

DRM – so pointless. Amazon use DRM on their e books. I took them off my Kindle, stripped the DRM using Calibre, converted them to epub format and have them sitting on my external hard drive.

This is the problem with DRM. It is easily circumvented, it does not stop piracy and the only result is pissing off paying customers. Why can’t the entertainment industries see this?

Vincent Clement (profile) says:

Re: Re:

DRM is completely pointless. My wife recently got into the whole Resident Evil movie series, so while she was out one day she bought a bunch of Resident Evil DVDs. 3 of the 4 discs played fine in our desktop computer. The fourth wouldn’t play.

Opened DVDFab and ripped a DRM-Free Region-Free DVD, burned that to a blank DVD (for the time being it’s just easier to do it this way for my wife) and voila, she could watch the movie on the computer.

The only thing DRM achieved in this case is frustrating my wife, inconveniencing me and using up a blank DVD.

hegemon13 says:

Not worth it. At all.

I got the ten free movies by mistake. I was impressed with the quality of the one free HDX rental from Vudu that came with my Blu-ray player, so I decided to try out actually registering the only Ultraviolet-compatible Blu-ray I owned.

What did I get? Well, the Vudu app doesn’t work on any of my mobile devices, so I got the ability to watch a digital version of my Blu-ray disc through the app…wait for it…on my Blu-ray player. Because the extra ten steps to put the disc in is totally worth losing HD Audio for.

Oh, and I also got 10 crappy movies in SD that I will never, ever watch.

Anonymous Coward says:

Pirate on

I’m sure that Hollywood has some smart people that can tell them how to make a boatload of money on the Internet and yet all we get is Ultraviolet.

I’ve looked at it from different angles trying to figure out just what their hold up is, and all I can figure out is that they want people to pirate…and then it hit me!

BLANK DVD SALES!

When pirates download their booty,they usually make a back up on DVD and in many places Hollywood gets a cut of all the DVD sales.So…
Ultraviolet=
increased pirating=
increase in DVD sales=
$,$$$,$$$,$$$

All that stuff about infringement and DCMA notices and all that anti pirate rhetoric is just a smoke screen!

What they really want, is for you to be a pirate!

Yeah, that’s it…has to be!

Robert (profile) says:

@Mason Wheeler

You can be quite snarky. I didn’t put words into your mouth, if asking a question it is actually that, a question.

And I said you SOUND like a bitter person because all you did was trash talk C.

Only after many posts does your reasoning come out. You can’t say “it’s obvious” because how many people here honestly know C? Or write code that talks to hardware?

Anyway, I’ll look for myself for code examples then. I figured, based upon your word choice against C, that you had specific coding examples.

I am not a hacker, no time, and I’m not a CS major, so I don’t know all the ins-and-outs of compilers or how OS’s manage how an application runs. I do understand the processor level though, with pipelines and such, and the DMA etc…

That’s why I asked for examples, of some code that can be exploited and how it is exploited.

Generally, if I am against something to the point of bashing it, I would prefer to be able to give examples off the top of my head, so during a conversation I could back it up. That’s NOT A DIG AT YOU! It’s just how I debate.

I can’t think of any time I’ve written code that would, by itself, overflow a buffer.

That’s why I asked for examples.

I’ll consult the web for examples.

Thanks.

Cheers!

Wally (profile) says:

Wow

UltraViolet is so useless in general. The movies you “own” can only be watched once for 48 hours every two weeks whether it’s streamed to you or on any device that “supports” UltraViolet.

Outside of the DRM of Ultraviolet, it’s a pain in the arse to set up as well. First you have to make an account, then you have to register the movie to your account from the computer. Next you have to register your device exclusively to Ultraviolate…otherwise no dice. Finally you have to put in the code that comes with your movie each and every time you wish to watch the movie you purchased away from the TV set…

The fact that they are giving away free movies (with a catch) is a sign of desperation that the Ultraviolate format is dying…but when you see how much of a hassle it is, why should anyone be surprised that it is?

Fergal (profile) says:

Re: Wow

What a bunch of whiners! I have an UltraViolet account with about 30 movies. I can watch any of them on my home theater with the Vudu app in my BD player. Most of them came free with DVDs/BDs I bought, but I paid for a few directly on Vudu and Sony’s site. I watched one on my Kindle Fire the other day, and one on my iPad a few weeks ago. I can download or stream them to my Android phone using the Flixster app (although I don’t really care about watching movies on my phone). I’ve never tripped over any of those “massive strings” from DRM. It just works (except for frequent bugs in the Flixster service). I had to create a new UltraViolet account and Flixster account and it was such a pain to set up … took me a whole 30 seconds! Then another unbearable 20 seconds to link my Vudu account (already had it). I’m still recovering from the mental anguish of multiple accounts. (But no complaints that I had to create a TechDirt account. 😉

I just don’t get all this negativity about evil DRM, but maybe that’s just because I don’t use Linux in my home theater. But why deal with the pain when my BD player and my cheap Roku box stream UV just fine?

BTW, the post I’m responding to is full of shite. There’s no 48-hour limit, I never had to register any player, and I never had to enter a movie code more than once.

kyle clements (profile) says:

I really wish the film and music industries would offer a simple service where I could just go to their website, pick a piece of content, and pay $5 to support those involved in the creation of that content, then I’d get an email saying I was now licensed to own a copy of that content in any form.

Then I can go to pirate bay, iso hunt, or wherever, download the content, and it would be perfectly legal and paid for, and that would be the end of it.

Tom R says:

Industry Imbeciles

The answer is dead simple. I will not buy anything (film, music, game) that uses DRM.

Fortunately I am not interested in most of the infantile BS that these days comes out of Hollywood and the so-called “Music Industry” (which mainly consists of the issue soft porn videos all with the same inhuman and inflexible beat, boring two chord backing, and a three-note melody – if there is any melody at all).

I pity the younger generation that does not know any better than to consume the puerile rubbish that makes up mainstream entertainment in the 21st century.

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...