Why Making Developers Liable For Security Vulnerabilities Won't Work

from the it's-a-problem dept

It seems you can’t go a few months before someone, somewhere brings up the question of whether or not developers of software products should be liable for the security vulnerabilities in their products. The frustration level is high with buggy software, and so out come the suggestions that lemon laws should apply and execs at companies that make buggy software should go to jail. The problem, though, is that software will remain insecure no matter what’s done, and adding liability might actually make the problem worse. However, along comes Howard Schmidt saying instead of companies being held responsible, the actual developers of products should be held liable for security vulnerabilities. This is the same Howard Schmidt who announced 10 months ago that technology would solve the phishing problem in a year. He’s got two months to go, and last we checked, phishing was still a growing problem. Should we hold him liable for falsely claiming that phishing would be gone? That’s the crux of the problem. No matter how careful a developer is, there are always going to be holes he can’t foresee. Making developers liable will only cause a few things to happen: vastly fewer programmers will be available to work on security issues, as it’s just not worth the risk, and fewer companies will even try to make security products. Also, just about every product you buy will be surrounded by pages and pages of legalese to tell you that the product isn’t at all secure to try to legalize themselves out of liability. That won’t help anyone in terms of actually building more secure applications. Schmidt is right in saying that developers need to be better trained in computer security, but that doesn’t mean adding liability issues without looking at the unintended consequences of such an action. Update: There’s an update to this story, where Schmidt clarifies that he was talking about accountability, not liability — and the ZDNet article misconstrued it.

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Why Making Developers Liable For Security Vulnerabilities Won't Work”

Subscribe: RSS Leave a comment
Tom says:

Why Making Developers Liable For Security Vulnerab

Well they should not be responsible for security but they should be responsible for poorly written software that was not tested properly or rushed out the door. To many time in my day have I had to explain to a customer that they purchased sub-standard software to save a buck. The people who wrote that should be liable for the headache caused by saving a couple weeks or a few dollars.

AC says:

Re: Why Making Developers Liable For Security Vuln

Yeah right, because its the developers idea to skip all QA and ship it before the end of the quarter to boost profits. Devs usually want to do good work, however the suits only care about making money. There is bound to be a conflict, and who do you think usually wins? Thats right the folks that sign the checks.

What really kills me about this is the fact that it is so very biased. It in effect is saying lets punish all the techno-serfs instead of holding the management of the corporation responsible for their decisions.

John says:

Re: Why Making Developers Liable For Security Vuln

Sure, blame the developer… As a developer I can say that this is BS. Bugs will live forever. But the amount of bugs is always directly related to management. Some clients don’t want to pay the extra dollars for QA and full testing, nor do they want to wait the amount of time it takes to properly test, fix, and retest software.

And if you really want to get rid of bugs, you have to start in the operating system and maybe as low as the hardware. All of this technology is built upon itself and if the piece below has a bug, it will show through to other programs on top.

Bill says:

Read the EULA

At best, you should be able to get back the cost of the software. This is the same for most manufactured products. If I sell you a $1.00 nut and bolt, and because it’s defective, you want me to buy you a new car, or what ever failed, your crazy. I could build some libility into the price, but them each nut and bolt is going to coust you $10

Happy user says:

Re: What about the underlying code?

Truth is, no one has held a gun to your head and is telling you that “you must use XX Software or you will die.”

It was your choice to use XX Software. By not doing your research on the company before purchasing the software, you remain the only one to blame here.

And if there is no other software that will get the job done except the one with “bugs” or “security holes”, then that is part of the price you pay for wanting to use that software.

Humans are too quickly to blame others for when things don?t work 100% as they expected. I am sure it was hardly the intent of the developer to create “flawed” software ? or maybe it was?

Signed, Happy user.

S says:

No Subject Given

The above points about cutting out QA and security issues in underlying software/hardware are absolutely issues. We face the “we don’t have money for QA” problem every year when the budget is outlined.

Here’s another example of the security/buggy debate. Our help system uses compiled html help files. Suddenly our help systems didn’t display the html when access across the network. Turns out that the problem is a security fix that was implemented in an OS patch. So now what? Do we hold the OS company liable? Their developers for not seeing this problem years ago before we developed our help system? No.

We will deal with it and make things work eventually, for now we have a work around. Times change, businesses change, software changes, security issues change, etc. It’s not as if the technology world is static. And, nobody has a crystal ball. Instead of wasting time and money pointing fingers, put effort and money into positive change, into solutions and MOVE ON!

(stepping down from soap box…)

Y Pennog Coch (profile) says:

A second computer for all would be cheaper

Fit a second network in your office, or use wireless, and buy everyone a small quiet mini-ITX PC (they’re fast enough for surfing) and a KVM switch.
Make it clear that frivolous use of the work systems instead of the ITX will be a sacking offence. For personal mail, get a webmail account and access it from the ITX box. Similarly, placing commercial data on the ITX boxes is a serious offence.
Now you can lock down your original business network and apply some aggressive filtering.
If anyone wants to check out a filtered site for business purposes, they do it from the ITX box then put in an unblocking request. Requests could be granted automatically for established employees and domains that aren’t explicitly blacklisted.
Now when a PC goes down to a security hole, 19 times out of 20 it’ll be one of the ITX boxes, no data to rescue, and all the IT guys have to do is boot it from a recovery disc. And if you use Linux, you can leave the discs with the employees.
If this sounds too expensive, well so long as Moore’s Law keeps going it can only get cheaper.

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...