Should It Be Illegal To Get Hacked?

from the might-be-a-bit-extreme dept

A few years back, we asked if it should be illegal to get hacked. In that case, we were referring to some fines that the FTC had handed out to companies that had leaked data to hackers. This raised some troubling questions -- as it's often difficult-to-impossible to stop your computer systems from getting hacked, and putting liability on the company could lead to some serious unintended consequences. Yet, at the same time, over the past few years, we've heard about large security breaches on a regular basis (thanks, in large part, to new disclosure laws) -- and often those breaches definitely seem to be due to negligence on the part of a corporate IT team that failed to lock down the data in any significant manner. That seems to be leading more people down the path of saying that companies should be liable for getting hacked.

For example, Slashdot points us to a blog post at InfoWorld, where it's suggested that companies should be criminally liable for leaking such data. I can certainly understand the sentiment, but it may go too far. Again, it's impossible to totally protect a system from getting hacked. Sooner or later there's always going to be some sort of leak. Increasing penalties could make companies take things more seriously -- especially in cases of gross negligence (which do seem all too common). But making the rules too strict can have serious negative unintended consequences as well, even to the point that some companies may stop accepting credit cards altogether, since the liability would just be too great. Would people be willing to give up the convenience of credit cards to protect their safety? From what we've seen, for most users the answer would be no. They know their credit cards are at risk, but they still use them because the benefit of the convenience still seems to outweigh the danger of the risk.

Filed Under: data leaks, hacked, legal, liabilitiy


Reader Comments

Subscribe: RSS

View by: Time | Thread


  1. icon
    chris (profile), 26 Aug 2008 @ 1:59pm

    the difference between leaks and hacks

    the difference between leaks and hacks is the same as the difference between negligence and malice.

    should you be held liable for doing something stupid that harms others? sure. like falling asleep at the wheel and causing a fatal accident, you should be held accountable being negligent with people's personal information.

    is that the same as deliberately doing something that hurts others? of course not. driving on the sidewalk with the intent of running people over is a much worse crime than falling asleep at the wheel, even if no one is injured in your sidewalk rampage.

    the issue boils down to a question of intent. once you get past that, the issue is still really complicated.

    a company having questionable security practices are where it really gets interesting. i mean, they aren't trying to be unsafe, they just aren't aware of what safe is, or are too cheap/lazy/incompetent to implement security measures.

    is there an enforceable definition of what safe is in terms of information security? can we trust state/federal legislators to come up with a definition that won't land us all in jail?

    it stands to reason that if you conformed to some sort of accepted standard for security measures and your data was compromised by an outsider then you shouldn't face the same penalties as a company that disregards information security. that would be like punishing the victim of an assault for not defending himself.

    there are industry standards, but should those be made enforceable? is there a federal or national standards body for information security? there are standards for the government and the military, should those be applied corporations as well? you know, something like an FDA or OSHA?

    what happens if you do everything by the book and you get owned by something that isn't in the book? (like a 0day for example) should the company still be held accountable? is that the same as being negligent? are legislators even capable of understanding what a 0day is?

    what happens if a company has an employee that actively subverts those reasonable security measures? you took all the steps and someone is working against you. someone might do this on purpose in the case of corporate espionage, or they may do this without realizing it, in the case of a lost or stolen laptop or USB key. someone may have access to sensitive data and move/copy it to a non-secure medium purely in the interest of convenience. should there be a mandate to lock down that sort of activity?

    also, upon whom does the responsibility fall? on the company leadership? on the company's IT director? on the negligent party?

    it seems like forcing companies to safeguard data is a good idea, but there are a lot of questions that need to be answered.

Add Your Comment

Have a Techdirt Account? Sign in now. Want one? Register here



Subscribe to the Techdirt Daily newsletter




Comment Options:

  • Use markdown. Use plain text.
  • Remember name/email/url (set a cookie)

Follow Techdirt
Techdirt Gear
Show Now: Takedown
Advertisement
Report this ad  |  Hide Techdirt ads
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Chat
Advertisement
Report this ad  |  Hide Techdirt ads
Recent Stories
Advertisement
Report this ad  |  Hide Techdirt ads

Close

Email This

This feature is only available to registered users. Register or sign in to use it.