Should It Be Illegal To Get Hacked?

from the might-be-a-bit-extreme dept

A few years back, we asked if it should be illegal to get hacked. In that case, we were referring to some fines that the FTC had handed out to companies that had leaked data to hackers. This raised some troubling questions — as it’s often difficult-to-impossible to stop your computer systems from getting hacked, and putting liability on the company could lead to some serious unintended consequences. Yet, at the same time, over the past few years, we’ve heard about large security breaches on a regular basis (thanks, in large part, to new disclosure laws) — and often those breaches definitely seem to be due to negligence on the part of a corporate IT team that failed to lock down the data in any significant manner. That seems to be leading more people down the path of saying that companies should be liable for getting hacked.

For example, Slashdot points us to a blog post at InfoWorld, where it’s suggested that companies should be criminally liable for leaking such data. I can certainly understand the sentiment, but it may go too far. Again, it’s impossible to totally protect a system from getting hacked. Sooner or later there’s always going to be some sort of leak. Increasing penalties could make companies take things more seriously — especially in cases of gross negligence (which do seem all too common). But making the rules too strict can have serious negative unintended consequences as well, even to the point that some companies may stop accepting credit cards altogether, since the liability would just be too great. Would people be willing to give up the convenience of credit cards to protect their safety? From what we’ve seen, for most users the answer would be no. They know their credit cards are at risk, but they still use them because the benefit of the convenience still seems to outweigh the danger of the risk.

Filed Under: , , ,

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Should It Be Illegal To Get Hacked?”

Subscribe: RSS Leave a comment
Anonymous Coward says:

You need to take responsibility . . .

When I was doing more web development consulting (before I sold out and became a cog in the giant corporate machine) I would often do projects for smaller companies that wanted an e-commerce presence of some kind. It was always my practice to off load the credit card information to a bank and never hold or even handle any of that information myself, simply because I knew the organization didn’t have the resources necessary to properly secure it. Companies need to understand that people’s information is important to them and while getting and keeping that information may be incredibly useful for analysis, it also comes with a hefty burden of responsibility.
Just as an aside I now work for a fortune 500 financial institution and I can assure you that security is paramount in nearly everything we do regarding application development. It is often a giant pain when it comes to getting things done, but I have to respect that my company takes the security of its customers as seriously as it does. Of course being a financial institution the idea of information security is old hat and has largely been built into the operation of the business as a whole. This is certainly not true for many smaller organizations that have e-commerce or other information intensive web presences.

Anonymous Coward says:

Re: You need to take responsibility . . .

I have personally been involved in several projects where if I hadnt stood up and put my foot down and been ornery and said–NO, we may not brush this off, we MUST do better to protect our customer’s data–then nobody else would have had the temerity to stand up with me and we would have become a disaster waiting to happen.

Each of us should double our resolve to be bold and fight for whats right.

Yosi says:

Doctors are liable. So should IT

All people will eventually die. This doesn’t mean that medical personal is not responsible for it’s mistakes. Same logic applies here. IT stuff should be responsible, the whole question is silly. Whether it’s criminal or civil liability can be determined on per-case basis, similar to medical practice.
In some cases, doctor (or nurse) will be fined, while in some cases jail time is possible.

Anonymous Coward says:

Re: Another Thought

“If this becomes reality, why not pay someone to hack a competitor? The negative publicity coupled with the legal ramifications could result in more business for you.”

why not rob a bank, or better yet, just have your competitors leadership murdered (im sure you could hire someone to do it for you). “why not . . ” just hire someone to hack them? Becuase its a crime and hopefully most people have a slightly stronger moral compass then that anyway.

Jake says:

Another potential complication is how exactly one determines whether inadequate or poorly-worded company policy or individual carelessness -or neither, for that matter- were to blame; a law encompassing that many grey areas is not a law I want to see being passed in these times of hair-trigger litigation. The threat of civil or criminal penalties would also make it very difficult to learn any lessons from a breach if admitting to an oversight or error of judgement could earn oneself a lawsuit or criminal record.

Moto says:


For a long while now, there have been strict (though not well enforced) regulations on tech security in companies that deal in medical history for patients. Medical histories/charts contain just as much sensitive information as most financial institutions, if not more. (I can’t remember the name of the regulations, and my Googling skills aren’t up to par today..) These regulations do not set up companies for being sued for being hacked, while they do hold the companies responsible if information is leaked/stolen/hacked due to their not adhering to them. I find this to be a sensible way of forcing companies to take responsibility without setting them up for unfair lawsuits.

Sadly, I have worked for a number of such companies, and several did not have ANY interest in even reading the regulations. They also have never been inspected. The regulations involved are, to me, completely acceptable levels of security. If they were to actually be enforced, similar regulations could be created for any industry, or simply tied to the storage/manipulation of certain TYPES of information such as SSNs, account/credit card numbers, addresses, etc.

CM says:

Why cant all this data be encrypted. We always here about network security, but thats an ongoing armsrace. Encryption seems to be a more permanent and secure matter. I want to hear “Hackers have gained 200,000 CC and SS numbers, but its 256bit AES encrypted. Be sure to cancel your CC in 10 – 12 years”.

Then, if a key gets leaked, or the data not encoded, it should be easy to pass specific negligence.

And… If you can show blatant negligence, then then you are real close to criminal

Anonymous Coward says:

Re: encryption is the key

not necessarily, many computer experts can crack the common encryptions.also, as we get more and more processors brute forcing an encryption is also faster and easier to do. and what do you do once an encryption scheme gets cracked? move on to the next one, have fun migrating your entire servers over.

additionally due to the nature of encryption, everything will run about 1/3 slower than they do now and take more space to store. when you are dealing with terabytes (or even petabytes) of data, even that little change will be significant

Will says:

some liability needed

There needs to be some degree of liability, mandatory fines, and mandatory and unlimited compensation for any resulting damages.

That starts with with regulations on what data can be kept without explicit request of the user and how long it can be kept. None of this “we keep your credit card information on file” crap that you MUST agree to for a purchase. No more keeping data on file months or years beyond a transaction.

That’s followed by a specified set of mandatory minimum security requirements. Encryption of certain personal, financial, etc. data is mandatory; sid data cannot be shared with partners without explicit permission; said data cannot be carried on portable devices unless protected by an additional layer of security; and so on.

Those requirements are enforced with mandatory fines, X dollars per user per breach per violation. Also any and ALL damages caused by failure to meet these standards are covered by the pary at fault. You lose a database of unencrypted credit card data and $10 Million in fraudulent charges result – you reimburse that loss, not the credit card companies or the customers, mandatory.

You don’t have to make everyone liable for every security vulnerability that they don’t patch within minutes or even months to make a dent in fraud. There’s no need to legislate best practices either – just minimum standards of care.

chris (profile) says:

the difference between leaks and hacks

the difference between leaks and hacks is the same as the difference between negligence and malice.

should you be held liable for doing something stupid that harms others? sure. like falling asleep at the wheel and causing a fatal accident, you should be held accountable being negligent with people’s personal information.

is that the same as deliberately doing something that hurts others? of course not. driving on the sidewalk with the intent of running people over is a much worse crime than falling asleep at the wheel, even if no one is injured in your sidewalk rampage.

the issue boils down to a question of intent. once you get past that, the issue is still really complicated.

a company having questionable security practices are where it really gets interesting. i mean, they aren’t trying to be unsafe, they just aren’t aware of what safe is, or are too cheap/lazy/incompetent to implement security measures.

is there an enforceable definition of what safe is in terms of information security? can we trust state/federal legislators to come up with a definition that won’t land us all in jail?

it stands to reason that if you conformed to some sort of accepted standard for security measures and your data was compromised by an outsider then you shouldn’t face the same penalties as a company that disregards information security. that would be like punishing the victim of an assault for not defending himself.

there are industry standards, but should those be made enforceable? is there a federal or national standards body for information security? there are standards for the government and the military, should those be applied corporations as well? you know, something like an FDA or OSHA?

what happens if you do everything by the book and you get owned by something that isn’t in the book? (like a 0day for example) should the company still be held accountable? is that the same as being negligent? are legislators even capable of understanding what a 0day is?

what happens if a company has an employee that actively subverts those reasonable security measures? you took all the steps and someone is working against you. someone might do this on purpose in the case of corporate espionage, or they may do this without realizing it, in the case of a lost or stolen laptop or USB key. someone may have access to sensitive data and move/copy it to a non-secure medium purely in the interest of convenience. should there be a mandate to lock down that sort of activity?

also, upon whom does the responsibility fall? on the company leadership? on the company’s IT director? on the negligent party?

it seems like forcing companies to safeguard data is a good idea, but there are a lot of questions that need to be answered.

Anonymous Coward says:

Re: Re:

not all hacking is due to a corporation’s negligence. no matter how good your security is there is a hacker out there that is better.

the only way to really keep things secure is to lock the computer in a sealed room with armed guards, require ID badges to approach, make sure the computer isn’t connected to any network, and then search everyone who comes in for a camera, portable drive, cell phone, and just anything electronic at all, then search them on the way out to make sure they aren’t taking anything with them.

anything less than that and you are open to possible attacks. companies shouldn’t be held responsible if they take reasonable precautions to safe-guard data.

Benjamin Wright (profile) says:

Federal Trade Commission

Mike: The legal reaction to data break-ins has been too emotional. Many enterprise hacking victims have not been given the credit they deserved for having reasonable security for their circumstances. TJX is a case in point. The Federal Trade Commission treated TJX unfairly under FTC law. The FTC should rethink the law of credit card security, and stop treating merchant victims of organized crime as culprits. –Ben

Paranoid With Reason says:

it's not just one part

It’s not just the IT department’s networks. It’s not just the home-grown apps’ poor coding. We must also deal with the incredibly poor coding, from the ground up, of the most commonly installed OS on desktop computers.

Securing the servers and the network won’t save you from Joe Clueless when the poor sap brought in his laptop from home because it didn’t have the corporate-installed browser filter, figured out how to get it onto the corporate network despite IT’s efforts, browsed to a page with a malicious bit of code (maliciously malformed JPG or SWF banner ad anyone?) and wound up with a bot computer that in turn infected all the other desktops within the corporate firewall via some unreported weakness in their OS.

It’s a far too probable scenario, really, and nothing the corporate IT department does can save them from it happening. That’s why any legislation would have to be written very carefully or we’d run the risk of victimizing the victims.

Not that the (often budget-stingy) corporations who choose cheap over secure deserve a break…but bad legislation is worse than none and typically legislators aren’t so good at the finer points of the technology they’re so eager to legislate.

Jack Sombra says:

Should be something akin to Doctors

Just without the abuse that goes on in the medical arena:

*If a doctor is really negligent he loses his licence
*If a doctor is just negligent he gets sued
*If the doctor just cannot help/save the patient due to no real falute of his own, well thats life (or death)

Should be the same for companys that put data (and thus peoples finances/identity) at risk, with the alternative to “loseing licence” for companys being get shut down or being forced to outsource all IT to an accredited outside supplier, who tells the company what they can/cannot do and how much it will cost.

Sneeje says:

Confusing negligence and criminal behavior

People too often confuse another individual placing themselves at risk with the resulting criminal act. I believe this is because people believe that blame must be divided between all parties of an action, which really shouldn’t be true.

The hacker acts of their own volition and knowingly violates a particular law–they are to blame for this act.

A company places themselves at risk–they are to blame for that act.

Two separate acts, two separate allocations of blame.

This can be analogous to someone knowingly walking through a high-crime area. It should not be their fault that they get robbed–it is potentially their fault that they were placed at risk. The mugger would get prosecuted for the crime, but the individual would have weaker civil grounds.

This is why negligence and criminality are two separate concepts in law.

Anonymous Coward says:

I think making it a legal issue is wrong. If a company is negligent, then sure, something needs to be done, but it’s difficult to know where to draw the line.

Say I am negligent, and leave my keys in my car. My car gets stolen and then used in a hit and run. Am I liable if the person that is hit, dies? In America, I probably will be because those knobs sue for anything. Any other country, no, I’m not liable.

Eric Foster-Johnson (user link) says:

Dealing with lost laptops

So many times, I see companies lose customer/personal information by storing such info on a laptop and then the laptop gets stolen.

I propose a new law: you can store private info on any computing device, portable or not, so long as that device weighs at least 100 lbs. Failure to do so would be criminal negligence should the info get lost.

This would take care of many of those lost laptop issues.

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...