To make this work, the President officially declared foreign hacking to be a "national emergency" (no, really) and basically said that if the government decides that some foreign person is doing a bit too much hacking, the US government can basically do all sorts of bad stuff to them, like seize anything they have in the US and block them from coming to the US. Because that won't be abused at all.
Look, everyone agrees that there's a lot of online hacking and computer attacks going on. So much of what we do in the world has moved online, so of course that's going to be a target. But giving a general "ARRRRRGGH! HACKING BAD! WHITE HOUSE MAD!" executive order seems incredibly pointless and counterproductive. It seems like yet another example of politicians feeling the need to do something because there's a problem -- but not having any good ideas on what to actually do that will help solve the problem. So they just do something to say they did something, never mind how toothless it is -- or (more importantly) how the broad and vague definitions set forth in the "something" they do can (and will) be used in the future against perfectly reasonable actions and actors.
It's stories like these that make actual computer security folks shake their heads in confusion at politicians. You don't solve cybersecurity issues with vague executive orders. You do it with better security practices (and not undermining those practices with backdoors and stockpiling zero days).
Last week, the Senate Intelligence Committee voted (in secret, of course) to approve a new cybersecurity bill, dubbed CISA (as it was in the last Congress), though it kept the content of the actual bill secret until this week. The only Senator who voted against it was... Senator Wyden, of course, who rightly pointed out that this bill is "not a cybersecurity bill – it’s a surveillance bill by another name."
Aside from its redundancy, the Senate Intelligence bill grants two new authorities to companies. First, the bill authorizes companies to launch countermeasures (now called "defensive measures" in the bill) for a "cybersecurity purpose" against a "cybersecurity threat." "Cybersecurity purpose" is so broadly defined that it means almost anything related to protecting (including physically protecting) an information system, which can be a computer or software. The same goes for a "cybersecurity threat," which includes anything that "may result" in an unauthorized effort to impact the availability of the information system.
Even with the changed language, it's still unclear what restrictions exist on "defensive measures." Since the definition of "information system" is inclusive of files and software, can a company that has a file stolen from them launch "defensive measures" against the thief's computer? What's worse, the bill may allow such actions as long as they don't cause "substantial" harm. The bill leaves the term "substantial" undefined. If true, the countermeasures "defensive measures" clause could increasingly encourage computer exfiltration attacks on the Internet—a prospect that may appeal to some "active defense" (aka offensive) cybersecurity companies, but does not favor the everyday user.
Second, the bill adds a new authority for companies to monitor information systems to protect an entity's hardware or software. Here again, the broad definitions could be used in conjunction with the monitoring clause to spy on users engaged in potentially innocuous activity. Once collected, companies can then share the information, which is also called “cyber threat indicators,” freely with government agencies like the NSA.
Also, the bill goes away from previous cybersecurity bills that put Homeland Security in charge (which, by itself, isn't great, but DHS is the best option if you're debating between DHS, the NSA and the FBI). While the information still goes to DHS under this bill, DHS doesn't then get to parse through it and figure out where it goes. Instead, the info needs to be shared "in real time" with the NSA. All of which just gives weight to the fact that this is a surveillance bill, not a bill to protect against "cybersecurity attacks."
But if you want to know the single biggest reason why this bill is bogus: ask those supporting it what cybersecurity attack this bill would have stopped. And you'll notice they don't have an answer. That's because it's not a cybersecurity bill at all. It's just a bill to try to give the government more access to your user info.
For years, legislators have been attempting to grant themselves permission to strong-arm tech companies into handing over all sorts of information to the government under the guise of cybersecurity. CISPA, CISA, etc. The acronyms come and go, but the focus is the same: information sharing.
Of course, the promise of equitable sharing remains pure bullshit. Tech companies know this and have been understandably resistant to the government's advances. There are few, if any positives, to these proposed "agreements." The government gets what it wants -- lots and lots of data -- and the companies get little more than red tape, additional restrictions and fleeing customers.
U.S. government officials say privately they are frustrated that Silicon Valley technology firms are not obtaining U.S. security clearances for enough of their top executives, according to interviews with officials and executives in Washington and California. Those clearances would allow the government to talk freely with executives in a timely manner about intelligence they receive, hopefully helping to thwart the spread of a hack, or other security issues.
The lack of cooperation from Silicon Valley, Washington officials complain, injects friction into a process that everyone agrees is central to the fight to protect critical U.S. cyberinfrastructure: Real-time threat information sharing between government and the private sector.
Before dealing with the questionable promise of "real-time threat information sharing," let's deal with the supposedly minor requirement of security clearances. It's not as if this won't impose undue burdens on tech company leaders, especially when they already have a pretty good idea this stipulation will be a major hassle followed by continued opacity from a government that's 90% lip service and 10% outright lying. Tech execs are being asked to make all the effort and hope against hope there will actually be some benefits.
"I believe that this is more about the overclassification of information and the relatively low value that government cyberintel has for tech firms," said one Silicon Valley executive. "Clearances are a pain to get, despite what government people think. Filling out the paper work … is a nightmare, and the investigation takes a ridiculous amount of time."
"I think tech companies are doing a return-on-investment analysis and don't think the government intel is worth the cost or effort," said the Silicon Valley executive. "This is why government threat signature sharing initiatives are such a nothing-burger: The signatures are of limited value and only a few select companies with clearances can actually use them."
The clearance process can easily take over a year. The application runs 127 pages and asks a mixture of questions ranging from highly-intrusive to facially-ridiculous.
[This question seems to disqualify nearly every law enforcement officer in the United States.]
And that's just the start of the process. The rest of the vetting process takes several months, and there's no guarantee the executives the government wants to obtain clearance will actually be cleared to discuss classified information.
And even if these clearances are obtained, the benefits are unproven and suspected to be minimal. On the other hand, the downsides are enormous. As Marcy Wheeler points out, clearances may open up discussion channels with law enforcement and intelligence agencies, but they also create additional restrictions for those carrying these privileges -- the breach of which can result in severe consequences. In light of the inequitable "sharing" envisioned by many tech companies, the hassle just isn't worth it.
Because it’s not just that the security clearance application that is unwieldy. It’s that clearance comes with a gag order about certain issues, backed by the threat of prison...
Why would anyone sign up for that if the tech companies have more that the government wants than the government has that the tech companies need?
On top of this, there's the bottom line to consider. The information that may or may not flow back to tech companies won't do much to offset the perception that company executives are willingly buddying up with the US intelligence community. In the post-Snowden world, this could mean the loss of customers, future contracts and sensitive foreign markets.
The government has yet to offer anything Silicon Valley wants in exchange for additional burdens, greater secrecy and increased demands for customer data. The government is better at taking than it is at giving, and no amount of cyberterrorism hand-wringing is going to change that reality.
The NSA is specifically concerned that Iran's cyberweapons will become increasingly potent and sophisticated by virtue of learning from the attacks that have been launched against that country. "Iran’s destructive cyber attack against Saudi Aramco in August 2012, during which data was destroyed on tens of thousands of computers, was the first such attack NSA has observed from this adversary," the NSA document states. "Iran, having been a victim of a similar cyber attack against its own oil industry in April 2012, has demonstrated a clear ability to learn from the capabilities and actions of others."
That's because, unlike traditional physical weapons used against enemy infrastructure, digital versions are not generally destroyed during an attack. One of their big advantages is that once they have infiltrated and infected a target system, they can continue to carry out surveillance or attacks over a long time period. But that also means they may eventually be discovered -- especially if they leak out -- allowing them to be studied and improved in a way generally not possible with traditional weapons. Those new versions can then be directed elsewhere, including against the original attacker.
So intelligence agencies find themselves in a difficult position. The more they carry out attacks using digital weapons, and the more sophisticated those tools, the greater the likelihood that adversaries will detect them, adapt them and then turn them back against the country that deployed them. It's probably too much to hope that this may cause such weapons to be used more sparingly....
On Friday morning, we noted that the CEOs of Google, Facebook and Yahoo had declined to appear at the President's cybersecurity summit at Stanford, but that Apple CEO Tim Cook was going. However, we pointed out that all signs suggested Cook was going to send a message that he wasn't going to give in and allow the government a backdoor to iOS encryption. Cook had recently noted that the government "would have to cart us out in a box" before Apple would add a backdoor. And, indeed, speaking right before President Obama's speech, Cook delivered a strong defense of encryption and privacy:
“We believe deeply that everyone has a right to privacy and security,” said Cook. “So much of our information now is digital: photos, medical information, financial transactions, our most private conversations. It comes with great benefits; it makes our lives better, easier and healthier. But at Apple, we have always known this also comes with a great responsibility. Hackers are doing everything they can to steal your data, so we’re using every tool at our disposal to build the most secure devices that we can.”
“People have trusted us with their most personal and private information and we must give them the best technology we can to secure it,” said Cook. “Sacrificing our right to privacy can have dire consequences. We live in a world where people are not treated equally. There are people who don’t feel free to practice their religion, express their opinion or love who they choose. Technology can mean the difference between life and death.”
“If we don’t do everything we can to protect privacy, we risk more than money,” said Cook. “We risk our way of life.”
It's great to see tech companies taking a stronger and stronger stand in protecting the privacy of their users and customers. Once again, thank Snowden for actually making this an issue that companies actually need to care about.
Last Friday, at the White House's Cybersecurity Summit at Stanford, reporter Kara Swisher sat down for a half-hour interview with President Obama (and she even dragged her famous red chairs along). It's a better, more in-depth interview than you're ever likely to see from the established mainstream press, and touches on a variety of issues regarding technology and security. While I don't agree with some of the answers, I will say that the President appears to be extremely well-briefed on these issues, and didn't make any totally ridiculous or glaringly misleading remarks. You can see the whole interview here:
In it, he admits that the "Snowden disclosures" (as he calls them) hurt "trust" between DC and the tech industry, and admits that the government has been "a little slow" in updating the laws for how the NSA operates online. However, he does say that surveillance on US persons is very carefully controlled and that he can say "with almost complete certainty that there haven't been abuses on US soil." He admits that's not entirely the case overseas, where there are basically no limits on the NSA's surveillance, and he recognizes that needs to change. Of course, if that's the case, he can do that right now -- because the NSA's authority for all of that is an executive order, 12333, and he could revoke it and write a new one. But he hasn't.
Then he gets to the area I found most interesting and want to focus on, the question of encryption. After discussing how he's looking to update the rules for surveillance and his relationship with tech, the interview proceeds like this:
Obama: There's still some issues like encryption...
Swisher: Let's talk about encryption.
Obama: ... that are challenging, and that's something that's been brought up...
Swisher: What's wrong with what Google and Apple are doing? You have encrypted email.
Swisher: Shouldn't everybody have encrypted email and have their protections?
Obama: Everybody should. And I'm a strong believer in strong encryption. Where the tension has come up, is historically what's happened is that... let's say you knew a particular person was involved in a terrorist plot, and the FBI is trying to figure out who else are they trying to communicate with to prevent the plot. Traditionally, what's been able to happen is they get a court order, the FBI goes to the company, they request those records, the same way they'd go get a court order to request a wiretap. The company technically can comply.
The issue here is, partly in response to consumer demand, partly in response to legitimate concerns about consumer privacy, the technologies may be built to a point where, when the government goes...
Swisher: They can't get the information.
Obama: ... the company says "sorry, we just can't pull it. It's so sealed and tight that even though the government has a legitimate request, technologically we cannot do it."
Swisher: Is what they're doing wrong?
Obama: No. I think they are properly responding to a market demand. All of us are really concerned about making sure our...
Swisher: So what are you going to do?
Obama: Well, what we're going to try to do is see if there's a way for us to narrow this gap. Ultimately, everybody -- and certainly this is true for me and my family -- we all want to know if we're using a smartphone for transactions, sending messages, having private conversations, we don't have a bunch of people compromising that process. There's no scenario in which we don't want really strong encryption.
The narrow question is going to be: if there is a proper request for -- this isn't bulk collection, this isn't fishing expeditions by government -- where there's a situation in which we're trying to get a specific case of a possible national security threat, is there a way of accessing it? If it turns out there's not, then we're really going to have to have a public debate. And, I think some in Silicon Valley would make the argument -- which is a fair argument, and I get -- that the harms done by having any kind of compromised encryption are far greater than...
Swisher: That's an argument you used to make, you would have made. Has something changed?
Obama: No, I still make it. It's just that I'm sympathetic to law enforcement...
Swisher: Why? What happened? Because you were much stronger on...
Obama: No, I'm as strong as I have been. I think the only concern is... our law enforcement is expected to stop every plot. Every attack. Any bomb on a plane. The first time that attack takes place, where it turns out we had a lead and couldn't follow up on it, the public's going to demand answers. This is a public conversation that we should be having. I lean probably further in the direction of strong encryption than some do inside law enforcement. But I am sympathetic to law enforcement, because I know the kind of pressure they're under to keep us safe. And it's not as black and white as it's sometimes portrayed. Now, in fairness, I think those in favor of air tight encryption also want to be protected from terrorists.
Obama: One of the interesting things about being in this job, is that it does give you a bird's eye view. You are smack dab in the middle of these tensions that exist. But, there are times where folks who see this through a civil liberties or privacy lens reject that there's any tradeoffs involved. And, in fact, there are. And you've got to own the fact that it may be that we want to value privacy and civil liberties far more than we do the safety issues. But we can't pretend that there are no tradeoffs whatsoever.
I actually think this is a very good, nuanced answer to this issue. It doesn't descend into hyperbole about child predators and ticking time bombs like law enforcement officials have done. He admits that there are tradeoffs and, at least publicly, seems to be willing to admit that stronger encryption without compromise might be the best solution.
Of course, where we're left with questions is about his requested "public debate." Where and how is that happening? Because, to date, the only noise on this issue coming out of his administration has been on the other side, pushing for new legislation that would require backdoors and compromise encryption. We haven't seen anyone in the administration presenting the other side at all. And, for those of us who strongly believe that a basic cost/benefit analysis of weakening encryption vs. letting law enforcement do their job through traditional detective work would show that the "costs" of weakened encryption vastly outweigh the "threats" of criminals getting away with stuff, it would be nice to see the government at least recognizing that as well.
President Obama chides civil liberties and privacy folks for not getting that there are tradeoffs here, and I don't think that's accurate. Most do recognize the tradeoffs. It's just that they believe the true benefit in terms of "stopping criminals" to weakening encryption is not very great, while the cost to everyone in risking their own privacy is massive. What we have not seen is any indication that law enforcement recognizes that there are tradeoffs, or that they care. Yes, as the President admits, they're weighing some of this against "not getting blamed" when an inevitable "bad event" happens -- but they don't seem to be willing to recognize, at all, the risks to everyone's privacy. That's why they keep talking about golden keys and magic wizards who can make special encryption that only good guys can use.
So I'm glad that the President at least seems to recognize this is a nuanced issue with tradeoffs, but I wish that others in his administration, especially from the law enforcement side, were willing to recognize that as well.
We already wrote about the information sharing efforts coming out of the White House cybersecurity summit at Stanford today. That's supposedly the focus of the event. However, there's a much bigger issue happening as well: and it's the growing distrust between the tech industry and the intelligence community. As Bloomberg notes, the CEOs of Google, Yahoo and Facebook were all invited to join President Obama at the summit and all three declined. Apple's CEO Tim Cook will be there, but he appears to be delivering a message to the intelligence and law enforcement communities, if they think they're going to get him to drop the plan to encrypt iOS devices by default:
In an interview last month, Timothy D. Cook, Apple’s chief executive, said the N.S.A. “would have to cart us out in a box” before the company would provide the government a back door to its products. Apple recently began encrypting phones and tablets using a scheme that would force the government to go directly to the user for their information. And intelligence agencies are bracing for another wave of encryption.
In fact, it seems noteworthy that this whole issue of increasing encryption by the tech companies to keep everyone out has been left off the official summit schedule. As the NY Times notes (in the link above), Silicon Valley seems to be pretty much completely fed up with the intelligence community after multiple Snowden revelations revealed just how far the NSA had gone in trying to "collect it all" -- including hacking into the foreign data centers of Google and Yahoo. And, on top of that, the NSA's efforts to buy up zero day vulnerabilities before companies can find out and patch them:
“What has struck me is the enormous degree of hostility between Silicon Valley and the government,” said Herb Lin, who spent 20 years working on cyberissues at the National Academy of Sciences before moving to Stanford several months ago. “The relationship has been poisoned, and it’s not going to recover anytime soon.”
That Times article quotes White House cybersecurity boss Michael Daniel (the man who is proud of his own lacking of cybersecurity skills) trying to play down the "tensions" between Silicon Valley and Washington, followed by this anonymous quote from a Silicon Valley exec:
“A stupid approach,” is the assessment of one technology executive who will be seeing Mr. Obama on Friday, and who asked to speak anonymously.
Further, the article discusses how companies are trying to fight back against the NSA's abuse of zero days (another thing that Daniel has championed) by getting to them before the government does:
And while Silicon Valley executives have made a very public argument over encryption, they have been fuming quietly over the government’s use of zero-day flaws. Intelligence agencies are intent on finding or buying information about those flaws in widely used hardware and software, and information about the flaws often sells for hundreds of thousands of dollars on the black market. N.S.A. keeps a potent stockpile, without revealing the flaws to manufacturers.
Companies like Google, Facebook, Microsoft and Twitter are fighting back by paying “bug bounties” to friendly hackers who alert them to serious bugs in their systems so they can be fixed. And last July, Google took the effort to another level. That month, Mr. Grosse began recruiting some of the world’s best bug hunters to track down and neuter the very bugs that intelligence agencies and military contractors have been paying top dollar for to add to their arsenals.
They called the effort “Project Zero,” Mr. Grosse says, because the ultimate goal is to bring the number of bugs down to zero. He said that “Project Zero” would never get the number of bugs down to zero “but we’re going to get close.”
There's a lot more in the two stories ahead, but the angry feeling is real. In the past year, it's amazing how many conversations I've had with people around Silicon Valley who aren't just upset or disgusted over the intelligence community's actions, they're angry. And while the tech industry was never as buddy buddy with the government as some have tried to imply, things had undoubtedly become complacent in some circles, with little effort being made to make sure that information wasn't being misused or abused. But that's no longer the case. There are, of course, legal limits on what companies can do, but just as the NSA once explained how they play right up to the very edge of the limits that Congress puts around them (some of us believe they go beyond that...), the tech industry is rapidly learning that they, too, need to push back to the line that the law allows them to do so as well.
And, of course, none of that would likely have happened without Ed Snowden revealing to journalists the nature of the NSA's overreach.
from the don't-destroy-privacy-in-the-name-of-cybersecurity dept
There's a big "White House Cybersecurity Summit" down the road at Stanford today, where the President will release the details of a new executive order promoting "a framework for sharing information about cyber threats" which the administration hopes will lead organizations to better protect their data from malicious hacks.
The new executive order encourages businesses to form "information sharing and analysis organizations," or ISAOs, which would gather data about hacking attacks and share it with companies and the government.
A number of companies will announce Friday that they are incorporating the administration's cybersecurity framework, which was created after a 2013 executive order, into their companies. The framework helps businesses decide how to use cybersecurity investments, ways to implement cybersecurity for new companies and measure their programs against others. Intel, Apple and Bank of America use framework and will announce that they will require all vendors to use it. Both QVC and Walgreens will say they will employ the framework in their risk management practices, while Kaiser Permanente will commit to using it as well.
Of course, if you've been following the big fights over the past few years on cybersecurity legislation, you'll know that such "information sharing" has been a key component in most of the proposed bills, none of which have become law. Most of the bills have focused on one key thing: giving companies liability protection, so that they can't be sued over the information they share. From the beginning, however, we've asked a pretty simple question that no one has answered: what is currently preventing companies from sharing such threat information?
The answer, as reinforced by this move today by the White House, is absolutely nothing. Companies can (and in some cases already do) share "threat" information, and having them do so in a more organized fashion to prevent malicious attacks is, in fact, a good idea. What's not needed is a law that basically gives blanket immunity for companies to share almost any information to any government agency. That's been the problem with CISPA, CISA and similar bills: they're not about truly making information sharing about threats easier, since that can be done already. They're about giving blanket cover for companies to share even more information with government agencies such as the NSA.
With this new executive order and companies adopting the suggested framework, many of the "benefits" backers of cybersecurity legislation talk about will happen without the need for any new legislation. True threat information can be shared and companies can get wiser about protecting their information. But it doesn't give them blanket immunity if they start handing over other information to the government for other purposes, such as surveillance. That's important.
Yes, working together to prevent the growing number of online attacks is important. But that should never be used as a backdoor process to enable greater surveillance. Doing it this way, rather than by passing a questionable law, seems like a much more reasonable first step.
from the and-why-is-it-focused-on-information-sharing? dept
Cybersecurity has become a big buzzword in Washington, and there have been plenty of calls for legislation, usually focused on "information sharing" setups that allow companies and the government to compare notes on threats without fear of any legal liability. But the actual issues of cybersecurity are never clearly defined, nor is the need for various legislative changes fully explained. Is the problem really as big as it's made out to be? Or is the whole thing just a bureaucratic turf war?
Most legislation that includes the word "cyber" is nothing more than an excuse to give the government a larger piece of the action -- generally by redefining the term "information sharing" to mean a one-way street of data collection running from private companies (and their customers) to various law enforcement and security agencies.
Schneiderman's proposal seems to be more skewed towards actually increasing protections of companies and customers, rather than simply codifying additional government access. But before we start passing around high fives and popping champagne corks, it must be noted that not a single word of this has been put to paper yet (excluding the press release). At this point, it's just a proposal for legislation. There's no first draft to read and no indication what its interplay (amendments, etc.) with existing laws will entail.
That being said, most of what's delivered in Schneiderman's statement is mostly reasonable. Most of what's being asked for should have already been in place (including additional restrictions on the sharing of medical data). Many companies (coughSONYcough) seem to treat their customers' personal data as an afterthought -- something that only deserves attention after it's been Pastebinned for the world to see.
Expand Definition of Private Information- New York legislators should expand the definition of “private information” to include both the combination of an email address and password, and an email address in combination with a security question and answer, as California already has done. Additionally, the definition of private information should include medical information, including biometric information, and health insurance information.
Legislate Reasonable Data Security Requirement- All entities that collect and/or store private information should be required to have reasonable security measures to protect said information. These measures should include:
Administrative safeguards to assess risks, train employees and maintain safeguards.
Technical safeguards to (i) identify risks in their respective network, software, and information processing, (ii) detect, prevent and respond to attacks and (iii) regularly test and monitor systems controls and procedures.
Physical safeguards to have special disposal procedures, detection and response to intrusions, and protect the physical areas where information is stored.
Certification- Entities that obtain independent third-party audits and certifications annually showing compliance with New York’s reasonable data security requirements should receive for use in litigation a rebuttable presumption of having reasonable data security.
Legislate a Safe Harbor to Provide an Incentive for a Heightened Level of Data Security– New York needs to incentive businesses to implement the most robust data security. To do so, New York should offer a safe harbor if a company adopts a heightened form of security. To comply, entities would be required to categorize their information systems based on the risk a data breach imposes on the information stored. Once information systems are categorized, a data security plan based on a multitude of factors would be implemented and followed. Once this standard is met, the entity would be required to attain a certification and, upon doing so, would be granted the benefit of a safe harbor that could include an elimination of liability altogether.
Overall, not terrible, with a couple of caveats. One: the government's ability to protect itself from cyberattacks and other hacking ranges from less-than-adequate to abysmal. Considering its lack of self-awareness, it seems presumptive to put itself in the position of setting standards for data security. Sure, it could bring in actual experts in the field to craft these, but once legislators have had their say, what's been recommended may only bear the faintest resemblance to what's actually implemented.
Two: while the proposal helpfully expands the definition of "private information," it fails to provide specifics about who can or can't access this information. Any company could route around these restrictions with some fine print in its Terms of Service. And there's nothing forbidding the acquisition of medical, biometric and insurance data by the state itself. In fact -- and here's where we head into the "fairly decent BUT" section" -- the proposal lays the groundwork for one-way information sharing in the final paragraph.
Protection for Sharing Forensic Data- Finally, in the event of a data breach, New York should incentivize companies to share forensic reports with law enforcement officials. One way to accomplish this would be to make sure that the disclosure of a forensic report to a relevant law enforcement agency for the purposes of investigating those responsible for a data breach does not affect any privilege or protection. This would allow companies to feel comfortable with the free sharing of information while giving authorities a better chance at catching those responsible.
This is more sensible than other proposals as it looks to limit sharing of data to forensic data only. Then again, this is a proposal and, while all intentions are pure, it's a long way from a finished product. When the bill finally hits the legislative floor, it's very likely that this restrictive sharing will be loosened. Considering the panic that surrounds all things cyber-related -- especially once some enterprising do-gooder tosses the word "cyberterrorism" into the mix -- it's going to take a very dedicated and obstinate person to shepherd this through with most of these protections still intact.
And someone's still going to need to sell this additional layer of regulation to the companies it will affect -- many of whom have some pull in the upper reaches of the government. They're not exactly going to welcome the additional expense of implementing solid data security, even if they should have been on top of this since day one. The litigation safe harbor should make the pitch a bit more appealing, but again, it will take someone dedicated and tenacious to ensure the requirements aren't watered down into uselessness on its way to the governor's desk.