Insanity Rules: NSA Apologists Actually Think Apple Protecting You & Your Data Could Be 'Material Support' For ISIS

from the this-is-wrong dept

A few weeks ago, we pointed out that Senator Sheldon Whitehouse led the way with perhaps the most ridiculous statement of any Senator (and there were a lot of crazy statements) in the debate over encryption and the FBI's exaggerated fear of "going dark." He argued that if the police couldn't find a missing girl (using a hypothetical that not only didn't make any sense, but which also was entirely unlikely to ever happen), then perhaps Apple could face some civil liability for not allowing the government to spy on your data. Here's what he said:
It strikes me that one of the balances that we have in these circumstances, where a company may wish to privatize value -- by saying "gosh, we're secure now, we got a really good product, you're gonna love it" -- that's to their benefit. But for the family of the girl that disappeared in the van, that's a pretty big cost. And, when we see corporations privatizing value and socializing costs, so that other people have to bear the cost, one of the ways that we get back to that and try to put some balance into it, is through the civil courts. Through the liability system. If you're a polluter and you're dumping poisonous waste into the water rather than treating it properly somebody downstream can bring an action and can get damages for the harm they sustained, can get an order telling you to knock it off.
You can read our longer analysis of how wrong this is, but in short: encryption is not pollution. Pollution is a negative externality. Encryption is the opposite of that. It's a tool that better protects the public in the vast majority of cases. That's why Apple is making it so standard.

The suggestion was so ridiculous and so wrong that we were surprised that famed NSA apologist Ben Wittes of the Brookings Institute found Whitehouse's nonsensical rant "interesting" and worthy of consideration. While we disagree with Wittes on nearly everything, we thought at the very least common sense would have to eventually reach him, leading him to recognize that absolutely nothing Whitehouse said made any sense (then again, this is the same Wittes who seems to have joined the magic unicorn/golden key brigade -- so I'm beginning to doubt my initial assessment that Wittes is well-informed but just comes to bad conclusions).

However, even with Wittes finding Whitehouse's insane suggestion "interesting," it's still rather surprising to see him find it worthy of a multi-part detailed legal analysis for which he brought in a Harvard Law student, Zoe Bedell, to help. In the first analysis, they take a modified form of Whitehouse's hypothetical (after even they admit that his version doesn't actually make any sense), but still come to the conclusion that the company "could" face civil liability. Though, at least they admit plaintiffs would "not have an easy case."
The first challenge for plaintiffs will be to establish that Apple even had a duty, or an obligation, to take steps to prevent their products from being used in an attack in the first place. Plaintiffs might first argue that Apple actually already has a statutory duty to provide communications to government under a variety of laws. While Apple has no express statutory obligation to maintain the ability to provide decrypted information to the FBI, plaintiffs could argue that legal obligations it clearly does have would be meaningless if the communications remained encrypted.
To make this possible, Bedell and Wittes try to read into various wiretapping and surveillance laws a non-existent duty to decrypt information from your mobile phone. But that's clearly not true. If that actually existed, then we wouldn't be having this debate right now in the first place, and FBI Director James Comey wouldn't be talking to Congress about changing the law to require such things. But, still, they hope that maybe, just maybe, a court would create such a duty out of thin air based on things like "the foreseeability of the harm." Except, that's going to fall flat on its face, because the likelihood of harm here goes the other way. Not encrypting your information leads to a much, much, much greater probability of harm than encrypting your data and not allowing law enforcement to see it.

Going to even more ridiculous levels than the "pollution" argument, this article compares Apple encrypting your data to the potential liability of the guy who taught the Columbine shooters how to use their guns:
For example, after the Columbine shooting, the parents of a victim sued the retailer who sold the shooters one of their shotguns and even taught the shooters how to saw down the gun’s barrel. In refusing to dismiss the case, the court stated that “[t]he intervening or superseding act of a third party, . . . including a third-party's intentionally tortious or criminal conduct[,] does not absolve a defendant from responsibility if the third-party's conduct is reasonably and generally foreseeable.” The facts were different here in some respects—the Columbine shooters were under-age, and notably, they bought their supplies in person, rather than online. But that does not explain how two federal district courts in Colorado ended up selecting and applying two different standards for evaluating the defendant's duty.
But it's even more different than that. Even with this standard -- which many disagree with -- there still needs to be "conduct" that is "reasonably and generally foreseeable." And that's not the case here that it is "reasonably and generally foreseeable" that because data is encrypted that people will be at more risk. In all these years, the FBI still can't come up with a single example where such encryption was a real problem. It would be basically impossible to argue that this is a foreseeable "problem," especially when weighed against the very real and very present problem of people trying to hack into your device and get your data.

In the second in the series, Bedell and Wittes go even further in looking at whether or not Apple could be found to have provided material support to terrorists thanks to encryption. If this sounds vaguely familiar, remember a similarly ridiculous claim not to long ago from a music industry lawyer and a DOJ official that YouTube and Twitter could be charged with material support for terrorism because ISIS used both platforms.

Bedell and Wittes concoct a scenario in which a court might argue that providing a phone that can encrypt a terrorist's data, opens the company up to liability:
In our scenario, a plaintiff might argue that the material support was either the provision of the cell phone itself, or the provision of the encrypted messaging services that are native on it. Thus, if a jury could find that providing terrorists with encrypted communications services is just asking for trouble, then plaintiffs would have satisfied the first element of the definition of international terrorism in § 2331, a necessary step for making a case for liability under § 2333.
Of course, this is wiped out pretty quickly because that law requires intent. The authors note that this would "pose a challenge" to any plaintiff "as it would appear to be difficult, if not impossible, to prove that Apple intended to intimidate civilians or threaten governments by selling someone an iPhone..."

You think?

But, our intrepid NSA apologists still dig deeper to see if they can come up with a legal theory that will actually work:

But again, courts have handled this question in ways that make it feasible for a plaintiff to succeed on this point against Apple. For example, when the judge presiding over the Arab Bank case considered and denied the bank’s motion to dismiss, he shifted the analysis of intimidation and coercion (as well as the question of the violent act and the broken criminal law) from the defendant in the case to the group receiving the assistance. The question for the jury was thus whether the bank was secondarily, rather than primarily, liable for the injuries. The issue was not whether Arab Bank was trying to intimidate civilians or threaten governments. It was whether Hamas was trying to do this, and whether Arab Bank was knowingly helping Hamas.

Judge Posner’s opinion in Boim takes a different route to the same result. Instead of requiring a demonstration of actual intent to coerce or intimidate civilians or a government, Judge Posner essentially permits the inference that when terrorist attacks are a “foreseeable consequence” of providing support, an organization or individual knowingly providing that support can be understood to have intended those consequences. Because Judge Posner concludes that Congress created an intentional tort, § 2333 in his reading requires the plaintiff to prove that the defendant knew it was supporting a terrorist or terrorist organization, or at least that it was deliberately indifferent to that fact. In other words, the terrorist attack must be a foreseeable consequence of the specific act of support, rather than just a general risk of providing a good or service.

But even under those standards, it's hard to see how Apple could possibly be liable for material support. It's just selling an iPhone and doing so in a way that -- for the vast majority of its customers -- is better protecting their privacy and data. It would take an extremely twisted mind and argument to turn that into somehow "knowingly" helping terrorists or creating a "foreseeable consequence." At least the authors admit that much.

But why stop there? They then say that Apple could still be liable after the government asks them to decrypt messages. If Apple doesn't magically stop the user in particular from encrypting messages, then, they claim, Apple could be shown to be "knowingly" supporting terrorism.
The trouble for Apple is that our story does not end with the sale of the phone to the person who turns out later to be an ISIS recruit. There is an intermediate step in the story, a step at which Apple’s knowledge dramatically increases, and its conduct arguably comes to look much more like that of someone who—as Posner explains—is recklessly indifferent to the consequences of his actions and thus carries liability for the foreseeable consequences of the aid he gives a bad guy.

That is the point at which the government serves Apple with a warrant—either a Title III warrant or a FISA warrant. In either case, the warrant is issued by a judge and puts Apple on notice that there is probable cause to believe the individual under investigation is engaged in criminal activity or activity of interest for national security reasons and is using Apple’s services and products to help further his aims. Apple, quite reasonably given its technical architecture, informs the FBI at this point that it cannot comply in any useful way with the warrant as to communications content. It can only provide the metadata associated with the communications. But it continues to provide service to the individual in question.
But all of this, once again, assumes an impossibility: that once out of its hands, Apple can somehow stop the end user from using the encryption on their phone.

This is the mother of all stretches in terms of legal theories. And, throughout it all, neither Bedell nor Wittes even seems to recognize that stronger encryption protects the end user. It's like it doesn't even enter their minds that there's a reason why Apple is providing encryption that isn't "to help people hide from the government." It's not about government snooping. It's about anyone snooping. The other cases they cite are not like that at all. These arguments, even as thin as they are, only make sense if Apple's move to encryption doesn't really have widespread value for basically the entire population. You don't sue Toyota for "material support for terrorism" just because a terrorist uses a Toyota to make a car bomb. Yet, Wittes and Bedell are somehow trying to make the argument that Apple is liable for better protecting you, just because in some instances it might also help "bad" people. That's a ridiculous legal theory that barely deserves to be laughed at, let alone a multi-part analysis of how it "might work."

Filed Under: ben wittes, encryption, isis, liability, material support, mobile encryption, pollution, sheldon whitehouse, terorrism, zoe bedell
Companies: apple


Reader Comments

Subscribe: RSS

View by: Time | Thread


  • identicon
    Anonymous Coward, 6 Aug 2015 @ 8:47am

    This is how you know the NSA is clearly out of options with respect to encryption - they're trying to manipulate the law in whatever way possible to twist it into what they think they need.

    They WANT access to everything, after proving time and time again that if they have it, they will STILL fail to connect whatever dots they're talking about until after the fact.

    reply to this | link to this | view in chronology ]

  • icon
    Groaker (profile), 6 Aug 2015 @ 9:01am

    Data control

    Our government has absolutely no control over the data it holds, and thus feels that no one else should be able to hold on to their own data either.

    reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 6 Aug 2015 @ 9:08am

    Learned from the experts

    This reads exactly what was planned against Google by MPAA.

    https://www.techdirt.com/articles/20150724/15501631756/smoking-gun-mpaa-emails-reveal-plan-to-r un-anti-google-smear-campaign-via-today-show-wsj.shtml

    "Following the media blitz, you want Bill Guidera and Rick Smotkin to work with the PR firm to identify a lawyer specializing in SEC matters to work with a stockholder. This lawyer should be able to the [sic] identify the appropriate regulatory filing to be made against Google."

    reply to this | link to this | view in chronology ]

  • icon
    SteveMB (profile), 6 Aug 2015 @ 9:13am

    it's still rather surprising to see him find it worthy of a multi-part detailed legal analysis for which he brought in a Harvard Law student, Zoe Bedell, to help

    If Wittes wants to turn his own name into point-and-laugh fodder, that's his business, but dragging in a student who is presumably trying to build a respectable intellectual reputation for herself is just plain evil.

    reply to this | link to this | view in chronology ]

    • icon
      James Burkhardt (profile), 6 Aug 2015 @ 10:11am

      Re:

      What really gets me is he couldn't get an actual lawyer to sign off on this legal theory. He had to find a half trained student to help him.

      reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 6 Aug 2015 @ 9:14am

    So I can sue the manufacturer of the door locks in my house if I accidentally lock myself out because it's foreseeable that I might do so and the locks actually work as intended without regard for who is on the other side of the door without the key?

    reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 6 Aug 2015 @ 9:56am

      Re:

      If somehow 'terrorist' was involved with you getting locked out of your house then yes because anything terrorizing must be stopped at all cost.

      reply to this | link to this | view in chronology ]

  • identicon
    me@me.net, 6 Aug 2015 @ 9:21am

    if you ever wondered who were the real enemies of the ameican way of life

    this showcases their posterchildren

    reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 6 Aug 2015 @ 9:21am

    As I understand it, both Apple and Android are making existing encryption options on by default. Under the legal theories discussed here, wouldn't an individual be liable for turning on device encryption, for instance a parent enabling encryption on their child's phone? And how about turning off the appalling range of data-collection turned on by default in Win10?

    reply to this | link to this | view in chronology ]

  • icon
    Wickedsmack (profile), 6 Aug 2015 @ 9:27am

    Maybe I'm just paying more attention

    It would seem to me that the government is full of representatives that are largely ingnorant to modern technology. Sure they may know how to work a Blackberry or check their email, but with the larger more complex issues that are popping up, they seem woefully undereducated. The Goverment has proven time and again, they can't manage themselves. Federal agencies are no better. If I buy a product from a company, that product and any data in it is mine. Not the police's, not the Government, not the NSA. Encryption is pretty much a requirement these days (though I am blown away by the companies that come out and say OMG it wasn't encrypted...and oh yeah it was your financial and personal info...so basically a your ID on a silver platter for theft). WHY CAN'T THEY SEE THAT?!? If they are our most tech savvy agencies, why on Earth would they not understand the basics of why encryption is needed and why it should never ever have any kind of back door ever? Oh I'm sorry are you going to actualy have to do your homework and get warrants to access particular information based on evidence and credible suspicion. Sorry you have to do your job better....

    reply to this | link to this | view in chronology ]

  • identicon
    Chris Brand, 6 Aug 2015 @ 9:30am

    "Reasonable and foreseeable risk"

    So we know that identity theft and the like happen all the time. By this argument, if my iPhone gets hacked and I suffer a loss, I can go after Apple for *not* encrypting the data, because they made it easier for the hackers.

    reply to this | link to this | view in chronology ]

    • icon
      Richard (profile), 6 Aug 2015 @ 12:39pm

      Re: "Reasonable and foreseeable risk"

      This is like suing a builder for fitting locks to a house because it stops the police from breaking down your door vs suing a builder for NOT fitting locks when you get burgled.

      reply to this | link to this | view in chronology ]

  • icon
    DannyB (profile), 6 Aug 2015 @ 9:46am

    Material Support

    Suppose someone invented a new kind of office safe. One that was indestructible and impenetrable.

    It would keep your stuff really safe! It would sell well. Banks and all kinds of other users would love it. It would be a true benefit to society.

    According to the government...

    Everyone using this new safe is now providing Material Support to ISIS and should be prosecuted accordingly.

    The inventor of this new safe should be hanged for treason.

    Safe makers should be smart enough to create a magical Golden Key embedded with pure ground unicorn horn particles. This golden key would open all safes -- because the government would mandate that safes use special locks that open when presented with the Golden Key.

    But this Golden Key does not compromise all security of everyone everywhere, because, Trust Us, we'll make sure nobody misuses the golden key. And of course ISIS or the Chinese won't be able to create their own copy of the Golden Key that works just as good as the original at opening all safes everywhere.

    reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 6 Aug 2015 @ 9:51am

    So, don't buy apple?

    reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 6 Aug 2015 @ 9:55am

    culpability

    The Wiki article on culpability is instructive.

    The case of the nursing home in Hurricane Katrina is a good example of the issue of culpability. The nursing home owners called the Governor in as a witness, and asked her what caused the flood. The Governor said it was because the Levi broke. The State was trying to blame the Manangos for the death of the residents, by blaming them for the levy breaking.

    "Proximate cause" is another factor.

    reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 6 Aug 2015 @ 9:56am

    So basically, our parents should not allow us to breathe because there's a possibility that we might use our bodies to support groups our government has labled as terrorists?

    Whatever it is that they're smoking, I gotta get me some of that.

    reply to this | link to this | view in chronology ]

    • identicon
      LOL, 6 Aug 2015 @ 10:28pm

      Response to: Anonymous Coward on Aug 6th, 2015 @ 9:56am

      Nope. You don't want what they're smoking. It appears to turn people into idiots. I think the damage is irreversible too.

      reply to this | link to this | view in chronology ]

  • icon
    Sheogorath (profile), 6 Aug 2015 @ 10:11am

    But for the family of the girl that disappeared in the van, it's pretty galling to learn that the crime could easily have been prevented simply by placing a salt cellar with a bug in it on the diner table between the conspirators instead of chasing the boogeyman of encryption. Because that's how kidnappers tend to plan their crimes, Senator Whitehouse, FACE TO FACE. *facepalms*

    reply to this | link to this | view in chronology ]

  • identicon
    Ambrellite, 6 Aug 2015 @ 10:15am

    I'm tired of this stupidity

    A pillow could be used to asphyxiate somebody. Shoes could help a criminal run away. Underwear could conceal explosives. Sunglasses can help a suspect hide from police. Even something as innocuous as yogurt can be used to deliver deadly poison, or to hide narcotics.

    Can these fools get it through their heads that things are often useful to criminals simply because they're USEFUL??

    reply to this | link to this | view in chronology ]

  • icon
    sporkie (profile), 6 Aug 2015 @ 10:58am

    "In all these years, the FBI still can't come up with a single example where such encryption was a real problem."

    But can they think of 21 million+ recent examples of where the lack of encryption was an issue?

    reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 6 Aug 2015 @ 11:07am

    Toyota analogy

    A better car analogy is that because Toyota makes cars that have air bags and other features to protect a driver in a crash that Toyota is then providing material support to a terrorist by allowing them to survive accidents and thus be able to perpetrate future terrorist acts.

    Using that analogy it becomes clear that assigning liability to apple is outrageously wrong.

    reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 6 Aug 2015 @ 11:18am

      Re: Toyota analogy

      Exactly, just like how David Coleman Headley used GPS signals to find his landing spot to launch his Mumbai attack. Since GPS satellites are run by the US government, that would be material support as well.

      reply to this | link to this | view in chronology ]

      • icon
        Vikarti Anatra (profile), 7 Aug 2015 @ 3:00am

        Re: Re: Toyota analogy

        >Exactly, just like how David Coleman Headley used GPS signals to find his landing spot to launch his Mumbai attack. Since GPS satellites are run by the US government, that would be material support as well.
        Are you really sure he used only GPS NavStar (system run by USG) signals and not GLONASS (system run by Russian Goverment) ?

        reply to this | link to this | view in chronology ]

  • identicon
    Capt ICE Enforcer, 6 Aug 2015 @ 11:10am

    TOS

    Well crap, according to Apple`s terms of service agreement. Individuals dont own their phones but instead license them out. So shouldn`t Apple be the primary culprit for all bad things in this world, instead of the consumer. People after all shouldn`t be held accountable for their actions.

    reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 6 Aug 2015 @ 11:20am

    Why does communicating via packets change my expectation and right to privacy? If Masnick and I want a private conversation, we can do as we always do: meet in an Arby's parking lot at 2am and talk (and talk and talk...). Why then, if we decide to use a device to communicate, should every government on the planet suddenly have a right to access to that conversation?

    reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 6 Aug 2015 @ 11:22am

    > In all these years, the FBI still can't come up with a single example where such encryption was a real problem

    Ugh... I hate this argument. What if they were able to give 10 good examples where encryption was a real problem with terrible consequences? Would you suddenly be willing to accept compromised encryption?

    reply to this | link to this | view in chronology ]

    • icon
      Sheogorath (profile), 6 Aug 2015 @ 11:56am

      Re:

      Absolutely not. Let 'em get warrants and go after the individual devices. You know, the same way they have to if they want to look through someone's snail mail.

      reply to this | link to this | view in chronology ]

      • identicon
        Anonymous Coward, 6 Aug 2015 @ 12:05pm

        Re: Re:

        Encryption isn't like mail. A warrant won't help you get access to it.

        reply to this | link to this | view in chronology ]

        • icon
          afn29129 (profile), 6 Aug 2015 @ 12:50pm

          Snail-mail vs Email

          Snail-mail can be just as encrypted as Email.
          I could both email a message or put in an envelope and have the post office deliver.

          W9zAzuTpaI1Nipu2Edm4sbZUt1C8NGjat4QvWdXB
          2bHVnEUttu0pPELbjkKpXn35RVNuU01XeOJ2PKxx
          qnkzTCtbs60b rJdxpi1O64qsRuFLYdCkDEsBoMrC
          68ykVFkPjuSlhuhFsjHmcMP2q3RtHjxxopv1vZY5
          W4z4ZGJbcplbyg8lcBi0CZG7K3NToEuw AwxKmJ6p
          BSvixYCODfjtYRI99eRR6kXAs7HF4Lh0s6E9IXZ1
          MU3Xl9tDPhNYh1OthrD03Lqv8MqXOHuttDvL98d4
          StrZxskbQr9q vbkCbAZtLh58mQWnM4fi32iljFIg
          feztj6vjN9POJKIPaLstOHW3IZ2kupb8pa6saRC3
          0owY0msP6PWhqkWlKRhnYyEVeQ65XNg1 IkBAODyn

          In each case all that would show it the metadata, (to, from, etc)

          reply to this | link to this | view in chronology ]

          • icon
            Sheogorath (profile), 6 Aug 2015 @ 2:49pm

            Re: Snail-mail vs Email

            Well, if you think about encryption as being the envelope an email is sent in, and then realise that the Feds (or whoever) have to present a properly drawn up warrant to open that envelope when it's physical...

            reply to this | link to this | view in chronology ]

        • icon
          Sheogorath (profile), 6 Aug 2015 @ 2:45pm

          Re: Re: Re:

          Subpoena, then. Either way, a legal mechanism already exists for getting the key off either the sender or the recipient, so let the authorities make use of that.

          reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 6 Aug 2015 @ 11:39am

    Through the liability system. If you're a polluter and you're dumping poisonous waste into the water rather than treating it properly somebody downstream can bring an action and can get damages for the harm they sustained, can get an order telling you to knock it off.



    Unless you live down stream from a fracking operation and the group that's supposed to be looking out for your well being and that of the environment is being paid off or ordered to turn a blind eye or lie .

    reply to this | link to this | view in chronology ]

  • icon
    Seegras (profile), 6 Aug 2015 @ 11:43am

    Poisoning the water supply

    Imagine that there is this outfit putting tracers into the water supply. So they can find people wasting water or something.

    Now they're arguing, since water filters not only filter out poisonous and disease-causing substances but also their tracers, the water filter companies "may be liable" ?

    reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 6 Aug 2015 @ 11:59am

    ...when weighed against the very real and very present problem of people trying to hack into your device and get your data.


    This is the thing never taken into consideration during all these witch hunts to prevent privacy apps. The government itself recommends data encryption and if it had been following it's own advice, items like hacking into the medical care databank of Obamacare as well as the loss of security questionnaires applying for higher security status would never have had an impact. Instead, the Obamacare raid has opened up people to identity theft and the security raid has given China a roadmap on how to compromise those holding security classifications through the lack of protection.

    The aim of the spying is contrary to the needs of the public at large. In the process of keeping things from going dark as they like to refer to, they have opened the door to hackers at large to come in and take what they wish through the compromised programs, databases, and computers.

    Not a good trade off.

    reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 6 Aug 2015 @ 12:05pm

    NSA aplogists? You're looking at the article wrong.

    ... This is the mother of all stretches in terms of legal theories.

    And that, good sirs, was (in my view) the point of the article.

    In other words, you're complaining of these legal researchers being NSA apologists, when they are conducting a thought exercise on the question "Could the company be held liable, given the current state of laws and precedents?"

    You don't complain about white hat hackers. In fact, you laud them, and call out the unfairness of prosecuting them. And yet you don't extend the analogy to white hat legal researchers, thinking about what arguments you would have to defend against in a court of law. Why not?

    reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 6 Aug 2015 @ 12:19pm

    "...the material support was either the provision of the cell phone itself..."

    Illegal for 'Merikun companies to make phones.

    reply to this | link to this | view in chronology ]

  • identicon
    Personanongrata, 6 Aug 2015 @ 12:26pm

    Banana Republic

    If Apple doesn't magically stop the user in particular from encrypting messages, then, they claim, Apple could be shown to be "knowingly" supporting terrorism.

    Would it be better for American citizens if Apple (etal) were forced to collaborate with a clearly criminal government that at every opportunity seeks to function in secrecy while circumventing or out right ignoring the US Constitution, the supposed supreme law of the land, and replace it with an arbitrary and politically expedient manner of governing called tyranny?

    reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 6 Aug 2015 @ 12:43pm

      Re: Banana Republic

      It's Obamana Republic.

      reply to this | link to this | view in chronology ]

      • icon
        Groaker (profile), 6 Aug 2015 @ 1:11pm

        Re: Re: Banana Republic

        It would make me ecstatic to believe that Obama was the responsible individual. That this went back only as far as his administration.

        But as an example, the NY Times secretly knew at least a year before Bush 43's second election that the Bush was collecting information in bulk without warrants. And there is good reason to believe that it went back further to at least Clinton's time, and likely before it.

        reply to this | link to this | view in chronology ]

  • icon
    Stan (profile), 6 Aug 2015 @ 1:31pm

    "..prevent their products..."

    "The first challenge for plaintiffs will be to establish that Apple even had a duty, or an obligation, to take steps to prevent their products from being used in an attack in the first place. "

    Hey Swith & Wesson, Glock, Winchester, et al.

    Are you paying attention?

    reply to this | link to this | view in chronology ]

  • icon
    Matthew Cline (profile), 6 Aug 2015 @ 1:54pm

    Why Apple?

    Why concentrate on a company like Apple rather than, say, software like PGP? Is it just because everyone will recognize Apple, or is it something else?

    reply to this | link to this | view in chronology ]

    • icon
      John Fenderson (profile), 6 Aug 2015 @ 5:09pm

      Re: Why Apple?

      It's because Apple and Google are making the encryption automatic.

      reply to this | link to this | view in chronology ]

      • icon
        Groaker (profile), 6 Aug 2015 @ 5:40pm

        Re: Re: Why Apple?

        There are many automated encryption systems that are on or available 24/7 for data and voice. Apple and Google just make convenient targets because they are well known to the population, and the tyrants who would sway them.

        reply to this | link to this | view in chronology ]

      • identicon
        Anonymous Coward, 7 Aug 2015 @ 1:13am

        Re: Re: Why Apple?

        Because these companies could control a large part of the encryption used by people, and unlike Microsoft have not arranged to make the keys available to them on demand.
        /conspiracy theory.

        reply to this | link to this | view in chronology ]

  • identicon
    Steve, 6 Aug 2015 @ 3:39pm

    The legal constructs for endless, baseless war & the war on terror are equally tenuous. The Government will just keep going with encryption law until it manages to fabricate some legal fiction that it will then keep secret.
    The whole surveillance state is being implemented to put in place the system of control of the general population that the (real) Government is afraid will be necessary very shortly, when the shit hits the fan.

    reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 6 Aug 2015 @ 7:01pm

    I would be more willing to swallow that if they were not openly supporting ISIS on 1 hand while saying anyone that supports them are terrorists on the other hand.

    Or "do as I say not as I do"

    reply to this | link to this | view in chronology ]

  • identicon
    Justme, 6 Aug 2015 @ 10:19pm

    One Simple Question.

    One question for advocates of a government backdoor.

    Would you also be comfortable having any encryption the government uses contain the same backdoor mechanism?

    You can't separate national security from the security of it's citizen's! Weakness in one is a weakness in both.

    reply to this | link to this | view in chronology ]

  • icon
    Coyne Tibbets (profile), 6 Aug 2015 @ 11:14pm

    The short, short version

    You are with the NSA or you are a terrorist.

    reply to this | link to this | view in chronology ]

  • identicon
    Pretzel Logic, 6 Aug 2015 @ 11:17pm

    If child molesters go to prison...

    then why not encryption providers?

    reply to this | link to this | view in chronology ]

  • identicon
    Marc, 7 Aug 2015 @ 2:27am

    If this should hold, then

    All firearm manufacturers are liable for illegal shootings (they have not actively adapted their products to stop illegal use).

    All baseball bat manufacturers are liable as well for misuse.

    Hmm, banks are liable for aiding fraud.

    Government is liable for not preventing crime? Can I sue them?

    Why does no one address such idiotic ideas with fitting idiotic questions?

    reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 18 Aug 2015 @ 9:09am

    The student should drop out and start writing comedy.

    reply to this | link to this | view in chronology ]


Add Your Comment

Have a Techdirt Account? Sign in now. Want one? Register here



Subscribe to the Techdirt Daily newsletter




Comment Options:

  • Use markdown. Use plain text.
  • Remember name/email/url (set a cookie)

Close

Add A Reply

Have a Techdirt Account? Sign in now. Want one? Register here



Subscribe to the Techdirt Daily newsletter




Comment Options:

  • Use markdown. Use plain text.
  • Remember name/email/url (set a cookie)

Follow Techdirt
Special Affiliate Offer

Advertisement
Report this ad  |  Hide Techdirt ads
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Chat
Advertisement
Report this ad  |  Hide Techdirt ads
Recent Stories
Advertisement
Report this ad  |  Hide Techdirt ads

This site, like most other sites on the web, uses cookies. For more information, see our privacy policy. Got it
Close

Email This

This feature is only available to registered users. Register or sign in to use it.