Autonomous Bot Seized For Illegal Purchases: Who's Liable When A Bot Breaks The Law?

from the get-those-lawyers-ready dept

If you program a bot to autonomously buy things online, and some of those things turn out to be illegal, who's liable? We may be about to have the first such test case in Switzerland, after an autonomous buying bot was "seized" by law enforcement.

Two years ago, we wrote about the coming legal questions concerning liability and autonomous vehicles. Those vehicles are going to have some accidents (though, likely fewer than human driven cars) and then there are all sorts of questions about who is liable. Or what if they speed? Who gets the ticket? There are a lot of legal questions raised by autonomous vehicles. But, of course, it's not just autonomous vehicles raising these questions. With high-frequency trading taking over Wall Street, who is responsible if an algorithm goes haywire?

This question was raised in a slightly different context last month when some London-based Swiss artists, !Mediengruppe Bitnik, presented an exhibition in Zurich of The Darknet: From Memes to Onionland. Specifically, they had programmed a bot with some Bitcoin to randomly buy $100 worth of things each week via a darknet market, like Silk Road (in this case, it was actually Agora). The artists' focus was more about the nature of dark markets, and whether or not it makes sense to make them illegal:
The pair see parallels between copyright law and drug laws: “You can enforce laws, but what does that mean for society? Trading is something people have always done without regulation, but today it is regulated,” says ays Weiskopff.

“There have always been darkmarkets in cities, online or offline. These questions need to be explored. But what systems do we have to explore them in? Post Snowden, space for free-thinking online has become limited, and offline is not a lot better.”
But the effort also had some interesting findings, including that the dark markets were fairly reliable:
“The markets copied procedures from Amazon and eBay – their rating and feedback system is so interesting,” adds Smojlo. “With such simple tools you can gain trust. The service level was impressive – we had 12 items and everything arrived.”

“There has been no scam, no rip-off, nothing,” says Weiskopff. “One guy could not deliver a handbag the bot ordered, but he then returned the bitcoins to us.”
But, still, the much more interesting question is about liability in this situation. The Guardian reporter who wrote about this in December spoke to Swiss law enforcement, who noted that the situation was "unusual":
A spokesman for the National Crime Agency, which incorporates the National Cyber Crime Unit, was less philosophical, acknowledging that the question of criminal culpability in the case of a randomised software agent making a purchase of an illegal drug was “very unusual”.

“If the purchase is made in Switzerland, then it’s of course potentially subject to Swiss law, on which we couldn’t comment,” said the NCA. “In the UK, it’s obviously illegal to purchase a prohibited drug (such as ecstasy), but any criminal liability would need to assessed on a case-by-case basis.”
Apparently, that assessment has concluded in this case, because right after the exhibit closed in Switzerland, law enforcement showed up to seize stuff:
On the morning of January 12, the day after the three-month exhibition was closed, the public prosecutor's office of St. Gallen seized and sealed our work. It seems, the purpose of the confiscation is to impede an endangerment of third parties through the drugs exhibited by destroying them. This is what we know at present. We believe that the confiscation is an unjustified intervention into freedom of art. We'd also like to thank Kunst Halle St. Gallen for their ongoing support and the wonderful collaboration. Furthermore, we are convinced, that it is an objective of art to shed light on the fringes of society and to pose fundamental contemporary questions.
It appears possible that, in this case, law enforcement was just looking to seize and destroy the contraband products that were purchased by the bot, and may not then seek further prosecution, but it still does raise some interesting questions. I'm not sure I buy the "unjustified intervention in the freedom of art" argument (though that reminds me of another, unrelated story, of former MIT lecturer Joseph Gibbons, who was recently arrested for robbing banks, but who is arguing that it was all part of an "art project").

Still, these legal questions are not going away and are only going to become more and more pressing as more and more autonomous systems start popping up in different areas of our lives. The number of different court battles, jurisdictional arguments and fights over who's really liable are likely to be very, very messy -- but absolutely fascinating.

Reader Comments

Subscribe: RSS

View by: Time | Thread


  • icon
    Ninja (profile), 23 Jan 2015 @ 8:13am

    One way would be to allow a whole host of things that is widely traded in these underground markets with proper restrictions and regulations. Think marijuana: everybody uses but it's still illegal when it could be made legal with limitations and actually bring money to the Government for investment in areas in need.

    As for the bot, clearly nobody is at fault. The acquisitions were clearly random even if it ended up getting something illegal AND it was an experiment (nobody used the illegal stuff). So I'd say the seizure was disproportionate even if I agree that law enforcement should have checked what was going on.

    reply to this | link to this | view in chronology ]

    • identicon
      Beech, 23 Jan 2015 @ 8:23am

      Response to: Ninja on Jan 23rd, 2015 @ 8:13am

      "clearly nobody is at fault. "



      You'd like to think that, wouldn't you! Clearly this can only be solved by the "War on Bots". Uncle Sam will require billions of dollars worth of military hardware bought from crony capitalists. Every computer will need to be fitted with a keylogger and uniquely identifiable microchip. Encryption will need to be banned. Warrantless no - knock raids on any place suspected to harbor a bot. Millions will need to be jailed in for -profit prisons. You may be ready to surrender to the autonomous horror, Mr. Ninja, but some of us are willing to do what it takes to protect our profits. .. peace. I meant peace.

      reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 23 Jan 2015 @ 8:31am

      Re:

      > The acquisitions were clearly random even if it ended up getting something illegal

      One could argue that the *sellers* were the ones at fault, for putting the stuff up where the robot would stumble upon it.

      reply to this | link to this | view in chronology ]

      • icon
        Ninja (profile), 23 Jan 2015 @ 9:27am

        Re: Re:

        Indeed. But I still maintain that if there's a wide availability of something that is supposedly illegal it means there is wide demand. So is it really the answer to make it illegal?

        reply to this | link to this | view in chronology ]

        • identicon
          Pragmatic, 26 Jan 2015 @ 5:24am

          Re: Re: Re:

          What Ninja said. Taking an authoritarian approach to a demand-side issue is like bailing out a sinking ship with a teaspoon.

          reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 23 Jan 2015 @ 8:30am

    They waited until the end of the exhibition

    The best part is: they waited until the three-month exhibition had ended before seizing everything. They didn't want to interrupt the art piece.

    There's also a matter of context: until the exhibition ended, it was a work of art. After it ended, it's just a bunch of drugs and other stuff stored somewhere.

    reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 23 Jan 2015 @ 8:39am

    Some of This Happens All the Time

    The owners aren't responsible. After all, corporations break the law (or write them to their advantage) all the time. They are machines, not people, but nobody holds their owners criminally liable.

    reply to this | link to this | view in chronology ]

  • identicon
    Ragnarredbeard, 23 Jan 2015 @ 8:44am

    How was the bot "randomly" buying stuff? There is no such thing as a random number generator as far as I know.

    reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 23 Jan 2015 @ 8:49am

      Re:

      > There is no such thing as a random number generator as far as I know.

      Yes, there is. Quantum decay, avalanche diode noise, etc. All modern operating systems mix together several sources of random numbers, and make the result available to applications.

      reply to this | link to this | view in chronology ]

      • icon
        John Fenderson (profile), 23 Jan 2015 @ 8:51am

        Re: Re:

        It is possible to generate random numbers in the broad sense, yes. But unless you have installed special equipment, your computer cannot actually generate them.

        reply to this | link to this | view in chronology ]

        • identicon
          Anonymous Coward, 23 Jan 2015 @ 9:01am

          Re: Re: Re:

          > But unless you have installed special equipment, your computer cannot actually generate them.

          A lot of recent Intel processors have that special equipment directly on the CPU (it's used by the RDRAND and RDSEED instructions). If you have a TPM (and it's enabled), it also has that special equipment. So, not that uncommon.

          reply to this | link to this | view in chronology ]

          • identicon
            Anonymous Coward, 23 Jan 2015 @ 9:08am

            Re: Re: Re: Re:

            reply to this | link to this | view in chronology ]

          • icon
            John Fenderson (profile), 23 Jan 2015 @ 9:33am

            Re: Re: Re: Re:

            Those Intel processors don't count -- that "special equipment" does not produce actual random numbers. What it really does is produce very high quality pseudo-random numbers. High enough quality to be suitable for most cryptographic purposes, to be sure, but still not actually random.

            reply to this | link to this | view in chronology ]

            • identicon
              Anonymous Coward, 23 Jan 2015 @ 9:46am

              Re: Re: Re: Re: Re:

              Hamburg, Kocher and Marson (2012), p.7:
              2.2 Entropy source

               . . .  The circuit then resolves to one of two possible states, determined randomly by thermal noise in the system. . . .

              reply to this | link to this | view in chronology ]

              • icon
                John Fenderson (profile), 23 Jan 2015 @ 10:00am

                Re: Re: Re: Re: Re: Re:

                Yes, I'm familiar with the circuitry. Thermal noise is a very good entropy source. Better than the usual sources. But it is not entirely random.

                reply to this | link to this | view in chronology ]

                • icon
                  John Fenderson (profile), 23 Jan 2015 @ 10:10am

                  Re: Re: Re: Re: Re: Re: Re:

                  I can see that I'll have to expand further.

                  First, using thermal noise as the entropy source is extremely good. However, the thermal noise present in the system is affected by the whatever operations the computer is performing as well as the physical environment it is operating in. A true source of entropy is not affected by any environmental or operating conditions. For most purposes, it's certainly close enough -- but not for all, particularly when it comes to cryptography. There are a number of instances where mathematically strong crypto (or even unbreakable crypto such as OTPs) have been compromised because the random numbers were only nearly random.

                  Second, the entropy source is not directly the source of the random numbers used. It is used to seed a random number generator -- which, again, does not actually generate random numbers. Even if the entropy source were perfectly random, using it to seed a computational random number generator means that you're not actually getting random numbers in the end.

                  reply to this | link to this | view in chronology ]

                  • identicon
                    Anonymous Coward, 23 Jan 2015 @ 10:30am

                    Re: Re: Re: Re: Re: Re: Re: Re:

                    Even if the entropy source were perfectly random, using it to seed a computational random number generator means that you're not actually getting random numbers in the end.

                    Consider a circuit which takes the output of a Bernoulli process and chains it into an inverter.

                    Is the output of that circuit, from the inverter, a Bernoulli process?

                    reply to this | link to this | view in chronology ]

                  • identicon
                    Anonymous Coward, 26 Jan 2015 @ 11:47am

                    Re: Re: Re: Re: Re: Re: Re: Re:

                    A true source of entropy is not affected by any environmental or operating conditions.
                    Herr Professor Doktor Fenderson,

                    Do there exist stochastic processes which are non-stationary in nature?

                    reply to this | link to this | view in chronology ]

    • icon
      John Fenderson (profile), 23 Jan 2015 @ 8:50am

      Re:

      Call it pseudo-random, then. It's close enough to random for this sort of application.

      reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 23 Jan 2015 @ 8:53am

      Re:

      There are pseudo-random generators in various programming languages and platforms. They're close enough to actually random genegeneration for practical purposes. Saying the purchases were random is to say that the criteria that the bot used to determine what to purchase was not easily predictable by a human being without extensive study and knowledge of what items would be available in what quantities and at the arbritrary purchase price. So to a human observer it would seem random enough to match that description.

      reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 23 Jan 2015 @ 8:58am

      TRNGs [was Re: ]

      There is no such thing as a random number generator as far as I know.
      HotBits: Genuine random numbers, generated by radioactive decay
       . . . HotBits is an Internet resource that brings genuine random numbers, generated by a process fundamentally governed by the inherent uncertainty in the quantum mechanical laws of nature, directly to your computer in a variety of forms. HotBits are generated by timing successive pairs of radioactive decays detected by a Geiger-Müller tube interfaced to a computer.  . . .


      Random.org: True Random Number Service
       . . . RANDOM.ORG offers true random numbers to anyone on the Internet. The randomness comes from atmospheric noise . . .

      reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 23 Jan 2015 @ 8:51am

    Crap, it's the fuzz! Autobots, rollout!

    reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 23 Jan 2015 @ 8:53am

    Who's liable and responsible for the actions of autonomous systems? I would say either the manufacturer or the owner of such systems are liable. Depending on the circumstances.

    Let's take firearms as a good example. Non-autonomous, traditional firearms, require a human to pull the trigger. That's why gun manufactures can't be held responsible for what the owner of a firearm does with it.

    Now let's say the firearms is fully autonomous. Seeking out and finding targets on it's own. I would still argue the owner of the firearm, not the manufacturer, is responsible for what that autonomous system does. The owner is the one who sets up the autonomous system in an area and programs the targeting parameters for the system to seek out.

    The only way a manufacturer could be held responsible for an autonomous system's use, is if there's a verifiable software bug in the systems they're selling. Take the Chevy ignition switch defect as an example. Cars were randomly shutting down on drivers due to a design defect in Chevy's ignition system. Hence, in this case the manufacturer is responsible.

    So coming back to the autonomous Black market bot. In my opinion, the artists are responsible for the autonomous system because they programmed in the parameters for the autonomous system to execute. They could, for example, have programmed in for the bot to avoid keywords such as 'drug' and 'ecstasy', but they chose not to as both the owner and manufacturer of the autonomous system.

    reply to this | link to this | view in chronology ]

    • identicon
      B, 23 Jan 2015 @ 8:56am

      Re:

      reply to this | link to this | view in chronology ]

      • identicon
        Baron von Robber, 23 Jan 2015 @ 8:58am

        Re: Re:

        Gah, dang keyboard...

        But even if the programmers did put in an exclude list, they could have missed something that was illegal someplace and the bot bought it.

        Intent is why it is so important in the US legal system....Sorry WAS so important. Doesn't seem that way anymore.

        reply to this | link to this | view in chronology ]

        • identicon
          Anonymous Coward, 23 Jan 2015 @ 9:31am

          Re: Re: Re:

          But as far as intent goes, they purposely went to a dark market and not, say, eBay. They had to know there was a high likelihood of getting contraband eventually.

          And once they received the contraband, they intentionally kept it instead of destroying it or turning it over to police - so they'd still be guilty of possession even if they weren't guilty of purchasing.

          reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 23 Jan 2015 @ 9:41am

      Re:

      I don't think it matters much to liability whether it's a "smart" automated system or a "dumb" one.

      If you set up a shotgun boo trap on the door of your business and it kills a fireman trying to do his job, negligent homicide is probably the least you'd becharged with. Making it a smarter system changes nothing.

      reply to this | link to this | view in chronology ]

      • identicon
        Anonymous Coward, 23 Jan 2015 @ 10:09am

        Re: Re:

        Of course, your example was intended to be a booby trap. If instead you made the system mix two random chemicals which may or may not explode depending on what was selected, I don't think you'd be innocent, though.

        reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 23 Jan 2015 @ 10:46am

      Re:

      Analogy fail

      Owner of autonomous killing machine knowingly set loose this device on unsuspecting people. The owner is responsible for causing murder.

      Programming a bot to anonymously purchase random things is not harming anyone. The people doing wrong are the ones offering to sell (Intent to commit illegal act) and then actually selling and shipping the item. (Committed illegal act)

      reply to this | link to this | view in chronology ]

      • identicon
        Anonymous Coward, 23 Jan 2015 @ 11:00am

        Re: Re:

        Programming a bot to anonymously purchase random things is not harming anyone


        Funding a criminal enterprise could be considered harm.

        reply to this | link to this | view in chronology ]

      • identicon
        Anonymous Coward, 23 Jan 2015 @ 11:05am

        Re: Re:

        The people doing wrong are the ones offering to sell (Intent to commit illegal act) and then actually selling and shipping the item. (Committed illegal act)


        But how do you know the seller doesn't have their own bot to purchase random items and resell them for a profit?

        reply to this | link to this | view in chronology ]

  • identicon
    David, 23 Jan 2015 @ 8:57am

    One missing detail

    Regarding "freedom of art", the Swiss official responsible for the seizure stated that if they wanted to do more exhibits of this piece of performance art, they should just replace the particular exhibit of the no-longer available case of drugs with the seizure order since it was a proper fruit of their bot purchases.

    I think he had an excellent point there.

    reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 23 Jan 2015 @ 10:49am

      Re: One missing detail

      This also assists the police since they now have physical evidence they can use to potentially prosecute the senders.

      Tax payers win too, no government monies were used to purchase the evidence to crack down on these sort of illegal sales.

      reply to this | link to this | view in chronology ]

  • identicon
    MedicalQuack, 23 Jan 2015 @ 9:06am

    Same Question with Markets Coming About too

    This is a good topic as there's a case coming up this year on an algorithm that went rogue in the markets too, so yeah who's to blame when algorithm goes out of control. It's a reality that we have to face as it does happen and yes where does the responsibility lie, very good questions indeed.

    I'm sure you may have seen the video but Kevin Slavin made a presentation on this topic about the book on Amazon with a rogue algorithm that with no interaction at all kept pricing itself higher and higher to where a boring little book on Flies got up to a million dollars to purchase..oh those rogues.

    http://ducknetweb.blogspot.com/2011/07/how-algorithms-shape-our-worldted-talks.html

    Why do you think we have circuit breakers on the stock exchanges? It's the same reason to stop all trading when the algos and their little bot brains get scrambled and nothing is making sense or trades are done with rogue activity and numbers that are not true. This is a big deal as we are talking big money now.

    reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 23 Jan 2015 @ 9:42am

    Classic

    Turing police in Switzerland. Now all we need is some black ICE and coffin hotels.

    reply to this | link to this | view in chronology ]

  • icon
    DannyB (profile), 23 Jan 2015 @ 9:51am

    It should be illegal !

    It should be illegal for lawbreakers to break the law!

    (Only the rich and powerful should be allowed to break the law.)

    reply to this | link to this | view in chronology ]

  • icon
    jilocasin (profile), 23 Jan 2015 @ 10:15am

    So, just how do you go about seizing a 'bot'?

    According to the title, the _bot_ was seized. According to the text of the article, the things the bot _bought_ were seized.

    Since this _bot_ wasn't actually a physical _ro_bot (like say R2D2), but a bit of software, what _can_ the cops actually _seize_?

    The computer(s) that it's currently running in?
    An SD Card (or USB stick) that holds a copy of the software)?
    A hard copy print out of the source code?

    What if it's running on many computers? Perhaps there are many copies, or maybe it's some kind of hive mind.

    If you _turn_off_ the computer (or all of them) that it's running on, wouldn't that be the equivalent of capital punishment? Sounds kind of extreme for making a few drug purchases.

    Perhaps you turn off all the computers except one and then take _that_ one into 'custody'?

    Do we take into account the intellectual sophistication of the _bot_? We don't prosecute a mentally challenged criminal (or a small child) as harshly as a competent adult criminal do we? Will we now need 'competency hearings' when we arrest a _bot_?

    If your adult child commits a crime, we don't arrest their parents. So why would it be O.K. to arrest the programmers of a criminal _bot_?

    There's along way to go from there to HAL (or SkyNet).

    reply to this | link to this | view in chronology ]

    • identicon
      Anonymous Coward, 23 Jan 2015 @ 10:57am

      Re: So, just how do you go about seizing a 'bot'?

      If your adult child commits a crime, we don't arrest their parents. So why would it be O.K. to arrest the programmers of a criminal _bot_?


      But if you release your dog into your neighbor's henhouse, you would likely be criminally responsible if he kills the chickens.

      reply to this | link to this | view in chronology ]

  • icon
    John Fenderson (profile), 23 Jan 2015 @ 10:17am

    My point of view

    The artist is liable for the actions of the bot in this case. I think this because the bot was designed to purchase random things from a black market. The artist was certainly aware that in doing so, the odds are very, very high that prohibited items would be purchased. Since the effects of the bot were obvious prior to deployment, the one who deployed it is responsible.

    It would be a different situation if the bot were limited to legal marketplaces. In that case, if the bot ended up buying something illegal, the person who deployed it should not be liable, as that outcome would not be one that would be so easily predicted.

    reply to this | link to this | view in chronology ]

  • icon
    GeorgeQGreg (profile), 23 Jan 2015 @ 10:51am

    Did this make anyone else think of this XKCD?

    http://xkcd.com/576/

    reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 23 Jan 2015 @ 11:37am

    Bots are still in their infancy

    Since bots are still in their infancy I don't think they can be charged. Courts don't usually charge minors. :)

    reply to this | link to this | view in chronology ]

  • identicon
    Sniper_X, 23 Jan 2015 @ 1:36pm

    WHY IS THIS A QUESTION?!

    In all cases it is the person who was responsible for putting the automate device/bot into service.

    Why is this a question?!!!

    If we replace people with automation, the person who decided to automate that process or task is still responsible.

    Are we so ignorant that we think that just because this new thing exists, that all logic about responsibility VANISHES?

    To be clear, there are NO CASES where we should question who is responsible.

    NO CASES where a person shouldn't not be held accountable for their actions.

    The owner of the self driving car is responsible for it's actions!

    The owner of a bot is responsible for it's actions

    This question has been answered many times before.

    This is not an "interesting" question...

    It's SHOCKING that we need to ASK!

    reply to this | link to this | view in chronology ]

    • icon
      John Fenderson (profile), 23 Jan 2015 @ 2:52pm

      Re: WHY IS THIS A QUESTION?!

      It's a question because there are a ton of nuances that don't make the answer quite so straightforward.

      For example, let's say you have a robot that malfunctions and commits a crime without the knowledge, consent, or positive action on the part of the owner. It doesn't seem obvious to me that the owner of the robot should be convicted of the crime that the robot did. The owner did not commit, cause, or condone the commission of the crime, after all.

      Perhaps you could argue that owner committed some other offense such as inadequately supervising the actions of the robot, but not he crime the robot itself committed.

      "This question has been answered many times before."

      No, it really hasn't.

      reply to this | link to this | view in chronology ]

      • identicon
        Anonymous Coward, 23 Jan 2015 @ 4:01pm

        Re: Re: WHY IS THIS A QUESTION?!

        The answers to such questions might be easier to determine in countries like Saudi Arabia, whose legal system based on an entirely different form of logic.

        For instance, if you hire a taxi there, and the driver kills a pedestrian, it is YOU who will go to jail and face charges. In their way of thinking, the accident would never have happened if you had not hired and given instructions to the taxi driver (which is certainly true) and so therfore the driver is basically "just following orders."

        reply to this | link to this | view in chronology ]

        • icon
          GeorgeQGreg (profile), 23 Jan 2015 @ 4:18pm

          Re: Re: Re: WHY IS THIS A QUESTION?!

          Remind me to never hire a taxi next time I'm in Saudi Arabia.

          reply to this | link to this | view in chronology ]

        • icon
          John Fenderson (profile), 26 Jan 2015 @ 8:26am

          Re: Re: Re: WHY IS THIS A QUESTION?!

          "the accident would never have happened if you had not hired and given instructions to the taxi driver"

          Just out of curiosity, what if your instructions included "obey all traffic laws and don't hit anybody"?

          reply to this | link to this | view in chronology ]

  • identicon
    Anonymous Coward, 23 Jan 2015 @ 1:48pm

    Another question is who is responsible when computerized, mechanized warfare ends up perpetrating war crimes? Infantry soldiers are personally held responsible for who they kill, but who is to blame when a robot/drone commits a war crime?

    reply to this | link to this | view in chronology ]

    • icon
      John Fenderson (profile), 23 Jan 2015 @ 3:17pm

      Re:

      "Infantry soldiers are personally held responsible for who they kill"

      No they're not, as long as they're operating within the established rules of engagement.

      reply to this | link to this | view in chronology ]

  • identicon
    Lurker Keith, 24 Jan 2015 @ 7:29am

    Ghost in the Shell

    Seems Ghost in the Shell's distopian future is more or less on schedule, though architecture hasn't been keeping up.

    reply to this | link to this | view in chronology ]


Add Your Comment

Have a Techdirt Account? Sign in now. Want one? Register here
Get Techdirt’s Daily Email
Use markdown for basic formatting. HTML is no longer supported.
  Save me a cookie
Follow Techdirt
Techdirt Gear
Show Now: Takedown
Advertisement
Report this ad  |  Hide Techdirt ads
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Chat
Advertisement
Report this ad  |  Hide Techdirt ads
Recent Stories
Advertisement
Report this ad  |  Hide Techdirt ads

Close

Email This

This feature is only available to registered users. Register or sign in to use it.