Autonomous Bot Seized For Illegal Purchases: Who's Liable When A Bot Breaks The Law?

from the get-those-lawyers-ready dept

If you program a bot to autonomously buy things online, and some of those things turn out to be illegal, who’s liable? We may be about to have the first such test case in Switzerland, after an autonomous buying bot was “seized” by law enforcement.

Two years ago, we wrote about the coming legal questions concerning liability and autonomous vehicles. Those vehicles are going to have some accidents (though, likely fewer than human driven cars) and then there are all sorts of questions about who is liable. Or what if they speed? Who gets the ticket? There are a lot of legal questions raised by autonomous vehicles. But, of course, it’s not just autonomous vehicles raising these questions. With high-frequency trading taking over Wall Street, who is responsible if an algorithm goes haywire?

This question was raised in a slightly different context last month when some London-based Swiss artists, !Mediengruppe Bitnik, presented an exhibition in Zurich of The Darknet: From Memes to Onionland. Specifically, they had programmed a bot with some Bitcoin to randomly buy $100 worth of things each week via a darknet market, like Silk Road (in this case, it was actually Agora). The artists’ focus was more about the nature of dark markets, and whether or not it makes sense to make them illegal:

The pair see parallels between copyright law and drug laws: ?You can enforce laws, but what does that mean for society? Trading is something people have always done without regulation, but today it is regulated,? says ays Weiskopff.

?There have always been darkmarkets in cities, online or offline. These questions need to be explored. But what systems do we have to explore them in? Post Snowden, space for free-thinking online has become limited, and offline is not a lot better.?

But the effort also had some interesting findings, including that the dark markets were fairly reliable:

?The markets copied procedures from Amazon and eBay ? their rating and feedback system is so interesting,? adds Smojlo. ?With such simple tools you can gain trust. The service level was impressive ? we had 12 items and everything arrived.?

?There has been no scam, no rip-off, nothing,? says Weiskopff. ?One guy could not deliver a handbag the bot ordered, but he then returned the bitcoins to us.?

But, still, the much more interesting question is about liability in this situation. The Guardian reporter who wrote about this in December spoke to Swiss law enforcement, who noted that the situation was “unusual”:

A spokesman for the National Crime Agency, which incorporates the National Cyber Crime Unit, was less philosophical, acknowledging that the question of criminal culpability in the case of a randomised software agent making a purchase of an illegal drug was ?very unusual?.

?If the purchase is made in Switzerland, then it?s of course potentially subject to Swiss law, on which we couldn?t comment,? said the NCA. ?In the UK, it?s obviously illegal to purchase a prohibited drug (such as ecstasy), but any criminal liability would need to assessed on a case-by-case basis.?

Apparently, that assessment has concluded in this case, because right after the exhibit closed in Switzerland, law enforcement showed up to seize stuff:

On the morning of January 12, the day after the three-month exhibition was closed, the public prosecutor’s office of St. Gallen seized and sealed our work. It seems, the purpose of the confiscation is to impede an endangerment of third parties through the drugs exhibited by destroying them. This is what we know at present. We believe that the confiscation is an unjustified intervention into freedom of art. We’d also like to thank Kunst Halle St. Gallen for their ongoing support and the wonderful collaboration. Furthermore, we are convinced, that it is an objective of art to shed light on the fringes of society and to pose fundamental contemporary questions.

It appears possible that, in this case, law enforcement was just looking to seize and destroy the contraband products that were purchased by the bot, and may not then seek further prosecution, but it still does raise some interesting questions. I’m not sure I buy the “unjustified intervention in the freedom of art” argument (though that reminds me of another, unrelated story, of former MIT lecturer Joseph Gibbons, who was recently arrested for robbing banks, but who is arguing that it was all part of an “art project”).

Still, these legal questions are not going away and are only going to become more and more pressing as more and more autonomous systems start popping up in different areas of our lives. The number of different court battles, jurisdictional arguments and fights over who’s really liable are likely to be very, very messy — but absolutely fascinating.

Filed Under: , , , , , , , ,

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Autonomous Bot Seized For Illegal Purchases: Who's Liable When A Bot Breaks The Law?”

Subscribe: RSS Leave a comment
Ninja (profile) says:

One way would be to allow a whole host of things that is widely traded in these underground markets with proper restrictions and regulations. Think marijuana: everybody uses but it’s still illegal when it could be made legal with limitations and actually bring money to the Government for investment in areas in need.

As for the bot, clearly nobody is at fault. The acquisitions were clearly random even if it ended up getting something illegal AND it was an experiment (nobody used the illegal stuff). So I’d say the seizure was disproportionate even if I agree that law enforcement should have checked what was going on.

Beech says:

Re: Response to: Ninja on Jan 23rd, 2015 @ 8:13am

“clearly nobody is at fault. “

You’d like to think that, wouldn’t you! Clearly this can only be solved by the “War on Bots”. Uncle Sam will require billions of dollars worth of military hardware bought from crony capitalists. Every computer will need to be fitted with a keylogger and uniquely identifiable microchip. Encryption will need to be banned. Warrantless no – knock raids on any place suspected to harbor a bot. Millions will need to be jailed in for -profit prisons. You may be ready to surrender to the autonomous horror, Mr. Ninja, but some of us are willing to do what it takes to protect our profits. .. peace. I meant peace.

Anonymous Coward says:

They waited until the end of the exhibition

The best part is: they waited until the three-month exhibition had ended before seizing everything. They didn’t want to interrupt the art piece.

There’s also a matter of context: until the exhibition ended, it was a work of art. After it ended, it’s just a bunch of drugs and other stuff stored somewhere.

Anonymous Coward says:

Re: Re: Re: Re:

But unless you have installed special equipment, your computer cannot actually generate them.

A lot of recent Intel processors have that special equipment directly on the CPU (it’s used by the RDRAND and RDSEED instructions). If you have a TPM (and it’s enabled), it also has that special equipment. So, not that uncommon.

John Fenderson (profile) says:

Re: Re: Re:5 Re:

I can see that I’ll have to expand further.

First, using thermal noise as the entropy source is extremely good. However, the thermal noise present in the system is affected by the whatever operations the computer is performing as well as the physical environment it is operating in. A true source of entropy is not affected by any environmental or operating conditions. For most purposes, it’s certainly close enough — but not for all, particularly when it comes to cryptography. There are a number of instances where mathematically strong crypto (or even unbreakable crypto such as OTPs) have been compromised because the random numbers were only nearly random.

Second, the entropy source is not directly the source of the random numbers used. It is used to seed a random number generator — which, again, does not actually generate random numbers. Even if the entropy source were perfectly random, using it to seed a computational random number generator means that you’re not actually getting random numbers in the end.

Anonymous Coward says:

Re: Re: Re:6 Re:

Even if the entropy source were perfectly random, using it to seed a computational random number generator means that you’re not actually getting random numbers in the end.

Consider a circuit which takes the output of a Bernoulli process and chains it into an inverter.

Is the output of that circuit, from the inverter, a Bernoulli process?

Anonymous Coward says:

Re: Re:

There are pseudo-random generators in various programming languages and platforms. They’re close enough to actually random genegeneration for practical purposes. Saying the purchases were random is to say that the criteria that the bot used to determine what to purchase was not easily predictable by a human being without extensive study and knowledge of what items would be available in what quantities and at the arbritrary purchase price. So to a human observer it would seem random enough to match that description.

Anonymous Coward says:

Re: TRNGs [was ]

There is no such thing as a random number generator as far as I know.

HotBits: Genuine random numbers, generated by radioactive decay

 . . . HotBits is an Internet resource that brings genuine random numbers, generated by a process fundamentally governed by the inherent uncertainty in the quantum mechanical laws of nature, directly to your computer in a variety of forms. HotBits are generated by timing successive pairs of radioactive decays detected by a Geiger-Müller tube interfaced to a computer.  . . . True Random Number Service

 . . . RANDOM.ORG offers true random numbers to anyone on the Internet. The randomness comes from atmospheric noise . . .

Anonymous Coward says:

Who’s liable and responsible for the actions of autonomous systems? I would say either the manufacturer or the owner of such systems are liable. Depending on the circumstances.

Let’s take firearms as a good example. Non-autonomous, traditional firearms, require a human to pull the trigger. That’s why gun manufactures can’t be held responsible for what the owner of a firearm does with it.

Now let’s say the firearms is fully autonomous. Seeking out and finding targets on it’s own. I would still argue the owner of the firearm, not the manufacturer, is responsible for what that autonomous system does. The owner is the one who sets up the autonomous system in an area and programs the targeting parameters for the system to seek out.

The only way a manufacturer could be held responsible for an autonomous system’s use, is if there’s a verifiable software bug in the systems they’re selling. Take the Chevy ignition switch defect as an example. Cars were randomly shutting down on drivers due to a design defect in Chevy’s ignition system. Hence, in this case the manufacturer is responsible.

So coming back to the autonomous Black market bot. In my opinion, the artists are responsible for the autonomous system because they programmed in the parameters for the autonomous system to execute. They could, for example, have programmed in for the bot to avoid keywords such as ‘drug’ and ‘ecstasy’, but they chose not to as both the owner and manufacturer of the autonomous system.

Anonymous Coward says:

Re: Re: Re: Re:

But as far as intent goes, they purposely went to a dark market and not, say, eBay. They had to know there was a high likelihood of getting contraband eventually.

And once they received the contraband, they intentionally kept it instead of destroying it or turning it over to police – so they’d still be guilty of possession even if they weren’t guilty of purchasing.

Anonymous Coward says:

Re: Re:

I don’t think it matters much to liability whether it’s a “smart” automated system or a “dumb” one.

If you set up a shotgun boo trap on the door of your business and it kills a fireman trying to do his job, negligent homicide is probably the least you’d becharged with. Making it a smarter system changes nothing.

Anonymous Coward says:

Re: Re:

Analogy fail

Owner of autonomous killing machine knowingly set loose this device on unsuspecting people. The owner is responsible for causing murder.

Programming a bot to anonymously purchase random things is not harming anyone. The people doing wrong are the ones offering to sell (Intent to commit illegal act) and then actually selling and shipping the item. (Committed illegal act)

David says:

One missing detail

Regarding “freedom of art”, the Swiss official responsible for the seizure stated that if they wanted to do more exhibits of this piece of performance art, they should just replace the particular exhibit of the no-longer available case of drugs with the seizure order since it was a proper fruit of their bot purchases.

I think he had an excellent point there.

MedicalQuack (user link) says:

Same Question with Markets Coming About too

This is a good topic as there’s a case coming up this year on an algorithm that went rogue in the markets too, so yeah who’s to blame when algorithm goes out of control. It’s a reality that we have to face as it does happen and yes where does the responsibility lie, very good questions indeed.

I’m sure you may have seen the video but Kevin Slavin made a presentation on this topic about the book on Amazon with a rogue algorithm that with no interaction at all kept pricing itself higher and higher to where a boring little book on Flies got up to a million dollars to purchase..oh those rogues.

Why do you think we have circuit breakers on the stock exchanges? It’s the same reason to stop all trading when the algos and their little bot brains get scrambled and nothing is making sense or trades are done with rogue activity and numbers that are not true. This is a big deal as we are talking big money now.

jilocasin (profile) says:

So, just how do you go about seizing a 'bot'?

According to the title, the _bot_ was seized. According to the text of the article, the things the bot _bought_ were seized.

Since this _bot_ wasn’t actually a physical _ro_bot (like say R2D2), but a bit of software, what _can_ the cops actually _seize_?

The computer(s) that it’s currently running in?
An SD Card (or USB stick) that holds a copy of the software)?
A hard copy print out of the source code?

What if it’s running on many computers? Perhaps there are many copies, or maybe it’s some kind of hive mind.

If you _turn_off_ the computer (or all of them) that it’s running on, wouldn’t that be the equivalent of capital punishment? Sounds kind of extreme for making a few drug purchases.

Perhaps you turn off all the computers except one and then take _that_ one into ‘custody’?

Do we take into account the intellectual sophistication of the _bot_? We don’t prosecute a mentally challenged criminal (or a small child) as harshly as a competent adult criminal do we? Will we now need ‘competency hearings’ when we arrest a _bot_?

If your adult child commits a crime, we don’t arrest their parents. So why would it be O.K. to arrest the programmers of a criminal _bot_?

There’s along way to go from there to HAL (or SkyNet).

John Fenderson (profile) says:

My point of view

The artist is liable for the actions of the bot in this case. I think this because the bot was designed to purchase random things from a black market. The artist was certainly aware that in doing so, the odds are very, very high that prohibited items would be purchased. Since the effects of the bot were obvious prior to deployment, the one who deployed it is responsible.

It would be a different situation if the bot were limited to legal marketplaces. In that case, if the bot ended up buying something illegal, the person who deployed it should not be liable, as that outcome would not be one that would be so easily predicted.

Sniper_X says:


In all cases it is the person who was responsible for putting the automate device/bot into service.

Why is this a question?!!!

If we replace people with automation, the person who decided to automate that process or task is still responsible.

Are we so ignorant that we think that just because this new thing exists, that all logic about responsibility VANISHES?

To be clear, there are NO CASES where we should question who is responsible.

NO CASES where a person shouldn’t not be held accountable for their actions.

The owner of the self driving car is responsible for it’s actions!

The owner of a bot is responsible for it’s actions

This question has been answered many times before.

This is not an “interesting” question…

It’s SHOCKING that we need to ASK!

John Fenderson (profile) says:


It’s a question because there are a ton of nuances that don’t make the answer quite so straightforward.

For example, let’s say you have a robot that malfunctions and commits a crime without the knowledge, consent, or positive action on the part of the owner. It doesn’t seem obvious to me that the owner of the robot should be convicted of the crime that the robot did. The owner did not commit, cause, or condone the commission of the crime, after all.

Perhaps you could argue that owner committed some other offense such as inadequately supervising the actions of the robot, but not he crime the robot itself committed.

“This question has been answered many times before.”

No, it really hasn’t.

Anonymous Coward says:


The answers to such questions might be easier to determine in countries like Saudi Arabia, whose legal system based on an entirely different form of logic.

For instance, if you hire a taxi there, and the driver kills a pedestrian, it is YOU who will go to jail and face charges. In their way of thinking, the accident would never have happened if you had not hired and given instructions to the taxi driver (which is certainly true) and so therfore the driver is basically “just following orders.”

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...