Who’s responsible when a bot breaks the law?
A collective of Swiss artists faced that very question when they coded the Random Darknet Shopper, an online shopping bot, to purchase random items from a marketplace located on the Deep Web, an area of the World Wide Web not indexed by search engines. While many of the 16,000 items for sale on this marketplace are legal, quite a few are not; and when the bot purchased a handful of illegal pills and a fake Hungarian passport, the artists found themselves in one of those conundrums unique to the 21st century: Is one liable when a bunch of semi-autonomous code goes off and does something bad? (Hat tip to Fusion for the link to the original story.)
According to Forbes Legal Contributor Ryan Calo, it’s questionable whether the artists would be liable in this instance. “If, for instance, the law says a person may not knowingly purchase pirated merchandise or drugs, there is an argument that the artists did not violate the law,” he wrote. “Whereas if the law says the person may not engage in this behavior recklessly, then the artists may well be found guilty.” In other words, if the artists released the bot with the assumption that an unlawful outcome could occur—even if they didn’t program the bot to purchase a specific illegal item—they could potentially be found guilty of violating the law in some way.
In addition to the drugs and passport, the bot ordered a box set of The Lord of the Rings, a Louis Vuitton handbag, a couple of cartons of Chesterfield Blue cigarettes, sneakers, knockoff jeans, and much more. The artists gave their software $100 in Bitcoins per week to spend, and the items are on display in a Swiss art gallery.
In a short piece in The Guardian, the artists seemed prepared to face the legal consequences of their software’s actions, but nothing had happened yet—even though the gallery is reportedly next door to a police station.
Art projects aside, the use of semi-autonomous software for online shopping isn’t a new phenomenon, nor is the reliance on bots to autonomously engage in all sorts of activity. But who’s responsible when one of those bots does something the law doesn’t like?