Laws limiting where, when, by whom, and for what robots can be deployed are beginning to take shape in the United States.
There are plenty of legal liability issues surrounding self-driving cars and robotic auto-assist systems, for example. Laws making them legal (under some circumstances) are pending or already on the books in 13 states, including California, Florida and Nevada.
The Federal Aviation Administration strictly regulates the use of airborne robots within the U.S., so most U.S.-owned airborne robots, we assume, are covered by rules governing U.S. military and intelligence operations overseas.
But since 2004, the FBI has spent more than $3.7 million on a fleet of surveillance drones and given more than $1.2 million in grants for other law-enforcement agencies to experiment with them, as well. Even drones used by public-safety agencies are subject to FAA regulations, which once limited them to 4.4 pounds and a range within the line of sight of the operator, according to a report released Sept. 26 by the Justice Department. (PDF)
FAA regulations focus primarily on keeping drones from crashing into other aircraft or injuring people on the ground, not defining when, where and on whom they can be used for surveillance. “No agency, including the FBI, should deploy domestic surveillance drones without first having strong privacy guidelines in place,” according to a Sept. 26 statement from the ACLU in response to the DoJ report.
By not addressing whether warrants should be required in order to use a drone for surveillance, and what rules should apply to video or other data collected by drones, the Justice Dept., Dept. of Homeland Security and other agencies “risk further delaying the integration of UAS into the national airspace system,” according to a September 2012 report from the Government Accountability Office.
The technology of unmanned aerial vehicles continues to advance, however, most recently with the “flawless” take-off, supersonic flight and landing of a pilotless U.S. Air Force F-16 fighter plane specially modified to be flown by pilots on the ground.
The newly designated QF-16 is a $70 million project to convert six early-generation F-16 fighters into flying targets and “enemy” aircraft for U.S. pilots to train against, according to Aviation Today.
While unimaginably more complicated, that role is similar to the one played by remote-controlled crawlers called PackBots from iRobot – which also developed the Roomba and Scooba home-hygiene automation systems. More than 4,000 PackBots have been deployed to Iraq, Afghanistan and other U.S. war zones to help in the search for improvised explosive devices (IEDs) and other hazards. Early in their deployment, PackBots did that primarily by poking at suspicious packages or freshly dug spots in the road, and being blown up for their troubles.
California, in the lead on limiting the use of robots as well as approving the use of them in autonomous vehicles, turns out also to be leading the drive to limit the use of robots online, for the convenience of humans, if not their physical safety. On Sept. 23, Calif. Gov. Jerry Brown signed into law a rule banning ticket-buying software scalpers and ticket brokers use to buy up thousands of seats to concerts and sporting events in the seconds after online ticket windows open.
Those automation systems represent a trivial risk compared to malware networks, spam generators or the risk of having Twitter conversations hijacked by partisans who are AI rather than fanatic.
And even those risks are trivial compared to the macroeconomic peril from automated trading systems that move so quickly they make even Wall Street motion-sick. It may be a first step in regulating the use of ‘bots to skew opinion and review sites, online votes and surveys, social-media postings and almost every other form of online activity.
Half of all web traffic already comes from non-humans. And ‘bot traffic is expanding to threaten to hijack feedback sections, reviews and recommendations to “dominate online sales of all stripes,” according to New Statesman.
California may have put its foot down on online ticket sales, but that’s just one small market in just one state. It’s not clear whether there’s enough outrage over ‘bots pretending to be humans to extend that resistance to reviews, recommendations or comments, or whether the sophistication of ‘bots is increasing faster than the ability to identify and deter them.
It is clear that without some mechanism more effective than CAPTCHA and other bot-preventing mechanisms, fake humans will begin to dominate opinions on the Internet as completely as they dominate the market for bomb disposal, aerial insurgent tracking-and-disposal or online ticket buying.
The question is whether there’s anything to be done about it, or any commercial or government organization is interested enough to try.
Image: U.S. Air Force