Main image of article Amazon's Fight to Keep A.I. Private: Important for All
[caption id="attachment_138540" align="aligncenter" width="1000"] Amazon Echo[/caption] Amazon is fighting a court order compelling it to turn over data potentially gathered via Echo, its voice-controlled smart home hub. The court order stems from the death of one Victor Collins of Bentonville, Arkansas, whose body was found in a hot tub. The owner of that hot tub, James Bates, is believed to have killed Collins, though there’s little evidence to back that up. Bates owned a few connected devices, including an Echo. Police served Amazon with a warrant for data from Bates’ Echo, which is where things get slippery. When users speak to Alexa, Amazon stores many of the queries offsite. Police in the case want Amazon to turn over transcripts and recordings the Echo may have captured on the night they believe Collins was killed. Now, Amazon says Alexa is protected under free-speech laws. Amazon lays out a few key arguments in its defense, most notably that Alexa “may include expressive content protected by the First Amendment.” The company has asked the court to make sure the state of Arkansas has a “compelling need for the information sought, including that it is not available from other sources." It also wants the state to prove “a sufficient nexus between the information and the subject of the criminal investigation.” In a nutshell, Amazon says prosecutors must prove Alexa is the only thing that knows anything about the murder. So far, the company has only turned over Bates’ account information and purchase history.

Amazon, Apple, Google and Privacy

Amazon is waging a quiet war on behalf of users and technology. The first hangup: someone has to activate an Echo using the trigger word “Alexa.” It’s unlikely anyone would say "Alexa" before, during, or after committing a crime in their home. A victim, aware they were in danger, may think to scream "Alexa" to initiate a recording or scare off an assailant, but activation is not a sure thing. Alexa learns your cadence and tone the more you use it, causing it to become increasingly personalized over time. Unlike Google Home, Google's voice-activated digital assistant, you can’t disable recordings on the Echo, but you can delete audio clips online. Apple also doesn’t allow users to disable Siri’s recording functionality (unless you turn it off completely), but it keeps recordings locked behind a key-value storage scheme; the company has absolutely no access to your data, warrant or not. A Bloomberg report from late last year suggests Google and Amazon hold onto recordings to improve their voice-activated products. In addition to basic speech recognition, these devices must be able to understand a dizzying array of languages, dialects and accents in order to be considered a success. That sort of technology is good for use in vehicles, where drivers need to keep their focus on the road. At the moment, tech companies are standing at the intersection of good taste and good faith. Apple faced a similar problem when the FBI demanded it crack into the iPhone of a mass shooter. Rather than roll over, Apple stood its ground and insisted that, while the crime in question was heinous, giving up data set a bad precedent. The FBI bit back, but eventually admitted it didn’t need Apple’s help to get into the device. We have to assume Amazon will lean into that scenario if the Arkansas police press the matter. In its filing, Amazon notes that a police force asking for “smart speaker” information is a first.

Data Privacy is Your Job, Too

Big companies obviously have a responsibility in this context as the providers of products and services, but who else bears responsibility? Several third-party options for weaving voice controls into an app exist, leaving the work of storing queries optional... but ultimately up to the developer/company producing the product. That adds a layer of potential legal complexity. Even if your company is steadfast about protecting users’ voice queries, a hosting provider may not be so willing to turn authorities away. If you’re going to build or use speech recognition and voice controls in your service, discussing how to protect user data may be a question you’ll want to ask (unless you host the data on-site; then it’s all under your control). Another option is to remove the company from the equation altogether. Anonymous vlogging app Dusk doesn’t ask for any identifiable information such as email or location, and uses end-to-end encryption for data transmission. The app pixelates the image and voice in the app before transmission, so its development team can't even decrypt your uploads. As with Apple, Dusk never has access to your data (even though it has no idea who you are), which is the safest of bets. Artificial intelligence (A.I.) is a “key growth area” for smartphones, suggests IHS. “We see AI making smart devices even smarter with improved user experiences,” said Ian Fogg, director of mobile and telecom analysis at IHS Markit. “Existing AI agents like Apple’s Siri and Google Assistant will expand across the industry, complemented by embedded AI in all parts of mobile devices from cameras, to audio, to machine.” Whether or not to record or ‘turn over’ recorded data in your app or service may not be your problem today, but it will be soon enough. Users are increasingly excited about voice-activated services, which are finding their way to more devices every day. As with Apple before it, Amazon’s battle is worth keeping an eye on, if for no other reason than it frames how we approach data and security for the next wave of technology.