Siri's Supposed Privacy Glitch: It's A Feature, Not A Bug

Every time you ask Siri a question, the data remains on Apple's servers for two years, Apple told Wired earlier this week. It's a revelation that raises concerns about privacy, which isn't exactly Apple's strong suit to begin with. But is this really something to flip out about? Nope.

For six months, Siri's servers retain a record of the things you ask it and associates that data with you, the user. For the remaining 18 months, it's anonymized. That way, Apple can use the data to improve its service over time without knowing that it was in fact you that asked what the rash in your nether regions is all about.

(See also: Siri Jokes Aside, Voice Control Will Make Computing Better)

The ACLU rightly faults Apple for not making its Siri data retention policies clearer or easier to find. The worry here is that the often private information we utter to Siri could wind up in the hands of marketers, the authorities or lawyers in civil suits.

These are valid concerns, and Apple should clarify whether — or how — this information is used for marketing purposes, for example. But in the process of reining in Cupertino, we should be careful not to handicap the evolution of such a promising technology. 

Artificial Intelligence Needs Data To Learn

Here's the thing: Siri is artificial intelligence. Like the human mind it attempts to emulate, AI improves as it learns. To teach machines, we need to feed them data. Every time we ask Siri where the nearest Italian restaurant or strip club is, we're also teaching her, not just about our own tastes and curiosities, but about human language, sentiment and intent.

Some of those lessons she can apply to us individually. Much of it, crucially, is used to improve the service for everybody. Without this progress, Siri will never lose the "beta" label for which it is so easy to ridicule.  

For most of its lifespan on Apple's servers, this data is anonymous. That means there's no way to tie your filthy inquiries back to you, should anybody ever inquire. You could, of course, argue that Apple should keep the data anonymous from the moment it's created, as Google proclaims it does with Voice Search. It might not be a fair comparison (given how much Google learns about us via other channels), but perhaps Apple should take a cue from Google and keep this data anonymous from the outset.

But if temporarily tying my questions to my voice helps Siri fine-tune my experience using the service, I'm fine with that. That's the bottom line here: Apple should hang onto data like this only as long as technically necessary. If it stops being useful to the product's evolution, the data should disappear. 

For The Privacy Concious, Alternatives Abound

I'd be more concerned about what Siri does with my queries if the access it offered to information was unique. You don't have to use Siri. It's just a more convenient tool to use in some contexts. For truly private inquiries, people can (and likely do) continue to use traditional methods like a Chrome incognito tab or any other browser in private mode. 

Now, if Apple wants us to turn to Siri more often, it's going to have to add better privacy controls. Much like Web browsers and Google Web History offer us toggles to keep certain (or all) activity private, the voice-controlled personal assistants of the the future will need to do the same. If they don't, people will continue to use alternative, more privacy-friendly tools, whether Web browsers or competing voice assistants. 

Apple has a responsibility to be transparent about this type of thing. And it really ought to scrap this data as soon as it's not technically beneficial to keep it. But insofar as it fuels the core functionality of an evolving technology, if Siri needs to remember my questions for awhile, go for it, Siri. Just give me a heads up.