Update (09/28/2016 2:54 PM PDT): Apple has confirmed that when a user opts-in to sending diagnostics & usage data, they are also automatically opting-in to sending data that is then run through differential privatization techniques.

When Apple originally announced the use of differential privacy in iOS 10, it was not without a little controversy. Skeptics from all corners began wondering how private differential privacy could really be when used in a mass deployment in the way that iOS 10 was going to use it.

Apple clarified that the use of differential privacy to collect user data would be opt-in, meaning if a user didn’t want to give into the system they didn’t have to. What Apple never indicated was where this opt-in area would be and what would happen if you decided against it…

When iOS 10 was officially released to the public a few weeks back, I decided to start from scratch versus restoring from a backup. Not because of any lingering idea that something from past iOS backups would interfere with iOS 10, but just to see what a new user on iOS 10 would see. The one thing I was critically looking for was the indication of where I could opt-in to Apple’s use of differential privacy, and that’s where it all got hazy.

Being an amateur iOS developer, I’ve grown accustomed to configuring devices and always opting-in to sharing diagnostic and usage information. I understand the importance of receiving crash reports from users so that a developer can work quickly to resolve issues. What I neglected to do was find where the instated differential privacy options with iOS 10 came in. The situation was made even more evident to me by Aleksandra Korolova, an assistant professor working on privacy at the University of Southern California’s Viterbi School of Engineering.

Korolova and her student Jun Tang discovered that Apple had lumped in the mention of differential privacy under two different diagnostic sections in iOS 10. With iOS 10, opting in to having diagnostic and usage data sent automatically to app developers means that users are also automatically subjected to data collection using differential privacy. It seems that if a user wants to submit diagnostic data to developers, but not be subject to the collection of this new data, they’re out of luck.

2/ Opt-in text for differentially private data collection missing info needed for meaningful choice: privacy parameter values used by iOS 10

— Aleksandra Korolova (@korolova) September 13, 2016

This is where we have to separate the conversation a bit. App developers won’t be receiving your differentially private data when being sent diagnostics and usage data. The data being collected is for and by Apple at this time. Apple had stated in the past that the use of differential privacy is a way to build “crowdsourced learning while keeping that data of the individual users completely private”. Apple then went on to state that the use of differential privacy would be limited to four specific use cases:

Of course, data collection isn’t new on iOS. Users could submit their diagnostics and usage information on iOS 9 and older if they chose to opt-in. According to iOS 10’s Diagnostics and Privacy terms, “none of the collected information identifies you personally”. The personal data collected (the four specific use cases listed above) “is either not logged at all” or is “subject to privacy preserving techniques such as differential privacy”. The question then comes down to: how private is this use of data? I enjoy the idea of Apple suggesting me emoji replacements, but not at the expense of risking to expose my data in how I currently use my device to do so.

3/ Diagnostics so far: algorithms CountMedianSketch, OneBitHistogram, SequenceFragmentPuzzle+CountMedianSketch; epsilon 1, 1, 4 (ht JunTang)

Will you opt-out of having further data collected if it means app developers won’t be getting your crash reports?

Differential privacy is supposed to create a safeguard by taking my data and making it private, but how can I be sure? It appears that Apple uses various algorithms to anonymize the data (see Korolova’s tweets with images of diagnostic logs above), but there is no clarity in how private that data is. For a user to make a better choice on whether or not they choose to allow collection of their data with differential privacy, they would need to be better informed on how private their data remains after any of the particular algorithms are applied.

To opt-out of sending developers your diagnostics and usage data automatically on iOS 10, and thus no longer having data collected:

We’ve reached out to Apple requesting further clarification, and will update the post if we receive a response.  Apple’s response has been added to the top of the post.

Step 2. Select Don’t Send. (You may also be notified ‘This will also turn off data collection on your Apple Watch.”)