Apple has added a new post to its Machine Learning Journal that explains how it’s using differential privacy to protect users, even when collecting very sensitive data such as keystrokes and the sites users visit.

This type of data collection occurs when users opt in to share usage analytics from macOS or iOS, allowing Apple to collect “privatized records”.

Apple introduced differential privacy in iOS 10 in support of new data collection aimed at improving QuickType, emoji suggestions, Spotlight suggestions, and media playback features in Safari.

The system works on the basis that statistical noise can be added to data on the device before it’s shared with Apple.

The post, Learning with Privacy at Scale, is Apple’s seventh issue in its first volume on the site that goes into detail about its machine-learning projects and how they impact its products. This one offers a deeper dive into its differential privacy framework and serves to reassure users that it’s not slurping up extremely private information.

It says its approach to differential privacy on the device allows data to be “randomized before being sent from the device, so the server never sees or receives raw data”.

The results of Apple’s massive data collection allow it to see, for example, differences across keyboard locales.


Image: Apple

The records arrive at a restricted access server where IP addresses are dropped. Apple says at that point it can’t tell if an emoji record and a Safari web…

Continue ….

[SOURCE]