This commit is contained in:
fria 2025-07-14 07:39:29 -05:00 committed by GitHub
parent a1c690dd23
commit c15c837d2c
No known key found for this signature in database
GPG key ID: B5690EEEBB952194

View file

@ -171,7 +171,11 @@ There's been an effort from everyone to make differential privacy implementation
[Apple](https://www.apple.com/privacy/docs/Differential_Privacy_Overview.pdf) uses local differential privacy for much of its services, similar to what Google does. They add noise before sending any data off device, enabling them to collect aggregate data without harming the privacy of any individual user.
They limit the number of contributions any one user can make via a *privacy budget*, confusingly also represented by epsilon, so you won't have to worry about your own contributions being averaged out over time and revealing your own trends. Some of the things they use differential privacy for include
They limit the number of contributions any one user can make via a *privacy budget*, confusingly also represented by epsilon, so you won't have to worry about your own contributions being averaged out over time and revealing your own trends.
This allows them to find new words that people use that aren't included by default in the dictionary, or find which emojis are the most popular.
Some of the things they use differential privacy for include
- QuickType suggestions
- Emoji suggestions
@ -211,4 +215,4 @@ Considering 309 million people lived in the U.S. in 2010, that's a devastating b
>Nationwide, roughly 150 million individuals—almost one-half of the population, have a unique combination of sex and single year of age at the block level.
They could keep adding noise until these attacks are impossible, but that would make the data nigh unusable. Instead, differential privacy offers a mathematically regiorous method to protect the data from future reidentification attacks without ruining the data by adding too much noise. They can be sure thanks to the mathematical guarantees of DP.
They could keep adding noise until these attacks are impossible, but that would make the data nigh unusable. Instead, differential privacy offers a mathematically rigorous method to protect the data from future reidentification attacks without ruining the data by adding too much noise. They can be sure thanks to the mathematical guarantees of DP.