From e79526612fde0d42bd9ac9602a2d3a954b14068a Mon Sep 17 00:00:00 2001 From: fria <138676274+friadev@users.noreply.github.com> Date: Sat, 12 Jul 2025 07:37:18 -0500 Subject: [PATCH] move local differential privacy --- blog/posts/differential-privacy.md | 7 +++---- 1 file changed, 3 insertions(+), 4 deletions(-) diff --git a/blog/posts/differential-privacy.md b/blog/posts/differential-privacy.md index 96839902..6b312a90 100644 --- a/blog/posts/differential-privacy.md +++ b/blog/posts/differential-privacy.md @@ -113,13 +113,12 @@ The paper introduces the idea of adding noise to data to achieve privacy. Of cou This early form of differential privacy relied on adding noise to the data *after* it was already collected, meaning you still have to trust a central authority with the raw data. -#### Local Differential Privacy - -In many later implementations of differential privacy, noise is added to data on-device before it's sent off to any server. This removes the need to trust the central authority to handle your raw data. - ### Google RAPPOR In 2014, Google introduced [Randomized Aggregatable Privacy-Preserving Ordinal Response](https://arxiv.org/pdf/1407.6981) (RAPPOR), their [open source](https://github.com/google/rappor) implementation of differential privacy, with a few improvements. +#### Local Differential Privacy + +In Google's implementation, noise is added to data on-device before it's sent off to any server. This removes the need to trust the central authority to handle your raw data, an important step in achieving truly anonymous data collection.