From a584e43bf9622548a31f9bf8ffc2ad57aea50aa0 Mon Sep 17 00:00:00 2001 From: fria <138676274+friadev@users.noreply.github.com> Date: Mon, 4 Aug 2025 09:43:49 -0500 Subject: [PATCH] wording Co-authored-by: redoomed1 Signed-off-by: fria <138676274+friadev@users.noreply.github.com> --- blog/posts/homomorphic-encryption.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/blog/posts/homomorphic-encryption.md b/blog/posts/homomorphic-encryption.md index c89decfa2..325cb06b6 100644 --- a/blog/posts/homomorphic-encryption.md +++ b/blog/posts/homomorphic-encryption.md @@ -68,7 +68,7 @@ An absolutely devastating breach of user privacy by any metric. ### OpenAI -When services process our data in the clear, we not only run the risk of the service themselves abusing their access to that data, but also court orders legally requiring them to retain data. +When services process our data in the clear, we run the risk of not only the service themselves abusing their access to that data, but also court orders legally requiring them to retain data. OpenAI was required to [retain](https://arstechnica.com/tech-policy/2025/06/openai-says-court-forcing-it-to-save-all-chatgpt-logs-is-a-privacy-nightmare) all ChatGPT user logs, even deleted ones. This is devastating for user privacy when you consider that ChatGPT handles over [1 billion](https://x.com/OpenAINewsroom/status/1864373399218475440) queries per day.