this post was submitted on 06 Jul 2024
327 points (92.7% liked)

Technology

57435 readers
3396 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
all 29 comments
sorted by: hot top controversial new old
[–] [email protected] 105 points 1 month ago (1 children)

So whats the issue here? Ohh no my private documents are in plain text on my private computer fucking morons.

[–] [email protected] 3 points 1 month ago (2 children)

Linux nerd water cooler chat.

[–] [email protected] 19 points 1 month ago

"One of the great things about Linux is that everything can be treated as a text file... Hey wait a minute, ChatGPT is using fucking plaintext files??"

[–] [email protected] 3 points 1 month ago (1 children)
[–] [email protected] 90 points 1 month ago

There really wasn’t an expectation of privacy with this. This is not a surprise.

[–] [email protected] 87 points 1 month ago (1 children)

Microsoft's much-heralded Notepad.exe was storing files as plain text

Same level of security concern. Quit putting your sensitive data into apps that aren't meant for it.

[–] [email protected] 10 points 1 month ago

Yup. Especially apps that are pushing the wonders of cloud services to share that data everywhere.

[–] [email protected] 56 points 1 month ago (1 children)

Microsoft's much-heralded Word app was storing documents as unencrypted DOCX files leaving them viewable by any malware.

[–] [email protected] 1 points 1 month ago* (last edited 1 month ago)

We mustn't enter any private info in a large language model (llm) in the 1st place. The conversations are probably used to train ai models.

There should be 2 disclaimers in any llm –

  1. The llm's responses aren't always based on facts. It can say wrong info sometimes.

  2. Users mustn't enter any private info in the llm.

[–] [email protected] 51 points 1 month ago

So many apps use sqlite or json files for storage without encryption; this doesn't seem like much of a discovery.

In any case, don't share PII or any of your deepest, darkest secrets with it.

[–] rottingleaf 41 points 1 month ago (1 children)

I store almost everyfuck in plain text, so what?

Oh, somebody wants to use techbro stuff and expect security.

[–] [email protected] 6 points 1 month ago (2 children)

Many people now use ChatGPT like they might use Google: to ask important questions, sort through issues, and so on. Often, sensitive personal data could be shared in those conversations.

[–] [email protected] 11 points 1 month ago* (last edited 1 month ago)

Don't a lot of people also keep their tax information as plain text in their PC? If someone's really worried about that stuff being leaked I think it's on them to download VeraCrypt or smth, and also not to use ChatGPT for sensitive stuff knowing that OpenAI and Apple will obviously use it as training data.

[–] rottingleaf 3 points 1 month ago

Well, there's a good side to this - at least the recipe of that totally not poisonous green cocktail will be available from logs.

[–] [email protected] 39 points 1 month ago
[–] [email protected] 10 points 1 month ago

This is the best summary I could come up with:


OpenAI announced its Mac desktop app for ChatGPT with a lot of fanfare a few weeks ago, but it turns out it had a rather serious security issue: user chats were stored in plain text, where any bad actor could find them if they gained access to your machine.

As Threads user Pedro José Pereira Vieito noted earlier this week, "the OpenAI ChatGPT app on macOS is not sandboxed and stores all the conversations in plain-text in a non-protected location," meaning "any other running app / process / malware can read all your ChatGPT conversations without any permission prompt."

OpenAI chose to opt-out of the sandbox and store the conversations in plain text in a non-protected location, disabling all of these built-in defenses.

OpenAI has now updated the app, and the local chats are now encrypted, though they are still not sandboxed.

It's not a great look for OpenAI, which recently entered into a partnership with Apple to offer chat bot services built into Siri queries in Apple operating systems.

Apple detailed some of the security around those queries at WWDC last month, though, and they're more stringent than what OpenAI did (or to be more precise, didn't do) with its Mac app, which is a separate initiative from the partnership.


The original article contains 291 words, the summary contains 211 words. Saved 27%. I'm a bot and I'm open source!