this post was submitted on 18 Oct 2024
159 points (97.6% liked)
Asklemmy
43889 readers
1476 users here now
A loosely moderated place to ask open-ended questions
If your post meets the following criteria, it's welcome here!
- Open-ended question
- Not offensive: at this point, we do not have the bandwidth to moderate overtly political discussions. Assume best intent and be excellent to each other.
- Not regarding using or support for Lemmy: context, see the list of support communities and tools for finding communities below
- Not ad nauseam inducing: please make sure it is a question that would be new to most members
- An actual topic of discussion
Looking for support?
Looking for a community?
- Lemmyverse: community search
- sub.rehab: maps old subreddits to fediverse options, marks official as such
- [email protected]: a community for finding communities
~Icon~ ~by~ ~@Double_[email protected]~
founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
(I am not an expert, just a hobby self-hoster)
Think of how police obtain information about people. They usually do an investigation involving questioning and warrants to receive records and put together a case. They must obtain consent from someone or get a warrant from a judge to search records.
Or, they could just buy info from a data broker and obtain a massive amount of information about someone.
Imagine if every company has this info and can tie it in to your daily life. Google probably has your data location history and can see exactly what routes you've taken lately. They can use that information, with timestamps, to estimate your speed. What if they sold it to your car insurance company, who then uses it to raise your rates because you are labeled as a speeder?
What if your purchase history is sold to your health insurance provider and they raise your deductible because most of your food purchases are at unhealthy fast food joints?
Now, with AI being shoved into every nook and cranny in the tech we use, AI can quickly get a profile on you if it is fed your chat history. Even your own voice is not safe if it can be accessed by AI. This can be used to emulate you - Interests, chats, knowledge, sound. People could use this to steal your identity or access accounts.
Actually police (and governments) don’t need to purchase your data. They can gather anything and everything from what people share publicly and constantly on social media. Countless numbers of people have been arrested because of what they shared publicly and the metadata included with that share.
If they need criminal info they have immediate access to it.
The concern isn’t that you do something wrong, it’s that the data that you put out there can be used against you in countless ways. Marketing, sales, and so on are the least of your worries. If anyone wants to threaten you, your loved one’s, or even trick you into thinking they are in a threat situation, most people don’t realize how easy that could be with the data they give away daily.
That's why I said this:
They don't need warrants for location data if it's bought from a company that sells that data.
Whether or not it's admissible in court is another question, though.