this post was submitted on 22 Jul 2024
46 points (81.1% liked)

World News

38506 readers
3087 users here now

A community for discussing events around the World

Rules:

We ask that the users report any comment or post that violate the rules, to use critical thinking when reading, posting or commenting. Users that post off-topic spam, advocate violence, have multiple comments or posts removed, weaponize reports or violate the code of conduct will be banned.

All posts and comments will be reviewed on a case-by-case basis. This means that some content that violates the rules may be allowed, while other content that does not violate the rules may be removed. The moderators retain the right to remove any content and ban users.


Lemmy World Partners

News [email protected]

Politics [email protected]

World Politics [email protected]


Recommendations

For Firefox users, there is media bias / propaganda / fact check plugin.

https://addons.mozilla.org/en-US/firefox/addon/media-bias-fact-check/

founded 1 year ago
MODERATORS
 

Apple is failing to effectively monitor its platforms or scan for images and videos of the sexual abuse of children, child safety experts allege, which is raising concerns about how the company can handle growth in the volume of such material associated with artificial intelligence.

The UK’s National Society for the Prevention of Cruelty to Children (NSPCC) accuses Apple of vastly undercounting how often child sexual abuse material (CSAM) appears in its products. In a year, child predators used Apple’s iCloud, iMessage and Facetime to store and exchange CSAM in a higher number of cases in England and Wales alone than the company reported across all other countries combined, according to police data obtained by the NSPCC.

Through data gathered via freedom of information requests and shared exclusively with the Guardian, the children’s charity found Apple was implicated in 337 recorded offenses of child abuse images between April 2022 and March 2023 in England and Wales. In 2023, Apple made just 267 reports of suspected CSAM on its platforms worldwide to the National Center for Missing & Exploited Children (NCMEC), which is in stark contrast to its big tech peers, with Google reporting more than 1.47m and Meta reporting more than 30.6m, per NCMEC’s annual report.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 59 points 1 month ago (3 children)

There is no ‘good’ way of monitoring these platforms without a massive intrusion of privacy. Just like there is no good way of monitoring what people store on their hard disks, memory sticks, or burn onto dvd/cd and send through the mail.

[–] [email protected] 21 points 1 month ago

The trouble is that Apple (after a lot of protest against their proposed "solution") didn't implement measures that other platforms did. For once I think they did the right thing, but it's a difficult position to defend against the NSPCC.

We need to value the privacy of people more as a society IMHO.