Pika@sh.itjust.worksEnglish
7 hoursIn the US companies(where the company is located last I knew) are legally mandated to report specific things such as CSAM and other things if they come across it.
What the issue should be isn’t the fact that they are reporting it, the issue should be they have the capability to see it in the first place to be able to report it.
This isn’t me defending CSAM or anything like that but, in a decent storage system, google shouldn’t be able to even see what you have, let alone what the images actually are.
- Blue_Morpho@lemmy.worldEnglish6 hours
They not only look at your files but will decrypt any encrypted zip files to see what you have.
- 🍉 Albert 🍉@lemmy.worldEnglish8 hours
severe mix feelings.
glad they caught him, but corporations casually snooping through your data and report whatever they want is definitely not an good thing
Leon@pawb.socialEnglish
6 hoursWas a gay guy here in Sweden who got assaulted and kidnapped by masked police because some American company had found CSAM on his account while crawling through Yahoo email.
Only it wasn’t CSAM, the photos depicted the man’s 30 year old twinky boyfriend.
No restitution. No police were punished for assaulting a suspect proved innocent. The man and his boyfriend both were humiliated.
I’ve no mixed feelings about it. Spying through private data is entirely unforgivable. There are plenty of pedos out there who get caught and nothing happens anyway. They don’t need to violate innocent people’s privacy to do their job.
Like if the ends justify the means you can end all suffering in the world by just nuking everything. All problems solved.
Edit: pesos → pedos
- tidderuuf@lemmy.worldEnglish8 hours
Microsoft has been doing this for years. It was with Onedrive at first but now that they’ve enabled “analytics” in every product that might connect to the internet they can have it all searched.
Supposedly it is first filtered by algorithms but that shit is still being uploaded somewhere other than your hard drive.
wizardbeard@lemmy.dbzer0.comEnglish
7 hoursI believe it was in preview build versions of Win 7 or 10 where researchers found it was sending the generated thumbnails of images on your PC to Redmond (MS HQ). Can’t remember if they said it was for CSAM detection or just a debugging feature in the preview builds.
- 0x0@lemmy.zipEnglish3 hours
the generated thumbnails of images on your PC
So the precursor to Recall?
- obvs@lemmy.worldEnglish7 hours
Unfortunately, the negative effects from companies like Google turning in completely ethical people for doing things that should be completely legal and uncontroversial will do drastically more damage than the positive effects from said companies turning in the poorest of the pedophiles.
- 7 hours
They’re suggesting it was automated hash based recognition.
I don’t have a problem with CSAM hash matching.
- 7 hours
Sure, until it starts flagging normal pictures with its janky AI and you get your door kicked in based on a warrant signed by Google.
Leon@pawb.socialEnglish
6 hoursThis literally already happened here in Sweden. A guy got assaulted by masked police in the middle of the night because an American company had gone through photos in his Yahoo mail and flagged his 30 year old boyfriend as possible CSAM.
People like to think that Sweden is progressive etc. and I’d rebut it with this. If it can happen here, it could happen anywhere.
- 🍉 Albert 🍉@lemmy.worldEnglish7 hours
my issue is that we have a framework for corporations to scan all your data and inform the state. used to stop CSAM, but it’s a matter of state policy wether said structure will be used to fight discent.
- 7 hours
I agree. We’ve seen this happening in the USA “yes technically they can do that but they would never”. Now we know better.
- org@lemmy.orgEnglish7 hours
“The first image in the ‘Nudity’ collection … depicted who I believe to be David Edward-Ooi Poon without a shirt, taking a selfie of himself while sticking out his tongue over an unconscious adult female,” the search-warrant application states. The document goes on to describe the woman in the photo as naked below the waist and wearing a dark-coloured eye mask over her eyes.
The detective alleges that that photograph and others she examined appeared to be stored in a folder on the iPhone titled “Girls I Drugged And Raped.”
Doesn’t sound like hashes to me.
- 7 hours
That is the result of the search warrant, not the trigger.
- 9 hours
The detective alleges that that photograph and others she examined appeared to be stored in a folder on the iPhone titled “Girls I Drugged And Raped.”
- Zetta@mander.xyzEnglish6 hours
If you read the article, it looks like only 9 images were originally reported by Google. The images in the folder called the girls I drugged and raped were on his iPhone that they broke into with Cellebrite
- 7 hours
The images I am referring to are likely distinct from the ones in the title as they are from his iPhone and Google is who reported him. Regardless in the article it says the detective looked at one of the Google reported images. Whether they just referenced a known hash I don’t know for sure, but I think it’s pretty well known that FAANG scan basically all images for CSAM nowadays.
- 5 hours
Kind of fine with this. It gives me the ick they can do that, but so does CSAM and I don’t see a middle ground.
- 4 hours
I don’t know why anyone would expect Google isn’t sifting through what you upload to its cloud.



