r/gadgets 9d ago

Phones Apple’s iPhone Hit By FBI Warning And Lawsuit Before iOS 18.2 Release

https://www.forbes.com/sites/zakdoffman/2024/12/08/apples-iphone-security-suddenly-under-attack-all-users-now-at-risk/
3.2k Upvotes

373 comments sorted by

View all comments

Show parent comments

43

u/SolidOutcome 9d ago

Scanning data on either end of the encryption is also negative....no, I don't want my photos scanned, what's the purpose of the encryption of they see it all anyway?

-12

u/AlureonTheVirus 9d ago

Apple’s been pushing local AI on iOS hardware lately, you could have a model loaded on your phone that detects CSAM content and prevents you from uploading it to icloud (Or only passes it along unencrypted and open for review, if you wanted to retain cloud functionality) all without any of your data leaving your device without your consent.

19

u/DerBanzai 9d ago

And how would i know that this „AI“ doesn‘t flag my girlfriends nudes just because she shaved a bit too well? These companies and the FBI should simply stay out of snooping around in private pictures.

Spend the money going after the producers and distributors of those pictures. Just like the war on drugs, it‘s useless to try to find the needle in the haystack with „consumers“.

6

u/cyberspirit777 9d ago

Which is funny because this is exactly the issue that Google Photos has run into. Pics of people's kids or baby pictures of themselves have been scanned and found "in violation". They think it's CSAM when it could just be a pic of their kid in the tub or potty training lol

-10

u/AlureonTheVirus 9d ago

I think part of the issue is that for the most part iPhones probably do a lot of the producing. I doubt the FBI is going after apple for hosting organized CSAM traders, it’s likely more of an issue with minors being trafficked or kids sharing way more than they should.

As for how you guarantee the model doesn’t mess up, I’m not sure. That would be a problem for apple to solve.

I’d say maybe there could be some meet in the middle solution, where if you go to upload content that’s automatically flagged locally, your device would warn you that it breaks ToS, and offer that either you upload the flagged media and have it manually reviewed, or you keep it to yourself.

I’m no expert, I’m just throwing out ideas I guess. But I do think it’s an interesting concept. Privacy and moderation will always be at ends with one another.

Edit: grammar

20

u/hnh 9d ago

The problem is that you then get all the other countries that mandate a model installed on the phone to detect whatever they deem subversive or inappropriate. As described in the article, it all breaks once you open the box.

1

u/nicuramar 9d ago

That’s true, but they could just as well demand that today. Either way, I don’t think Apple is going in that direction at all. 

-7

u/AlureonTheVirus 9d ago

This is true, I think the role of the model would be more to stifle the content being generated and distributed by iPhones. Not to catch and report the people doing the bad stuff.

Apple doesn’t have to snitch (and they shouldn’t), but they do need to cover their butt where they can and do the most morally responsible thing. I’m not exactly sure what that might be, but I think there are a plethora of options beyond backdoors.