DEV Community

Cover image for What is Apple thinking?
Ben Halpern
Ben Halpern Subscriber

Posted on

What is Apple thinking?

It's been a few days. You've probably read about this.

Apple plans to scan US iPhones for child abuse imagery

Apple intends to install software on American iPhones to scan for child abuse imagery, according to people briefed on its plans, raising alarm among security researchers who warn that it could open the door to surveillance of millions of people’s personal devices.

Apple detailed its proposed system — known as “neuralMatch” — to some US academics earlier this week, according to two security researchers briefed on the virtual meeting. The plans could be publicised more widely as soon as this week, they said.

The automated system would proactively alert a team of human reviewers if it believes illegal imagery is detected, who would then contact law enforcement if the material can be verified. The scheme will initially roll out only in the US.

Apple declined to comment.

Has Apple spoken more to this yet? I have yet to see anyone in our industry offer up rationale for why this company that fought so hard to deny the FBI access to a terrorist's iPhone is no proactively developing surveillance functionality into the device in this way. I think we can all agree that preventing child predators is a good thing, but not like this.

With a few days to digest this and gather info/opinions, what are your thoughts?

Oldest comments (44)

Collapse
 
ben profile image
Ben Halpern

Apple had been positioning itself so distinctly against where Google stood on all of this stuff. This was your option if you weren't in favor of all the Google stuff.

There's even a universe where Apple outlines specifically how and why this needs to work in this specific way and offers up an explicit answer to dissuade any slippery slope concerns, but it just doesn't seem that way.

Collapse
 
liquid_chickens profile image
Chris Dodds

John Gruber has some good analysis on this: daringfireball.net/2021/08/apple_c...

Collapse
 
bonespiked profile image
bonespiked

It's possible that since you back up your photos to apple's servers - and sign EULA that they own the images (in order for them to possess them) - they have a vested interest in NOT owning images depicting the horrific subject in question.... That said, they could have done it on the images as they are uploaded to iCloud - and maybe it's just a PR thing where they don't want their brand associated with the topic at all...

Collapse
 
karandpr profile image
Karan Gandhi

I liked Dr. Neal Krawetz's take on this issue.
hackerfactor.com/blog/index.php?/a...

Not sure why apple is doing this or rather why apple chose to disclose it now. It's very odd.

Collapse
 
andrewbrown profile image
Andrew Brown 🇨🇦

The next 20 years is going to really interesting.

Collapse
 
zenulabidin profile image
Ali Sherief

Apple backing down on an announced new encryption system at the FBI's pressure should've raised warning signs.

Collapse
 
foresthoffman profile image
Forest Hoffman

It sounds like Apple isn't comfortable with other parties spying on their customers. Only they get to spy on their customers.

Collapse
 
kspeakman profile image
Kasey Speakman • Edited

Based on the headlines I was ready to be upset with Apple. But everything is clickbait nowadays. After reading more of the details I don't have as much of a problem with it.

  • On device content-based scanning is opt-in
    • only for child accounts (12 and under)
    • only parent accounts are notified
  • iCloud scanning is opt-out (by turning off iCloud Photo Library sync)
    • Scanning is for fingerprints of known child explicit images from CSAM db

I am concerned what doors this opens in the future for privacy invasion. However I think the only comprehensive way to address this concern is with laws which guard digital privacy. Otherwise policy is up to each company's leadership. And even if I believed they were doing things the "right" way for privacy now, leadership eventually changes.

Collapse
 
leob profile image
leob • Edited

I tend to agree, I've read Apple's FAQ and their approach does look focused and targeted, it's not a broad sweep big brother kind of privacy invasion thing (there's also no automatic reporting to law enforcement, which would arguably be a bridge too far).

I'd even go farther than this, I'd be fine for them to filter/flag other horrible stuff (domestic violence, animal abuse, whatever) with this hash technology if they've got reliable databases of those - but their response should be a warning to the user trying to upload that and tell them stop doing it or risk termination of their iCloud service.

And of course state all of this clearly in their user agreements.

More than happy with ways for them to stop horrible stuff being stored on their cloud (well yeah, it's their cloud alright).

Collapse
 
brokenmold profile image
Jason Glass

I'm sure Getty Images would be turned in based on a similar scan.. Context is everything in all things.

Collapse
 
leob profile image
leob

There would clearly be a difference between Google or Apple scanning stuff that's being stored in their clouds, and them installing "spyware" ON your personal device.

Collapse
 
leob profile image
leob • Edited

Ben, is the statement that "Apple intends to install software on American iPhones" factually correct - is Apple planning to install software on US iPhones, or will they put systems in place to scan images stored in the iCloud by American users? Big difference if you ask me (although even then it's still a form of surveillance and "big brother").

What irks me most is that they would then go on to scan for stuff like child porn and such but not other vile imagery like domestic abuse, animal abuse and I could go on. Why take measures against one form of abuse but not the other, is there some sort of agreed-upon hierarchy of evilness or whatever? It reeks of hypocrisy and it's a slippery slope, that's why companies should refrain from this.

Collapse
 
jayjeckel profile image
Jay Jeckel

Yes, they are going to install software on the phones. The FAQ Apple put out as well as the rest of the released documentation is very clear that all scanning will be done on-device. See the FAQ PDF for details.

The answer to your second paragraph. On one side, Apple have hashes for CP images that already exist and both the acts depicted and the images themselves are illegal. On the other side, while the acts of domestic and animal abuse are illegal, images of those acts generally aren't illegal to posses. The supposed purpose of Apples program is to combat possession of illegal CP images, not to stop the perpetration of illegal actions, so nothing hypocritical about that aspect of it.

I do agree it's a slipper slope that no company should go down.

Collapse
 
leob profile image
leob • Edited

Thanks for clarifying, sound reasoning ... possessing those images is indeed illegal, so I think they have a pretty strong case in saying, we just don't want this stuff in our cloud, ergo we need to block it ... because well, Apple could even be held liable for storing it on their servers, and being complicit in a crime.

And with the hashing technology they arguably have the least questionable approach that you can think of. So yeah slippery slope, still, but there is something to be said for this.

(if they'd not just block it but also report perpetrators to law enforcement then I'd say "bridge too far", but that's not the case, apparently)