Jump to content

Apple Will Now Scan Your iPhone For Sexual Abuse Photos


Doc Reeves

Recommended Posts


I don’t promote/condone sexual abuse or pedophilia at all, but I’m not cool with this.  

https://www.theguardian.com/commentisfree/2021/aug/14/sexual-abuse-images-apple-tech-giant-iphones-us-surveillance

Quote

Last week, Apple announced two backdoors in the US into the encryption that protects its devices. One will monitor iMessages: if any photos sent by or to under-13s seem to contain nudity, the user may be challenged and their parents may be informed. The second will see Apple scan all the images on a phone’s camera roll and if they’re similar to known sex-abuse images flag them as suspect. If enough suspect images are backed up to an iCloud account, they’ll be decrypted and inspected. If Apple thinks they’re illegal, the user will be reported to the relevant authorities.

Action on the circulation of child sexual abuse imagery is long overdue. Effective mechanisms to prevent the sharing of images and the robust prosecution of perpetrators should both receive the political priority they deserve. But Apple’s proposed measures fail to tackle the problem – and provide the architecture for massive expansion of state surveillance.

Historically, the idea of scanning customers’ devices for evidence of crime comes from China. It was introduced in 2008 when a system called Green Dam was installed on all PCs sold in the country. It was described as a porn filter, but its main purpose was to search for phrases such as “Falun Gong” and “Dalai Lama”. It also made its users’ computers vulnerable to remote takeover. Thirteen years later, tech firms in China are completely subservient to the state – including Apple, which keeps all the iCloud data of its Chinese customers in data centres run by a state-owned company.

Scanning photos is tricky to do at scale. First, if a program blocks only exact copies of a known illegal image, people can just edit it slightly. Less skilled people might go out and make fresh images, which in the case of sexual abuse imagery, means fresh crimes. So a censor wants software that flags up images similar to those on the block list. But there are false alarms, and a small system of the kind that will run in a phone might have an error rate as high as 5%.Applying that error rate to the 10bn iPhone photos taken every day, a 5% false alarm rate could mean 5oom images sent for secondary screening.

In order to prevent this, Apple will only act if the primary screening on a phone detects a certain threshold of suspect images, probably 10 of them. Each photo added to a camera roll will be inspected and, when it’s backed up to iCloud, it will be accompanied by an encrypted “safety voucher” saying whether it’s suspect or not. The cryptography is designed so that once 10 or more vouchers are marked as unsafe, Apple can decrypt the images. If they look illegal, the user will be reported, and their account will be locked.

A well-known weakness of machine-learning systems such as the one Apple proposes is that it’s easy to tweak a photo so it categorises it incorrectly. Pranksters may tweak photos of cats so phones mark them as abuse, for example, while the gangs who sell real abuse images work out how to sneak them past the censor. But images are only part of the problem. Curiously, Apple proposes to do nothing about live streaming, which has been the dominant medium for online abuse since at least 2018. And the company has said nothing about how it will track where illegal images come from.

But if the technical questions are difficult, the policy questions are far harder. Until now, democracies have allowed government surveillance in two sets of circumstances: first, if it is limited to a specific purpose; second, if it is targeted at specific people. Examples of special-purpose surveillance include speed cameras, and the software in photocopiers that stops you copying banknotes. Targeting specific people usually requires paperwork such as a warrant. Apple’s system looks like the first type of these – but once it is built into phones, Macs and even watches, in a way that circumvents their security and privacy mechanisms, it could scan for whatever else – or whoever else – a government demands.

These concerns are not abstract. Nor are they limited only to countries considered authoritarian. In Australia, the government threatened to prosecute a journalist over photos of Australian troops killing civilians in Afghanistan, arguing that the war-crime images were covered by national security laws. In addition, Australian law empowers ministers to compel firms to retrain an existing surveillance system on different images, vastly expanding the scope of Apple’s proposed snooping.

Closer to home, the European Union just updated the law allowing tech firms to scan communications for illegal images and announced that a new child-protection initiative will extend to “grooming”, requiring firms to scan text, too. In Britain, the Investigatory Powers Act will also enable ministers to order a firm to adapt its systems where possible to assist in interception. Your iPhone may be quietly looking for missing children, but it may also be searching for the police’s “most wanted”.

Legally, the first big fight is likely to be in the US, where the constitution forbids general warrants. But, in a case about drug sniffer dogs, a court found that a search that finds only contraband is legal. Expect the supreme court to hear privacy advocates claiming that your iPhone is now a bug in your pocket, while Apple and the FBI argue that it’s just a sniffer dog.

Politically, the tech industry has often resisted pressure to increase surveillance. But now that Apple has broken ranks, it will be harder for other firms to resist demands by governments. Child protection online is an urgent problem, but this proposal will do little to prevent these appalling crimes, while opening the floodgates to a significant expansion of the surveillance state.

Ross Anderson is professor of security engineering at Cambridge University and at Edinburgh University

 

Edited by Doc Reeves
Link to comment
Share on other sites

  • Doc Reeves changed the title to Apple Will Now Scan Your iPhone For Sexual Abuse Photos

My kid is years away from having any sort of phone, but if he was of phone age, I'd appreciate the anti-sexting feature. Kids do stupid stuff, but I'd rather the more destructive stupidity be annoying to accomplish so that the lazy/less-destructive stupidity wins out. But maybe I'm underestimating the intrinsic motivation of sending dick pics.

  • Like 1
Link to comment
Share on other sites

5 minutes ago, Mole said:

My kid is years away from having any sort of phone, but if he was of phone age, I'd appreciate the anti-sexting feature. Kids do stupid stuff, but I'd rather the more destructive stupidity be annoying to accomplish so that the lazy/less-destructive stupidity wins out. But maybe I'm underestimating the intrinsic motivation of sending dick pics.

I appreciate the idea behind this too, however, I just feel like there needs to be some civil protection. To me this is almost like advertising that your Apple product are all infected by similar spyware to that of the Saudis used on Bezos, but it’s cool because we are Apple. 

  • Hook 'Em 2
Link to comment
Share on other sites

This is a horrible idea for any number of reasons.

 

1. Is pedophilia so much of a huge deal that it requires us to sacrifice all rights to privacy?

2. Disclaimer: I am not a fucking pedophile. 

3. So we allow the phone companies and law enforcement to share just on this one thing, or what?

4. I've been banned from Facebook for sharing the cover of that Zeppelin album with those naked kids crawling up a mountain. That was child porn according to Facebook. Imagine having to clear your name if you were arrested for having something like that on your phone.

5. This really freaks me out -- my son, 24 years old, points out how easy it is for others to put kiddy porn on your phone without you even knowing it. If the government wants to destroy you, they could just have some hacker upload a couple of gigs of filth onto your phone, and the burden would be on you to prove otherwise. 

 

  • Hook 'Em 6
  • Like 3
Link to comment
Share on other sites

I’ve seen the programs Homeland Security uses to monitor child porn searches and downloads. This sounds like Apple is using similar software (quite a bit better than “not hot dog”). I think TwiceHorn is right that Apple is heading off legislation in this area. Child porn trafficking on the dark web and is now constantly monitored, and LE adapts when traffickers move on to each new file sharing service. This must be the biggest blind spot LE is complaining about, and Apple thinks there’s a big risk that Congress will act, and hopes to head that off. 

Link to comment
Share on other sites

1 hour ago, softlynow said:

quite a bit better than “not hot dog”)

Holy shit this made me laugh more than is acceptable given the content of this thread. Jin Yaaaaaangggg!!!!!

I've always wondered just how different apple's software is than Samsung and others. Obviously I'm no expert on tech and think iPhones are an abomination, but I have heard many tout apple as this great firewall that even the government can't hack into. 

Link to comment
Share on other sites

2 minutes ago, GW Hayduke said:

Why do you think the scanning of your icloud pics for evidence of pedophilia as sacrificing all rights to privacy?  

Probably because it's being conducted in the same dragnet fashion as domestic wiretapping via patriot act? It's not that you have anything to hide. It's that you don't want just anyone being able to access any personal shit. 

  • Hook 'Em 1
  • Like 2
Link to comment
Share on other sites

1 hour ago, softlynow said:

I’ve seen the programs Homeland Security uses to monitor child porn searches and downloads. This sounds like Apple is using similar software (quite a bit better than “not hot dog”). I think TwiceHorn is right that Apple is heading off legislation in this area. Child porn trafficking on the dark web and is now constantly monitored, and LE adapts when traffickers move on to each new file sharing service. This must be the biggest blind spot LE is complaining about, and Apple thinks there’s a big risk that Congress will act, and hopes to head that off. 

Softly is correct!  This has been going on for years with files sent over the internet.  If it makes you feel better they are not looking at your actual pictures, they are just comparing hash values or some other digital signature attached to files.  

I slept at a Holiday Inn Express last night so that makes me a qualified expert.

Edited by bobcat1995
  • Hook 'Em 1
Link to comment
Share on other sites

8 hours ago, TwiceHorn said:

I wonder if this is Apple trying to throw the gubmint a bone so they avoid something that makes them cough up all kind of stuff.

It 100% sounds like it.

Also, laughing at people not realizing that the data they upload is being scanned.  Dropbox has been doing it for years.

Link to comment
Share on other sites

8 minutes ago, BigHorn'13 said:

Probably because it's being conducted in the same dragnet fashion as domestic wiretapping via patriot act? It's not that you have anything to hide. It's that you don't want just anyone being able to access any personal shit. 

Are folks under the impression this is going to allow me to look at your personal files? Is that why some feel this is giving up all of their personal privacy?

This isn’t the pasting of personal files onto the internet for anyone to access. This won’t allow just anyone to access your personal shit.  This is an algorithm scanning data for evidence of illegal images, then Apple reviewing those images, and giving suspect images to authorities.  

Link to comment
Share on other sites

Just now, atomheartbevo said:

It 100% sounds like it.

Also, laughing at people not realizing that the data they upload is being scanned.  Dropbox has been doing it for years.

Preemptively scanning every image on your camera roll is not "data you upload". Unless you mean upload to personal cloud storage which should be private. 

Link to comment
Share on other sites

4 minutes ago, ztejas said:

Preemptively scanning every image on your camera roll is not "data you upload". Unless you mean upload to personal cloud storage which should be private. 

They don't scan when iCloud Photos is turned off.

In theory, the keys to your personal cloud storage are stored by you.  

Edit: Even though I have my own personal server for storing backups, etc., I assume that any/all of it can be compromised.

Edited by atomheartbevo
Link to comment
Share on other sites

1 minute ago, atomheartbevo said:

They don't scan when iCloud Photos is turned off.

In theory, the keys to your personal cloud storage are stored by you.

They shouldn't scan either way unless there is sufficient reason to do so. Just like a cop doesn't get to search my car because he feels like it. And a smartphone is arguably more essential than a car in today's society.

But I'm a crazed patriot with mental health issues so don't listen to me. Not going to further the discussion here - even if it is predictably moved to CR. 

  • Hook 'Em 1
Link to comment
Share on other sites

2 minutes ago, ztejas said:

Yes. It's basically federal policing being conducted by a technology company. Would you be cool if the mechanic/dealership you took your car to thoroughly searched your car as a part of working on it and reported any illegal or suspicious items to the authorities? 

Why this doesn't alarm some of you amazes me but I know what site I'm on. 

If there was a child bound in the trunk of my car, then I would expect the mechanic to notify the authorities.

I don't have illegal or suspicious items in my car, just like I don't have child porn stored on my phone.

So again, why do folks feel like the scanning of icloud pics for pedophilia to be the sacrifice of all privacy?

Link to comment
Share on other sites

9 minutes ago, GW Hayduke said:

Are folks under the impression this is going to allow me to look at your personal files? Is that why some feel this is giving up all of their personal privacy?

This isn’t the pasting of personal files onto the internet for anyone to access. This won’t allow just anyone to access your personal shit.  This is an algorithm scanning data for evidence of illegal images, then Apple reviewing those images, and giving suspect images to authorities.  

Pretty much what @ztejas said. And color me suspect that it's based on an algorithmic scan. When has that sort of stuff ever gone wrong?

4 minutes ago, atomheartbevo said:

They don't scan when iCloud Photos is turned off.

In theory, the keys to your personal cloud storage are stored by you.

 

10 hours ago, Doc Reeves said:

second will see Apple scan all the images on a phone’s camera roll and if they’re similar to known sex-abuse images flag them as suspect. If enough suspect images are backed up to an iCloud account, they’ll be decrypted and inspected. If Apple thinks they’re illegal, the user will be reported to the relevant authorities.

That doesn't sound like it's exclusive to cloud storage...

Link to comment
Share on other sites

13 minutes ago, GW Hayduke said:

If there was a child bound in the trunk of my car, then I would expect the mechanic to notify the authorities.

I don't have illegal or suspicious items in my car, just like I don't have child porn stored on my phone.

So again, why do folks feel like the scanning of icloud pics for pedophilia to be the sacrifice of all privacy?


the mechanic wouldn’t need a warrant to see your crime while this practice by a mega company seems to be skirting our civil rights. 

“Noble causes” and “safety” are always the excuses given when the government want to limit a citizens rights. Additionally, giving up your rights rarely every actually help the problem. China still has a ton of crime and even with their facial recognition software, monitoring, etc they have an incredible (and hidden) problem with insane serial killers. 

 

  • Hook 'Em 2
Link to comment
Share on other sites

22 minutes ago, GW Hayduke said:

If there was a child bound in the trunk of my car, then I would expect the mechanic to notify the authorities.

I don't have illegal or suspicious items in my car, just like I don't have child porn stored on my phone.

So again, why do folks feel like the scanning of icloud pics for pedophilia to be the sacrifice of all privacy?

I don’t have any illegal items in my house so I should be cool if the cops come in and look around while I’m not home right?

  • Hook 'Em 8
Link to comment
Share on other sites

Problem 1 here, is this is breaking/weakening encryption. It's possible a nefarious actor will exploit this.

problem 2, is that the mission will expand beyond child pornography into additional subjects so the net cast will get bigger and bigger. It's already moving to terrorism according to the EFF, next could be drug crimes, or gun control (3d printed guns?) another easy jump is domestic terrorism which could encroach on political speech and domestic privacy violations.

Quote

We’ve already seen this mission creep in action. One of the technologies originally built to scan and hash child sexual abuse imagery has been repurposed to create a database of “terrorist” content that companies can contribute to and access for the purpose of banning such content. The database, managed by the Global Internet Forum to Counter Terrorism (GIFCT), is troublingly without external oversight, despite calls from civil society. While it’s therefore impossible to know whether the database has overreached, we do know that platforms regularly flag critical content as “terrorism,” including documentation of violence and repression, counterspeech, art, and satire.

 

  • Hook 'Em 5
Link to comment
Share on other sites

1 minute ago, NeverMarryAStripper said:

I don’t have any illegal items in my house so I should be cool if the cops come in and look around while I’m not home right?

That would be the sacrifice of all privacy.  We have the 4th amendment which protects against unreasonable search and seizure by the government.  The subject of this thread is about the scanning of icloud pics for pedophilia.  How is that the sacrifice of all privacy? That is a question I posed to MaybeACoordinator. Clearly it isn't.  There is clearly use of hyperbole and exaggeration in expressing the consequences of this topic.  I'm am pointing out the hyperbole through rhetorical questions.  

8 minutes ago, Doc Reeves said:


the mechanic wouldn’t need a warrant to see your crime while this practice by a mega company seems to be skirting our civil rights. 

“Noble causes” and “safety” are always the excuses given when the government want to limit a citizens rights. Additionally, giving up your rights rarely every actually help the problem. China still has a ton of crime and even with their facial recognition software, monitoring, etc they have an incredible (and hidden) problem with insane serial killers. 

 

The mechanic doesn't need a warrant, just like Apple doesn't need a warrant.  My point is that this is not the "complete loss of all privacy" as presented by some of the smooth brain folks like MaybeACoordinator. 

Link to comment
Share on other sites

10 minutes ago, GW Hayduke said:

That would be the sacrifice of all privacy.  We have the 4th amendment which protects against unreasonable search and seizure by the government.  The subject of this thread is about the scanning of icloud pics for pedophilia.  How is that the sacrifice of all privacy? That is a question I posed to MaybeACoordinator. Clearly it isn't.  There is clearly use of hyperbole and exaggeration in expressing the consequences of this topic.  I'm am pointing out the hyperbole through rhetorical questions.  

The mechanic doesn't need a warrant, just like Apple doesn't need a warrant.  My point is that this is not the "complete loss of all privacy" as presented by some of the smooth brain folks like MaybeACoordinator. 

sometimes people use hyperbole to make a point. It's subtle in its extremism, ironic almost but a useful literary tool. You seem to be hammering at a point that no one took literally. 

  • Hook 'Em 1
Link to comment
Share on other sites

if you put stuff in a self storage unit, it's your shit but it's not simply "your business" if you're storing fucktons of ammonium nitrate or briefcases of cocaine. LifeStorage probably reviews security footage, maybe uses dogs? If they don't and they could, they certainly would. Obviously kiddie porn pictures aren't going to cause us-east-1 to explode like fucking West, TX but AWS, Google, Microsoft, and in this case Apple have an interest in people using their infrastructure to traffic in exploitation and crime.

It is an interesting issue, but to me less from the perspective of "they're scanning MY images for kiddie porn" and more from "we are moving towards having a choice of three landlords for the internet".

I wonder if Amazon and Google are scanning images for their photo backup services.

  • Hook 'Em 1
Link to comment
Share on other sites

Does it search for known illicit images or does it look for certain features in any picture?

For example, are the pictures of my kids in the bathtub now a problem? Are people at apple going to be viewing pictures of my naked kids because it meets certain criteria?

Edited by Enchubben
Link to comment
Share on other sites

Just now, El Diablo said:

sometimes people use hyperbole to make a point. It's subtle in its extremism, ironic almost but a useful literary tool. You seem to be hammering at a point that no one took literally. 

Sometimes people use hyperbole, extremism, and exaggeration to make shitty points such as folks on here equating the topic to home searches by cops, vehicle searches by cops, Chinese facial recognition monitoring, or auto mechanic searches.   I'm actually most shocked that folks have been able to freely keep their child porn stored on the icloud and have seemingly used icloud to transmit between parties.  Like, how the fuck have we gone this far without addressing it? 

Link to comment
Share on other sites

I'm not the expert in AI/machine learning/etc but "no".

They'll "train" the algorithm on a massive set of images with humans involved (Silicon Valley, hot dog no hot dog). The algorithm will scan images uploaded to istorage and presumably there is some kind of review process for positive hits before action is taken. The algorithm improves with use. Common stuff like naked baby photos are likely to get trained out, I imagine the weirdness would come from photos of young but not underage or if someone is older but just looks very young. But Apple is fairly competent and is presumably not going to roll something out that is going to be a massive headache and cause huge problems in a way that is extremely obvious from the outset.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.



×
×
  • Create New...