Jump to content

Apple Will Now Scan Your iPhone For Sexual Abuse Photos


Doc Reeves

Recommended Posts

16 minutes ago, Celery Man said:

if you put stuff in a self storage unit, it's your shit but it's not simply "your business" if you're storing fucktons of ammonium nitrate or briefcases of cocaine. LifeStorage probably reviews security footage, maybe uses dogs? If they don't and they could, they certainly would. Obviously kiddie porn pictures aren't going to cause us-east-1 to explode like fucking West, TX but AWS, Google, Microsoft, and in this case Apple have an interest in people using their infrastructure to traffic in exploitation and crime.

It is an interesting issue, but to me less from the perspective of "they're scanning MY images for kiddie porn" and more from "we are moving towards having a choice of three landlords for the internet".

I wonder if Amazon and Google are scanning images for their photo backup services.

Put this post at the top of the thread and lock it up

Link to comment
Share on other sites

10 minutes ago, Enchubben said:

Does it search for known illicit images or does it look for certain features in any picture?

For example, are the pictures of my kids in the bathtub now a problem? Are people at apple going to be viewing pictures of my naked kids because it meets certain criteria?

No one is “viewing” your pictures in my understanding but I do think your question about pics of your kids is an interesting question. I don’t know enough about how this works to answer it unfortunately.

Link to comment
Share on other sites

54 minutes ago, GW Hayduke said:

That would be the sacrifice of all privacy.  We have the 4th amendment which protects against unreasonable search and seizure by the government.  The subject of this thread is about the scanning of icloud pics for pedophilia.  How is that the sacrifice of all privacy? That is a question I posed to MaybeACoordinator. Clearly it isn't.  There is clearly use of hyperbole and exaggeration in expressing the consequences of this topic.  I'm am pointing out the hyperbole through rhetorical questions.  

The mechanic doesn't need a warrant, just like Apple doesn't need a warrant.  My point is that this is not the "complete loss of all privacy" as presented by some of the smooth brain folks like MaybeACoordinator. 

It’s not like the mechanic searching your car when you take it to him.  It’s more like the manufacturer if your car coming into your garage and searching it every night.  Of course the manufacturer says he’s only looking for one specific item which he will report to the authorities if it’s found.  What’s the control to keep the manufacturer from looking for other things?  Whats the control to keep the manufacturer from using anything found for whatever purpose he wants to?

 

Should your bank be able to search your safety deposit box every night just to make sure you don’t have any kiddie porn in it?

Edited by NeverMarryAStripper
  • Hook 'Em 2
Link to comment
Share on other sites

4 minutes ago, NeverMarryAStripper said:

It’s not like the mechanic searching your car when you take it to him.  It’s more like the manufacturer if your car coming into your garage and searching it every night.  Of course the manufacturer says he’s only looking for one specific item which he will report to the authorities if it’s found.  What’s the control to keep the manufacturer from looking for other things?  Whats the control to keep the manufacturer from using anything found for whatever purpose he wants to?

it is not.  it is like you are leasing a car and storing it at the lessor's garage and they are sniffing your car for drugs

 

seriously - you can turn icloud off. You can get down to best buy and buy any number of hard drives. You can use Time Machine to back up your mac locally. If you actually are storing illegal images, you don't even have to plug that device into the internet or have it connected to a data plan. If you want to store your shit in your garage without apple sniffing it, you still have that option.

Edited by Celery Man
Link to comment
Share on other sites

So if I get "hit" for having a few bathtub pics of my sister's kids she sent me and the algorithm needs to learn those are not child porn I'm guessing there is some sort of manual process which results in some apple employee going through all of my pictures to ensure it is, in fact, not child pornography. So, even though I've done absolutely nothing wrong, a stranger gets the opportunity to go through hundreds or thousands of my private photographs? What if I have completely legal pictures of my wife naked? I should be fine with letting them snoop through my personal shit?

Seems like that "review" job might be a ripe place to start looking for actual perverts and pedos.

What about documents? Will they have to go through all my documents to determine I haven't copied pictures into a word file so they aren't jpgs? I often have some sensitive, confidential work info that I have accessed on my phone. Fair game to look through that? cc4b647c4f77b50bac9974a4bc14727a.gif

Link to comment
Share on other sites

2 hours ago, GW Hayduke said:

If there was a child bound in the trunk of my car, then I would expect the mechanic to notify the authorities.

I don't have illegal or suspicious items in my car, just like I don't have child porn stored on my phone.

So again, why do folks feel like the scanning of icloud pics for pedophilia to be the sacrifice of all privacy?

Yes, dumbass, I used hyperbole to emphasize the following point: is such a dangerous cession of privacy justified by the threat to society posed by the existence of child porn?

How many unknown devotees of child porn are out there right now? It would seem to me not that many -- I imagine most of not all over 35 years of age have been busted already, becase those of that age group to have been too ignorant of how the Internet works and the various legal online snares designed to catch them. At least in Texas, once you are caught with kiddy porn, you are on the sex offender registry, and are thereby under supervision or perhaps even banned from the Internet. I am sure there are dodges around this, and i suspect there always will be, but short of putting these people behind bars for life on their first offense, I don't know if we should hand over that much access to Apple for a threat like this. It's too ripe for mission creep and abuse by hackers.

Like, say your ex got her little hacker nephew to fill your cloud with child porn. Or maybe some just ends up in there by mistake...Through a glitch, some of my photos are geotagged exactly 180 degrees around the world from where they were taken, because the cloud simply fucked up and put pics taken in Colorado in Tibet or Uttar Pradesh or some shit.  

Try getting your life back if you get hauled in on this charge. No matter how abundantly plain you make your innocence, some of that stank will attach for the rest of your life. 

Edited by MaybeACoordinator
Link to comment
Share on other sites

13 minutes ago, MaybeACoordinator said:
2 hours ago, GW Hayduke said:

 

Yes, dumbass, I used hyperbole to emphasize the following point: is such a dangerous cession of privacy justified by the threat to society posed by the existence of child porn?

It is only a “dangerous cessation of privacy” when using hyperbole, extremism, exaggeration and slippery slope. 
 

16 minutes ago, MaybeACoordinator said:

 

Like, say your ex got her little hacker nephew to fill your cloud with child porn. Or maybe some just ends up in there by mistake..

I get it. Tell us again that you aren’t a pedophile.  You aren’t a pedophile, but you are concerned that there may be some mistakenly on your iCloud account?

Link to comment
Share on other sites

1 hour ago, Enchubben said:

Does it search for known illicit images or does it look for certain features in any picture?

For example, are the pictures of my kids in the bathtub now a problem? Are people at apple going to be viewing pictures of my naked kids because it meets certain criteria?

Yeah this is the key point. 

Apple is scanning for the digital signature of known child porn, it is looking for a match on those. If you have over some set number (they said around 30) of KNOWN matches in your iCloud is when the alert even happens. 

Apple is not looking at your pictures or trying to decide what they look like. 

 

If you want to understand it here is a good interview. 

 

Edited by hornbri
  • Hook 'Em 3
  • Like 1
Link to comment
Share on other sites

9 minutes ago, Celery Man said:

ah interesting - so it's not even that they are training an algorithm to analyze images to recognize child porn (which can then be used to analyze your photos), it is scanning for known images of child pornography

Which can and will be disguised:

From the OP:

Quote

A well-known weakness of machine-learning systems such as the one Apple proposes is that it’s easy to tweak a photo so it categorises it incorrectly. Pranksters may tweak photos of cats so phones mark them as abuse, for example, while the gangs who sell real abuse images work out how to sneak them past the censor. But images are only part of the problem. Curiously, Apple proposes to do nothing about live streaming, which has been the dominant medium for online abuse since at least 2018. And the company has said nothing about how it will track where illegal images come from.

And governments love these policies:

Quote

Historically, the idea of scanning customers’ devices for evidence of crime comes from China. It was introduced in 2008 when a system called Green Dam was installed on all PCs sold in the country. It was described as a porn filter, but its main purpose was to search for phrases such as “Falun Gong” and “Dalai Lama”. It also made its users’ computers vulnerable to remote takeover. Thirteen years later, tech firms in China are completely subservient to the state – including Apple, which keeps all the iCloud data of its Chinese customers in data centres run by a state-owned company.

Quote

 

But if the technical questions are difficult, the policy questions are far harder. Until now, democracies have allowed government surveillance in two sets of circumstances: first, if it is limited to a specific purpose; second, if it is targeted at specific people. Examples of special-purpose surveillance include speed cameras, and the software in photocopiers that stops you copying banknotes. Targeting specific people usually requires paperwork such as a warrant. Apple’s system looks like the first type of these – but once it is built into phones, Macs and even watches, in a way that circumvents their security and privacy mechanisms, it could scan for whatever else – or whoever else – a government demands.

These concerns are not abstract. Nor are they limited only to countries considered authoritarian. In Australia, the government threatened to prosecute a journalist over photos of Australian troops killing civilians in Afghanistan, arguing that the war-crime images were covered by national security laws. In addition, Australian law empowers ministers to compel firms to retrain an existing surveillance system on different images, vastly expanding the scope of Apple’s proposed snooping.

Closer to home, the European Union just updated the law allowing tech firms to scan communications for illegal images and announced that a new child-protection initiative will extend to “grooming”, requiring firms to scan text, too. In Britain, the Investigatory Powers Act will also enable ministers to order a firm to adapt its systems where possible to assist in interception. Your iPhone may be quietly looking for missing children, but it may also be searching for the police’s “most wanted”.

 

 

  • Hook 'Em 1
Link to comment
Share on other sites

10 minutes ago, MaybeACoordinator said:

Which can and will be disguised:

From the OP:

And governments love these policies:

 

But thats not at all what is happening, it's not analyzing images at all. 

Step 1) Apple is installing a list of 1's and 0's on your phone that match the "finger print" of known CSAM (Child Sexual Abuse Material). This list is enclosed within iOS.

Step 2) When you upload a photo to icloud, your iPhone (On Device) it is checking to see if the 1's and 0's of the pictures you are about to upload match the set of known porn 1's and 0's that are stored on your phone. If they do when your phone sends the picture to the cloud it also sends a "flag" 

Step 3) if you have more than 30 flags in your iCloud account, then some sort of notification can occur. Apple could adjust this number to 1 I suppose for arguments sake, but it is still a upload of a picture of known CSAM.  Apple doesn't even know what 30 images you have, just that you have more than 30.

In short, Apple still sees NOTHING stored on your phone. 

Edited by hornbri
Link to comment
Share on other sites

1 hour ago, GW Hayduke said:

 

The mechanic doesn't need a warrant, just like Apple doesn't need a warrant.  My point is that this is not the "complete loss of all privacy" as presented by some of the smooth brain folks like MaybeACoordinator. 


Yes, Apple is a company. They are now claiming that because you bought a device from them they have the right to search your info to make sure your not doing anything naughty. What does that mean? Well, that naughty people will just use Samsung or burner phones like they already do and the vast majority of us will have to submit to monitoring by a private company which is actually worse in ways than if it were by the gov.

It’s a question of balance. Sure, we could solve a lot of murders if we were all required to submit our DNA from birth to the authorities, but the cost would be so great to all of us it would outweigh the good we are trying to achieve.

  • Hook 'Em 1
  • Like 1
Link to comment
Share on other sites

1 hour ago, Celery Man said:

I'm not the expert in AI/machine learning/etc but "no".

They'll "train" the algorithm on a massive set of images with humans involved (Silicon Valley, hot dog no hot dog). The algorithm will scan images uploaded to istorage and presumably there is some kind of review process for positive hits before action is taken. The algorithm improves with use. Common stuff like naked baby photos are likely to get trained out, I imagine the weirdness would come from photos of young but not underage or if someone is older but just looks very young. But Apple is fairly competent and is presumably not going to roll something out that is going to be a massive headache and cause huge problems in a way that is extremely obvious from the outset.

When the AI inevitably pulls false positives for review, who reviews them? Who looks at the 13 yr old who took a nude selfie on her phone?  Or someone’s naked wife?  Who’s on this Apple pedo team?  
 

The fappening wasn’t that long ago. This is a terrible idea. Just glad they have such a good record of keeping their user’s data secure. 

Edited by ChickenSandwich
  • Hook 'Em 1
Link to comment
Share on other sites

1 minute ago, ChickenSandwich said:

When the AI inevitably pulls false positives for review, who reviews them? Who looks at the 13 yr old who took a nude selfie on her phone?  Or someone’s naked wife?  Who’s on this Apple pedo team?  
 

The fappening wasn’t that long ago. This is a terrible idea. Just glad they have such a good record of keeping their user’s data secure. 

i was wrong this isn't what they are doing at all. there is no review - it should be basically impossible for them to pull false positives and it sounds like law enforcement would be the people to follow up on people who have >30 known child pornographic images.

the fappening happened from shit like people who have wikipedia pages (and people who have nude photos of those people) using "what elementary school did you go to" as their security question

  • Hook 'Em 1
Link to comment
Share on other sites

1 minute ago, hornbri said:

neither - your phone is going to scan itself for known porn. Not using machine learning or AI

if his explanation is accurate, your device does one part of the scan while uploading images to iCloud and the iCloud server does the other half.

You are still able to not upload your pictures onto Apple servers and circumvent this whole process.

Link to comment
Share on other sites

3 hours ago, ztejas said:

They shouldn't scan either way unless there is sufficient reason to do so. Just like a cop doesn't get to search my car because he feels like it. And a smartphone is arguably more essential than a car in today's society.

But I'm a crazed patriot with mental health issues so don't listen to me. Not going to further the discussion here - even if it is predictably moved to CR. 

Oh I think it's absolute bullshit they are doing this, but I am not surprised at all.  The feds were getting very close to forcing them to put backdoors in.  Apple is throwing them a bone to get them to back down, trying to do it on their own, rather than giving the feds access.

None of my friends who work at Apple are happy at all about this, and this is not a project that the higher-ups wanted.

But we've been watching the fight with the feds play out, and something was going to give.

  • Hook 'Em 1
Link to comment
Share on other sites

3 hours ago, BigHorn'13 said:

That doesn't sound like it's exclusive to cloud storage...

I'm just going off of their statements, basically if it's not going to touch iCloud, they won't mess with it.

And for those worried about cock pics, the algorithms are only looking at pics that have been reported/verified with the national whatever missing child center as child abuse pics.  It's not going to be send your sex videos anywhere.

But it's still really shitty.  I get why Apple is doing it, but I think it's bullshit, and Apple needs to keep publicly standing up to the government,, rather than tossing them this bone.

Link to comment
Share on other sites

So apple is either going to load a known list of fingerprints of child porn on to your phone, taking up space that you could use, or will be sending data about your photos to someplace for comparison. Either way your cpu cycles on your phone are going to be used to do this. 

So all of you saying no big deal I'm sure you're ok with your phone dropping a call with your boss or an important customer in the name of making sure your images are not child porn?

  • Hook 'Em 4
Link to comment
Share on other sites

5 minutes ago, blacklab said:

So apple is either going to load a known list of fingerprints of child porn on to your phone, taking up space that you could use, or will be sending data about your photos to someplace for comparison. Either way your cpu cycles on your phone are going to be used to do this. 

So all of you saying no big deal I'm sure you're ok with your phone dropping a call with your boss or an important customer in the name of making sure your images are not child porn?

I don't think anyone is saying it is not a big deal. We are clarifying what is actually happening. 

It could be abused for sure, the title of the thread is "apple will now scan your iPhone...." and that is not accurate. 

Link to comment
Share on other sites

18 minutes ago, hornbri said:

I don't think anyone is saying it is not a big deal. We are clarifying what is actually happening. 

It could be abused for sure, the title of the thread is "apple will now scan your iPhone...." and that is not accurate. 

From the article:
The second will see Apple scan all the images on a phone’s camera roll and if they’re similar to known sex-abuse images flag them as suspect.

 

  • Hook 'Em 1
  • Like 1
Link to comment
Share on other sites

11 minutes ago, blacklab said:

From the article:
The second will see Apple scan all the images on a phone’s camera roll and if they’re similar to known sex-abuse images flag them as suspect.

 

It may be a technical point, but in this case the phone is scanning itself. That interview is Apple trying to clear up two different things. 
 

I think the issue of privacy is super important and everyone should be informed. But it is a issue that requires us to understand exactly how the technology is working. In this case Apple has explained how it is working. If that is a lie or untruth I expect the internet community will quickly expose it as such. 

Link to comment
Share on other sites

For all the shit that Microsoft got into in the '90s for incorporating IE so tightly with Windows (or whatever the technicalities around the browser monopoly stuff was, I was a kid) I wonder about Apple and iCloud. But also I find it easy enough to turn photo storage off for iCloud but very irritating to get iCloud to shut off completely and never bother me again.

I'm not antagonizing the point about if the phone is spending a bunch of cycles scanning itself to check for kiddy porn in your pictures, but the stability of the device is precisely why I use iPhone versus Android. Though it's been since the... HTC One era that I've used an android device as a primary phone.

  • Like 1
Link to comment
Share on other sites

I'm more curious about the sexual abuse angle. Who defines what abuse is? A girl I'm seeing is into s&m and likes to be restrained spanked and many other things, we take pictures and video, will those be flagged as abuse and someone comes and investigate? I know a few onlyfans girls who use their Iphones for work and I'm sure several thousand more do as well. What about them

 

I agree with whoever said this is just the beginning, child porn, then political opinions, guns, drugs our government is constantly trying to move the goal post. This is bad precedent 

  • Like 1
Link to comment
Share on other sites

40 minutes ago, Smax said:

A girl I'm seeing is into s&m and likes to be restrained spanked and many other things, we take pictures and video, will those be flagged as abuse and someone comes and investigate?

The best strategy is to get ahead of the accusations. Make everything public to prove you have nothing to hide. I would go ahead and post the pics and videos here. Also, screenshot any texts where this young lady describes how she wants to be punished. 

  • Haha 1
Link to comment
Share on other sites

51 minutes ago, HornOnTheBayou said:

The best strategy is to get ahead of the accusations. Make everything public to prove you have nothing to hide. I would go ahead and post the pics and videos here. Also, screenshot any texts where this young lady describes how she wants to be punished. 

 

You've obviously not been reading the right threads..

Link to comment
Share on other sites

9 hours ago, NeverMarryAStripper said:

It’s not like the mechanic searching your car when you take it to him.  It’s more like the manufacturer if your car coming into your garage and searching it every night.  Of course the manufacturer says he’s only looking for one specific item which he will report to the authorities if it’s found.  What’s the control to keep the manufacturer from looking for other things?  Whats the control to keep the manufacturer from using anything found for whatever purpose he wants to?

 

Should your bank be able to search your safety deposit box every night just to make sure you don’t have any kiddie porn in it?

You'd be amazed at what banks are already doing.

  • Like 1
Link to comment
Share on other sites

5 hours ago, NeverMarryAStripper said:

How about auto manufacturers start sorting through your car’s GPS data and notifying the local authorities every time you exceed the speed limit so they can send you a ticket.

Already being done by some car rental companies from accounts I've read.

Edited by Onboard 2.0
Link to comment
Share on other sites

5 hours ago, NeverMarryAStripper said:

How about auto manufacturers start sorting through your car’s GPS data and notifying the local authorities every time you exceed the speed limit so they can send you a ticket.

Or how about all cars having a federally mandated breathalyzer   
 

The U.S. might see new drunk and impaired driving prevention technologies imposed if the $78 billion surface transportation bill becomes law.

 

 

 

Link to comment
Share on other sites

On 8/14/2021 at 10:37 PM, Doc Reeves said:

Scanning photos is tricky to do at scale. First, if a program blocks only exact copies of a known illegal image, people can just edit it slightly. Less skilled people might go out and make fresh images, which in the case of sexual abuse imagery, means fresh crimes. So a censor wants software that flags up images similar to those on the block list. But there are false alarms, and a small system of the kind that will run in a phone might have an error rate as high as 5%.Applying that error rate to the 10bn iPhone photos taken every day, a 5% false alarm rate could mean 5oom images sent for secondary screening.

In order to prevent this, Apple will only act if the primary screening on a phone detects a certain threshold of suspect images, probably 10 of them. Each photo added to a camera roll will be inspected and, when it’s backed up to iCloud, it will be accompanied by an encrypted “safety voucher” saying whether it’s suspect or not. The cryptography is designed so that once 10 or more vouchers are marked as unsafe, Apple can decrypt the images. If they look illegal, the user will be reported, and their account will be locked.

Are we positive "nobody will be looking at our photos"?

If we are going to have to deal with this, I would rather a human be in there making a judgment call, and the bolded section from the OP suggests that would be the case. I would pity that person for having to look at that stuff day after day, week after week, year after year, but the alternative -- just having algorithms serve as cops and potentially ruining the lives of innocents -- is worse. 

Note this does not stop the creation of new child porn, which really should be the goal in any such anti-child porn action. Because it's only the old images that raise the red flags with this scheme, that would create another incentive to film children being exploited.

Which would be exactly the sort of "great job, well done" outcome this idiocy deserves. 

I am certain that such a combing over would seem great at first, as it would no doubt ensnare quite a few olds among the kiddy porn fan set, but after that, it wouldn't do shit, and we'd be stuck with our files being scanned forever and ever, possibly as mission creep enters the picture to justify the diminishing returns on the porn angle. 

 

Edited by MaybeACoordinator
  • Hook 'Em 2
Link to comment
Share on other sites

13 hours ago, MaybeACoordinator said:

Try getting your life back if you get hauled in on this charge. No matter how abundantly plain you make your innocence, some of that stank will attach for the rest of your life. 

I used to work in forensics and did work for a number of police departments and DHS. I did one where it was clearly obvious that the person being charged had unknowingly downloaded files to his PC. Despite my explanation the DA proceeded. That really messed me up and I never worked for the state again. Did some defense work after that and just walked away from the speciality after awhile due to the mind fuckery.

 

  • Hook 'Em 3
  • Like 1
Link to comment
Share on other sites

3 minutes ago, F250 said:

I used to work in forensics and did work for a number of police departments and DHS. I did one where it was clearly obvious that the person being charged had unknowingly downloaded files to his PC. Despite my explanation the DA proceeded. That really messed me up and I never worked for the state again. Did some defense work after that and just walked away from the speciality after awhile due to the mind fuckery.

 

I am a journalist and I've had colleagues like that -- they will bend and warp the truth to suit some story they've written in their head before they start collecting facts, especially if it's some issue they feel passionate about. One of these guys was a reporter I used to respect and whose stories I envied, but then I kept getting complaints from the subjects of his stories about his distortions....He was very, very good at it, and never crossed the line to where he was proven wrong, exactly, but his stories wound up being lies. And I think he knew it -- he lost his job a few years ago (through no fault of his own) and last I heard he was a maitre d' at the Hustler Club in New Orleans. I know some of us on this site would claim that as a dream job, but this guy has become a train wreck while working there, as I think most of us would. It's a very dark world and one he is not cut out for. 

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.



×
×
  • Create New...