Jump to content

Zuckerberg Inc downloaded and seeded porn FOR YEARS to help feed AI - lawsuit


Recommended Posts

Posted

Strike 3 is that company that files lawsuits against anyone who's I.P. was involved in porn torrents. Rember when grandmas were getting sued because of their grandkids downloading MP3s? This is like that except it's porn.

  • Hook 'Em 2
Posted (edited)
10 minutes ago, huge said:

Can someone explain?

Meta needed a shitton of data, ANY kind of data, to feed into their LLM's. For LLM's that generate video, that data tends to be video data.

As part of collecting training datasets, Meta used torrenting to download and reseed a BUNCH of copyrighted works, as well as some porn, and even CSAM. Which means they were distributing child porn via torrent seeding, and propagating that child abuse, for the sake of building a more capable video generation large language model

 

Edit: allegedly 

Edited by Captainant
  • Hook 'Em 3
  • Rage+1 3
Posted (edited)
16 minutes ago, Parliament said:

So they used porn to teach their AI?  Or did they post porn on FB?

When seeding a lot of material in bittorrent users apparently are rewarded by being able to dl more and faster. So the porn is alleged to have been a means to a HUGE LOAD of pirated content.

Quote

The porn site operator explained to the court that BitTorrent's protocol establishes a "tit-for-tat" mechanism that "rewards users who distribute the most desired content." It alleged that Meta took advantage of this system by "often" pirating adult videos that are "often within the most infringed files on BitTorrent websites" on "the very same day the motion pictures are released."

These tactics allegedly gave Meta several advantages, making it harder for Strike 3 Holdings' sites to compete, including potentially distributing the videos to minors for free without age checks in states that now require them.

"Meta specifically targeted Plaintiffs’ content for distribution in order to accelerate its downloads of vast amounts of other content," the lawsuit said. And while Meta claimed that it "wrote a script to intentionally limit distributing popular books on BitTorrent," Strike 3 Holdings believes "discovery will likely show" Meta "continuously" distributed its adult videos specifically as a strategy to get around the BitTorrent protocol.

So far, Strike 3 Holdings says it has documented at least five episodes in which Meta "hand-picked" adult videos from a specific site for "intense periods of distribution" to avoid seeding other content it was sourcing through BitTorrent.

 

 

Edited by Chopper
wording
  • Hook 'Em 1
Posted (edited)

Wait . . . regular porn AND kiddie porn (as the thread title says)?  Aside from whatever IP might have been violated, kiddie porn should take that to a new level.  I'm not seeing that stated in the article.

Edited by jimmyjazz
  • Hook 'Em 1
  • Rage+1 1
Posted
34 minutes ago, huge said:

Can someone explain?

AI is a grift.  Those who perpetuate this grift will do anything, including putting children in harm's way, to make more money.  Zuckerberg is a soulless cunt, and like many others, has too much money and zero regulation.   That's about it. 

  • Hook 'Em 4
  • Like 1
  • Rage+1 1
Posted
29 minutes ago, Captainant said:

Pron is/was part of their training datasets for their large language models, aka "AI" models

which isn't unique to Meta, correct? OpenAI I believe is documented to follow similar training protocol.

Posted
6 minutes ago, jimmyjazz said:

Wait . . . regular porn AND kiddie porn (as the thread title says)?  Aside from whatever IP might have been violated, kiddie porn should take that to a new level.  I'm not seeing that stated in the article.

There was a big hullabaloo last year about companies taking measures to remove CSAM from their training datasets

https://www.theverge.com/2024/4/23/24138356/ai-companies-csam-thorn-training-data

Just now, pacman said:

which isn't unique to Meta, correct? OpenAI I believe is documented to follow similar training protocol.

Correct, thats pretty standard operating procedure for these foundational model maintainers

  • Hook 'Em 2
Posted
25 minutes ago, Captainant said:

Meta needed a shitton of data, ANY kind of data, to feed into their LLM's. For LLM's that generate video, that data tends to be video data.

As part of collecting training datasets, Meta used torrenting to download and reseed a BUNCH of copyrighted works, as well as some porn, and even CSAM. Which means they were distributing child porn via torrent seeding, and propagating that child abuse, for the sake of building a more capable video generation large language model

 

Edit: allegedly 

I presume that their torrenting/seeding efforts extended far beyond porn and included movies, TV shoes, music videos, etc... If thats the case, we can be sure Meta will be held accountable, correct? 

Quote

Statutory Damages:

Copyright owners can sue for statutory damages, which can range from $750 to $30,000 per infringed work. 

Increased Damages for Willful Infringement:

If the infringement is deemed willful, meaning done intentionally or with reckless disregard for the law, the damages can be increased up to $150,000 per work. 

Or maybe the government will do mark a solid, and just ask nicely that they stop doing illegal shit. 

  • Hook 'Em 1
  • Chopper changed the title to Zuckerberg Inc downloaded and seeded porn FOR YEARS to help feed AI - lawsuit
Posted
10 minutes ago, jimmyjazz said:

Wait . . . regular porn AND kiddie porn (as the thread title says)?  Aside from whatever IP might have been violated, kiddie porn should take that to a new level.  I'm not seeing that stated in the article.

Thanks. I fixed the title. I mixed up the subhed (mentions seeding "to minors" due to no age check) and previous history.

Posted

Is there any proof that Facebook used the porn to train the AI models? My interpretation is that the porn site only knows the IPs. 

It seems just as likely that employees were torrenting the porn for personal use. 

Posted
3 minutes ago, zman13 said:

Is there any proof that Facebook used the porn to train the AI models? My interpretation is that the porn site only knows the IPs. 

It seems just as likely that employees were torrenting the porn for personal use. 

Are You Stupid GIF

  • Haha 2
Posted
19 minutes ago, zman13 said:

Is there any proof that Facebook used the porn to train the AI models? My interpretation is that the porn site only knows the IPs. 

It seems just as likely that employees were torrenting the porn for personal use. 

According to the article the lawsuit addresses this. Meta's ip addresses were downloading a huge amount of data - ebooks, music, software, etc -- that doesn't match the use pattern of an individual.

Quote

 

Meta also allegedly attempted to "conceal its BitTorrent activities" through "six Virtual Private Clouds" that formed a "stealth network" of "hidden IP addresses," the lawsuit alleged, which seemingly implicated a "major third-party data center provider" as a partner in Meta's piracy.

An analysis of these IP addresses allegedly found "data patterns that matched infringement patterns seen on Meta’s corporate IP Addresses" and included "evidence of other activity on the BitTorrent network including ebooks, movies, television shows, music, and software." The seemingly non-human patterns documented on both sets of IP addresses suggest the data was for AI training and not for personal use, Strike 3 Holdings alleged.

 

 

Posted
2 hours ago, huge said:

Can someone explain?

As far as my wife knows, all of the porn I’ve downloaded over the years was not for my personal consumption, but to train my AI.  The article is proof of that.

/surly

  • Haha 2
  • Drool 2
Posted
1 hour ago, Blotto said:

I presume that their torrenting/seeding efforts extended far beyond porn and included movies, TV shoes, music videos, etc... If thats the case, we can be sure Meta will be held accountable, correct? 

Or maybe the government will do mark a solid, and just ask nicely that they stop doing illegal shit. 

The problem with statutory damages is that the Copyright must be registered prior to the commencement of infringement, which is often not the case.

Meta seems to have been exceptionally brazen in their infringement for training purposes.

Posted
1 hour ago, zman13 said:

Is there any proof that Facebook used the porn to train the AI models? My interpretation is that the porn site only knows the IPs. 

It seems just as likely that employees were torrenting the porn for personal use. 

 

 

 

  • Hook 'Em 2
Posted
15 minutes ago, Vegas64 said:

So it sounds like by casting a wide net, they caught some really bad stuff. 

Sounds like more than that. There's greater demand for torrenting high quality porn, esp. new porn, which they (allegedly) downloaded and continually seeded in order to get preferential treatment by the torrent to be able to download other material with far weaker demand that they also wanted. I haven't torrented for years but the more you "share" the more you're able to download and able to dl at faster speeds. They wanted the porn for their ai, allegedly, but also for increasing what else they could dl from the web. 

  • Hook 'Em 1
Posted
1 minute ago, Chopper said:

Sounds like more than that. There's greater demand for torrenting high quality porn, esp. new porn, which they (allegedly) downloaded and continually seeded in order to get preferential treatment by the torrent to be able to download other material with far weaker demand that they also wanted. I haven't torrented for years but the more you "share" the more you're able to download and able to dl at faster speeds. They wanted the porn for their ai, allegedly, but also for increasing what else they could dl from the web. 

Oh wow. That is gross and egregious behavior.

Posted
2 hours ago, huge said:

Can someone explain?

Let me explain . . . No, there is too much. Let me sum up. Buttercup is marry Humperdinck in little less than half an hour. So all we have to do is get in, break up the wedding, steal the princess, make our escape . . . after I kill Count Rugen.

  • Hook 'Em 2
  • Like 5
Posted
14 minutes ago, Chopper said:

Sounds like more than that. There's greater demand for torrenting high quality porn, esp. new porn, which they (allegedly) downloaded and continually seeded in order to get preferential treatment by the torrent to be able to download other material with far weaker demand that they also wanted. I haven't torrented for years but the more you "share" the more you're able to download and able to dl at faster speeds. They wanted the porn for their ai, allegedly, but also for increasing what else they could dl from the web. 

that's an element of this, as well as many of the core datasets collected back in the early 2020's had really REALLY shit governance controls and they were absurdly wide, to the point of containing CSAM. Organizations like Thorn have made tooling to make it easier to maintain your training datasets and remove odious content, but Meta DID still download and reseed CSAM. A regular joe would go to FPMITA prison for that

  • Hook 'Em 1
Posted
4 minutes ago, Parliament said:

Sry I'm still hung up on how AI can be made better by letting it watch a buncha porn.

Remove the pornographic context from it, it's still video shot with real humans moving in real human motions, shot in photorealistic lighting. It's still valuable data for what a "real" video should look like. And when your directive is simply to "get more data" it doesn't really matter where it comes from, just that you're meeting the objective.

Generative AI (and literally all machine learning) is just reproducing or identifying patterns already seen in training data, or ID'ing abberations from the training data set it's already seen. But once it's produced, it's basically impossible to know what data was used in which model or to create which portion of an image or video or text. That virtually impossible unknowability is what Meta and OpenAI and Amazon are banking on to shield them from copyright claims or other allegations of horrible things like (inadvertantly) training on CSAM. 

  • Hook 'Em 2
Posted

There's one legitimate use in training the AI to recognize csam for purposes of moderating the platform without subjecting human moderators to csam. But if I were training an AI for that purpose I'd want special dispensation from the government, buy-in from victims rights groups, and be training it on stuff from evidence lockers so as to not generate any apparent demand.

  • Hook 'Em 1
  • Like 1
  • Haha 1
Posted
1 hour ago, Captainant said:

Remove the pornographic context from it, it's still video shot with real humans moving in real human motions, shot in photorealistic lighting. It's still valuable data for what a "real" video should look like.

The Nanny 90S GIF
 

kathryn hahn humping GIF

Posted
1 hour ago, Parliament said:

Sry I'm still hung up on how AI can be made better by letting it watch a buncha porn.

Alexa, my stepmother is stuck in the dishwasher. How do I get her out?

Well, before you get her out ...

  • Drool 1
Posted (edited)
4 hours ago, Chopper said:

According to the article the lawsuit addresses this. Meta's ip addresses were downloading a huge amount of data - ebooks, music, software, etc -- that doesn't match the use pattern of an individual.

 

I guess I'm still skeptical that we can assume it was a Facebook sanctioned download of data for AI vs. a bunch of Facebook employees. They have 30,000 to 50,000 employees in the timeframe of the downloads. Even when employees know their are being monitored, people still do dumb things.

This feels like the legal equivalent of "Trust me bro"

Edited by zman13
Posted
16 minutes ago, zman13 said:

I guess I'm still skeptical that we can assume it was a Facebook sanctioned download of data for AI vs. a bunch of Facebook employees. They have 30,000 to 50,000 employees in the timeframe of the downloads. Even when employees know their are being monitored, people still do dumb things.

This feels like the legal equivalent of "Trust me bro"

It had already been reported that when Meta went fishing for training material, it used torrents and bootleg websites to download pirated/infringing copies in a knowing fashion.

That has not been reported in connection with any of the other AI training infringement cases.

Posted
5 hours ago, Celery Man said:

giphy-downsized.gif

My second job - about a week on the job.  Town hall meeting where the boss says "We got too many people looking at porn here!" 

It's a good thing it was my second job and I was a little older - maybe 25.  Just a year or so earlier, I'm sure my hand would have shot up and asked "If there's 'too many', then there is an acceptable amount?  What is that number, sir?"

  • Haha 1

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.



×
×
  • Create New...