Bottom Right sees freedom as not being told by other rulers what to do. Your own rulers get to tell you what to do obviously.
PirateJesus
Philip answered him, 2 books is not sufficient for them. And Jesus took the books; and when he had given thanks, he distributed to the disciples, and the disciples to them that were set down. Therefore they gathered them together, and filled twelve baskets with the new copies, which remained over.
- 38 Posts
- 72 Comments
A fundamental misunderstanding of capitalism.
It’s actually just hungry hungry hippos. And hippos can eat smaller hippos
PirateJesus@lemmy.todayto
THE POLICE PROBLEM@lemmy.world•The police need this for...?English
4·2 years agoBulldozing the whole school to make sure they get the shooter.
stuck on decrappified windows for the immediate future.
PirateJesus@lemmy.todayto
Technology@lemmy.world•FBI Arrests Man For Generating AI Child Sexual Abuse ImageryEnglish
1·2 years agoEverybody is American. They just don’t know it yet.
Gospel of the Jesus
PirateJesus@lemmy.todayto
Technology@lemmy.world•FBI Arrests Man For Generating AI Child Sexual Abuse ImageryEnglish
1·2 years agoIt seems to me to be a lesser charge. A net that catches a larger population and they can then go fishing for bigger fish to make the prosecutor look good. Or as I’ve heard from others, it is used to simplify prosecution. PedoAnon can’t argue “it’s a deepfake, not a real kid” to the SWAT team.
There is a massive disconnect between what we should be seeing, and what we are seeing. I assume because the authorities who moderate this shit almost exclusively go after real CSAM, on account of it actually being a literal offense, as opposed to drawn CSAM, being a proxy offense. This can be attributed to no proper funding of CSAM enforcement. Pedos get picked up if they become an active embarrassment like the article dude. Otherwise all the money is just spent on the database getting bigger and keeping the lights on. Which works for congress. A public pedo gets nailed to the wall because of the database, the spooky spectre of the pedo out for your kids remains, vote for me please…
PirateJesus@lemmy.todayto
Technology@lemmy.world•FBI Arrests Man For Generating AI Child Sexual Abuse ImageryEnglish
2·2 years agoAnd also it’s an AI.
13k images before AI involved a human with Photoshop or a child doing fucked up shit.
13k images after AI is just forgetting to turn off the CSAM auto-generate button.
PirateJesus@lemmy.todayto
Technology@lemmy.world•FBI Arrests Man For Generating AI Child Sexual Abuse ImageryEnglish
2·2 years agoStable Diffusion has been distancing themselves from this. The model that allows for this was leaked from a different company.
PirateJesus@lemmy.todayto
Technology@lemmy.world•FBI Arrests Man For Generating AI Child Sexual Abuse ImageryEnglish
15·2 years agoIt’s not ok to do this. https://www.thefederalcriminalattorneys.com/possession-of-lolicon
PirateJesus@lemmy.todayto
Technology@lemmy.world•FBI Arrests Man For Generating AI Child Sexual Abuse ImageryEnglish
22·2 years agoMaking the CSAM is illegal by itself https://www.thefederalcriminalattorneys.com/possession-of-lolicon
Title is pretty accurate.
PirateJesus@lemmy.todayto
Technology@lemmy.world•FBI Arrests Man For Generating AI Child Sexual Abuse ImageryEnglish
36·2 years agoCreating the pics is a crime by itself. https://www.thefederalcriminalattorneys.com/possession-of-lolicon
PirateJesus@lemmy.todayto
Technology@lemmy.world•FBI Arrests Man For Generating AI Child Sexual Abuse ImageryEnglish
114·2 years agoso many people still think it should be illegal
It is illegal. https://www.thefederalcriminalattorneys.com/possession-of-lolicon
PirateJesus@lemmy.todayto
Technology@lemmy.world•FBI Arrests Man For Generating AI Child Sexual Abuse ImageryEnglish
18·2 years agoThe generated stuff is as illegal as the real stuff. https://www.thefederalcriminalattorneys.com/possession-of-lolicon https://en.wikipedia.org/wiki/PROTECT_Act_of_2003
PirateJesus@lemmy.todayto
Technology@lemmy.world•FBI Arrests Man For Generating AI Child Sexual Abuse ImageryEnglish
15·2 years agoIt would be illegal in the United States. Artistic depictions of CSAM are illegal under the PROTECT act 2003.
PirateJesus@lemmy.todayto
Technology@lemmy.world•FBI Arrests Man For Generating AI Child Sexual Abuse ImageryEnglish
23·2 years agoAsked whether more funding will be provided for the anti-paint enforcement divisions: it’s such a big backlog, we’ll rather just wait for somebody to piss of a politician to focus our resources.
PirateJesus@lemmy.todayto
Technology@lemmy.world•FBI Arrests Man For Generating AI Child Sexual Abuse ImageryEnglish
311·2 years agoSimulated crimes aren’t crimes.
Artistic CSAM is definitely a crime in the United States. PROTECT act of 2003.
PirateJesus@lemmy.todayto
Technology@lemmy.world•FBI Arrests Man For Generating AI Child Sexual Abuse ImageryEnglish
28·2 years agoThe major concern to me, is that there isn’t really any guidance from the FBI on what you can and can’t do, which may lead to some big issues.
The Protect Act of 2003 means that any artistic depiction of CSAM is illegal. The guidance is pretty clear, FBI is gonna raid your house…eventually. We still haven’t properly funded the anti-CSAM departments.
PirateJesus@lemmy.todayto
Technology@lemmy.world•FBI Arrests Man For Generating AI Child Sexual Abuse ImageryEnglish
477·2 years agoOMG. Every other post is saying their disgusted about the images part but it’s a grey area, but he’s definitely in trouble for contacting a minor.
Cartoon CSAM is illegal in the United States. AI images of CSAM fall into that category. It was illegal for him to make the images in the first place BEFORE he started sending them to a minor.
https://www.thefederalcriminalattorneys.com/possession-of-lolicon
PirateJesus@lemmy.todayto
Technology@lemmy.world•FBI Arrests Man For Generating AI Child Sexual Abuse ImageryEnglish
712·2 years agoCurrently, we do not outlaw written depictions nor drawings of child sexual abuse
Cartoon CSAM is illegal in the United States
https://www.thefederalcriminalattorneys.com/possession-of-lolicon











It depends on how situation aware the cops are about all the cameras and witnesses. In an ideal case, they’d realize there are a lot of voting community members around them who made the time to attend a ceremony. Not a situation where people will just turn up the volume on their television sets.