this post was submitted on 23 Sep 2023
312 points (92.9% liked)

Technology

58061 readers
31 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

A very NSFW website called Pornpen.ai is churning out an endless stream of graphic, AI-generated porn. We have mixed feelings.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 8 points 1 year ago (22 children)

I am a little surprised that no one had created a site like this for child pornography.

I am not a legal expert, but my layman's understanding of Ashcroft v Free Speech Coalition https://en.wikipedia.org/wiki/Ashcroft_v._Free_Speech_Coalition is that as long as there is no person being harmed by it CSAM is legal.

Maybe later rulings have changed this. One can hope.

[–] [email protected] 6 points 1 year ago (2 children)

CivitAI is a pretty perverted site at the best of times. But there's a disturbing amount of age adjustment plugins to make images of children on the same site they have plugins to make sex acts. It's clear some people definitely are.

[–] [email protected] 4 points 1 year ago (1 children)

Some models also prefer children for some reason and then you have to put mature/adult in positive prompt and child in negative

[–] [email protected] 3 points 1 year ago

I think part of the problem is that there is a lot of anime in the models and when you don't filter that out with negative prompts it can distort the proportions of realistic images. In general models are always heavily biased towards what they were trained on, and when you use a prompt or LORA that worked well on one model on another, you can get weird results. There is always a lot of nudging involved with keywords and weights to get the images to were you want it.

load more comments (19 replies)