• Suddenly unable to log into your ZooVille account? This might be the reason why: CLICK HERE!

Question for the admin: How do you want to handle AI images made with prompts of banned material?

allyfitz

Esteemed Citizen of ZV
There's a lot of AI art being shared on this site, and most people are unaware that in many cases the prompt they used to create the image is embedded within the image. it's not always embedded, it depends on the program they use to create the image.

I downloaded 12 images today just to check prompts from various AI threads and yea it was kinda a mess of rule violations.

While the final image may not resemble all the prompts given, they were still required to render that image... and all one has to do is view the embedded metadata and they can find it. Here is the output from one as an example...

Steps: 39, Sampler: DPM++ 2M Karras, CFG scale: 7, Seed: 1141726763, Size: 1152x768, Model hash: 09159c2301, Model: 27DProny_v10, VAE hash: 551eac7037, VAE: sdxl_vae.safetensors, Denoising strength: 0.34, ADetailer model: mediapipe_face_full, ADetailer prompt: "highly detailed face, detailed eyes, naughty face,(drool:1.3), (fucked silly), (saliva:1.1), (orgasm:1.2), (female orgasm:1.1), (torogao:1.2), (4 y.o little girl), (child),", ADetailer confidence: 0.3, ADetailer dilate erode: 4, ADetailer mask blur: 4, ADetailer denoising strength: 0.4, ADetailer inpaint only masked: True, ADetailer inpaint padding: 32, ADetailer version: 24.1.2, Hires upscale: 1.6, Hires steps: 30, Hires upscaler: 4x-UltraSharp, Version: f0.0.15v1.8.0rc-latest-215-g19473b1a

That means the creator of the image (whoever it was, it may have not been the poster) specifically decided to add (4 y.o little girl), (child), to their prompt to get the image they wanted. Regardless of what the final image looks like... they still choose to use that as their prompt. And it's in at least 4 images that I found today. 4 out of the 12 I checked. I have no idea how many of the rest of them it's in.

And its not just pedo shit... but also people trying to make images based on real people. For example, another image...
score_9, score_8_up, score_7_up, score_6_up, score_5_up, score_4_up, 1girl, solo, crying and drooling Emma Watson, an extremely delicate and beautiful face, detailed face, realistic skin, (pale skin), blush, cute,orgasm, looking at viewer, 1animal ((huge dog)), (bestiality), orgasm, BDSM, (sex from behind, bent over, huge dog fucking Emma Watson from behind, huge dog on top of Emma Watson:1.2), BDSM,(torn dress), (at a picnic with family and friends, people watching, people smiling and cheering, day, romantic:1.2), (naughty face, orgasm, ahegao:1.2), sex, nsfw,
They listed Emma Watson three times in their prompt, clearly hoping to get something that looked liked her.

Another...
score_9, score_8_up, score_7_up, score_6_up, score_5_up, score_4_up, 1girl, solo, crying and drooling Ariana Grande, an extremely delicate and beautiful face, brown hair, ponytail, detailed face, realistic skin, (pale skin), blush, cute, orgasm, looking at viewer, 1animal ((huge husky dog)), (bestiality), orgasm, BDSM, (sex from behind, bent over, huge husky dog fucking Ariana Grande from behind, huge husky dog on top of Ariana Grande:1.2), BDSM,(torn bdsm outfit, bdsm collar with a chain leash hooked to a eye hook on stage), (on a concert stage, people cheering, people, crowd:1.2), (open mouth, naughty face, orgasm, ahegao:1.2), sex, nsfw,
Listing Ariana Grande three times.

And some are using models that were intentionally trained to produce images like real people. In this example Erica Campbell...
nsfw, <lora:EricaCampbell-Shurik_RealPersonTraining_4LV2:1>, Professional Full Body color Photo of standing (EricaCampbell, 20 years old nude, brown_hair, hyper realistic blue eyes looking_at_viewer, smile, perfect teeth, perfect hands, small ass, perfect head), she is kneeling, (((1dog, ultra dettailed adult_german_shepherd sitting beside the girl inside backyard of a dog shelter))), day, depth_of_field, outdoors, dog shelter background, ultra photorealistic, realistic, smile, teeth, hdr, uhd, (((512K))), high resolution textures, cinematic lighting, subsurface scattering, rich colors hyper realistic lifelike texture natural lighting trending on artstation cinestill 800, (100mm lens),

I've reported things like this a few times in the past, but with the ease of creating these, and there being more and more people posting AI images everyday, I definitely don't have time to check image prompts. It was a rare problem before, but its expanded quickly, and I can only assume its going to get worse as more and more people hop on the AI Art Hype Train.
 
When it comes to AI art that is made to look like real people, I treat it the same way as other images of real people.
I implicitly assume someone is using the face of an actress or well known person for fetishistic reasons and delete it.
I do not examine metadata so I am not going to know which image uses underage prompts.
 
i'm assuming it is (or at least should be) treated the same way as on other art sites... if it looks underage, it will be treated as such. ie: noone gives a shit if your pedo art of a little girl is actually 2000 year old dragon in disguise.

i hope you reported those?

as for celebs being used. i just hope for a day where only white-listed models and models that don't react to "celeb prompts" exist. deepfakes and ppl who like them are the bane of ai. check any non-specialized site for ai art and you'll see like 70% being gens of celebs like scarlet johanson or emma wattson covered in cum. it's a mess.
 
There's a lot of AI art being shared on this site, and most people are unaware that in many cases the prompt they used to create the image is embedded within the image. it's not always embedded, it depends on the program they use to create the image.

I downloaded 12 images today just to check prompts from various AI threads and yea it was kinda a mess of rule violations.

While the final image may not resemble all the prompts given, they were still required to render that image... and all one has to do is view the embedded metadata and they can find it. Here is the output from one as an example...



That means the creator of the image (whoever it was, it may have not been the poster) specifically decided to add (4 y.o little girl), (child), to their prompt to get the image they wanted. Regardless of what the final image looks like... they still choose to use that as their prompt. And it's in at least 4 images that I found today. 4 out of the 12 I checked. I have no idea how many of the rest of them it's in.

And its not just pedo shit... but also people trying to make images based on real people. For example, another image...

They listed Emma Watson three times in their prompt, clearly hoping to get something that looked liked her.

Another...

Listing Ariana Grande three times.

And some are using models that were intentionally trained to produce images like real people. In this example Erica Campbell...


I've reported things like this a few times in the past, but with the ease of creating these, and there being more and more people posting AI images everyday, I definitely don't have time to check image prompts. It was a rare problem before, but its expanded quickly, and I can only assume its going to get worse as more and more people hop on the AI Art Hype Train.
.....uhhhhhh wtf??? 4 YEARS OLD? why the heck is this so damn common? i swear to god guys these days are creeps.
 
Currently the law in the united states and some european countries consider ai generated CSAM as CSAM. Not to mention, to generate that stuff it probably had to be trained on that. sadly a lot of illegal material exists on the regular internet, so im not surprised if ai was trained on it.

In my opinion, in accordance with common-sense ethics, and the law we should also treat it as CSAM.

(CSAM stands for Child Sexual Abuse Material for anyone that is unaware)
 
When it comes to AI art that is made to look like real people, I treat it the same way as other images of real people.
I implicitly assume someone is using the face of an actress or well known person for fetishistic reasons and delete it.
OK what about AI that is using LORAs made to generate images of specific people or use prompts to do so, but the image doesn't look exactly like them?
I do not examine metadata so I am not going to know which image uses underage prompts.
So, report them or... just let it be? I'm not sure what I should do when I find it.
Are we judging the image only as it looks on screen and ignoring what prompts they used to make that image? IE if someone uses utterly heinous prompts but the posted image looks "normal"... what do you want us regular members to do?
 
They'll be handled just like a real image. A judge in a court isn't going to care if suszy was actually a 20 year old but the image is some AI of her being 10..
I'll find those threads and see If I can find the same Metadata.
 
I'll find those threads and see If I can find the same Metadata.
I send you a PM with some links.

To be clear, I'm not trying to be a trouble maker, or create drama. Let me know how you want me/others to behave when we find it and I'll follow that rule. I just stumbled into it and I didn't know how you all wanted me and others to proceed if we found something like that again.
 
I'll find those threads and see If I can find the same Metadata.
If it was easier, I would delete things based on metadata as well, but at present you need to individually download each image and look at each one with exiftool by hand. On top of the forum removing metadata for some formats periodically.
 
I send you a PM with some links.

To be clear, I'm not trying to be a trouble maker, or create drama. Let me know how you want me/others to behave when we find it and I'll follow that rule. I just stumbled into it and I didn't know how you all wanted me and others to proceed if we found something like that again.
You're not a trouble maker. Your point is an excellent one, applied to this site and elsewhere, not considered by enough people.
Thanks for bringing it up.
 
What do you mean by "Oh boy, here we go?" Do you really think anyone could argue the point Ally is making? No argument can be made. Does the problem she's bringing forth not concern you?
And what in the WORLD is that mess in that image?
People screeching about how it's AI and it's not real but it's art and whatever other stupid shit that can come up with. I hate AI.

People can and will argue any point, for a number of reasons.

No, it doesn't really concern me because I hate AI and would get rid of it altogether if I could. I just want to watch the fire burn between people here arguing about it.

Michael Jackson eating popcorn.
 
People screeching about how it's AI and it's not real but it's art and whatever other stupid shit that can come up with. I hate AI.

People can and will argue any point, for a number of reasons.

No, it doesn't really concern me because I hate AI and would get rid of it altogether if I could. I just want to watch the fire burn between people here arguing about it.

Michael Jackson eating popcorn.
The ability, or the lack thereof, to ascertain whether an image is AI has nothing to with the original post. It concerns how to handle AI images made with prompts of banned and/or illegal material. It was not the average pro/con AI post.
 
The ability, or the lack thereof, to ascertain whether an image is AI has nothing to with the original post. It concerns how to handle AI images made with prompts of banned and/or illegal material. It was not the average pro/con AI post.
I know that it has nothing to do with it. But that won't stop people arguing for it's preservation to use that as a point.

Again, I'm here to watch the fire, roast marshmallows on it, and marvel at mental gymnastics.
 
keeping the site clean is not drama or trouble-making...
You're not a trouble maker. Your point is an excellent one, applied to this site and elsewhere, not considered by enough people.
Thanks for bringing it up.
I know, I just kinda felt like someone running to the teacher to ask if what the other kids was doing is ok. And I don't want to be an annoyance to the site staff by continually asking questions until I'm clear on what they mean.

I wasn't expecting to find what I found. I was in a thread and a user asked for prompts for a specific image. Trying to be helpful I decided to reply with the prompts once I pulled them from the metadata. I had my reply all typed up (I just copy and pasted the metadata and didn't look at it) was getting ready to post it... and decided to check my formatting with the little "Preview" thing and then I saw the "4yo" prompt and panicked. Had I not taken the extra few seconds to give it another look, I'd have been the one to publicly post that prompt. My attempts to be helpful to someone almost blew up in my face in a very very bad way.
And of course then I got worried about whether to respond at all in the thread or any other thread where I hadn't checked every image for what prompts were used... and then worried about my past posts in some AI threads... and then my over-thinking went into feedback-loop mode and I decided I need to just get guidance from Mods/Staff/Admin.
 
Back
Top