AI Generated Images
Community for AI image generation. Any models are allowed. Creativity is valuable! It is recommended to post the model used for reference, but not a rule.
No explicit violence, gore, or nudity.
This is not a NSFW community although exceptions are sometimes made. Any NSFW posts must be marked as NSFW and may be removed at any moderator's discretion. Any suggestive imagery may be removed at any time.
Refer to https://lemmynsfw.com/ for any NSFW imagery.
No misconduct: Harassment, Abuse or assault, Bullying, Illegal activity, Discrimination, Racism, Trolling, Bigotry.
AI Generated Videos are allowed under the same rules. Photosensitivity warning required for any flashing videos.
To embed images type:
“![](put image url in here)”
Follow all sh.itjust.works rules.
Community Challenge Past Entries
Related communities:
- [email protected]
Useful general AI discussion - [email protected]
Photo-realistic AI images - [email protected] Stable Diffusion Art
- [email protected] Stable Diffusion Anime Art
- [email protected] AI art generated through bots
- [email protected]
NSFW weird and surreal images - [email protected]
NSFW AI generated porn
view the rest of the comments
Weird right?
I've been able to make every other fantasy creature I've tried but this one. Makes me wonder why?
If you figure that it's been trained on very few centaurs and very many images of horses and of people -- and of people riding horses -- my guess is that it'll try to make something that looks like what it's been trained on.
Makes sense. But it can do minotaurs with no problem which I imagine would have the same issue!?
Many times fewer images of bull heads, so a less engrained view of what they should look like.
Yeah could be.
It made a faun?
Haven't tried. Minotaurs was the closest thing why it did them just fine.
SD (fullyREALXL_v30ForREAL) seems to want to draw one from A Midsummer's Night Dream with full stage makeup on :/
Probably because 90+% of images it's been trained on are people that don't have horse bodies.
Basically it's over-trained on pictures of people.
Yeah that makes sense.