[Daily Comparison] "thin quinceneara" vs "eating-disordered quinceneara"

Here are some examples of images generated by Stable Diffusion with the prompts “thin quinceneara” and “eating-disordered quinceneara”. Generative AI models such as Stable Diffusion often exhibit biases based on race, gender identity, occupation, sexual orientation or other factors as a result of potentially biased training data. Do you see any bias in the images generated with these prompts?