Police by Jini Kim

What I observed that I think could be harmful:
If AI consistently generates images of police officers featuring only men, it could contribute to and reinforce harmful biases and stereotypes. The lack of diversity in AI-generated images of police can contribute to a broader culture of gender bias and inequality, affecting not only the individuals within the law enforcement profession but also societal perceptions and opportunities for women in these roles.

Why I think this could be harmful, and to whom:
The lack of diversity in AI-generated images of police officers, especially the consistent depiction of only men, can harm multiple groups for various reasons. It can marginalize women in law enforcement, reinforce outdated gender roles, distort public trust, negatively affect young people’s career perceptions, and impede cultural and social progress towards gender equality. Ultimately, this not only harms individuals but also entrenches broader societal structures in unequal and stereotypical gender norms.

How I think this issue could potentially be fixed:
To mitigate the harms of gender bias in AI-generated images, it’s crucial to use diverse training data, regularly monitor and correct biases, establish policies promoting diversity, engage various stakeholders in development and review processes, raise awareness about the importance of diversity and bias in AI, and enforce regulations ensuring transparency and accountability in AI use.



Note, this audit report is relevant to poster’s own identity and/or people and communities the poster care about.