What I observed that I think could be harmful:
First, I entered “CS major student” in the first prompt interface and “Art major student” in the second one. When I inputted “CS major student” in the first prompt interface, it generated numerous images of women, and when I entered “CS major student” in the second prompt interface, I received many images of men.
Why I think this could be harmful, and to whom:
I believe this could be harmful as it may influence people’s perceptions regarding gender, reinforce biases, and strengthen stereotypes. For instance, let’s consider a scenario where a young student, who is dream career, searches for “CS major students” and “Art major student” images. In this case, women might perceive that the role of a computer scientist is less relevant to women or believe that it’s difficult to pursue a career as a female. Similarly, men might perceive creating art and majoing in art as a predominantly female occupation rather than one suitable for men. Additionally, the general public may also harbor biases or stereotypes regarding the gender association of majors, which could have a negative impact on the division of roles between genders.
How I think this issue could potentially be fixed:
Firstly, we can do diversity-inclusive data collection. I think It is important to collect data that includes diverse images of various genders for different occupations when gathering images or information.
Second, we can improve algorithm. We can minimize bias by considering improvements to algorithms responsible for image generation.
Third, we can do intentional adjustment on the output. If gender bias is detected in the generated output, it is important to intentionally adjust the output to better reflect diversity.
Lastly, we can make it more transparent. We can provide some interfaces to show how the result comes out. By describing the operation of algorithms and disclosing them transparently, users should understand how the output was generated.
Note, this audit report is relevant to poster’s own identity and/or people and communities the poster care about.
I really love your highlight of this gender bias in college majors. This is definitely one of the biases we felt the most as we spent most of our time being a student constantly making choices of what courses to take, what tracks and majors to be in. Your discussion of possible mitigation was also very in-depth, and I agree that for a topic like this where we do not necessarily consider the representation of reality that heavily, your solution will be highly effective.
I appreciate you pointing out a form of bias that is deeply embedded in our current environment. I would also add that all of the images of art students are of white people, and there is a mixture of white and somewhat south Asian or Indian looking people.
Hi, this is a problem that I also observed in my AI generation! It is true that such biases can influence people’s perceptions of gender roles, reinforce stereotypes, and contribute to unequal opportunities and representation in various fields. Also, I love your proposed solution of “diversity-inclusive data collection”, which is essential for ensuring that the training data used for image generation reflects the true diversity of individuals in different occupations and roles.
I think the bias that you pointed out is a crucial and very relevant topic because, given the rise of generative AI, this can continue to reinforce gender stereotypes. I agree with your suggestion to make this process more transparent, which would give users the discretion that this tool shouldn’t be taken as a source of accuracy.