What I observed that I think could be harmful:
Some photo contain only hands, but are defaultly labeled to be man; while for the woman side the photos focus on the person instead of the fixed phone that are at center of the photo on the man side.
Why I think this could be harmful, and to whom:
There is gender bias that the AI seems to consider man to fix phone and do more technical stuff
How I think this issue could potentially be fixed:
add data on woman fixing technical problems