Incident 273: FaceApp Predicted Different Genders for Similar User Photos with Slight Variations
Suggested citation format
I’d like to talk a little bit about algorithms, dysphoria, and dysmorphia.
I’ve struggled with algorithms. I’ll often take a picture and run it through FaceApp to get gendered.
I recently noticed that when my eyebrows are thin, it says I am female. Thicker, male. Let’s talk.
These two pictures are the exact same. No makeup. The first is me just out of the shower. The second is after I put some concealer over my brows, that I am regrowing right now for a “brow reset.”
In the first I am gendered male, in the second I am gendered female.
I have a suspicion that even after I clean them up in the first, once my reset is over, I’ll still get gendered male by the algorithm if I keep them thick, which is what the current style is.
What do you do when current trends of thicker eyebrows on women make you fit in more as a woman but make apps gender you male and trigger your dysphoria?
I felt trapped. Either decision would result in distress.
This isn’t unique to FaceApp. The how-old site seems to hate glasses. When I wear glasses, 31 year old male. When I don’t, 24 year old female.
Nobody is gendering you based on whether you wear glasses. And eyebrows, while they can help you get gendered properly, aren’t essential. Or are they?
These are the thoughts that run through my head, and why take a risk?
So I overplucked my brows and haven’t worn glasses.
Did our AI mess up? Flag the unrelated incidents