A recent study from Stanford University claims that facial-detection artificial intelligence (AI) software has been credited with correctly differentiating between gay and straight male faces 81% of the time, and 74% for women, resulting in fierce backlash from LGBTQ rights groups, who fear this kind of technology could be used to harm queer people.
The study, co-authored by Michal Kosinski and Yilun Wang, used 35,000 publicly-posted facial images, limited only to white users, from an undisclosed dating website. Christian Mingle, is that you?
Both GLAAD and HRC issued statements condemning the findings, even demanding that Stanford University distance itself from this study which they called ‘junk science.’ Stanford responded by noting that these studies are published “to be scrutinized by academics in the field and are appropriately a matter for discussion and debate.”
Kosinski was surprised at the backlash from gay rights groups as he notes that this study gives ‘strong support’ for the theory that sexual orientation is not a choice and this should make the LGBTQ community view his findings more favorably.
The debate centers on the ethical value of such a study that could be used by families, businesses and hostile governments to profile, and possibly persecute LGBTQ people on a large scale. This gay detection system could be used in a multitude of dangerous ways, reminiscent of the Nazi propaganda that preposterously and falsely claimed to detect Jews by certain features, including larger noses and swarthy, hairy bodies (sounds hot to me, is he on Growlr?). Nick Rule, an associate professor of psychology at the University of Toronto, who has published research on the science of “gaydar” warned, “if you can start profiling people based on their appearance, then identifying them and doing horrible things to them, that’s really bad.”
According to The Guardian, the results of this controversial study found that gay men, and to a lesser degree, lesbians, tended to have ‘gender-atypical’ features, expressions and “grooming styles.” Let’s call this “gay face.” The typical gay facial features for men, according to the findings, include ‘narrower jaws, longer noses and larger foreheads.’
This gaydar meets Minority Report technology seems to work like a hyper-focused, super horny, drunk gay dude on a Saturday night cruising Scruff. Let’s call him “Dick.” Dick scrolls through thousands of photos, swiping past the guys that obviously have ‘gay-face’ because Dick wants ‘masc’ tonight. So Dick sorts out the fastidious groomers with arched eyebrows and perfect beards. Dick has another cocktail and blocks all guys with fierce facial expressions, as well as those with pursed lips, pageant-ready toothy smiles, and overly expressive eyes. Finally, Dick demands ‘more pics!’ so that he can filter out the ‘maybe she’s born with it’ gay features; narrower jaws, longer noses, etc. Dick can now pass out knowing that he wasted his entire night trying to find the non-gay face on a gay app. According to the study, Dick, as well as any other mere mortal, would only have a 61% success rate in this futile endeavor, compared to the AI technology, which would have a whopping 91% success rate after getting five or more photos.
“AI can tell you anything about anyone with enough data,” said Brian Brackeen, CEO of Kairos, a face recognition company. “The question is as a society, do we want to know?”