Should we fear Gaydar & BiFi?
There’s a popular notion these days of ‘gaydar’: the ability of lesbian and gay people to spot each other when straight people wouldn’t have a clue as to the sexual orientation of the person in question. The bi equivalent would be bidar but that’s not quite such fun wordplay, so the term that has started to catch on in its place is “bi-fi”.
Many of us would like to have it – but if it’s real, surely whatever the cues are we pick up on then they could be trained onto computers. And then those of us with dodgy bi-fi signals could just download the app and be able to find out if the cutie down the street is bi in no time.
n a paper titled “Deep neural networks are more accurate than humans at detecting sexual orientation from facial images” the Journal of Personality and Social Psychology presented a Stanford University (USA) study looking at whether computers can identify gays and lesbians based on a couple of photos.
They say they can. Better than humans can, by quite some margin, concluding “Across seven studies, we show that a computer algorithm can accurately detect sexual orientation from people’s faces. When presented with a pair of participants, one gay and one straight, the algorithm could correctly distinguish between them 91% of the time for men and 83% of the time for women.”
Researchers Michael Kosinski and Yulin Wang explain they’ve got it sussed, and rapidly realise the challenges that might pose.
“We did not build a privacy-invading tool. We studied existing facial recognition technologies, already widely used by companies and governments, to see whether they can detect sexual orientation more accurately than humans. We were terrified to find that they do. This presents serious risks to the privacy of LGBTQ people.”
Well quite. For day-to-day life in Britain that might mainly be about Facebook working out which adverts to run along the side of your timeline but consider its usefulness in countries where homosexuality is still illegal or in deciding who to let through customs quickly and slowly at the Russian border.
All through the power of computers, because “faces contain much more information about sexual orientation than can be perceived and interpreted by the human brain”.
The sourcing on the images is interesting. The data scraping picked out pages gay men like on Facebook, but one of those cited is “Gay Times Magazine”, which will perhaps have a UK skew.
“Unfortunately, we were not able to reliably identify heterosexual Facebook users” is a line to make me chuckle.
“We obtained facial images from public profiles posted on a U.S. dating website. We recorded 130,741 images of 36,630 men and 170,360 images of 38,593 women between the ages of 18 and 40, who reported their location as the U.S. Gay and heterosexual people were represented in equal numbers. Their sexual orientation was established based on the gender of the partners that they were looking for (according to their profiles).”
You might want to pause and wonder the morality of that data-scraping exercise there, as I doubt those 75,000 people opted in to being digitally analysed to find out whether they looked gay. There’s also a question about how reliable photos and ages on dating sites are, but overall people who are looking for a romantic partner are maybe more likely to be honest than those hoping to tick a box with a researcher so they get a $20 Amazon gift card for their participation.
Regular BCN readers with their love of breaking down simple gay-straight style binaries will be wondering where the bisexuals are and how the machine copes with the gender diversity we’ve all got used to in the 21st century.
The researchers admit they kept that simple, and further they only processed white people’s faces for the analysis.
They picked random pairs and asked the AI which one of two people “looked gay”.
Off a pair of single photos they could accurately guess “which one’s gay” 70% of the time with pairs of women and 80% for men; with five photos of each subject, that rose to nearly 85% of the time women and above 90% in men.
Success rates are a bit lower if the software is only allowed to look at your basic face shape rather than taking into account finer details of how you present yourself.
The analysis is interesting: they made the software explore which pixels in a face were giving the most information. That gay and straight people have differing jawlines might be news. However, “lesbians tend to use less eye makeup… wore less revealing clothes… smiled less than their heterosexual counterparts” will surprise few readers. Perhaps likewise “heterosexual men… tended to wear baseball caps”.
So gaydar seems to be for real, even if yours is dodgy, and when they build a bigger computer they’ll have bifi sorted. And now we all know how to “pass” if we need to – a bit of eyeshadow or a baseball cap and you’ll be instantly straight, take them off if you want to be officially gay.
But it may be more trouble than it’s worth.