Advancing technology has always been a controversial topic of discussion and today it’s about to get even more heated.
A new study from the Stanford University has found that artificial intelligence can now guess whether a person is gay or straight based on photos of their faces with up to 91% accuracy, a claim that also puts the human ‘gaydar’ under serious threat.
Researchers from the university found that a computer algorithm could successfully distinguish between gay and straight men 81% of the time whilst it did the same for women with a 74% success rate. The issues stemming from this kind of technology is potentially dangerous when it comes to ethics and privacy, whereby facial recognition could be used to identify someone’s sexual orientation against their consent, making anti-LBGT abuse much easier.
The artificial intelligence itself was tested against a sample of over 35,000 facial images publicly uploaded to a US dating site. From there the technology extracted key features from the images by using “deep neural networks” (a complex mathematical model) to analyse the visuals across an extensive dataset.
The results? The artificial intelligence found that gay men and women tended to possess “gender-atypical” features, expressions and “grooming styles”. In other words gay men appeared more feminine whilst gay women were more masculine. On a more physical analysis, it determined that gay men tended to have narrower jaws, longer noses and larger foreheads when compared to straight men. Gay women were again the opposite with physical features consisting of larger jaws and smaller foreheads when compared to straight women.
These results were then tested against human judges who faired significantly worse than the algorithm with a 61% success rate for men and 54% for women. The machines weren’t done there though. When the chances were increased by allowing it to analyse five images per person, it had a guess success rate of 91% for men and 83% for women.
What the authors concluded from this was that “faces contain much more information about sexual orientation than can be perceived and interpreted by the human brain”. Delving even further, the paper suggested that the results showed strong support for the theory that sexual orientation originates from exposure to specific hormones before birth, meaning people are born gay and being queer is not a choice. The lower score for females also suggested that their sexual orientation tended to be more fluid.
Speaking of the underlying issues that this new technology brings was Nick Rule, an associate professor of psychology at the University of Toronto who has also published research on the science of gaydar.
“It’s certainly unsettling. Like any new tool, if it gets into the wrong hands, it can be used for ill purposes. If you can start profiling people based on their appearance, then identifying them and doing horrible things to them, that’s really bad.”
“What the authors have done here is to make a very bold statement about how powerful this can be. Now we know that we need protections.”
Authors of the Stanford Study also noted the potential for this new artificial intelligence to be used to link facial features to a host of other human traits and behaviours including political views, psychological conditions or personality.
Sound eerily familiar? You’ve probably watched Tom Cruise in Minority Report where people could be arrested for committing a crime they’ve yet to commit.
“AI can tell you anything about anyone with enough data,” said Brian Brackeen, CEO of Kairos, a face recognition company.
“The question is as a society, do we want to know?”
[via The Guardian]