Stanford CIS

Why You Should Be Suspicious of That Study Claiming A.I. Can Detect a Person’s Sexual Orientation

By Sonia Katyal on

Recently, the A.I. community was left largely stunned when a study released by two Stanford researchers claimed that artificial intelligence could essentially detect a person’s gay or straight sexual orientation. For those of us who have been working on issues of bias in A.I., it was a moment that we had long foreseen: Someone would attempt to apply A.I. technology to categorize human identity, reducing the rich complexity of our daily lives, activities, and personalities to a couple of simplistic variables. The now-infamous study is really only the tip of the iceberg when it comes to the dangers of predictive analytics mapping onto nuanced questions of human identity. Here, using entirely white subjects, all of whom had posted their profiles on dating sites, along with their photographs, the study concluded that its neural technology could predict whether a person was gay or straight roughly over 70 percent of the time (though it depended on gender and how many images were presented).

The study was deeply flawed and dystopian, largely due to its choices of whom to study and how to categorize them. In addition to only studying people who were white, it categorized just two choices of sexual identity—gay or straight—assuming a correlation between people’s sexual identity and their sexual activity.  In reality, none of these categories apply to vast numbers of human beings, whose identities, behaviors, and bodies fail to correlate with the simplistic assumptions made by the researchers. Even aside from the methodological issues with the study, just focus on what it says about, well, people. You only count if you are white. You only count if you are either gay or straight.

“Technology cannot identify someone’s sexual orientation,” stated Jim Halloran, GLAAD’s chief digital officer, in a statement. “What their technology can recognize is a pattern that found a small subset of out white gay and lesbian people on dating sites who look similar. Those two findings should not be conflated.” Halloran continued, “This research isn’t science or news, but it’s a description of beauty standards on dating sites that ignores huge segments of the LGBTQ community, including people of color, transgender people, older individuals, and other LGBTQ people who don’t want to post photos on dating sites.”

Unsurprisingly, the researchers claimed that critics were rushing to judgment prematurely. "Our findings could be wrong,” they admitted in a statementreleased Monday. “[H]owever, scientific findings can only be debunked by scientific data and replication, not by well-meaning lawyers and communication officers lacking scientific training," they claimed.

Read the full piece at Slate.