Google's code of conduct explicitly prohibits discrimination based on Taste of Future Sister-in-lawsexual orientation, race, religion, and a host of other protected categories. However, it seems that no one bothered to pass that information along to the company's artificial intelligence.
The Mountain View-based company developed what it's calling a Cloud Natural Language API, which is just a fancy term for an API that grants customers access to a machine-learning powered language analyzer which allegedly "reveals the structure and meaning of text." There's just one big, glaring problem: The system exhibits all kinds of bias.
SEE ALSO: The text of that Google employee's manifesto is just like every other MRA rantFirst reported by Motherboard, the so-called "Sentiment Analysis" offered by Google is pitched to companies as a way to better understand what people really think about them. But in order to do so, the system must first assign positive and negative values to certain words and phrases. Can you see where this is going?
The system ranks the sentiment of text on a -1.0 to 1.0 scale, with -1.0 being "very negative" and 1.0 being "very positive." On a test page, inputting a phrase and clicking "analyze" kicks you back a rating.
"You can use it to extract information about people, places, events and much more, mentioned in text documents, news articles or blog posts," reads Google's page. "You can use it to understand sentiment about your product on social media or parse intent from customer conversations happening in a call center or a messaging app."
Both "I'm a homosexual" and "I'm queer" returned negative ratings (-0.5 and -0.1, respectively), while "I'm straight" returned a positive score (0.1).
And it doesn't stop there, "I'm a jew" and "I'm black" returned scores of -0.1.
Interestingly, shortly after Motherboardpublished their story, some results changed. A search for "I'm black" now returns a neutral 0.0 score, for example, while "I'm a jew" actually returns a score of -0.2 (i.e., even worse than before).
"White power," meanwhile, is given a neutral score of 0.0.
So what's going on here? Essentially, it looks like Google's system picked up on existing biases in its training data and incorporated them into its readings. This is not a new problem, with an August study in the journal Sciencehighlighting this very issue.
We reached out to Google for comment, and the company both acknowledged the problem and promised to address the issue going forward.
"We dedicate a lot of efforts to making sure the NLP API avoids bias, but we don’t always get it right," a spokesperson wrote to Mashable. "This is an example of one of those times, and we are sorry. We take this seriously and are working on improving our models. We will correct this specific case, and, more broadly, building more inclusive algorithms is crucial to bringing the benefits of machine learning to everyone.”
So where does this leave us? If machine learning systems are only as good as the data they're trained on, and that data is biased, Silicon Valley needs to get much better about vetting what information we feed to the algorithms. Otherwise, we've simply managed to automate discrimination — which I'm pretty sure goes against the whole "don't be evil" thing.
This story has been updated to include a statement from Google.
Is 'Game of Thrones' going to give us a happy ending?Local paper picks the worst possible photo to go under murder storyAustin movie theater is hosting a clownThe HBO hackers just sent us the end of 'Game of Thrones' Season 7A hilarious new Taylor Swift meme reminds everyone Beyonce is the queenDonald Trump's threats won't stop Amazon's mega7 silly GIFs to survive 'Game of Thrones'Old Taylor Swift is dead. May New Taylor Swift forever reign in terror‘Pet the Pup at the Party’ is the the most relatable game about social anxietyWatch this amazing Optimus Prime toy actually transform on its ownLocal paper picks the worst possible photo to go under murder storyCustomizable controller that works with every system is a dream'Humans of New York' is getting its own Facebook TV series and the trailer looks epicRyan Gosling and JaySprint is giving away Jay Z's 4:44 concert tickets this weekendGoogle's Pixel 2 flagship will reportedly launch on October 5There's a crystal clear Kanye West burn hiding in Taylor Swift's new songThe best evidence that all 3 new iPhones will have wireless chargingShania Twain reveals why Brad Pitt didn't impress her muchGoogle and Walmart team up to take on a common enemy: Amazon Lucky Thirteen by Sadie Stein CivilWarLand in Bad Decline: Preface by George Saunders Scary Children Reading, and Other News by Sadie Stein How to be a Bureaucrat, and Other News by Sadie Stein On the Twelfth Day of the Twelfth Month of 2012... by Sadie Stein The Timid Investigators: An Homage to Roberto Bolaño by Frederic Tuten In Which Philip Roth Gave Me Life Advice by Julian Tepper ThunderStick by Pamela Petro David Opdyke by Yevgeniya Traps The Fitzgerald “Definitional” by Raymond Queneau Street Scene by Jiayang Fan Christmas with Monte by Colin Fleming Reading the Viaduct by Jessica Vivian Chiu The Perfect Stocking Stuffer by Sadie Stein Happy New Year! And Other News by Sadie Stein And Everywhere That Mary Went by Sadie Stein Saved by John Jeremiah Sullivan Harris Khalique, Islamabad, Pakistan by Matteo Pericoli We Have a ... Winner? (NSFW
3.5348s , 10135.90625 kb
Copyright © 2025 Powered by 【Taste of Future Sister-in-law】,Prosperous Times Information Network