When AI sees a man, it thinks “official.” A woman? “Smile”


559
559 points

Side by side photos of women with photo ops.

Sam Whitney (illustration), Getty Images

Men often judge women by their appearance. It turns out that computers do that, too.

When American and European researchers presented photos of Congressmen for GoogleA cloud-based photo recognition service, the service applies three times the number of annotations related to physical appearance to photos of women as it did for men. The highest classifications applied to men were “official” and “businessman”; For women they were “smiling” and “chin”.

“This results in women having a lower stereotype: that women are there to look beautiful and men are business leaders,” says Karsten Schwimmer, postdoctoral researcher at the GESIS Leibniz Institute for Social Sciences in Cologne, Germany. Work on the study, Posted last week, With researchers from New York University, American University, University College Dublin, University of Michigan, and the California nonprofit YIMBY.

The researchers ran their machine vision test on Google Artificial intelligence Serving image and those of competitors Amazon And the Microsoft. The crowd workers were paid to review the annotations of those services applied to official photos of legislators and the photos these lawmakers tweeted.

Google's AI image recognition service tends to see men like Senator Steve Dines (Republican for Mont.) As entrepreneurs, but it has described women legislators like Lucille Royal-Allard in terms of their appearance.
Zoom in / Google’s AI image recognition service tends to see men like Senator Steve Dines (Republican for Mont.) As entrepreneurs, but it has described women legislators like Lucille Royal-Allard in terms of their appearance.

Carsten Schwimmer

AI services generally saw things that reviewers could also see in images. But they tended to notice different things about women and men, as women were more likely to be distinguished by their appearance. Female legislators are often labeled “girl” and “beauty”. Services tended not to see women at all, failing to spot them more often than they failed to see men.

The study adds to the evidence that algorithms do not view the world as immune to mathematics, but instead tend to replicate or even amplify historical cultural biases. It was partially inspired by a 2018 project called Shades of sex Which showed that Microsoft’s and IBM’s cloud services were very accurate in determining the gender of white men however Very imprecise in determining the sex of black women.

The new study was published last week, but researchers gathered data from AI services in 2018. Experiments conducted by WIRED using official photos of 10 men and 10 women from the California State Senate indicate that the results of the study are still valid.

Amazon's image-processing service Rekognition has tagged photos of some California senators including Lingling Chang, a Republic, "Girl" or "Baby" But it did not apply similar designations to male legislators.
Zoom in / Amazon’s Rekognition Image Processing Service has tagged photos of some women senators in California, including Lingling Chang, a Republican, as a “girl” or “child” but has not applied similar labels to male lawmakers.

Wired Staff via Amazon

All 20 lawmakers are smiling in their official photos. Google’s suggested top rankings indicated a smile of only one man, but of seven women. The company’s AI vision service has categorized all 10 men as “entrepreneurs,” often also a “formal” or “white-collar worker.” Only five senators met one or more of these conditions. Women also received appearance-related markings, such as “skin”, “hairstyle” and “neck,” which were not applied to men.

Amazon and Microsoft services seem to show less pronounced bias, although Amazon reports that it is over 99 percent certain that two of the ten women in the Senate are either a “girl” or a “child.” He did not indicate that any of the ten men were minors. Microsoft identified the gender of all men but only eight women, and one described a man and did not specify another’s gender.

Google discontinued sex detection in its AI vision service earlier this year, saying gender cannot be inferred from a person’s appearance. Tracy Fry, managing director of artificial intelligence in charge of Google’s cloud division, says the company continues to work to reduce bias and welcomes outside input. “We always strive to be better and continue to collaborate with external stakeholders – such as academic researchers – to advance our work in this area,” she says. Amazon and Microsoft declined to comment. Both companies’ services only recognize sex as bisexual.

‘A false picture of reality’

The US-European study is partly inspired by what happened when researchers fed Google a’s vision service Award-winning stunning picture From Texas a Honduran girl appears crying while a US Border Police officer is holding her mother. Google’s Amnesty International has suggested labels that include “fun”, with a score of 77 percent, higher than the score of 52 percent it had set for “baby”. WIRED got the same suggestion after uploading the image to Google’s service on Wednesday.

Schwimmer and his colleagues started playing with Google, hoping it would help them gauge patterns in how people use images to talk about politics online. What later helped him uncover a gender bias in image services convinced him that the technology was not ready for use by researchers in this way and that companies using such services could suffer serious consequences. “You can get a completely wrong picture of reality,” he says. A company that uses a skewed AI service to organize a large batch of photos may inadvertently end up hiding the businesswomen, indexing them instead of their smiles.

When this photo won the 2019 World Press Photo award, a judge noticed it appeared "Psychological violence." Google Image Algorithms Discovered "Fun."
Zoom in / When this photo won the 2019 World Press Photo award, a judge noted that it showed “psychological violence.” I discovered Google’s photo algorithms “fun.”

Staff connected via Google

Previous research has found that salient data sets from labeled images are used to train vision algorithms It showed significant biases between the sexesFor example, it shows women cooking and men shooting. This deviation appears to come in part from researchers’ collecting of their photos online, as the available images reflect societal biases, for example by providing many more examples of entrepreneurs than businesswomen. Machine learning software trained on these datasets has been found to amplify bias in the basic image sets.

Schwimmer thinks biased training data may explain the bias the new study finds in the tech giant’s AI services, but it’s impossible to know without full access to their systems.

The diagnosis And the Repair Shortcomings and Prejudices In artificial intelligence systems it has become a hot topic of research in recent years. The way humans can grasp the exact context in an image while AI software focuses narrowly on pixel patterns creates plenty of possibilities for misunderstanding. The problem is becoming more and more urgent as algorithms improve image processing. “They are now everywhere,” says Olga Rusakovsky, associate professor at Princeton University. “So it is best to make sure that they are doing the right things in the world and there are no unintended consequences downstream.”

One way to solve the problem is to work on improving training data that could be the root cause of biased machine learning systems. Russakovsky is part of the Princeton project working on a tool called Retreat It can automatically flag some biases hidden in a range of images, including geographic and gender lines.

When researchers applied the tool to Open photos A collection of 9 million photos held by Google, they found that men are often more distinguished in outdoor scenes and sports grounds than women. Men in tracksuits mostly played outdoor sports such as baseball, while the women indoors played basketball or in swimwear. The Princeton University team suggested adding more images showing women abroad, including playing sports.

Google and its AI competitors are themselves major contributors to the search for equity and bias in AI. That includes working on the idea of ​​creating standardized ways to communicate the limitations and content of AI software and data sets to developers – something like calling the feed AI.

Google has developed a format called ‘Typical cards“And the cards posted for the face and object detection components of the cloud vision service. Someone claims that Google’s face detector works more or less with different races but does not mention other possible forms that the gender bias of AI might take.”

This story originally appeared wired.com.


Like it? Share with your friends!

559
559 points

What's Your Reaction?

hate hate
0
hate
confused confused
0
confused
fail fail
0
fail
fun fun
0
fun
geeky geeky
0
geeky
love love
0
love
lol lol
0
lol
omg omg
0
omg
win win
0
win
admin

0 Comments

Your email address will not be published. Required fields are marked *