Microsoft’s Bing search engine is serving up blatantly racist and anti-Semitic content.
When users look up terms like ‘Jews,’ ‘Muslims’ and ‘black people,’ they are met with a myriad of offensive search suggestions, according to HowToGeek’s Chris Hoffman, who first spotted the error.
Since the issue was made public, Microsoft says it has ‘taken action’ to resolve it, but as of Friday, some of the offensive content was still appearing on the search engine.
Scroll down for video
Since the racist search suggestions were made public, Microsoft says it has ‘taken action,’ but as of Friday, some of the offensive content was still appearing on the search engine
The offensive themes primarily appear in Bing’s smart suggestion bubbles, which are located above the image search results, often providing users with related search terms.
In one example, HowToGeek found that searching ‘Jews’ on Bing Images produced smart suggestion bubbles like ‘dirty Jews’ and ‘evil Jews.’
What’s worse, one of the top images that came up after searching ‘Jews’ was a meme that celebrates dead Jewish people.
The offensive images appeared for users even when SafeSearch is enabled. The feature helps ‘keep adult content out of your search results,’ Microsoft’s website notes.
In other cases, users would search ‘black people are,’ only to be given search suggestions like ‘ugly, stupid, racist and even savages,’ according to HowToGeek.
Clicking on the racist search suggestions would serve up even more offensive content.
In one example, HowToGeek found that searching ‘Jews’ on Bing produced smart suggestion bubbles like ‘dirty Jews’ and ‘evil Jews.’ Search results also showed offensive memes
Users search ‘black people are,’ only to be given search suggestions like ‘ugly, stupid, racist and even savages.’ Clicking on the offensive terms would serve up more offensive content
Another disturbing example found that if users search ‘Michelle Obama,’ it would provide users with conspiracy theories in the search suggestions like ‘Michelle Obama Transgender Proof’ and ‘Michelle Obama is a man.’
These results appeared in Bing’s video search engine.
And when users search with the typo ‘gril,’ shockingly, they’re shown sexually exploitative images of young girls.
Doing so serves up search suggestions that say ‘Cute Girl Young 16’ and ‘Boy and Girl sex.’
‘If you click that, it suggests “Cute Girl Young 12”, “Cute Girl Young 10,” and “Little Girl Modelling Provocatively,”‘ HowToGeek explained.
‘The results are filled with pornography of young-looking models. We hope they’re all 18 years of age or older, but who can say?’
It’s likely that if users had disabled SafeSearch on Bing that they’d be shown even more offensive and disturbing search results.
When users search with the typo ‘gril,’ shockingly, they’re shown sexually exploitative images of young girls. This serves up suggestions that say ‘Cute Girl Young 16’ and ‘Boy and Girl sex’
If users search ‘Michelle Obama,’ it would provide users with conspiracy theories in the search suggestions like ‘Michelle Obama Transgender Proof’ and ‘Michelle Obama is a man’
However, these suggestions aren’t shown on Bing’s standard auto-complete search suggestions, meaning that it’s specific to search suggestions on Bing Image Search and Video Search.
The problem appears to extend to Yahoo, which prioritizes Yahoo Answers posts in its search results, according to the Verge.
Searching ‘black people ar’ on Yahoo gave users a page called ‘Are Black People Born Stupid.’
Microsoft, which owns Bing, has since pledged to ‘take action’ on the issue.
‘We take matters of offensive content very seriously and continue to enhance our systems to identify and prevent such content from appearing as a suggested search,’ Jeff Jones, senior director at Microsoft, told HowToGeek.
‘As soon as we become aware of an issue, we take action to address it.’
It remains unclear if the firm is just scrubbing the suggestion boxes for the searches outlined by HowToGeek, or if it’s overhauling its entire algorithm.
Bing isn’t the first search engine to be called out for serving up racist and offensive search suggestions.
Google was ensnared in the same issue in 2016 when the Guardian discovered that searching about the Holocaust would give users pages denying it took place.
IS GOOGLE AUTOCOMPLETE RACIST?
A study from 2013 by a top university had claimed internet giant Google’s search facility ‘perpetuates prejudices’.
The investigation from Lancaster University found that results from Google’s auto-complete internet search tool produce suggested terms which could be viewed as racist, sexist or homophobic.
The study by a team at Lancaster University’s Faculty of Arts and Social Sciences comes as a German federal court has told Google to clean up the results its search engine suggests.
The court had said Google must ensure terms generated by auto-complete, which represent the level of questions people are asking, are not offensive or defamatory.
The FASS study found some shocking results in its UK study, which drew out more than 2,600 questions on the Google search tool and categorized the answers.
And it warns that ‘humans may have already shaped the internet in their image, having taught stereotypes to search engines.’
The research revealed high proportions of negative evaluative questions for black people, gay people and males.
For black people, these questions involved constructions of them as lazy, criminal, cheating, under-achieving and suffering from various conditions such as dry skin or fibroids.
Gay people were negatively constructed as contracting AIDS, going to hell, not deserving equal rights, having high voices or talking like girls.
The negative questions for males positioned them as catching thrush, under-achieving and treating females poorly.
A Google spokesperson said the system was entirely automated.