
Microsoft’s search engine Bing has had a rough time lately. In January, the search engine managed to serve up child pornography from a simple search. Nearly a year after the New York Times pointed the problem out to Microsoft, it seems as though nothing was done to address the situation. Bing also suggested further terms linked to the initial search, which uncovered additional illegal images. Microsoft, in response, said that they were doing their best to keep those images and terms in check, and even though their search engine wasn’t perfect, it was committed to getting better all the time.
Not Using Their Own Innovations
Ten years ago, Microsoft was instrumental in creating software called PhotoDNA, which allowed a computer to look at and easily recognize photos. They can then compare those images against a database of illegal images and flag those that came up as a match. However, it’s clear from the unlawful photos that showed up in Bing’s search results that the company isn’t using this technology to clamp down on child porn being presented to site’s users.
In an experiment, the Times used more than three dozen search terms to perform queries on the search engine. When the images were received (and blocked, since viewing images like these are illegal), the program the Times used recorded where on the internet those images came from. The Times then referenced the photos with PhotoDNA, which confirmed that the pictures were indeed illegal. The results of the experiment raise questions as to whether Microsoft is interested in clamping down on these practices.
Questions about Privacy
As with all discussions of censorship, the issue of privacy, and what search engines should have access to when it comes to user data is a serious consideration. Many tech companies spend a large volume of resources reviewing photos and videos on their sites to check for malware or copyright infringement. The search for content that shows abuse may cut into users’ privacy. It’s a moral issue that has tech companies trying to decide whether their users’ trust is more important than their users’ safety.