CAMERA

Google and Bing Put AI Deepfake Porn at Top of Some Search Results

According to a new report, Google and other popular search engines are consistently putting nonconsensual AI-generated deepfake porn at the top of some search results.

In a report published by NBC News on Thursday, Google and Microsoft’s Bing include nonconsensual deepfake porn in top image search results alongside tools that advertise the ability to create such material.

Deepfake porn involves realistically swapping a celebrity’s or person’s face with an adult star’s face while they are nude or engaged in a sexual act.

NBC News found that deepfake pornographic images featuring the likenesses of female celebrities were the first images Google and other top search engines surfaced in searches for many women’s names and the word “deepfakes,” as well as general terms like “deepfake porn” or “fake nudes.”

The news outlet notes that these searches were conducted on these popular search engines with safe-search tools turned off.

According to the report, NBC News searched the combination of a name and the word “deepfakes” with 36 popular female celebrities on Google and Bing.

A review of the results found nonconsensual deepfake images and links to deepfake videos in the top Google results for 34 of those searches and the top Bing results for 35 of them.

More than half of the top results were links to a popular deepfake website or a competitor. The popular deepfake website has fostered a market for nonconsensual deepfake porn of celebrities and private figures.

The content featured on top image search results for Bing includes fake nude photos of former teen Disney Channel female actors, and some of the images use pictures of their faces that appeared to be taken before they turned 18 (according to a reverse image search conducted by NBC News)

An Online Black Market for Deepfake Porn

Meanwhile, NBC News reports that searching “fake nudes” on Google returned links to multiple apps and programs to create and view nonconsensual deepfake porn in the first six results. Similarly, on Bing, searching “fake nudes” returned dozens of results of nonconsensual deepfake tools and websites before surfacing an article about the harms of the phenomenon.

The report reveals the growing black market for AI-generated deepfake porn on the internet — which is easily discoverable by a Google or Bing Search.

“We understand how distressing this content can be for people affected by it, and we’re actively working to bring more protections to Search,” a Google spokesperson says in a statement to NBC News.

“Like any search engine, Google indexes content that exists on the web, but we actively design our ranking systems to avoid shocking people with unexpected harmful or explicit content that they aren’t looking for.

“As this space evolves, we’re in the process of building more expansive safeguards, with a particular focus on removing the need for known victims to request content removals one-by-one,” the statement continued.

Microsoft reportedly did not respond to a request for comment, and NBC News could not locate Bing rules addressing generative AI content.


Image credits: Header photo licensed via Depositphotos.


Source link

Related Articles

Back to top button