The rise of generative artificial intelligence has brought incredible advances, but it has also brought darker consequences online. One of these problems is the copying of faces and bodies, often referred to as deepfakes. It can easily create synthetic nude images that can cause great harm to individuals. Individuals who have had intimate photos shared without their consent have found it difficult to delete these images from the internet. Microsoft has intervened with a resolution that may offer greatly required support.
Microsoft has made an important advance by introducing a new tool that enables victims of deepfakes to delete these images from the Bing search engine. This recent initiative aims to address the proliferation of fake pornographic content on the internet that appears to feature actual individuals, a phenomenon made worse by advancements in artificial intelligence.
Stop the spread of Microsoft deepfake adult content
Microsoft’s partnership with StopNCII (Stop Non-Consensual Intimate Images) represents a vital collaboration in the fight against “revenge” and deepfake nude images. StopNCII is an organization that enables victims to create a digital fingerprint or “hash” of explicit images, whether real or AI-generated, on their devices. This fingerprint is then used to track and remove the offending images on various platforms.
Microsoft’s Bing is now part of this effort, joining the ranks of other major tech platforms such as Facebook, Instagram, TikTok, and Reddit. This is a critical addition to Microsoft‘s already comprehensive security measures, which previously relied on a direct reporting tool. While the direct reporting option allowed users to flag harmful content, it was not enough to keep up with the volume of offensive material. By integrating StopNCII’s digital fingerprinting technology, Microsoft aims to more effectively and proactively prevent these images from appearing in search results and help victims regain control over their digital assets.
The ongoing struggle against deepfake
The issue of AI-generated nude images has become a growing concern with widespread implications, especially in countries like the United States where there is no federal law specifically targeting AI deepfake pornography. This has resulted in only 23 states enacting laws to address non-consensual deepfakes, leaving victims in many regions without adequate legal recourse. Microsoft’s partnership with StopNCII and the new tool on Bing may provide some relief in an environment where legal protections are inconsistent.
However, the problem is not limited to adults. High school students are also being targeted by “undressing” sites that create fake nude images, demonstrating the need for stronger laws and more comprehensive protections. StopNCII’s tools currently only cover individuals over the age of 18. Yet the wider impact of these harmful technologies is clear, and the need for stronger measures to protect young users is inevitable.
Microsoft’s commitment to digital safety
Microsoft has been aware for a long time of the harmful effects of sharing intimate images without permission and has taken steps to deal with this problem. In the year 2015, Microsoft launched a centralized reporting tool aimed at aiding users in reporting and removing non-consensual intimate images. But as the technology behind synthetic images has evolved, so has Microsoft’s approach.
The company has also advocated for modernized laws to protect victims of AI-generated content and released policy proposals aimed at protecting women and children from online abuse. In addition, Microsoft continues to improve its internal policies to prevent the creation and distribution of such content on its platforms, including Bing, Xbox, and other consumer services.
Microsoft’s new tool is a big step forward, but the fight against deepfake isn’t over. New technology brings new challenges, and we must respond quickly. Microsoft is working with StopNCII and other groups to fight deepfake. But we need everyone to help.
Microsoft encourages people who are worried about their intimate images being shared to use StopNCII’s tools and report violations to Microsoft. This will help stop the spread of these images and give victims some control. While technology has enabled new forms of abuse, it is also providing new ways to fight back. Microsoft’s latest move shows this ongoing struggle.
Featured image credit: Johannes Krupinski / Unsplash