Gadgets

Microsoft joins coalition to clean up revenge and deepfake porn from Bing

Microsoft has partnered to help remove objectionable intimate photos – including deepfakes – from its Bing search engine.

When a victim opens a “case” with StopNCII, the database creates a digital fingerprint, also called a “hash,” of a nearby photo or video stored on that person’s device without the need to upload a file. The hash is then sent to participating industry partners, who can search for a match in the original and remove it from their platform if it violates their content policies. This technique also applies to AI-generated deep scenes of a real person.

Many other technology companies have agreed to work with StopNCII to crack down on intimate photos shared without permission. Meta the tool, and you use it on Facebook, Instagram and Threads platforms; other services partnering with this effort include , Reddit, Snap, Niantic, OnlyFans, PornHub, Playhouse and Redgifs.

Not on that list, strangely, is Google. The tech giant has its own set of objectionable image reports, including . However, failing to participate in one of the few central areas to scrub revenge porn and other private images arguably places an additional burden on victims to take a less restrictive approach to their privacy.

In addition to efforts like StopNCII, the US government has taken some steps this year to directly address the damage done by the deep side of unconscionable images. A call for new legislation on the matter, and a group of senators to move to protect victims, was launched in July.

If you believe you have been a victim of unauthorized photo sharing, you can file a lawsuit with StopNCII. and Google ; if you are under 18, you can file a report with NCMEC .


Source link

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button