HomeIndustriesMeta's Oversight Board assesses how explicit deep fakes are handled

Meta's Oversight Board assesses how explicit deep fakes are handled

The Oversight Board is an independent expert panel created by Meta in 2020 to review the corporate's most difficult content moderation decisions on Facebook, Instagram and Threads.

On Tuesday, the corporate said it could review Meta's handling of two cases involving AI-generated sexually explicit images depicting female public figures.

The board has 20 members worldwide, including lawyers, human rights activists, journalists and scientists.

It operates independently of Meta, has its own staff and budget, and might make binding decisions about content that Meta must implement, unless doing so would violate the law.

The board can even make non-binding political recommendations to Meta.

In one blog entry, the Oversight Board stated: “Today the Board pronounces two latest cases for review. As a part of this, we invite individuals and organizations to submit public comments.”

The first case involves an AI-generated nude image posted on Instagram that was intended to resemble a public figure from India.

According to the panel, “The image was created using artificial intelligence (AI) to resemble a public figure from India.” The account that posted this content only shares AI-generated images of Indian women. The majority of users who responded have accounts in India, where deepfakes have gotten a growing problem.”

The second case concerns a picture posted in a Facebook group dedicated to AI creations. It shows an AI-generated naked woman resembling an American socialite being groped by a person.

The Oversight Board noted: “It shows an AI-generated image of a unadorned woman while a person gropes her breast. The image was created using AI to resemble an American public figure, who can also be named within the caption. The majority of users who responded have accounts within the United States.”

Initially, Meta allowed the Indian socialite's image to stay on Instagram, but later removed it for violating the Community Standard on Bullying and Harassment after the Oversight Board chosen the case for review.

Meta removed the American public figure's image for violating the identical policy, specifically the clause prohibiting “derogatory sexualized photoshop or drawings.”

The Oversight Board stated that it “chosen these cases to evaluate whether Meta's policies and enforcement practices are effective in combating explicit AI-generated images. This case is consistent with the Board's strategic gender priority.”

As a part of its review process, the Oversight Board is looking for public comment on various points of the cases, including “the character and severity of the harm attributable to deep fake pornography,” “contextual information concerning the use and distribution of deep fake pornography.” Pornography worldwide,” and “Strategies for a way Meta can combat deep-fake pornography on its platforms.”

The public comment period will remain open for 14 days and shut on April 30. The Supervisory Board will then deliberate on the cases and make its decisions, that are binding on Meta. All policy recommendations from the Board are non-binding, but Meta must respond inside 60 days.

The watchdog's announcement comes amid growing concerns concerning the spread of non-consensual deep fake pornography aimed toward women, particularly celebrities.

Taylor Swift was the one best goalwith AI-generated explicit images of the singer triggering a digital manhunt for the perpetrator.

Lawmakers introduced it in response to the growing threat of deepfake pornography DEFIANCE Act in Januarywhich might allow victims of non-consensual deep fake pornography to sue if they may prove the content was created without their consent.

Congresswoman Alexandria Ocasio-Cortez, who sponsored the bill and has herself been a goal of deepfake pornography, stressed the necessity for Congress to take motion to support victims as deepfakes develop into more accessible.


Please enter your comment!
Please enter your name here

Must Read