San Francisco, Calif., Mar 16, 2019 / 15:50 pm
Facebook has announced that it will begin using AI software to prevent and restrict the distribution of non-consensual sexual material – also known as revenge porn.
In a March 15 statement, Antigone Davis, Facebook's global head of safety, said the new technology will detect nude videos or pictures distributed without permission on Instagram and Facebook.
"This means we can find this content before anyone reports it, which is important for two reasons: often victims are afraid of retribution so they are reluctant to report the content themselves or are unaware the content has been shared," she wrote.
Davis said the machine learning and artificial intelligence technology will identify the problematic material. The company's Community Operations team will then determine whether the content violates Facebook's policies and, if it does, likely disable the account of the offender.