Facebook says it made an A.I. tool can detect revenge porn before it's reported
Facebook said Friday it's launching a new AI tool to detect revenge porn before it's reported. The company has recently come under fire for the working conditions of contracted content reviewers who moderate posts on the site, so a successful AI detection tool could be a step in the right direction. "Finding these images goes beyond detecting nudity on our platforms. By using machine learning and artificial intelligence, we can now proactively detect near nude images or videos that are shared without permission on Facebook and Instagram," Facebook's Global Head of Safety Antigone Davis said in a blog post. The company is launching a support hub for victims of revenge porn, called "Not Without My Consent," developed with experts and victims organizations.