Last year, the New York Times published a story about Pornhub's lax policies that allowed its users
The report also explains how Pornhubhandles child sexual abuse material (CSAM). Pornhub detects CSAM on its website through moderation and reporting provided by the National Center for Missing and Exploited Children. Last year, the center submitted a total of more than 13,000 potential CSAMs, of which 4,171 were unique reports and the rest were duplicates.
For moderation, Pornhub uses severaldetection technologies. In 2020, it scanned all previously uploaded videos with YouTube CSAI Match, a proprietary video platform technology for detecting child sexual abuse images. He also scanned all previously submitted photos using Microsoft PhotoDNA, which was developed for the same purpose. Pornhub will continue to use both technologies to scan all videos uploaded to its platform. In addition, the website uses Google's Content Security API, MediaWise cyber-fingerprint software (to scan all new user downloads for previously identified offensive content) and Safeguard, a proprietary image recognition technology designed to combat both CSAM and with videos taken without consent.
See also:
- Created the first accurate map of the world. What's wrong with everyone else?
- The algorithm has discovered a new mysterious layer inside the Earth
- Uranus has received the status of the strangest planet in the solar system. Why?