Our sources reported that there were YouTube videos with thumbnails depicting women engaging in various sexual acts with horses and dogs populate top search results on the video platform. Some of these videos can easily be found via YouTube’s algorithmic recommendation engine, have lots of views and have been on the platform for months.
YouTube videos that display such acts are easily caught by algorithmic filters of the company, its user-reporting system and its human content moderators. However, videos that are hard to locate are ones that use graphic and offensive images as thumbnails, alongside clickbait titles to increase views and ad revenue.
This problem shows how YouTube can be exploited by bad actors who use the platform either to generate ad revenue or for immoral activities. The company has struggled with the lack of control it has when it comes to user generated content. Although, YouTube has algorithms for flagging content and human moderators, yet, it appears as if a new issue pops up every week showing that the moderation system of the company is extremely weak at dealing with content that are illegal.
This moderation problem has now expanded to include terrorism recruitment and propaganda videos, child exploitation content, and porn and other explicit material, among millions upon millions of other non-advertiser friendly videos. YouTube has made changes in a bid to appease advertisers, quell criticism, and improve the safety and legal status of its product. Those changes include pledging to hire more human moderators, mass demonetization and banning of accounts, and updates to its terms of service and site policies.
YouTube went through and began scouring its platform of the videos and accounts responsible for the bestiality thumbnail content as soon as they were notified of the issue. Our sources report that YouTube said it took down a total of 8.28 million videos in 2017, with about 80 percent of those takedowns having started with a flag from its artificial intelligence-powered content moderation system. Yet, so long as YouTube relies mostly on software to address problem videos, it may have to manually scrub its platform clean of content like bestiality thumbnails.