YouTube removed over 8 mln videos from site in Q4

In December Google said it was hiring 10,000 people in 2018 to address policy violations across its platforms

LEON NEAL GETTY IMAGES In December Google said it was hiring 10,000 people in 2018 to address policy violations across its platforms

However, such videos hold so much space on YouTube servers and therefore purging looked necessary.

The announcement comes alongside the launch of the Reporting History dashboard, which will allow YouTube users to see the status of videos that they have flagged.

The report states that out of the videos taken down, 75% were never actually seen by the public.

Google announced it has removed 8.3 million videos from YouTube in its Community Guidelines enforcement report. The company claims that its machines are doing quite well at moderating videos that show violent extremism and spam.

YouTube took down 8.3 million videos in the last three months of 2017, responding to criticism it's slow to address inappropriate content on its site.

At 30%, sexual content was the leading reason for receiving a flag, followed by spam (26%), hateful or abusive messaging (15.6%), violent or repulsive content (13.5%) and harmful or unsafe acts (7.6%). And, of the 6.7 million videos flagged by robots, 76 percent were removed before anyone watched them even once.

Corralling problematic videos, whether through humans or machines, could help YouTube, a major driver of Google's revenue, stave off regulation and a sales hit.

YouTube has contended that the volume of videos uploaded to the site is too big of a challenge to rely only on human monitors. In this case, YouTube uses the technology to automatically spot objectionable content. From last October to December, the company deleted a total of 8.3 million videos - which it notes represented "a fraction of a percent of YouTube's total views during this time period". "Hateful or abusive" content clocks in at 15%, with "violent or repulsive" content making up 13% of videos flagged.

The firm notes that flags come from its automated flagging system, users and members of the Trusted Flagger program. In total, 1.5 million videos were removed after first being flagged by users.

The new update comes when almost a year ago social media platforms like YouTube were accused of hosting extremist content, fake news and hate speech. It did say that it had recruited "full-time specialists with expertise in violent extremism, counter-terrorism, and human rights". YouTube introduced machine learning for flagging content in June 2017.

Additionally, the video sharing platform revealed the top 10 countries from which it receives the most human flagged video reports.

Spotify's free version is getting some much needed updates
Mexico Outraged After Murder of Three Film Students