Headlines Of The Day
India accounts for over 25% of videos removed by YouTube in June quarter
India accounted for the highest number of videos removed by YouTube for violating community guidelines in the quarter ending June, shows YouTube’s Community Guidelines Enforcement report for Q2 2022, released September 8. YouTube removed 1.3 million videos in India during the quarter, followed by 445,148 in the US, 427,748 in Indonesia, 222,826 in Brazil, and 192,382 in Russia.
In total, YouTube removed more than 4.4 million videos and 3.9 million channels globally during the quarter.
According to the report, 93% of the videos blocked globally were first flagged by automated tools, out of which, 34% were removed even before they could get a single view, while 38% had one to ten views before they were axed. Around 256,109 videos were removed after being flagged by a user or member of YouTube’s Trusted Flagger program.
The report shows that YouTube received appeals against the removal of 223,000 videos, out of which only 26,000 were reinstated.
Most of the removed channels were caught violating the platform’s spam policies. Around 89% were removed for spam, misleading content, and scams, while 4% were removed for nudity and sexual content and 2.5% for compromising child safety.
A channel can be deemed as spam for excessive posting, repetitive content, false promises, and sending audiences to harmful websites among others, according to YouTube’s community guidelines. During the June quarter, 122,660 videos were removed for violating misinformation policy, out of which 35,000 videos were caught spreading misinformation around covid-19.
The report further shows that YouTube also cracked down on spam comments and removed over 754 million comments during the quarter. More than 98.8% of removed comments were detected by automated tools.
Social media users are increasingly finding themselves on the wrong side of various platforms’ rules and community guidelines. Last week, Meta Platforms said in its monthly transparency report that it took action against 27 million posts on Facebook and Instagram. Out of which, 17.3 million posts were found to be spam, while 2.7 million involved nudity and sexual activity. Live Mint