Islamic channel falls foul of YouTube’s ‘ham-fisted’ censorship
A popular Islamic video channel, Merciful Servant that has over two million subscribers, was briefly shut down by YouTube in December without any stated reason, as the Google-owned platform continued to remove content in line with its Community Guidelines.
Google had brought in new moderators at the start of 2018 to spot and remove fake, misleading and extreme videos, and shared its findings with law enforcement and other relevant government entities in the U.S. and elsewhere. Google’s Community Guidelines Enforcement Report follows a U.S. Congressional hearing about how YouTube monitors and deletes such content from the platform, including videos depicting violent extremism and hateful, graphic content.
In the third quarter of 2018, 1.67 million channels were removed, said Google. These channels, some with millions of followers, have complained about their accounts being removed and then reinstated due to what they call a ham-handed approach.
One of these was Merciful Servant, which on December 12 posted a video saying its channel was shut down without any notice, and that its followers received messages saying the channel’s YouTube account had been terminated.
“After logging in, we got a notification from YouTube telling us that our channel was suspended – and all our videos, eight years of work, were completely gone. This happened without any warning and zero copyright issues,” Merciful Servant said in a video that has been posted on its YouTube channel, which was subsequently restored.
A day after Merciful Servant released its ‘Our channel got shutdown’ video, YouTube wrote on a post on its official blog: “We are committed to tackling the challenge of quickly removing content that violates our Community Guidelines and reporting on our progress.”
The statement from YouTube says that as a move towards greater transparency, it is “expanding the report to include additional data like channel removals, the number of comments removed, and the policy reason why a video or channel was removed.”
However, despite this promise of transparency, Merciful Servant says that while the channel was restored, they were provided no explanation. “We contacted YouTube support immediately, explaining what our channel is about and asking for the reason why it was shut down. A few hours later, it was restored. But unfortunately, YouTube did not explain the reason why it was shut down.”
According to YouTube’s policies, channels are banned after three strikes within 90 days, or a single egregious violation.
“The fact that this could happen to the biggest Muslim channel is a very worrying thing,” Merciful Servant said. It has now stopped relying on what it calls “a third-party company or organisation” (read YouTube) and has created a new membership system.
LOSS OF REVENUE
YouTube’s new policy, which is to remove entire channels instead of focusing disciplinary action on specific videos, has seen users affected globally. Some of these rely on YouTube as a source of revenue. Their loss of revenue was coined the “Adpocalypse”, referring to YouTube’s stricter guidelines for the monetisation of content.
YouTube’s authoritarian approach to content – remove first, no answers provided – has its users scrambling for alternatives. Already, rivals such as Twitch have emerged, affecting its monopoly. Others, such as Merciful Servant, are seeking to create their own platforms.
“We will continue uploading here on YouTube for as long as we are able… But if our channel is terminated again we will have no way to deliver content to [our viewers] unless you become a member,” Merciful Servant said. Members will receive access to Merciful Servant content on a platform it owns.
Many content creators are complaining of what they call harassment. Automatic flagging by YouTube’s ever-evolving algorithm that flags keywords, headlines and even noises to decide the fate of each channel, has led to many decisions being cited as unfair by creators.
Dodgy copyright claims and other issues mean that even original content creators take a hit on their income if someone claims their content violates guidelines. In one case, a blogger said that a nature video he shot personally was claimed as a copyright belonging to a relaxation music channel because his video contained the sound of waves.
YouTube is also responding to advertiser pressure after companies including Coca-Cola and Amazon pulled ads from the platform after discovering their content was paired with hate speech and violent extremist content.
REMOVED IN ERROR
YouTube acknowledges that its moderators, humans and machines both, are prone to errors.
Bloomberg reported in March that according to an emailed statement, YouTube said: “As we work to hire rapidly and ramp up our policy enforcement teams throughout 2018, newer members may misapply some of our policies resulting in mistaken removals.” It pledged to “reinstate any videos that were removed in error.”
Of the 7,845,400 videos removed during the third quarter of 2018, 81 per cent were originally detected by machines. Of these, 74.5 per cent never received a single view, YouTube said. More than 90 per cent of channels and over 80 per cent of videos removed in September were for violating policies on spam or adult content.
The update also provided a report on the combination of machine learning and human reviewers that it uses to flag, review and remove spam, hate speech and other abuse in comments on its platform. More than 224 million comments were removed, 99.5 per cent by automated flagging, for violating YouTube’s community guidelines during the period.
(Reporting by White Paper Media; Editing by Emmy Abdul Alim emmy.alim@refinitiv.com)
Our Standards: The Thomson Reuters Trust Principles
© SalaamGateway.com 2019 All Rights Reserved
White Paper Media