TECH NEWS

 Hello ladies and gents this is the Viking telling you that today we are talking about 

Facebook has doubled bullying and harassment takedowns since last year

It’s ramping up moderation after a COVID-19 dip last year

On Thursday, Facebook released a new moderation transparency report showing a marked uptick in bullying and harassment enforcement, which reached a peak of 6.3 million total takedowns through the last quarter of 2020. It’s an increase from 3.5 million pieces last quarter and 2.8 million in the fourth quarter of 2019. The company said much of the change is due to improvements in the automated systems that analyze Facebook and Instagram comments.

Facebook’s latest transparency report covers October to December 2020, a period that includes the US presidential election. During that time, the main Facebook network removed more harassment, organized hate and hate speech, and suicide and self-harm content. Instagram saw significant jumps in bullying and self-harm removals. The company says its numbers were shaped by two factors: more human review capacity and improvements in artificial intelligence, especially for non-English posts.

The company also indicates it will lean on automation to address a growing amount of video and audio on its platforms, including a rumored Clubhouse competitor. “We’re investing in technology across all the different sorts of ways that people share,” said CTO Mike Schroepfer on a call with reporters. “We understand audio, video, we understand the content around those things, who shared it, and build a broader picture of what’s happening there.”

Facebook hasn’t confirmed the existence of a Clubhouse-like audio platform, but “I think there’s a lot we’re doing here that can apply to these different formats, and we obviously look at how the products are changing and invest ahead of those changes to make sure we have the technological tools we need,” he said.

Facebook pushed some moderation teams back into offices in early October; although it said in November that most moderators worked remotely, it’s also said that some sensitive content can’t be reviewed from home. Now, the company says increased moderation has helped Facebook and Instagram remove more suicide and self-injury posts. 

Facebook removed 2.5 million pieces of violating content, compared to 1.3 million pieces the preceding quarter, and Instagram removed 3.4 million pieces, up from 1.3 million. That’s comparable to pre-pandemic levels for Facebook, and it’s a significant absolute increase for Instagram.

And as always have a chilled day from the Viking

Comments