Why content moderation costs billions and is so tricky for Facebook, Twitter, YouTube and others

After the riots at the Capitol on January 6, debate is swirling over how platforms moderate content and what is protected as free speech.

It’s a messy and expensive process, with Facebook spending billions to review millions of pieces of content every day. While TikTok directly employs content moderators, Facebook, Twitter and YouTube outsource most of the grueling work to thousands of workers at third-party companies.

Many moderators in the U.S. and overseas say they need higher pay, better working conditions and better mental health support because of the terrible things they see while sifting through hundreds or thousands of posts every day.

In response, some companies are relying more on algorithms they hope can take over most of the dirty work. But experts say machines can’t detect everything, such as the nuances of hate speech and misinformation. There’s also a host of alternative social networks like Parler and Gab that rose to popularity primarily because they promised minimal content moderation. That approach led to Parler’s temporary suspension from the Apple and Google’s app stores and Amazon Web Services hosting.

Other platforms, like Nextdoor and Reddit, rely almost exclusively on large numbers of volunteers for moderation.

Watch the video to find out just how big the business of content moderation has become and the real world implications of the online decisions social networks make about what content we can and cannot see.

Technology

Products You May Like

Articles You May Like

Why US obesity rates fell for the first time in decades
NCIS Season 22 Midseason Report Card: The Highs, the Lows, and the Mysteries!
Me and the Alien MuMu Trailer Sets The Stage For Adventure
Eddy Cue explains why Apple won’t make a search engine
ModRetro Chromatic review: an arms dealer’s Game Boy is among the best ever made