Moderating content on social media is incredibly difficult. People love to complain about it, and it’s indeed imperfect, but it’s a tough system. I shared the story of Chole before, but I recently came across a great article from Techdirt that gives some amazing statistics.
In a recent a talk from Evelyn Douek, a Lecturer on Law & Doctoral Candidate at Harvard Law School, she helped shape the scale of what social media companies are up against. During the course of her 30 minute talk:
- Facebook would take down 615,417 pieces of content
- YouTube would take down 271,440 videos, channels, and comments
- TikTok would take down 18,870 videos
- The Facebook Oversight Board would receive 48 petitions to review a takedown decision.
Those numbers are big enough, but you need to remember two more things:
- Those are just for a 30 minute chunk. We’re talking about 29 million pieces of content that Facebook removes every day.
- This doesn’t get into the amount of content that is reviewed and allowed to stay. Moderators and algorithms undoubtedly review and allow at least that much more content.
The answer?
That said, I still don’t have a good answer. The simple thought (and often cried on Facebook) is to just “stop moderating”, but that certainly wouldn’t work. As I shared with the story of Chloe before, moderators on social platforms have to remove some truly awful content, and we’re thankful that they do. As I’ve asked before, do you see much porn on Facebook? There’s a good reason for that, and just allowing all content to stay public isn’t a good answer. Even sites like Parler, which try their best to limit moderation, are struggling mightily with it.
Algorithms will improve and things may get better at some point, but for now I urge to you show some grace to those that try to keep sites clean, and own your content if you want to speak freely without worrying about false positive flags on your content.
For more, I encourage you to watch Evelyn’s presentation, which can be seen here:
Leave a Reply