Yes, exactly! Everyone always says they want to use a website that doesn't "censor" content. ("Censor" in scarequotes there because people treat it like it's a magical word even when they're not treating it like the greatest possible offense -- I don't quite remember whether I started screening all comments by default before or after this happened, but if it was after, my sincere apologies to everyone who saw any part of the "censorship is a worse crime than child sex abuse" argument; that's when I realized I both needed to step away from the keyboard for a while and also leave comments screened-by-default unless I was directly in front of the keyboard and actively monitoring.) If you actually look at how people behave, though, the vast majority avoid unmoderated or undermoderated spaces, and the people who don't avoid them are the ones who've been kicked off every other service -- usually for cause -- and are shitty fucking neighbors. People do not actually want to hang out on a website that is nothing but spam and neo-Nazis! They just have no idea how much careful effort it takes to keep a website from becoming wall-to-wall spam and neo-Nazis, because good content moderation is invisible; they never see the amount of stuff that gets removed or the sheer volume of shit-tier spam and garbage that would cover 95% of every even mildly active site in 24 unmoderated hours. I mean, Dreamwidth gets a plethora of page-rank SEO spammers, and we are approximately nobody in terms of being able to drive traffic to a spammy site.
And inconsistency aside -- and inconsistent moderation is a problem, don't get me wrong -- there's also the problem that in every case except things like blatant spam, every incident on a site that ever results in one (or more) people reporting something to the Trust & Safety team (God I hate that that's what we eventually settled on calling ourselves so much, but that's a rant for another post's comments) automatically and inherently, just by the nature of the dispute, involves half the people involved being mad about the results. T&S takes the reported item down: the person who posted it is mad. T&S leaves the reported item up: the person who reported it is mad. And mad people post about their mad, so depending on people's networks, they're often left with the impression that a site's T&S team is biased or incompetent: I can't count the number of times I've seen someone complaining on Twitter, for instance, about a particular account being suspended or a tweet removed from view and their replies fill up with "but they leave THIS stuff up!" Where the reality of it is, either "THIS stuff" hasn't been reported, has been reported but the T&S queue is incredibly backed up again, or has some small nuance in phrasing or what-have-you that makes it not the same as the tweet that got taken down.
And yeah, what cloudsinvenice said about T&S teams being overworked, undersupported, psychologically traumatized (because yes, T&S work is trauma work), and expected to work outside their areas of cultural competency. And also outside their language competency! Like, we're a tiny company, we can't afford professional translation services, if a ToS report comes in about content in a language none of us are at all familiar with like Klingon (specifically using a fictional language to avoid derailing, heh) we're just as dependent on Google Translate as anyone else would be. And Google Translate is terrible at nuance, and ToS decisions are all nuance.
I've said for a while that I'd love to make a video game that could give people a sense of what it's like doing T&S work, in the style of "Papers, Please!" or the like -- here, have 30 seconds to decide if this post is a violation of the Terms of Service! If you drop below a 95% accuracy rate or go over time too often, you get fired! -- but the simple fact players could always be 100% confident the game wouldn't show them a report about/containing CSAM with no warning, because it would be illegal to include that material, would be a level of psychological security that would lessen the impact about a thousandfold. It's impossible to convey the reality of what the work involves, the kind of content that gets reported, and the kind of awful, psyche-destroying things T&S removes from a site before anyone even knows it's there.
The questions of what to allow and what to remove from a site are all hard problems with zero unambiguously correct answers, plus when you're dealing with the scale of even a moderately active site, even a completely understandable rate of human error can turn into hundreds of thousands of pissed-off people. Nobody has the "correct" answers, because there are no correct answers. We just prefer not to patronize a service that abdicates all responsibility for even asking the correct questions, because as a company that wrestles with those hard and unanswerable questions regularly, quite frankly I kind of resent the fact they feel they can just handwave the problem and make all those questions go magically away. When you're committed to providing service at the margins, you need to be more conscientious about what you remove and what you keep, not less conscientious, and Cloudflare isn't even trying.
Re: tl;dr incoming
And inconsistency aside -- and inconsistent moderation is a problem, don't get me wrong -- there's also the problem that in every case except things like blatant spam, every incident on a site that ever results in one (or more) people reporting something to the Trust & Safety team (God I hate that that's what we eventually settled on calling ourselves so much, but that's a rant for another post's comments) automatically and inherently, just by the nature of the dispute, involves half the people involved being mad about the results. T&S takes the reported item down: the person who posted it is mad. T&S leaves the reported item up: the person who reported it is mad. And mad people post about their mad, so depending on people's networks, they're often left with the impression that a site's T&S team is biased or incompetent: I can't count the number of times I've seen someone complaining on Twitter, for instance, about a particular account being suspended or a tweet removed from view and their replies fill up with "but they leave THIS stuff up!" Where the reality of it is, either "THIS stuff" hasn't been reported, has been reported but the T&S queue is incredibly backed up again, or has some small nuance in phrasing or what-have-you that makes it not the same as the tweet that got taken down.
And yeah, what
I've said for a while that I'd love to make a video game that could give people a sense of what it's like doing T&S work, in the style of "Papers, Please!" or the like -- here, have 30 seconds to decide if this post is a violation of the Terms of Service! If you drop below a 95% accuracy rate or go over time too often, you get fired! -- but the simple fact players could always be 100% confident the game wouldn't show them a report about/containing CSAM with no warning, because it would be illegal to include that material, would be a level of psychological security that would lessen the impact about a thousandfold. It's impossible to convey the reality of what the work involves, the kind of content that gets reported, and the kind of awful, psyche-destroying things T&S removes from a site before anyone even knows it's there.
The questions of what to allow and what to remove from a site are all hard problems with zero unambiguously correct answers, plus when you're dealing with the scale of even a moderately active site, even a completely understandable rate of human error can turn into hundreds of thousands of pissed-off people. Nobody has the "correct" answers, because there are no correct answers. We just prefer not to patronize a service that abdicates all responsibility for even asking the correct questions, because as a company that wrestles with those hard and unanswerable questions regularly, quite frankly I kind of resent the fact they feel they can just handwave the problem and make all those questions go magically away. When you're committed to providing service at the margins, you need to be more conscientious about what you remove and what you keep, not less conscientious, and Cloudflare isn't even trying.