This is an incredibly good and insightful comment; I love it!
The problem of inconsistent moderation is one of the hardest to solve -- too many companies view it as a loss leader and don't invest in the resources they need to do it well, and absolutely the whole issue is related to the engagement machine and its harbinger, the Holy Metrics. It's axiomatic among T&S people that "content moderation is impossible to do well at scale", because of the sheer fucking volume involved. (Twitter algorithmically suspends over a million accounts a day as spam, for instance -- imagine a .01% error rate and you've still got a lot of unhappy people.) And there's always been the problem of how it's impossible to gauge the effectiveness of content moderation from the outside, because people don't see the big picture, and they don't see the times a site gets the answer correct, they just see the well-publicized fuckups.
We certainly aren't immune to the fucking up, of course -- humans make mistakes, news at 11 -- but our lower volume and slower pace, and the fact we don't need to report completely unrealistic metrics to advertisers or VCs, means we can be slower, more thoughtful, and more careful with our content moderation decisions. We argue out anything we aren't 100% confident in as a group, and if we find we're having a lot of uncertainty around a given scenario, we regularly ask ourselves "okay, do we need to change the policy around X, or should we just clarify that X also applies to Y?" It's one of the reasons I love this place so much, because we can make decisions by balancing the various pressures when they wind up conflicting.
But you don't have to be us to recognize that a content policy (and indeed all T&S work) does need slightly more nuance than Cloudflare is apparently willing to put into it. Like I said in a few replies here, I can absolutely envision a team going through that thoughtful, nuanced, principled discussion process, weighing all of the various factors, and ending up with a policy that says "KiwiFarms isn't a violation of our AUP"; I likely wouldn't agree with a policy that has that result, but I can envision it happening, and I use sites whose policy nuance I disagree with all the time. Cloudflare's blog post from yesterday left me with zero confidence they even are aware of the balancing factors that go into policy-crafting, and from a company that's been in business for over a decade and doing what they're doing, that is just flat out irresponsible. I would be the first in line to defend Cloudflare if the government tried to dictate their policy to them, but as a business owner, yeah, we're gonna spend our money with the people who aren't the beer distributor for every Nazi bar in town.
Re: tl;dr incoming
This is an incredibly good and insightful comment; I love it!
The problem of inconsistent moderation is one of the hardest to solve -- too many companies view it as a loss leader and don't invest in the resources they need to do it well, and absolutely the whole issue is related to the engagement machine and its harbinger, the Holy Metrics. It's axiomatic among T&S people that "content moderation is impossible to do well at scale", because of the sheer fucking volume involved. (Twitter algorithmically suspends over a million accounts a day as spam, for instance -- imagine a .01% error rate and you've still got a lot of unhappy people.) And there's always been the problem of how it's impossible to gauge the effectiveness of content moderation from the outside, because people don't see the big picture, and they don't see the times a site gets the answer correct, they just see the well-publicized fuckups.
We certainly aren't immune to the fucking up, of course -- humans make mistakes, news at 11 -- but our lower volume and slower pace, and the fact we don't need to report completely unrealistic metrics to advertisers or VCs, means we can be slower, more thoughtful, and more careful with our content moderation decisions. We argue out anything we aren't 100% confident in as a group, and if we find we're having a lot of uncertainty around a given scenario, we regularly ask ourselves "okay, do we need to change the policy around X, or should we just clarify that X also applies to Y?" It's one of the reasons I love this place so much, because we can make decisions by balancing the various pressures when they wind up conflicting.
But you don't have to be us to recognize that a content policy (and indeed all T&S work) does need slightly more nuance than Cloudflare is apparently willing to put into it. Like I said in a few replies here, I can absolutely envision a team going through that thoughtful, nuanced, principled discussion process, weighing all of the various factors, and ending up with a policy that says "KiwiFarms isn't a violation of our AUP"; I likely wouldn't agree with a policy that has that result, but I can envision it happening, and I use sites whose policy nuance I disagree with all the time. Cloudflare's blog post from yesterday left me with zero confidence they even are aware of the balancing factors that go into policy-crafting, and from a company that's been in business for over a decade and doing what they're doing, that is just flat out irresponsible. I would be the first in line to defend Cloudflare if the government tried to dictate their policy to them, but as a business owner, yeah, we're gonna spend our money with the people who aren't the beer distributor for every Nazi bar in town.