Denise (
denise) wrote in
dw_maintenance2023-06-27 08:48 pm
![[staff profile]](https://www.dreamwidth.org/img/silk/identity/user_staff.png)
![[site community profile]](https://www.dreamwidth.org/img/comm_staff.png)
backend errors
Someone reminded me we hadn't made a top-level post about the "backend fetch error" that people have been getting in the evenings -- this is a problem related to site load. (It's also related to the increase in abusive traffic we've been seeing over the last several months or so, and we're not the only ones who've been noticing it; there's been a general internet-wide upswing in garbage lately.)
We're doing our very best to keep legitimate site usage from noticing the problem. When you get any kind of site capacity related error, wait a few minutes and try again: our systems have a number of ways to automatically fix the problems, but it takes a few minutes for them to get more capacity online. Sometimes it can also take a few rounds of "add more capacity, see if that fixes the problem" for the issue to fully resolve. We've been looking into long-term fixes that we can make to reduce the amount of garbage traffic we're getting and the chance that it will overwhelm our capacity, but we're already getting feedback from legitimate site users that the traffic filtering steps we're taking are causing issues accessing the site for them, and we're trying to avoid making that problem worse.
We're really sorry about the hassle! Things may be a little rocky in the evenings US time for the next week or two as we work to find the best long-term solutions, because evening in the US is both a higher period of legitimate activity for us (because of our high number of US users) and a higher period of abusive traffic (because it's the beginning of the day in the countries we get the most abusive traffic from). If you do get an error, waiting 3-5 minutes and trying again should resolve it.
We're doing our very best to keep legitimate site usage from noticing the problem. When you get any kind of site capacity related error, wait a few minutes and try again: our systems have a number of ways to automatically fix the problems, but it takes a few minutes for them to get more capacity online. Sometimes it can also take a few rounds of "add more capacity, see if that fixes the problem" for the issue to fully resolve. We've been looking into long-term fixes that we can make to reduce the amount of garbage traffic we're getting and the chance that it will overwhelm our capacity, but we're already getting feedback from legitimate site users that the traffic filtering steps we're taking are causing issues accessing the site for them, and we're trying to avoid making that problem worse.
We're really sorry about the hassle! Things may be a little rocky in the evenings US time for the next week or two as we work to find the best long-term solutions, because evening in the US is both a higher period of legitimate activity for us (because of our high number of US users) and a higher period of abusive traffic (because it's the beginning of the day in the countries we get the most abusive traffic from). If you do get an error, waiting 3-5 minutes and trying again should resolve it.
no subject
no subject
no subject
no subject
no subject
there's been a general internet-wide upswing in garbage lately
Is there a known reason why? My uneducated guess would be AI.
Thanks for the hard work! ♥
no subject
no subject
no subject
Identified SEO backlink domain names get replaced with "SPAM" on posting, so it doesn't matter if the Google bots scan them before you can remove them; the spammer's already failed.
no subject
There's also the problem that backlink spammers don't actually check if a particular website is giving them any real Google juice -- we actually had the Google crawler banned for a few months as splash damage from some of our bot detection systems before we could figure out a full list of the networks and user-agents they use for it, so no DW accounts were getting crawled at all for a while. The spam rate increased even though they were demonstrably getting zero benefit out of it. :/ Like 5% of the people who do this kind of thing are actually competent and clever and really good at working around restrictions you place on backlink spam, and the other 95% are warm bodies following a tutorial one of the 5% made and posted to one of the malicious SEO tactics forums. The 5% change up their tactics quickly, but the 95% never bother to check the dates on the tutorials. I've found some of the ones that people are following to spam us and we've changed up some things that make following those tutorials harder, but a lot of those kinds of tutorials live behind subscription services and I'm not giving money to malicious SEO forum operators just to figure out how they're telling people to abuse our site, heh.
no subject
I fear eventually we'll get to a point where much (maybe most?) Internet content is created by AIs and only consumed by AIs...
no subject
I don't blame the individual people who are doing the content generation and spamvertising, because it is a very good living for them, but even if you take them out of the equation and outsource that entirely to generative AI models, it's not going to make much of a difference in the volume, because the volume is already overwhelming. The problem is the advertising based model of revenue generation and the perverse incentives of the costs being shifted to platforms while the revenue goes to the people who are abusing the platforms, and I don't know what the solution is, honestly. We're small enough that manual detection and removal still works, but there's only so far we can extend that (especially since that manual detection and removal requires the person doing it to also have access to sensitive user data on the real accounts that are mixed in to the garbage ones, which means we're limited in how much we can let people help us with because we can't just give out the permissions to do it to anyone who wants to help). It's a giant fucking mess and the spam problem has been around for decades by now without a solution, because it's a really fucking hard problem.
no subject
Ah. So it's not AI spambots, "just" spam resulting from late stage capitalism. Yay...? 🙃😑
no subject
no subject
no subject
Thank you so much for all that you do. We can be patient (at least I can).
no subject
no subject
no subject
no subject
no subject
no subject
no subject
no subject
no subject
no subject
no subject
no subject
no subject
no subject
no subject
no subject
no subject
no subject
no subject
no subject
no subject
So in other words, you won't stop trying to make Backend Fetch happen?no subject
My wife earnestly assures me that my backend is fetching enough already.no subject
no subject
no subject
- Erulisse (one L)
no subject