Bots account for just a ridiculous amount of web traffic, tying up resources and bandwidth, and the go to response to this is...basically nothing. You can maybe throw Cloudflare in front of your site, and that's it.
Within the last year it's become much worse with AI bots vacuuming my content on my dime.
Why is there not some community-driven attempt block them? Is this problem just not solvable?
I’ve done a fair bit of scraping for various things. Companies that attempt to defeat scraping often ruin the user experience along the way.
As an example I am aware of, there is an airline that keeps doing all sorts of things to defeat scrapers.
Problem is, the site now constantly throws errors for regular users doing searches and people are regularly getting banned for doing too many searches.
And they still haven’t done much but made scrapers increase their retry counts.
reply