LLM and AI companies seem to all be in a race to breathe the last breath of air in every room they stumble into. This practice started with larger websites, ones that already had protection from malicious usage like denial-of-service and abuse in the form of services like Cloudflare or Fastly.
But the list of targets has been getting longer. At this point we're seeing LLM and AI scrapers targeting small project forges like the GNOME GitLab server.
How long until scrapers start hammering Mastodon servers? Individual websites? Are we going to have to require authentication or JavaScript challenges on every web page from here on out?
All this for what, shitty chat bots? What an awful thing that these companies are doing to the web.
I suggest everyone that uses cloud infrastructure for hosting set-up a billing limit to avoid an unexpected bill in case they're caught in the cross-hairs of a negligent company. All the abusers anonymize their usage at this point, so good luck trying to get compensated for damages.
Have thoughts or questions? Send them my way:
sethmlarson.99
(Signal)[email protected]
@[email protected]
Want more articles like this one?
Get notified of new posts by subscribing to the RSS feed or the email newsletter. I won't share your email or send spam, only whatever this is!Want more content now? This blog's archive has 110 ready-to-read articles. I also curate a list of cool URLs I find on the internet.
Find a typo? This blog is open source, pull requests are appreciated.
Thanks for reading! ♡ This work is licensed under
CC BY-SA 4.0