cross-posted from: https://discuss.online/post/32165111
I realize my options are limited, but what about any robots.txt style steps? Thanks for any suggestions.
0.- Take it out of the public.
You can always go the Tarpit route as well https://zadzmo.org/code/nepenthes/
Currently Anubis seems to be the standard for slowing down scrapers
https://github.com/TecharoHQ/anubis
There are also various poison and tarpit systems which will serve scrapers infinite garbage text or data designed to aggressively corrupt the models they’re training. Basically you can be as aggressive as you want. Your site will get scraped and incorporated into someone’s model at the end of the day, but you can show them down and make it hurt.
@potatopotato @selfhosted Black Ice exists. Software is hand-to-hand combat. The most #cyberpunk sentence I’ve read today:
“There are also various poison and tarpit systems which will serve scrapers infinite garbage text or data designed to aggressively corrupt the models they’re training. “
You could put your website behind a cloudflare anti bot check. But realistically, your website is public facing and these bots are scraping the public web. They will eventually get the data from your website.
I’m wondering if you could run CrowdSec on the server and manually block the offenders if they are not already in the community blocklists.
Isn’t fail2ban a possibility too? I created a filter for chatgpt and some others, and it feels like its working. My radicale server is my only free acessable service but it comes with a small webgui and so the bots showed up. I have no clue if the bot gets a fraction of your site each time it shows up, but seemingly the ban happens within 300ms when I remember correct. So it wouldn’t be that much of information…
When setting the retry to 1 it will ban at the first sight.
deleted by creator
Anubis is your friend





