• potatopotato@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    26
    arrow-down
    2
    ·
    1 day ago

    Currently Anubis seems to be the standard for slowing down scrapers

    https://github.com/TecharoHQ/anubis

    There are also various poison and tarpit systems which will serve scrapers infinite garbage text or data designed to aggressively corrupt the models they’re training. Basically you can be as aggressive as you want. Your site will get scraped and incorporated into someone’s model at the end of the day, but you can show them down and make it hurt.

    • David J. Atkinson@c.im
      link
      fedilink
      arrow-up
      10
      arrow-down
      2
      ·
      1 day ago

      @potatopotato @selfhosted Black Ice exists. Software is hand-to-hand combat. The most #cyberpunk sentence I’ve read today:

      “There are also various poison and tarpit systems which will serve scrapers infinite garbage text or data designed to aggressively corrupt the models they’re training. “