Just a thought - if you design a system to prevent AI crawlers, instead of booting them off, serve crypto-mining JavaScript instead. It would be very funny.

  • Flax@feddit.ukOP
    link
    fedilink
    English
    arrow-up
    19
    ·
    6 days ago

    Isn’t that what Anubis was doing? Making it run code so it wasn’t worthwhile, but people adjusted AI crawlers to run code?

    • plz1@lemmy.world
      link
      fedilink
      English
      arrow-up
      11
      ·
      6 days ago

      “Proof of work”. The AI crawlers don’t run Javascript (yet, I don’t think), so it’s basically a firewall to them.

      • Little8Lost@lemmy.world
        link
        fedilink
        English
        arrow-up
        7
        arrow-down
        1
        ·
        6 days ago

        Some can from what i understood
        And not only JS but other code too like SQL
        I remember the somewhat recent case where someone vibecoded something and the AI viped the database

    • NaibofTabr@infosec.pub
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      5 days ago

      There’s a functional difference between forcing a crawler to interact with code on your server that wastes its time, and getting it to download your code and run it on its own server - the issue being where the actual CPU/GPU/APU cycles happen. If they happen on your server then it’s not benefiting you at all, it’s costing you the same amount as just running the cryptominer directly would.

      Any halfway intelligent administrator would never allow an automated routine to download and run arbitrary code on their own system, it would be a massive security risk.

      My understanding of Anubis is that it just leads the crawler into a never-ending cycle of URLs that just lead to more URLs while containing no information of any value. The code that does this is still installed and running on your server, and is just serving bogus links to the crawler.

      • lagoon8622@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        5
        ·
        5 days ago

        My understanding of Anubis is that it just leads the crawler into a never-ending cycle of URLs

        That’s not how Anubis works. You’re likely thinking of Nepenthes

      • fruitycoder@sh.itjust.works
        link
        fedilink
        arrow-up
        3
        ·
        5 days ago

        “would never allow an automated routine to download arbitraru code” javascript and wasm being the leading tech to do exactly this. Make those essential for loading content and bypassing it would have to be bespoke solutions depending on the framework and implementations.