Just a thought - if you design a system to prevent AI crawlers, instead of booting them off, serve crypto-mining JavaScript instead. It would be very funny.

  • NaibofTabr@infosec.pub
    link
    fedilink
    English
    arrow-up
    2
    ·
    edit-2
    5 days ago

    There’s a functional difference between forcing a crawler to interact with code on your server that wastes its time, and getting it to download your code and run it on its own server - the issue being where the actual CPU/GPU/APU cycles happen. If they happen on your server then it’s not benefiting you at all, it’s costing you the same amount as just running the cryptominer directly would.

    Any halfway intelligent administrator would never allow an automated routine to download and run arbitrary code on their own system, it would be a massive security risk.

    My understanding of Anubis is that it just leads the crawler into a never-ending cycle of URLs that just lead to more URLs while containing no information of any value. The code that does this is still installed and running on your server, and is just serving bogus links to the crawler.

    • lagoon8622@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      5
      ·
      5 days ago

      My understanding of Anubis is that it just leads the crawler into a never-ending cycle of URLs

      That’s not how Anubis works. You’re likely thinking of Nepenthes

    • fruitycoder@sh.itjust.works
      link
      fedilink
      arrow-up
      3
      ·
      5 days ago

      “would never allow an automated routine to download arbitraru code” javascript and wasm being the leading tech to do exactly this. Make those essential for loading content and bypassing it would have to be bespoke solutions depending on the framework and implementations.