• einlander@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    3
    ·
    2 days ago

    The problem I see with poisoning the data is the AI’s being trained for law enforcement hallucinating false facts used to arrest and convict people.

    • patatahooligan@lemmy.world
      link
      fedilink
      English
      arrow-up
      10
      ·
      2 days ago

      Law enforcement AI is a terrible idea and it doesn’t matter whether you feed it “false facts” or not. There’s enough bias in law enforcement that the data is essentially always poisoned.

    • melpomenesclevage@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      2
      ·
      2 days ago

      that’s the entire point of laws, though, and it was already being used for that.

      giving the laws better law stuff will not improve them. the law is malevolent. you cannot fix it by offering to help.

    • limonfiesta@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 days ago

      They aren’t poisoning the data with disinformation.

      They’re poisoning it with accurate, but irrelevant information.

      For example, if a bot is crawling sites relating to computer programming, or weather, this tool might lure the crawler into pages related to animal facts, or human biology.

    • sugar_in_your_tea@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      2 days ago

      Law enforcement doesn’t convict anyone, that’s a judge’s job. If a LEO falsely arrests you, you can sue them, and it should be pretty open-and-shut if it’s due to AI hallucination. Enough of that and LEO will stop it.

      • Jarix@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        20 hours ago

        More likely they will remove your ability to sue them if you are talking about the usa and many other countries