• nimble@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      1
      ·
      3 months ago

      LLMs hallucinate and are generally willing to go down rabbit holes. so if you have some crazy theory then you’re more likely to get a false positive from a chatgpt.

      So i think it just exacerbates things more than alternatives

    • Alphane Moon@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      3 months ago

      I have no professional skills in this area, but I would speculate that the fellow was already predisposed to schizophrenia and the LLM just triggered it (can happen with other things too like psychedelic drugs).

      • zzx@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        3 months ago

        Yup. LLMs aren’t making people crazy, but they are making crazy people worse