• BossDj@lemm.ee
    link
    fedilink
    English
    arrow-up
    1
    ·
    8 hours ago

    I meant that fabrication doesn’t imply intent as “lies” would.

    It seems like you use the hallucinations term correctly, when output has no relation to input.

    In this case, as in many numerous others, the Ai took input of “cite a source” and did as output cite a source as requested, but invented the content of the source. It fabricated, which means to make up, create.

    Fabricate does not imply intent to deceive, where lie does.

    I will agree that if the output is purely unrelated to the input, hallucination is still fine, but is absolutely a romanticized term when we’re referring to this computer generated code… It’s literally personification.

    • jungle@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      5 hours ago

      Everything an LLM outputs is hallucinated. That’s how it works. Sometimes the hallucination matches reality, sometimes it doesn’t.