• 8oow3291d@feddit.dk
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    2
    ·
    9 hours ago

    LLMs don’t have any intentions.

    Eh. The output from LLMs is usually pretty goal-oriented, so it arguably has intentions.

    The LLM is not designed to deceive though, so in that sense it is correct that it is not lies.

    • supamanc@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      5 hours ago

      An LLM is a statistical modeling tool. It doesn’t have goals. It can’t have intentions. It just outputs according to an algorithm.