Meta “programmed it to simply not answer questions,” but it did anyway.

  • conciselyverbose@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    2
    ·
    1 year ago

    The problem is that mathematical proofs rely on the basic premise that the underlying assumptions are rock solid, and that the rules of the math are rock solid. It’s rigorous logic rules, applied mathematically.

    The real world is Bayesian. Even our hard sciences like physics are only “mostly” true, which is why stuff like relativity could throw a wrench in it. There’s inherent uncertainty for everything, because it’s all measurement based, with errors, and more importantly, the relationships all have uncertainty. There is no “we know a^2 and b^2, so c^2 must be this”. It’s “we think this news source is generally reliable and we think the sentiment of the article is that this crime was committed, so our logical assumption is that the crime was probably committed”. But no link in the chain is 100%. “Rock solid” sources get corrupted, generally with a time lag before it’s recognizable. Your interpretation of a simple article may be damn near 100%, but someone is still going to misread it, and a computer definitely can.

    Uncertainty is central to reality, down to the fact that even quantum phenomena have to be talked about probabilistically because uncertainty is built in all the way down.

    • rottingleaf@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 year ago

      You are describing LLMs, yes. But not what I’m describing.

      I’m talking about machine finding syllogisms and checking their correctness. This can’t be rock solid because of interpretation of the statement in natural language with its fuzzy semantics, but everything after that can be made rock solid. While in LLMs even it isn’t.

      That’s what I’m talking about.

      Humans make mistakes, but not such as LLM-generated texts contain.

      I mean that one can build a reasoning machine which an LLM isn’t.