• Kichae@lemmy.ca
    link
    fedilink
    English
    arrow-up
    8
    ·
    7 months ago

    It “knows” as in it has access to the information and the ability to provide the right info for the right context.

    It doesn’t, though, any more than you have access to the information in a pile of 10 million shredded documents.

    • givesomefucks@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      16
      ·
      7 months ago

      Right, in this case that we’re talking about…

      Do you not understand how “answer unavailable” is a better answer than taking a small percent of strips of paper at random and filling in the rest with words that sound relevant?

      It’s like a mad libs

      • Ech@lemm.ee
        link
        fedilink
        English
        arrow-up
        8
        arrow-down
        1
        ·
        7 months ago

        taking a small percent of strips of paper at random and filling in the rest with words that sound relevant?

        It’s like a mad libs

        Right. They’re text generators. That’s the technology. It can’t do what you’re demanding because that’s not how it works. LLMs aren’t magic answer machines. They don’t know when to say “answer not available”. They don’t know what they’re being asked. They don’t know anything.

      • wahming@monyet.cc
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        1
        ·
        7 months ago

        That is what LLMs do in EVERY conversation. Most of the time you don’t notice it, because it fits your expectations.

      • then_three_more@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        2
        ·
        7 months ago

        You know that answer unavailable is better because you have real intelligence, an LLM is just some mathematical functions so it can’t do that. If it could it would be getting much closer to actually being AI.