• 0 Posts
  • 137 Comments
Joined 1 jaar geleden
cake
Cake day: 9 juli 2023

help-circle





  • That doesn’t seem worth it when you can fit that amount of storage in about 20 L with lithium ion cells (think a small PC case), or something like 40 L if you used sodium ion cells, which are looking like a new alternative.

    Concrete offgassing of CO2 is already a big contributor to greenhouse gasses, so I can’t imagine this battery version is improving things there. You’d probably have to wire your whole basement with electrodes to even access the stored energy.







  • Let me try putting this a different way: The machine is picking the next best word / action / chess move to output based on its past experience of the world (i.e. it’s training data). It’s not just statistics, it’s making millions of learned connections between words, and through association they start to have meaning.

    Is this not exactly what the human brain does itself? Humans just have the advantage of multiple senses and having a physical agent (a body) to interact with the world.

    The problem that AI has is it’s got no basis in reality. It’s like a human talking about fantasy things like unicorns. We’ve only ever experienced them as descriptions and art created from those descriptions without any basis in reality.


  • I have adopted the philosophy that human brains might not be as special as we’ve thought, and that the untrained behavior emerging from LLMs and image generators is so similar to human behaviors that I can’t help but think of it as an underdeveloped and handicapped mind.

    I hypothesis that a human brain, who’s only perception of the world is the training data force fed to it by a computer, would have all the same problems the LLMs do right now.

    To put it another way… The line that determines what is sentient and not is getting blurrier and blurrier. LLMs have surpassed the Turing test a few years ago. We’re simulating the level of intelligence of a small animal today.


  • Because hallucinations pretty much exactly describes what’s happening? All of your suggested terms are less descriptive of what the issue is.

    The definition of hallucination:

    A hallucination is a perception in the absence of an external stimulus.

    In the case of generative AI, it’s generating output that doesn’t match it’s training data “stimulus”. Or in other words, false statements, or “facts” that don’t exist in reality.