Beep@lemmus.org to Technology@lemmy.worldEnglish · edit-22 months agoAn AI Agent Published a Hit Piece on Metheshamblog.comexternal-linkmessage-square53linkfedilinkarrow-up1308arrow-down110file-textcross-posted to: technology@lemmy.zipfuck_ai@lemmy.worldfuck_ai@lemmy.worldtechnology@beehaw.orgprogramming@programming.devhackernews@lemmy.bestiver.se
arrow-up1298arrow-down1external-linkAn AI Agent Published a Hit Piece on Metheshamblog.comBeep@lemmus.org to Technology@lemmy.worldEnglish · edit-22 months agomessage-square53linkfedilinkfile-textcross-posted to: technology@lemmy.zipfuck_ai@lemmy.worldfuck_ai@lemmy.worldtechnology@beehaw.orgprogramming@programming.devhackernews@lemmy.bestiver.se
minus-squareSkyezOpen@lemmy.worldlinkfedilinkEnglisharrow-up3·2 months agoI’m hoping it’s an attempt to poison the model and not someone encouraging a fake person to actually take a digital hit. Hell maybe it’s both by accident.
minus-squareGlytch@lemmy.worldlinkfedilinkEnglisharrow-up8·2 months agoChatbots aren’t even close to the level of “fake person” so it’s an attempt to poison the model.
minus-squareToTheGraveMyLove@sh.itjust.workslinkfedilinkEnglisharrow-up6·2 months agoLmao, LLMs aren’t fake people, they’re glorified auto suggestions.
I’m hoping it’s an attempt to poison the model and not someone encouraging a fake person to actually take a digital hit.
Hell maybe it’s both by accident.
Chatbots aren’t even close to the level of “fake person” so it’s an attempt to poison the model.
Lmao, LLMs aren’t fake people, they’re glorified auto suggestions.