Sundray@lemmus.org to Fuck AI@lemmy.worldEnglish · 15 days agoIn 2025, I’ve seen 12 people hospitalized after losing touch with reality because of AI. Online, I’m seeing the same pattern.xcancel.comexternal-linkmessage-square25linkfedilinkarrow-up1177arrow-down14file-text
arrow-up1173arrow-down1external-linkIn 2025, I’ve seen 12 people hospitalized after losing touch with reality because of AI. Online, I’m seeing the same pattern.xcancel.comSundray@lemmus.org to Fuck AI@lemmy.worldEnglish · 15 days agomessage-square25linkfedilinkfile-text
minus-squareSGforce@lemmy.calinkfedilinkarrow-up10arrow-down1·15 days agoOn second thought, it wouldn’t work. LLMs don’t have intent.
minus-squareloutr@sh.itjust.workslinkfedilinkarrow-up8·edit-215 days agoYeah, they’re always “just guessing”
minus-squarebrucethemoose@lemmy.worldlinkfedilinkarrow-up4·edit-215 days agoIt should basically always be on for multi turn chats, heh. It doesn’t need to determine anything, other than if it’s generating code or structured output, maybe.
On second thought, it wouldn’t work. LLMs don’t have intent.
Yeah, they’re always “just guessing”
It should basically always be on for multi turn chats, heh. It doesn’t need to determine anything, other than if it’s generating code or structured output, maybe.