cm0002@lemmy.world to Technology@lemmy.zipEnglish · 4 days agoChatGPT's hallucination problem is getting worse according to OpenAI's own tests and nobody understands whywww.pcgamer.comexternal-linkmessage-square16linkfedilinkarrow-up166arrow-down12
arrow-up164arrow-down1external-linkChatGPT's hallucination problem is getting worse according to OpenAI's own tests and nobody understands whywww.pcgamer.comcm0002@lemmy.world to Technology@lemmy.zipEnglish · 4 days agomessage-square16linkfedilink
minus-squareOptional@lemmy.worldlinkfedilinkEnglisharrow-up26arrow-down1·4 days ago*raises hand* Because it never “understood” what any “word” ever “meant” anyway?
minus-squaregeekwithsoul@lemm.eelinkfedilinkEnglisharrow-up11·4 days agoYeah, it’s all hallucinations - it’s just that sometimes the hallucinations manage to approximate correctness, and it can’t tell one from the other.
*raises hand*
Because it never “understood” what any “word” ever “meant” anyway?
Yeah, it’s all hallucinations - it’s just that sometimes the hallucinations manage to approximate correctness, and it can’t tell one from the other.