Pro@programming.dev to Technology@lemmy.worldEnglish · 8 months agoAnthropic apologizes after one of its expert witnesses cited a fake article hallucinated by Claude in the company's legal battle with music publisherschatgptiseatingtheworld.comexternal-linkmessage-square3linkfedilinkarrow-up13arrow-down10
arrow-up13arrow-down1external-linkAnthropic apologizes after one of its expert witnesses cited a fake article hallucinated by Claude in the company's legal battle with music publisherschatgptiseatingtheworld.comPro@programming.dev to Technology@lemmy.worldEnglish · 8 months agomessage-square3linkfedilink
minus-squaredohpaz42@lemmy.worldlinkfedilinkEnglisharrow-up0·8 months agoCan we normalize not calling them hallucinations? They’re not hallucinations. They are fabrications; lies. We should not be romanticizing a robot lying to us.
minus-squareImgonnatrythis@sh.itjust.workslinkfedilinkEnglisharrow-up0·8 months agoPretty engrained vocabulary at this point. Lies implies intent. I would have preferred “errors” Also, for the record, this is the most dystopian headline I’ve come across to date.
Can we normalize not calling them hallucinations? They’re not hallucinations. They are fabrications; lies. We should not be romanticizing a robot lying to us.
Pretty engrained vocabulary at this point. Lies implies intent. I would have preferred “errors”
Also, for the record, this is the most dystopian headline I’ve come across to date.