Lord Wiggle@lemmy.world to Not The Onion@lemmy.worldEnglish · 26 days agoTherapy Chatbot Tells Recovering Addict to Have a Little Meth as a Treatfuturism.comexternal-linkmessage-square7fedilinkarrow-up14arrow-down11
arrow-up13arrow-down1external-linkTherapy Chatbot Tells Recovering Addict to Have a Little Meth as a Treatfuturism.comLord Wiggle@lemmy.world to Not The Onion@lemmy.worldEnglish · 26 days agomessage-square7fedilink
minus-squareExtremeDullard@lemmy.sdf.orglinkfedilinkEnglisharrow-up1·26 days agoRemember: AI chatbots are designed to maximize engagement, not speak the truth. Telling a methhead to do more meth is called customer capture.
Remember: AI chatbots are designed to maximize engagement, not speak the truth. Telling a methhead to do more meth is called customer capture.