mozz@mbin.grits.dev to Technology@beehaw.org · 6 months agoSomeone got Gab's AI chatbot to show its instructionsmbin.grits.devimagemessage-square186fedilinkarrow-up1434arrow-down10file-text
arrow-up1434arrow-down1imageSomeone got Gab's AI chatbot to show its instructionsmbin.grits.devmozz@mbin.grits.dev to Technology@beehaw.org · 6 months agomessage-square186fedilinkfile-text
minus-squarePup Biru@aussie.zonelinkfedilinkarrow-up12·edit-26 months agoanyone who enables a company whose “values” lead to prompts like this doesn’t get to use the (invalid) “just following orders” defence
minus-squareIcalasari@fedia.iolinkfedilinkarrow-up8·edit-26 months agoOh I wasn’t saying that I was saying the person may not be stupid, and may figure their boss is a moron (the prompts don’t work as LLM chat bots don’t grasp negatives in their prompts very well)
anyone who enables a company whose “values” lead to prompts like this doesn’t get to use the (invalid) “just following orders” defence
Oh I wasn’t saying that
I was saying the person may not be stupid, and may figure their boss is a moron (the prompts don’t work as LLM chat bots don’t grasp negatives in their prompts very well)