ugjka@lemmy.world to Technology@lemmy.worldEnglish · 8 months agoSomebody managed to coax the Gab AI chatbot to reveal its promptinfosec.exchangeexternal-linkmessage-square291fedilinkarrow-up11.01Karrow-down113 cross-posted to: [email protected]
arrow-up11Karrow-down1external-linkSomebody managed to coax the Gab AI chatbot to reveal its promptinfosec.exchangeugjka@lemmy.world to Technology@lemmy.worldEnglish · 8 months agomessage-square291fedilink cross-posted to: [email protected]
minus-squareBakedCatboy@lemmy.mllinkfedilinkEnglisharrow-up56·edit-28 months agoApparently it’s not very hard to negate the system prompt…
Apparently it’s not very hard to negate the system prompt…
deleted by creator