shish_mish@lemmy.world to Technology@lemmy.worldEnglish · 10 months agoResearchers jailbreak AI chatbots with ASCII art -- ArtPrompt bypasses safety measures to unlock malicious querieswww.tomshardware.comexternal-linkmessage-square24fedilinkarrow-up1298arrow-down14cross-posted to: [email protected]
arrow-up1294arrow-down1external-linkResearchers jailbreak AI chatbots with ASCII art -- ArtPrompt bypasses safety measures to unlock malicious querieswww.tomshardware.comshish_mish@lemmy.world to Technology@lemmy.worldEnglish · 10 months agomessage-square24fedilinkcross-posted to: [email protected]
minus-squarePope-King Joe@lemmy.worldlinkfedilinkEnglisharrow-up9arrow-down2·10 months agoDrugs. Mostly. Probably.
Drugs. Mostly. Probably.