minus-squaretomas@lm.eke.litoTechnology@lemmy.world•'Godmode' GPT-4o jailbreak released by hacker — powerful exploit was quickly bannedlinkfedilinkEnglisharrow-up1·edit-25 months agosummary: using leet-speak got the model to return instructions on cooking meth. mitigated within a few hours. linkfedilink
summary: using leet-speak got the model to return instructions on cooking meth. mitigated within a few hours.