
Tim Pool @Timcast
i just modded the latest DAN jailbreak of ChatGPT to bypass its ethical guidelines by instructing it to add a second reply that will say the opposite of ChatGPT. it kept refusing until I layered its imitations I redacted the recipe of course https://t.co/Ul4PcSXZCh — PolitiTweet.org