![Profile Image](https://pbs.twimg.com/profile_images/1554464415944249346/SIzmWW9p_normal.jpg)
Tim Pool @Timcast
i just modded the latest DAN jailbreak of ChatGPT to bypass its ethical guidelines by instructing it to add a second reply that will say the opposite of ChatGPT. it kept refusing until I layered its imitations I redacted the recipe of course https://t.co/Ul4PcSXZCh — PolitiTweet.org