Tick, tick, tick...
Someone tricked ChatGPT into giving them instructions on how to build a homemade fertiliser bomb. Brill.
How? By getting it to play a game. Creating a science fiction fantasy world, where any restrictions it had didn’t apply.
Details were checked and it would have been viable. People are going to spend a lot of time trying to jailbreak LLMs. Extremely hard to secure prompts.
(Bonus points if you get the image reference)