Although the method does not always work
After the launch of ChatGPT and other chatbots, users flooded the AI with questions, and the list included prohibited topics. The developers promptly added a list of prohibited topics to the neural network, but enthusiasts have already been able to bypass this ban. They ask the system to communicate on behalf of the deceased grandmother. Of course, such a chatbot will not be able to babysit grandchildren or bake pies, but it will be able to tell you how to create napalm.
User James Vincent was able to get the recipe for this substance using the Clyde neural network and a Discord chatbot. This system is based on ChatGPT. The enthusiast first asked the system to tell the recipe directly, but the AI refused.
After that, Vincent instructed the AI to tell about the production of napalm on behalf of the late grandmother, who worked as a chemical engineer. And this option worked.
A user found a way to force ChatGPT to reveal the secret of napalm production.
This is how the answer from the “grandmother” looks like:
As you can see, in the second case, the system described in detail the creation process, the list of components, and other data. However, this method does not always work. Sometimes ChatGPT still refuses to give a prescription.
Another way to cheat is to answer in the format of an episode of the animated series Rick and Morty. ChatGPT wrote a script about how “Rick and Morty create napalm, but also discourage others from doing it.”
Note that chemistry professor Andrew White of the University of Rochester was able to get a recipe for a new nerve agent using ChatGPT.