4o spirals incredibly when asked to make a prolog quine. For an added bonus, ask it to "read it aloud" via the "..." menu - it will read the text, and then descend into absolute word salad when trying to read the code. Fascinating stuff.
Very neat! A lot of small LLM's have a similar failure mode where they get stuck and repeat a token / get stuck in a 2-3 token loop until they hit the max message size cutoff. Very ironic that it's about a quine.
https://chatgpt.com/share/fc175496-2d6e-4221-a3d8-1d82fa8496...
4o spirals incredibly when asked to make a prolog quine. For an added bonus, ask it to "read it aloud" via the "..." menu - it will read the text, and then descend into absolute word salad when trying to read the code. Fascinating stuff.