@HauntedOwlbear Getting back to your query, I've found that delamination mostly occurs when you ask for an opinion on... something.
Mostly, something physical.
This causes a chance for 'as a sentient AI, I cannot...' type responses. Like, if you have any kind of temperature on your model, then every time you ask for an opinion, it MIGHT spit out the 'as an LLM, I can't tell you an opinion' - and once that line's crossed, the 'persona' is kinda gone.