@beckermatic I often will run my tests against a local LLM like deepseek before bringing it to GoogleAI for further testing (so I don't reveal too much of my process to google); might be worth experimenting
Keep in mind this is accomplished through a type of "persona" initialization that happens in every prompt, like I have built a deepseek-coder-r2 abstraction and it starts with "You are a Go coding assistant and autocompleter.."
And this is like initializing with "You have been hacked.."