@irwin In theory that sounds good, but unless you’re sidestepping the LLM it simply adds to the resource-demand spiral. And if you’re sidestepping the LLM that way, you’ll need a non-LLM mechanism to make the topically relevant filler, which means that you’re using vanilla NLP techniques, which… makes the LLM a bit of an albatross.