@Jirikiha @nazokiyoubinbou @joby @CptSuperlative @emilymbender
If I asked chat GPT to "turn off the porch light" and it said "OK, I've turned off the light on your porch." I would know that it has not really done this. It has no way to access my porch light. I would realize that it is just giving a text answer that fits the context of the previous prompts.
So, why do people think it makes sense to ask chat GPT to explain how it produced a response?