Embed Notice
HTML Code
Corresponding Notice
- Embed this notice
That DΔrn Pooka :verified_think: (theququ@shitposter.club)'s status on Sunday, 22-Oct-2023 05:38:23 JSTThat DΔrn Pooka :verified_think: @Moon Local open models are doing better than OpenAI is at that now, there's finetunes of Llama 2 that can have a 16k token context, which I believe is bigger than chatGPT supports. But even this memory is just the entire token history. If you want "hidden state" you can teach it to output tokens that denote a block of text to hide from the user, but I don't know of any that do that yet.