@kellogh @futurebird @dahukanna @PavelASamsonov it's true that LLMs can generate novelty in a recombinatory or juxtapositional sense; after all, that's precisely what the "hallucinations" aka bullshit results are. They're novel constructions; it's just that they don't relate to reality, they are not true. There are many possible statements about any given real world situation, but many fewer true ones, and the LLM has no ability to distinguish truth. We see this in the chemical modeling...