the answers (or rather the lack thereof) in the ongoing MDN issue (https://github.com/mdn/yari/issues/9230) continue to be a source of headaches. it is transparently obvious that the reason they want to introduce LLMs is not for the users' benefit, but for some other incentive they seem unwilling to disclose...
but also the sentence "incorrect information can still be useful" has been uttered, and i have yet to recover from that.