@scottjenson
What I mean is that they cannot be trusted to carry out instructions because they are unreliable and have no understanding of context, or in fact anything.
To an LLM wiping a hard drive requires no more consideration than drafting some text. To a human, if I said "wipe my hard drive", they'd want to be sure that's what I said and wanted before doing it.
Link that with their high error rate and there's no way I'd 'give one' control of my laptop.
GNU social JP is a social network, courtesy of GNU social JP管理人. It runs on GNU social, version 2.0.2-dev, available under the GNU Affero General Public License.
All GNU social JP content and data are available under the Creative Commons Attribution 3.0 license.