@phiofx @smallcircles @jryans@merveilles.town a silver lining of the silly hype is that it's led to some pretty impressive breakthroughs in desktop AI which is great for projects like this! Llama.cpp allows us to run state of the art LLMs on moderately specced pcs and smaller models (by today's standards) like BERT can run on a potato. Of course for many ML use cases simple models like logistic regression and random forest classifiers can get you there and they will definitely run on a potato!