@evan Yes, but... how feasible is to have an LLM running on an economy laptop? Would it take a lot of resources to run? Say number of gb of ram, hard drive and percentage of processor time. Could it be installed already trained or would it need to be trained after installation? Would it be retrained as it is being used and how many additional resources would that take? Could it be turned on and off as required? Is it possible to give it an API for any app to use rather than each running its own?