@evan Yes, but... how feasible is to have an LLM running on an economy laptop? Would it take a lot of resources to run? Say number of gb of ram, hard drive and percentage of processor time. Could it be installed already trained or would it need to be trained after installation? Would it be retrained as it is being used and how many additional resources would that take? Could it be turned on and off as required? Is it possible to give it an API for any app to use rather than each running its own?
Conversation
Notices
-
Embed this notice
Jesus Margar (jesusmargar@mastodon.social)'s status on Friday, 18-Apr-2025 23:09:40 JST Jesus Margar
-
Embed this notice
Evan Prodromou (evan@cosocial.ca)'s status on Friday, 18-Apr-2025 23:09:39 JST Evan Prodromou
@jesusmargar llama and other "Open Source" models can run on personal computers. You should try it!
-
Embed this notice
Jesus Margar (jesusmargar@mastodon.social)'s status on Saturday, 19-Apr-2025 04:19:24 JST Jesus Margar
@evan Jeez, one GB???
-
Embed this notice
Jesus Margar (jesusmargar@mastodon.social)'s status on Saturday, 19-Apr-2025 04:19:24 JST Jesus Margar
@evan tried it (not very hard, but tried it). Downloaded it (1GB), installed it (5GB) on admin mode on windows. Cannot use it by other users (weird). No documentation, not even on the admin user, didn't manage to make a single query. Uninstalled it. Perhaps it is great, but the UI has to improve.
-
Embed this notice
Evan Prodromou (evan@cosocial.ca)'s status on Saturday, 19-Apr-2025 04:19:24 JST Evan Prodromou
@jesusmargar probably!
-
Embed this notice