Conversation
Notices
-
Embed this notice
kaia (kaia@brotka.st)'s status on Friday, 25-Oct-2024 15:31:19 JST kaia @arcana @sun @genmaicha
with an average Nvidia (!) 30xx/40xx gaming GPU, you can already run some things. Stable Diffusion, Text Generation etc with smaller models. however, the restriction is always VRAM. my 4090 can run a lot, but in retrospect two or four cheaper cards would have been better VRAM-wise.
if you just want to experiment, maybe hourly cloud offer that rent enterprises grade Tesla GPUs and similar are best. it's quite cheap.-
Embed this notice
kaia (kaia@brotka.st)'s status on Friday, 25-Oct-2024 15:32:31 JST kaia @arcana @genmaicha @sun
oh and modern Apple silicon is very good at AI and ML stuff. in that case you can forgoe a GPU -
Embed this notice
RedTechEngineer (redtechengineer@fedi.lowpassfilter.link)'s status on Friday, 25-Oct-2024 15:40:41 JST RedTechEngineer @kaia @arcana @sun @genmaicha GPUs are used because they are the least bad option for older/current computers. But Apple has designed their stuff to properly run AI which is very neat. kaia likes this.
-
Embed this notice