Conversation
Notices
-
Embed this notice
PC-9801 Enjoyer (pawlicker@bae.st)'s status on Monday, 15-Apr-2024 12:49:08 JST PC-9801 Enjoyer intel made a gpu for poverty kids
image.png-
Embed this notice
iced depresso (icedquinn@blob.cat)'s status on Monday, 15-Apr-2024 12:49:07 JST iced depresso @Pawlicker good for AV1 encoding i guess -
Embed this notice
:blobancap: :blobcattrans: :blobancap: :blobcattrans: :blobancap: :blobcattrans: (allison@hidamari.apartments)'s status on Monday, 15-Apr-2024 12:52:05 JST :blobancap: :blobcattrans: :blobancap: :blobcattrans: :blobancap: :blobcattrans: @icedquinn @Pawlicker Good for AV1 encoding, good for emulators, good for "just works" under Linux for things that use oneAPI compared to whatever baroque mess AMD's compute stack is -
Embed this notice
bronze (bronze@pl.kitsunemimi.club)'s status on Monday, 15-Apr-2024 12:53:32 JST bronze @verita84eva @Pawlicker you know I hate windows but DirectML is one of the few things that's actually made me consider a pass-through VM :holo_think: -
Embed this notice
dotnet@loli.church's status on Monday, 15-Apr-2024 12:53:32 JST dotnet @bronze@pl.kitsunemimi.club @verita84eva@poster.place @Pawlicker@bae.st last time I tried it, DirectML was pretty bad, its only value was that it was better than using the CPU.
-
Embed this notice
Vivi Nella Verita (verita84eva@poster.place)'s status on Monday, 15-Apr-2024 12:53:38 JST Vivi Nella Verita @bronze @Pawlicker
Yeah....For stable diffusion, I had to use this fork for Windows
https://github.com/lshqqytiger/stable-diffusion-webui-directml/issues -
Embed this notice
Vivi Nella Verita (verita84eva@poster.place)'s status on Monday, 15-Apr-2024 12:53:40 JST Vivi Nella Verita @Pawlicker @bronze
It does not looks like all the AI stuff supports ARC yet. Hell, even having Radeons are limitedIn conversation permalink -
Embed this notice
bronze (bronze@pl.kitsunemimi.club)'s status on Monday, 15-Apr-2024 12:53:40 JST bronze @verita84eva @Pawlicker Radeon for ML is such a piece of shit its not even funny. AMD does not give a fuck about ROCm for the gaming GPUs, so you maybe get support or maybe not depending on the model.
Meanwhile I've seen someone with a piece of shit GTX760 churn out some 400x400 pics with the CUDA support he STILL has.In conversation permalink -
Embed this notice
PC-9801 Enjoyer (pawlicker@bae.st)'s status on Monday, 15-Apr-2024 12:53:41 JST PC-9801 Enjoyer @verita84eva @bronze they have a cheap 16gb card too In conversation permalink -
Embed this notice
Vivi Nella Verita (verita84eva@poster.place)'s status on Monday, 15-Apr-2024 12:53:42 JST Vivi Nella Verita @bronze @Pawlicker
My 12 GB card is barely enoughIn conversation permalink -
Embed this notice
bronze (bronze@pl.kitsunemimi.club)'s status on Monday, 15-Apr-2024 12:53:43 JST bronze @Pawlicker you too will be able to do stable diffusion for a mere $100 :blobfoxlaughsweat: In conversation permalink -
Embed this notice
djsumdog (djsumdog@djsumdog.com)'s status on Tuesday, 16-Apr-2024 00:15:05 JST djsumdog I have one of these. I was using it while trying to debug issues with amdgpu drivers causing kernel panics. It's not a bad car for basic stuff honestly.
In conversation permalink
-
Embed this notice