Intel Arc ASIC team working on next-gen discrete GPUs In an interview with PC Gamer, Intel Fellow Tom Petersen talks current state of Arc GPU development. Intel Arc GPU Series, Source: Intel While the majority of the driver team is now working on Alchemist, the ASIC and design team is looking at next-gen. The company…
@ink8@PhenomX6 given Intel's current Linux support I'd be surprised if it wasn't any different. Just install mesa and you're good to go. A binary blob is required for video encoding IIRC.
@sjw@bae.st@PhenomX6@fedi.pawlicker.com well, apparently Intel uses their stuff internally on Linux? it makes sense they're done with dealing with the competition's mediocre driver support.
what better way to make your employees work hard than telling them to work on their own tools by using the very same tools— actually that sounds fucking painful. i almost wanna donate to the cause but they have all of the money they could ever need so w/e, i'll just hope for the best
@PhenomX6@ink8 Honestly, Intel's Linux support is one of the reasons I'm so hyped. I doubt they're gonna put limits on video encoding like NVIDIA does (i.e. "Oh! Consumer grade cards are limited to 3 encode tasks at once. If you want more you'll have to buy a Quadro. Oh, you bought a Quadro? Yeah, but you bought one of the cheap ones under $1k. Get fucked. Oh, now you bought a Quadro over $1,000? Encode as many video streams as your vram allows!
Meanwhile, it seems like Intel is gonna be like, "lmao, fuck it! Encode 15 1080p streams on a $140 card at once! We don't care!
As far as me and everyone else in the tech industry is concerned the new Intel GPUs are encoding cards that just so happen to be able to render video games.
Literally, just just check like 90 seconds of that video (from the linked timestamp) out! Being able to encode 60 1080p@60 streams into AV1/VP9/HEVC/AVC in real-time at 75 watts with no external power! This is literally a wet dream!
For the longest time I've said NVIDIA should release an enterprise card with NVENC2/NVDEC and nothing else. Just a chip optimised for video decoding/encoding but they didn't do it.
Now Intel has done it and it just so happens to be able to render video games.
Even if you have a 3090Ti it's worth buying this $140 card so you aren't sacrificing 20-40+ FPS for video encoding. Plus this card can not only decode but also encode AV1!
My only concern is if I buy first gen will I be greatly disappointed by second gen's video encode performance in like 8 months?
Probably, especially considering Intel has moved most of their ASIC design team to their GPU division.
@sjw@bae.st@PhenomX6@fedi.pawlicker.com yeah i know that one. apparently it's focused only on generic packages and maximum-optimized implementation of such. but then ease-of-use is considered superficial.
i mean for devs it is, and sometimes ease-of-use can even be cumbersome once the software starts taking decisions for you.