Embed Notice
HTML Code
Corresponding Notice
- Embed this notice
Your New Marijuana Injecting Waifu :weed: (sjw@bae.st)'s status on Saturday, 03-Sep-2022 13:54:49 JST Your New Marijuana Injecting Waifu :weed:
@PhenomX6 @ink8 Honestly, Intel's Linux support is one of the reasons I'm so hyped. I doubt they're gonna put limits on video encoding like NVIDIA does (i.e. "Oh! Consumer grade cards are limited to 3 encode tasks at once. If you want more you'll have to buy a Quadro. Oh, you bought a Quadro? Yeah, but you bought one of the cheap ones under $1k. Get fucked. Oh, now you bought a Quadro over $1,000? Encode as many video streams as your vram allows!
Meanwhile, it seems like Intel is gonna be like, "lmao, fuck it! Encode 15 1080p streams on a $140 card at once! We don't care!
As far as me and everyone else in the tech industry is concerned the new Intel GPUs are encoding cards that just so happen to be able to render video games.
https://youtu.be/RlPesrNpPNQ?t=3m5s
Literally, just just check like 90 seconds of that video (from the linked timestamp) out! Being able to encode 60 1080p@60 streams into AV1/VP9/HEVC/AVC in real-time at 75 watts with no external power! This is literally a wet dream!
For the longest time I've said NVIDIA should release an enterprise card with NVENC2/NVDEC and nothing else. Just a chip optimised for video decoding/encoding but they didn't do it.
Now Intel has done it and it just so happens to be able to render video games.
Even if you have a 3090Ti it's worth buying this $140 card so you aren't sacrificing 20-40+ FPS for video encoding. Plus this card can not only decode but also encode AV1!
My only concern is if I buy first gen will I be greatly disappointed by second gen's video encode performance in like 8 months?
Probably, especially considering Intel has moved most of their ASIC design team to their GPU division.
https://bae.st/notice/ANAPsNqhCuGWRZDIoq