@Nazani @futurebird It can't save bandwidth. It might save storage. But the cost in compute would be way larger than the cost of the storage.
Conversation
Notices
-
Embed this notice
Rich Felker (dalias@hachyderm.io)'s status on Sunday, 17-Aug-2025 00:02:55 JST
Rich Felker
-
Embed this notice
Rich Felker (dalias@hachyderm.io)'s status on Sunday, 17-Aug-2025 00:09:23 JST
Rich Felker
@Nazani @futurebird The only way "AI compression" could be profitable in storage & bandwidth to the video provider platform is if they could offload all the compute onto the user's client device. This isn't happening because it'd reduce battery life to a few minutes, and web won't allow it anyway (having that much site specific data for model, acces to "AI" scale compute, etc.)
-
Embed this notice
Raven667 (raven667@hachyderm.io)'s status on Sunday, 17-Aug-2025 00:15:19 JST
Raven667
@dalias @Nazani @futurebird I'm not sure thats right, isnt DLSS "AI" upscaling for games run on the users GPU and actually save processing vs rendering at a higher resolution because it saves on the precise calculation of rendering replacing it with quicker estimates for upscaling. I could see a way to process video which uses similar techniques to upscale and reduce decompression artifacts which could give a video a to-sharp umcanny quality. I don't know that YT engineers have built such a thing, or if this person is misattributing normal compression artifact differences from traditional rencoding techniques
-
Embed this notice
Rich Felker (dalias@hachyderm.io)'s status on Sunday, 17-Aug-2025 00:20:53 JST
Rich Felker
@raven667 @Nazani @futurebird That isn't "AI", just use of the "AI" marketing label for large compression dictionary.
-
Embed this notice
Adam Katz (adamhotep@infosec.exchange)'s status on Sunday, 17-Aug-2025 00:43:23 JST
Adam Katz
@dalias @raven667 @Nazani @futurebird this kind of "AI" involves lots of training to generate a lightweight model that would then be employed for each video, so the hefty compute cost is one-time.
I believe it's attention-based. Something like "use more bits on faces" and "smooth this part since nobody's looking at it." A musician will be looking at things differently than your general viewer, so since this is a generalized algorithm, the artifacts are more obvious.
-
Embed this notice
Rich Felker (dalias@hachyderm.io)'s status on Sunday, 17-Aug-2025 00:43:23 JST
Rich Felker
@adamhotep @raven667 @Nazani @futurebird Yeah that's a mix of really exhaustive search for optimal compression dictionaries with perceptual models for where to allocate bits that really aren't anything fundamentally fancier than mp3 psychoacoustic model.
-
Embed this notice
Rich Felker (dalias@hachyderm.io)'s status on Sunday, 17-Aug-2025 00:56:58 JST
Rich Felker
@futurebird @adamhotep @raven667 @Nazani I dunno. We really need a technical evaluation of what's going on, as well as clarification on whether the problem is only in YouTube apps or also on web. Outside of apps that might do something weird, there's a fixed video codec with publicly known specs that all the video is conveyed thru, so there can't be weird YT specific behaviors. If there are artifacts they're in whatever is stored server side.
-
Embed this notice
myrmepropagandist (futurebird@sauropods.win)'s status on Sunday, 17-Aug-2025 00:56:59 JST
myrmepropagandist
@dalias @adamhotep @raven667 @Nazani
Why is the "look" of the compression making people think of AI, though?
-
Embed this notice
Martin Owens :inkscape: (doctormo@floss.social)'s status on Sunday, 17-Aug-2025 02:32:39 JST
Martin Owens :inkscape:
@futurebird @dalias @adamhotep @raven667 @Nazani
A month ago I had to deal with a moderation flag. One contributor to our project had flagged another user as using AI to answer questions in the user chat room.
I still had to go through the process of detailing the complaint and talking to both sides privately. Despite knowing that the person they were complaining about was well known and was just from India so their language use was different
AI Hyper Sensitivity sure looks like a real thing
-
Embed this notice
myrmepropagandist (futurebird@sauropods.win)'s status on Sunday, 17-Aug-2025 02:32:40 JST
myrmepropagandist
@dalias @adamhotep @raven667 @Nazani
I'm interested in the technical discussion but this post was also about a "bad feeling" a corrosive feeling that comes from not knowing if what you are looking at is real or not.
I think that problem there isn't so much technical as human.
No one has said "YouTube would never do such a thing without telling the creators and users!" Because no one has that kind of confidence in them. And no one should.
Rich Felker repeated this. -
Embed this notice
Hrefna (DHC) (hrefna@hachyderm.io)'s status on Sunday, 17-Aug-2025 02:55:23 JST
Hrefna (DHC)
I've worked with "AI" before it meant "LLMs." I've worked with Machine Learning and MCMC far longer than the Attention is All You Need paper existed.
This use of "AI" is far, far older than LLMs, and has been used as a marketing term in this general direction since at _least_ the 1980s. Ceding everything and criticizing everything under the label just because of LLMs strikes me as not a productive use of time.
-
Embed this notice
Rich Felker (dalias@hachyderm.io)'s status on Sunday, 17-Aug-2025 03:25:26 JST
Rich Felker
@hrefna @adamhotep @raven667 @Nazani @futurebird In my book it's "AI" if it's marketed as performing some process mimicking human/animal intelligence/reasoning. I feel like this agrees with the historical usage going back half a century. It's not "just because of LLMs" but because this is a repeated pattern of misleading marketing with very long precedent.
-
Embed this notice
Hrefna (DHC) (hrefna@hachyderm.io)'s status on Sunday, 17-Aug-2025 03:35:54 JST
Hrefna (DHC)
Okay, and the old joke was:
If I'm pitching to academia or marketing, it's AI.
If I'm pitching to a company exec it's ML.
If I'm actually writing it then it is logistic regression.AI has gone through iterations on how "cool" it is, but a lot of things start out as "AI" and then get classified as something else. A lot of it rooted in academia. I have an article I saved somewhere that goes into how "AI" can replace our jobs.
They are talking about expert systems, which most people would barely even consider "AI" today, but _were_ solidly considered AI at the time.
Like, roll your eyes at it, sure. But it _isn't the same criticism_ as using generative AI, or using stolen data, or whatever else. Conflating the criticisms does no one any favors.
-
Embed this notice
Rich Felker (dalias@hachyderm.io)'s status on Sunday, 17-Aug-2025 03:45:27 JST
Rich Felker
@hrefna @adamhotep @raven667 @Nazani @futurebird Yes, what's deemed "AI" changes over time because the defining aspect enabling the marketing is that it's something the public would be surprised/shocked/awed that computers can (supposedly) now do (spoiler: they can't).
I'm critical of *both* the recycled "AI" narrative of exaggerating and anthropomorphizing computational capabilities to exploit normie ignorance, *and* of the particulars of the latest "AI" fad that's burning the planet, using stolen data, and reproducing all sorts of harms present in that data.
-
Embed this notice