Mozilla really needs to reconsider this whole AI generated alt text endeavor.
Conversation
Notices
-
Embed this notice
cameron (cameron@route66.social)'s status on Friday, 09-Aug-2024 18:43:27 JST cameron -
Embed this notice
BeAware :fediverse: (beaware@social.beaware.live)'s status on Friday, 09-Aug-2024 18:43:26 JST BeAware :fediverse: @cameron @Adventurer they shouldve just made a deal with OpenAI. I use ChatGPT for most of my alt text and it does amazingly.
However, they don't care enough about accessibility, so I doubt this will improve *at all*.🤦♂️
-
Embed this notice
BeAware :fediverse: (beaware@social.beaware.live)'s status on Saturday, 10-Aug-2024 06:06:04 JST BeAware :fediverse: @HugeGameArtGD @cameron @Adventurer
So...?😑
Is your stance, "accessibility, as long as it's not AI"?
If so, that's a strange stance to take. I can't wrap my head around how to accurately describe an image.
So it's either that, or visually impaired folks don't know what my image contains and I'd much rather have better accessibility using AI than none at all....
-
Embed this notice
LinuxUserGD (hugegameartgd@mastodon.gamedev.place)'s status on Saturday, 10-Aug-2024 06:06:05 JST LinuxUserGD @BeAware @cameron @Adventurer OpenAI content is also AI-generated though
BeAware :fediverse: repeated this. -
Embed this notice
BeAware :fediverse: (beaware@social.beaware.live)'s status on Saturday, 10-Aug-2024 07:49:43 JST BeAware :fediverse: @frumble I'm suggesting that Mozilla might should have made a deal with OpenAI.
I just didn't think Mozilla would offer any local models to do this sort of thing and was suggesting that if they *must* do AI alt text, surely there's better options than the example pictured above.
I don't think people would want to download a 5-10 gb model to make a paragraph or so to describe images, though maybe I'm wrong.🤷♂️
-
Embed this notice
Maxi 10x 💉 (frumble@chaos.social)'s status on Saturday, 10-Aug-2024 07:49:44 JST Maxi 10x 💉 @BeAware Are you really suggesting they should have rather chosen a proprietary cloud provider instead of a local model (which will only getting better from here on)?
-
Embed this notice