Conversation
Notices
-
Embed this notice
publictorsten (publictorsten@mastodon.social)'s status on Thursday, 17-Oct-2024 03:30:21 JST publictorsten - Haelwenn /элвэн/ :triskell: and clacke like this.
-
Embed this notice
Florian Idelberger (fl0_id@mastodon.social)'s status on Thursday, 17-Oct-2024 06:22:06 JST Florian Idelberger @publictorsten das musst du nicht vermuten, das steht da genau so
clacke likes this. -
Embed this notice
publictorsten (publictorsten@mastodon.social)'s status on Thursday, 17-Oct-2024 06:22:07 JST publictorsten Nebenaspekt der Story: Eigentlich hatte das Originalfoto ja den gewünschten Bildausschnitt. Ich vermute, dass an einer Stelle das Bild beschnitten wurde und an einer anderen Stelle plötzlich ein anderes Bildformat gewünscht wurde.
Das ist unsinnig und damit wohl der künftig vorherrschende Einsatz von KI: Unnötige Aufgaben schaffen und hoffen, dass nicht allzu schief geht.
-
Embed this notice
Konrad Rudolph (klmr@mastodon.social)'s status on Thursday, 17-Oct-2024 06:22:09 JST Konrad Rudolph @tomstoneham @publictorsten I am noticing this a lot: GenAI being used as an excuse/accelerator for a massive erosion of professionalism, with the excuse of “increased productivity” (“getting things done”).
clacke likes this. -
Embed this notice
Tom Stoneham (tomstoneham@dair-community.social)'s status on Thursday, 17-Oct-2024 06:22:10 JST Tom Stoneham @publictorsten What is so shocking about this is *not* that the genAI did exactly what it is designed to do, namely create teh most likely looking image, but that the person who used it switched off their brain at the sight of the words 'AI'.
Before AI, if a social media person had had to do a major edit of a photo which left them deciding 'bra or no bra', they would have asked the sitter. Now they use genAI to be 'more productive', they stop doing their own job properly.
-
Embed this notice
Jon Het-CIS. (jon_kramer@mastodon.social)'s status on Thursday, 17-Oct-2024 06:22:24 JST Jon Het-CIS. @KathleenC @petrosyan @publictorsten It was done by A.I. There is no one to fire. I guess you could fire a random person who ordered the A.I. to modify the photo to fit the space, or the team that programed the A.I. I am not sure that would help anything though.
-
Embed this notice
Susan Kaye Quinn 🌱(she/her) (susankayequinn@wandering.shop)'s status on Thursday, 17-Oct-2024 06:22:24 JST Susan Kaye Quinn 🌱(she/her) @Jon_Kramer if you start holding people accountable for using AI and causing harms with it, I guarantee you that will "help"
clacke likes this. -
Embed this notice
Kathleen (kathleenc@sanjuans.life)'s status on Thursday, 17-Oct-2024 06:22:25 JST Kathleen @petrosyan @publictorsten Sexualizing her photo was so special. Honestly it's essential to fire the individual who did this.
-
Embed this notice
petrosyan (petrosyan@mstdn.io)'s status on Thursday, 17-Oct-2024 06:22:26 JST petrosyan @publictorsten Not only didn't she consent to this edit, additionaly her photo was uploaded to an AI tool without her consent. I really don't know what's worse.
-
Embed this notice
Susan Kaye Quinn 🌱(she/her) (susankayequinn@wandering.shop)'s status on Thursday, 17-Oct-2024 06:22:31 JST Susan Kaye Quinn 🌱(she/her) @Jon_Kramer the woman who decided that it was okay to upload this person's image and edit it using AI; the organization that approved using AI (or at least didn't explicitly ban it)
The tech company that hired the 5000 people can be held accountable too.
The great gift that AI gives people/companies is obscuring harms such that "no one can be blamed" — it's a fig leaf for harms as a service and it's one of the reasons WHY people use it.
clacke likes this. -
Embed this notice
Jon Het-CIS. (jon_kramer@mastodon.social)'s status on Thursday, 17-Oct-2024 06:22:32 JST Jon Het-CIS. @susankayequinn @KathleenC @petrosyan @publictorsten Maybe. But at some point, you are holding random people accountable, and that become counter productive. There is no identifiable person to hold accountable here. 5000 people worked on that A.I. system. Maybe more than that.
-
Embed this notice
Count Hideout 🏳️🌈🦇 (darth_hideout@mastodon.social)'s status on Thursday, 17-Oct-2024 06:22:36 JST Count Hideout 🏳️🌈🦇 @susankayequinn @Jon_Kramer @KathleenC @petrosyan @publictorsten
There are lots of underlying assumptions, too, like "everyone's using it," "it's inevitable," "it's necessary," "we'll be left behind if we don't adopt early" et al. If you accept them, then many things become excusable.
I'm surprised how quick people are to turn to AI & automated solutions in general, rather than their own brains & abilities. But that's a different topic.
clacke likes this. -
Embed this notice
clacke (clacke@libranet.de)'s status on Thursday, 17-Oct-2024 06:22:37 JST clacke @Hunko @publictorsten But then you'd have to pay for a good graphics designer. -
Embed this notice
HTZ🏳️🌈 (hunko@woof.group)'s status on Thursday, 17-Oct-2024 06:22:39 JST HTZ🏳️🌈 @publictorsten
A good graphic designer or photographer knows how to add length without AI. One of my tasks working for conference clients was working magic with terrible headshots, all before AI could help. -
Embed this notice
Andrew Bartlett (abartlet@mastodon.nzoss.nz)'s status on Thursday, 17-Oct-2024 16:33:09 JST Andrew Bartlett Also happened in Australia to an MP:
https://www.abc.net.au/news/2024-02-01/georgie-purcell-ai-image-nine-news-apology-digital-ethics/103408440clacke likes this.