@CatherineFlick TBH: For PhD Students I do at least see the pragmatic value. They need a bunch of publications which is a lot of work so generate the abstracts and try to hit any conference to fulfill your requirements.
(But I have a very cynical view on the way science is structured organizationally so maybe that's a tad unfair)
It's not about 'clever new algorithms' or other forms of actual engineering or research. Just throw data against NVIDIA hardware until the results look fine.
The most interesting thing about xAI's 'grok' is how it confirms what me and others have been saying for a while: "AI" construction is less about having new ideas or new tech, it's about just having access to data and resources. If you have money you can spin up a competitive LLM in months.
I really like @Mer__edith's point here, very well put.
But I think there's something missing: She argues from an output perspective and in that she is right. "Generative AI" produces "plausible" text and "passable though somewhat off" imagery so these systems are not that useful in serious contexts. 100%
But these systems are massively useful to undermine labor. To burden a decreasing amount of employees with the work of fixing the output of these systems (probably for bad pay because "you are just fixing it a bit"). And to repeat the last automation cycles of producing a lot more stuff for a lot less cost per unit which only works by reducing quality.
You know how bad fast fashion is quality-wise? That is what "generative AI" does to text, image, media. But that fact that fast fashion is bad doesn't stop it because the cost argument wins. And that is what "generative AI" is useful for.
"Useful" is a word that only makes sense when adding for *whom* and for *what purposes and goals*.
There's a weird gendered thing where after every talk or article of mine coming out some male academic sends me a bunch of their articles or books to read. Which is nice at times to see who works on what but it has _never_ been a woman or nonbinary person. Only men.
I don't know if it's just men being more aggressive in pushing themselves and their careers or if it's men having a hard time making a compliment without overloading it with some weird transaction.
Saw a few defenses of Andreessen's fascist creed as "has basically just trolling, if you are offended that's on you and your moral failure" and sure, if nothing matters to you and everything powerful people say is "just words" that's convenient but really shows your position and cynicism.
Mitchell Baker from Mozilla tells the German news agency that we shouldn't "leave AI development to the tech-giants", that somehow "the training data should be controlled by users".
And that does on a surface level sound nice. User control and criticizing tech giants. Great. But it accepts the tech-giants' narrative: That "AI" is inevitable and that all data *has to* be dumped into these systems, that unfairness is a tech issue to solve with more data. That is not true.
We can also decide not to build these huge and wasteful statistical systems.
"ChatGPT spit out erroneous or inappropriate cancer treatment recommendations 1/3 of the time." We saw the same with IBM's "AI"s that were supposed to detect cancer, with all the COVID apps that tried detecting infections via smartphone sensors etc.
Computers do lack lived experience and a connection to the world, they operate based on very limited models constructed from sensor data. These abstractions cut off so much information that errors become basically unavoidable. Which isn't always a problem: In a different world having a system that helps filter data for people could be useful but when these systems are deployed as "time savers" or for "rationalization" (meaning firing people or not hiring enough people) these mistakes have grave consequences.
A key aspect of the currently debated paper on "AI" models' water usage is in the last paragraphs:
Data centers were pushed to "follow the sun" in order to reduce carbon emissions by harnessing (cheap) solar energy. Cool!
But locations with a lot of sun often get very hot during the warmer months requiring a lot of water to cool them.
So by trying to emit less carbon (and because solar power is very cheap of course, it's not to be nice) companies are contributing to the aridification of the regions they moved to.
"AI" (and to a lesser degree all digital infrastructure) has massive impact on the environment. It's not an ethereal thing that we can decouple from our other activities.
Giving Fairphone shit because while trying to build a long-term viable, sustainable and socially and economically fairer device they also think about the people in the real fucking world who are supposed to use them and offer them the software most of them rely on (like Google Apps) is more toxic purism than I can stand.
People who are willing to suffer in their life by using something different, something slightly incompatible to what the rest of the world uses already do install roms and run whatever on their machines.
Getting real actual people to buy a phone that's not built to be thrown away in a year of two, that highlights the values of sustainability is a huge win.
And making the medicine ("your phone might not be as fancy as your peers' phones") taste sweeter ("for 8 years you have access to all the applications you need for your life to function within a digital society") is just a smart move.
If your job is working on open source and you don't need all the stuff that Google etc provide, awesome. I get it, running only free stuff is really cool, I do it myself. For me.
But that's based on my skills, the things. I care about. It's kind of my hobby so to speak and it oftentimes means that I have to dick around with wonky solutions. Which *I* am okay with.
But demanding that of real people in the world? People who just need a simple thing to do a shared spreadsheet for some gift for a nursery school teacher leaving should have the whole "Let's use Glorp!" - "No Glorp is run by a transphobe. Maybe Snoot?" - "Snoot is to hard to run, because it demands an old version of Zonk" - "Could we set up a mailing list?" dance? Get fucking real.
We all know the "free software" vs. the corporate "open source" split. But if the last years showed something it is that only using "freedom" as a fundamental principle isn't doing enough, it's just too limited.
If we want our communities to build software for a "common good" we might need to start thinking about other fundamental values and reframe "FLOSS" into something more goal oriented.
We need to integrate ideas of inclusion and environmentalism, about harm reduction and overcoming internalized colonialism that have shaped so many projects and communities (including to a degree the fediverse).
"Freedom" always sounds good and useful but it's just not enough, it's not tangible enough, to open for being captured by right wing narratives.
Sociotechnologist, writer and speaker working on tech and its social impact. Communist. Feminist. Antifascist. Luddite. Email: tante@tante.cc | License CC BY-SA-4.0 tfr"Ein-Mann-Gegenkultur" (SPIEGEL)