@sun@kaia@FloatingGhost I mean the "I trained my own local model running on my GPU" is still environment-destroying and content-stealing. Not to mention that you most likely to be using proprietary software and novidya. Only thing you sorta avoid by running locally is "big tech" aspect of it.
@hj@kaia@FloatingGhost the environmental cost is server farms training big models, when you run it locally it is not any more environmentally damaging than a video game and when you train it yourself on your own data it is not stealing anything.
@sun@kaia@FloatingGhost it's still damaging tho, and gpu-intensive games are a separate can of worms. As for stealing - how do other people know it's really your data and not just files of dubious origins you downloaded from internet? Pelople sorta assume you just train your model on pics taken from gelbooru and such which is what people are upset about.
@hj@kaia@FloatingGhost and when you tell them you didn't, they are still mad, that is why they are dumb. it's not my fault or problem that they can't verify on my hard drive I didn't train the model ethically. I can't easily verify proprietary software isn't violating an open source license either. that is just life. I am not going to turn this thread into a hellthread over this so I'm done with it.
@sun@hj@FloatingGhost@kaia its insane how we went from "abolish copyright public domain sharing is not theft 09 F9" to "oi m8 u got a loisence 4 dat string a ones and zeros"
@mer@kaia@FloatingGhost@hj@sun this take is insane, public domain is public. it applies to both small artists and big media companies. in remix culture everything is a valid tool for producing works. in copyright abolition the big corpos can do it too. this is praxis
@why@kaia@FloatingGhost@hj@sun oh ideally yeah intellectual property is a myth but in practice, copyright is not going anywhere and training data acquisition is another way the corps get to keep the system working one way only there's no problem as long as your private model is entirely non-commercial right?
@mer@kaia@FloatingGhost@sun@why it's a tug of war - some people really want absolute control over their IP, some people want to abolish it, mainly because former group keeps abusing their power and extend their domain of it to fields that used to be free. What really needed is balance and compromise. Main differences are: - that people can be accountable and can list works quoted, what was their inspiration, education etc. AI cannot be held accountable and cannot list what sources/inspiration was used to "create", I mean without uploading their entire model used for learning. - humans are imperfect, all of my inspirations, sources and techniques had to pass through my sick brain and my shaking hands, as result things tend to have personality and imperfections. AI is designed to be as perfect as possible, and all it does is take those original untouched digital data and puts them into a digital blender, and equivalent of doing extremely good photo manipulation, and so far I haven't seen any advances in trying to make AI more original by crippling it in a certain way, i see advances in "make AI copy certain art style" and "make it better at doing [thing it does imperfectly].
@why@kaia@FloatingGhost@sun@mer i personally think the only thing of copyright that should remain is the opt-out requirement for attribution. You can use IP as long as you give original authors proper credit. No control over your IP, no monopoly over your IP, but people still have to refer to you when they use it.
@mer@kaia@FloatingGhost@hj@sun yeah but im not talking about fair use, im talking about copyright abolition. it wouldve always allowed AI corporations to partake. are we saying thats the wrong approach now?
@why@kaia@FloatingGhost@hj@sun we both know fair use is not recognized symetrically the exact law doesnt matter when the legal framework is what it is and because of that, AI should be evaluated in context
@why@kaia@FloatingGhost@hj@sun Yes, but the reality of the situation is the corps do not want IP abolition. They just want a license to harvest training data and sell it back to you. There is no doing one step before the other because AI requires so much computational power that these models would not even be made if they weren't backed by VC interests that want profit. The point of AI business wise is to obfuscate copyright infringement so you can fire more humans from your content pipeline, not anything else.
@why@kaia@FloatingGhost@sun@mer no they are saying that this tech 90% benefits the big tech, and even if you can do it on smaller scale it doesn't really matter and doesn't benefit normal people AS much, it's all about power and money as it always was.
@mer@kaia@FloatingGhost@hj@sun are you trying to say you cant train your own model with a moderately powerful gaming gpu? just say "itll only work after society collapses and they pick me to rebuild it" so we can end this conversation
@mer@nosleep@kaia@hj@sun are you like actually retarded? like drooling struggle to put clothes on retarded? nobody brought up or even was concerned with """"cognition"""". your argument for your point is a really basic definition of how machine learning works. ML which has been around for decades and has improved the workflows of just about every single artist that touches a computer. you have a very very limited view of how technology works, or even how the economy works in general. i dont even know what advice i would give you since you clearly cant comprehend more than one of your own sentences without making up some bullshit for the next sentence.
@nosleep@kaia@hj@sun@why Ok listen. The tech is fundamentally weighted statistics applied to matrices of data. When you have a "working model" it means you arrived at something that will be good enough to give you satisfying output most of the time. However. There is not, has never been and will never be actual cognition involved. That is just not what the tech is. Because of this, there is always a chance of hallucinated outputs. It will never go away.
@mer@kaia@hj@sun@why bro i love you but have you tried feeding a few paragraphs of excel's documentation into chatgpt? it translates it to common english unreasonably well. it can absolutely parse documentation and make it more readable. it is LITERALLY what the tech is meant to do. same with everything else you mentioned. it just can't handle an entire novel or codebase because the context window is small, and there are some experiments that are trying to break those boundaries. it will happen in the next couple of years
@mer@kaia@FloatingGhost@hj@sun so in your utopia can a writer not use a LLM to proofread their novel? can a programmer not use it to script functions or parse documentation? can a digital artist who specializes in portraits not generate a background? where is the line in the sand that prevents me from generating NPC lines for a game?
@why@kaia@hj@sun No, in my "utopia" how you call it, you do whatever you want with generative AI. There is no line in the sand. You just understand that your commercial project is not gonna make money through digital sales or licensing. Either you sell physical copies, you do it on comission or you get patrons. Because there is no line in the sand to copy and distribute your work.
Also >use a LLM to proofread their novel? >use it to script functions or parse documentation? The tech we currently call AI cannot and will never do that. That is not a judgement on the current state of the tech, it's simply not what the tech does.
@why@kaia@FloatingGhost@hj@sun you can sell a game but pirating it is legal so it would not make you money Without IP, creative work switches from creation >> sale to a comission system (comission >> creation) supplemented by patronnage (basically patreon). This is already how most artists make money because they are not benefiting equally from the copyright system. An artist draws a picture and it ends up on boorus whether they like it or not, you can shut down reposts but one, it'll still resurface and two, you cut your reach. So what do you do? Comissions and patronnage. Yes under this system, big productions like the film industry would not survive in the state they are currently. This is a feature, not a bug.
@mer@kaia@FloatingGhost@hj@sun >a product that you can only benefit from in the context of copyright being a thing so in the copyright free world you cant sell games? what art forms are allowed in this utopia of yours?
@hj@kaia@FloatingGhost@sun@why yes, a local model ran on a personal GPU can maybe get you some funny remixed art and some text that's slightly better than markov chains Which if it's for non commercial use... go nuts. I personally don't like the slop. If it's for replacing a part of the pipeline in a commercial project, like writing NPC dialogue in your indie game... you're infringing copyright in order to make a product that you can only benefit from in the context of copyright being a thing.
@mer@kaia@hj@sun@why it doesn't matter what they call it or if cognition is involved. i'm just saying what i've seen in the wild. you can do a lot with current generative models, including all of the cases you mentioned that "will never" be able to be done. this argument (that we're having, not yours specifically, because i get what you mean) is stupid anyway. it's like arguing if tesla autopilot is really true level 5 self driving. it's not, but you can do a lot with the current system if you're not a retard and use its strengths instead of blindly relying on it. same with generative ai
@mer@nosleep@kaia@hj@sun but plagiarism doesnt exist, plagiarism is ethical and all art is plagiarism. this is a fact and not up for debate since both of us support copyright abolition.
your second point is irrelevant too since were still talking about art here. if an artist has low standards and uses shitty bad dalle slop, people will stop giving them money for commissions and theyll not profit. if the artist uses the tech well and it helps them make better works, then it helps them and it works. nobody is saying chatgpt should replace civil engineers. everything in this thread is elective non critical artworks
@why@nosleep@kaia@hj@sun no, you are either not understanding or ignoring the obvious distinction: there are ideological problems with AI (that it is essentially a plagiarism obfuscation machine) and there are application problems with AI that AI cannot and should not be trusted for critical tasks
@mer@nosleep@kaia@hj@sun so in your utopia you have a list of things a machine learning application can and cannot do? if you had your way would we still be employing switchboard operators?
@why@nosleep@kaia@hj@sun It is relevant because you make the leap of logic that models that can do good enough on non-critical tasks can scale up to critical tasks if you just invest more power/training data. That is a sophism.
@lebronjames75@hj@kaia@mer@why my original point is that you can do it completely locally and ethically and effectively and for a good purpose but people still go apeshit at you. because it's a moral panic
@jeffcliff@nosleep@kaia@hj@sun@why 'm gonna block you now because you are unable to parse simple scopes of discussion and that makes any talk pointless.
@meso@nosleep@kaia@hj@jeffcliff@sun@why@mer mesho any single person has to say one concept at you as a reverse psychology trick, no matter what it is, you will become obsessed by it
@nosleep@kaia@hj@sun@why It's actually not meaningless. Because the tesla "autopilot" may look good, but anything short of total (theorized) level 5 is unacceptable when lives are at stake. Yes, there are applications to models, but proofreading and parsing documents aren't those if the documents in question are in any way critical. And the examples you used are. If the abridged version of excel documentation changes a line of code and I end up fucking up all the business' data, that is a real loss.
The valid applications usually boil down to replacing minimum wage service workers like fast food cashiers (and even that they have trouble doing so it's malaysian guys in call centers disguised as AI). The important part is that the part of the pipeline that you replace must be non-critical to the point you already expect some failure and the loss associated with it is baked in your business model.
Yeah I can safely say that people who complain about the environment especially about server farms are absolutely retarded and have no idea how a power grid works.
They complained about mining farms, and now AI. Give me a break. Some places are hydro power where's the damage on the environmentwhen using a few more Kw?
A large scale industrial plant can easily use over 1000 Mega watts a day.
A drop in the bucket for most computer and server psu.
#1 Power is diverse. #2 A majority of all power consumption is industry.
This is like arguments of use electric cars instead of gas powered to help the environment for consumers. Which btw most emissions are contributed to industry.
Post-three mile island a significant amount of systems engineering work has gone into modern nuclear plants to remove the complexity the humans who run them face. The difference between 'robots' and this stuff is basically purely semantic.
Same goes for NORAD and other systems that have kept us alive for 50+ years now whether you call it 'hey hai" "robots" or just "applied statistics" - your life depends on computer systems working.
@jeffcliff@nosleep@kaia@hj@sun@why@mer Yeah, ok but those are automated systems that operate predictably because of the nature of its design. AI is, by design, a completely different tool, and inappropriate for this purpose. There should be humans running critical systems, with computer aids (if desired).
There isn't a need for an artificial intelligence running a nuclear plant.