@graf@SuperDicq As long as it's local and you can easily turn it off I don't have a problem with it. In a lot of these uses it's just being used as local search, which is something people want.
It's exactly this all over again. The hype will die out, 95% of "AI" features will get removed except the few ones where it makes sense and the technology is actually useful.
@sun@SuperDicq i see a huge problem with it when they're all running the same copy paste AI code since they're android/iOS apps that all now feed off the same cache in the same location.
you aren't stupid. think as a fiverr developer for even a minute before giving this a pass. AI doesnt belong in phones like this and you know it
@sun@shitposter.world It's just a more efficient way to make things that we could already make before we had the tool. There are no new ethical questions that can be asked that we couldn't ask before and apply it to any other image, video or text editing tool or whatever.
@graf@SuperDicq Oh, I am talking about the local one for your personal data using an onboard AI accelerator, Apple is also integrating OpenAI though which is potentially bad.
My opinions aren't set in stone about this yet, I can be convinced.
So, this is a strong argument for software freedom. If you accept this on your phone, you are trusting that what the corporation that made your phone OS, says NOW, but a year from now they could be like "it's always-on and we're moving your data to the cloud."
ai should not be storing anything about you -- apple does, microsoft does, google does, chatgpt does, anything with AI does you know why? because you're the product. they make money selling you. you cant be this dense to think apple isn't balls deep in doing the same, truly?
@sun@shitposter.world@graf@poa.st I'm not even gonna read the context of all the previous posts and you're saying "Apple is doing it right" and I'm just gonna say no.
@sun@SuperDicq im sorry moon if you believe somebody who you let install an app on your phone is "only running AI locally" and "nothing leaves your phone"
@poastoak@sun@SuperDicq I got rid of every apple device I had early this year except my macbook and that i set to a random throwaway email i couldn't even tell you what it is and they still text, email whatever after i asked them to remove my contact details
@graf@SuperDicq@poastoak I was thinking about it anyway, I am just trying to think through the pros and cons. The real reason I got an Apple laptop in the first place was my Linux laptop died and I needed a new laptop right away and I chose build quality as my top priority (because the previous laptop failed.)
@mer@shrine.moe@sun@shitposter.world I think I understand what you're getting at but your point is kind of confusing because you talk about things that really shouldn't matter like if it's done by a corpo, commercial or non-commercial and the mess that is "IP laws".
When it comes to ethics of data usage in training datasets I think personal data should not be used by anyone period. No matter if corpo or not. This should be illegal.
When it comes to information that is publicly accessible, I think it everything should be usable in a training data set. It shouldn't matter who makes this training data set, and it doesn't matter what the original author of this information thinks.
@sun@SuperDicq here's the ethics thing: no one fucking owns their training data >but I paid for it plenty of data brokers sell shit they don't actually own >but all the posts on said platform are free game because of the EULA EULAs are famously non-binding and full of unenforceable clauses that blatantly contradict legislations >but I'm using it only for non-commercial use some of this data may still be illegal as it's been redistributed commercially before ending in your tagged dataset, wide personal data legislations like GDPR are often infringed upon >but I'm an IP abolitionist and IP laws are just a tool the corpos use to oppress us precisely, it's not morally bad to run your local model on illegal datasets but it *is* illegal and we shouldn't let corpos that already use DMCAs like their personal attack dog have another free pass at being completely above IP laws
@SuperDicq@poastoak@sun I don't believe that either. especially not with suppy chain attacks over stupid shit like "muh ukraine" in the last two years. absolutely no
@graf@poa.st@poastoak@poa.st@sun@shitposter.world If I wanted to generate something based on a chat I had I of course would give the program access to the chat messages. How else would it be able to do its thing?
If this implies that this "AI chat" program is fully free software and doesn't send my data to somewhere it is outside of my control. Otherwise I would not do it.
@SuperDicq@poastoak@sun okay, and when the "software handling" the ai chat to ask how to respond requires the tokens for the conversation to unlock it to read it, which it will, because its matrix. you're okay with that?
@SuperDicq yes I am talking about multiple scopes, I may get ahead of myself but I'm anticipating the responses about illegal datasets being plainly illegal under copyright law saying that it's okay as long as it's non-comemrcial use ( @sun has argued so before)
>I think personal data should not be used by anyone period >[I think information that is publicly accessible should be usable in a training data set] with the state of data gathering, the venn diagram of these two is almost a circle, this is why personal data legislations aim to protect certain data items regardless, also some data items may be protected by older laws like your image and publicity rights
I'm not a legalist and I want drastically less IP laws. But before even talking about ethics of doing AI in a vacuum, we must think about the ethics of AI in context. And in context, AI is an IP law obfuscation machine. >copro takes what it wants >puts it in a blackbox >the output cannot be meaningfully used as evidence against IP law infringement >the dataset has been hashed or deleted or is classed as industry sensitive
@SuperDicq@poastoak@sun you might be the most retarded person ive ever met, like so much so that you might be a soothsayer for other retards. have you ever considered a job in soothsaying?
@fkq1q2r2@noauthority.social You can't legislate moralityYou can and we do that all the time. A lot of morality is universal. Pretty much the entire world agrees that "murder is bad", etc. it has to be engrained in you as a child.I don't think that is true at all. I don't feel like people just stop learning at a certain age, you're always learning. Having a lack of proper education and stable environment as a child does mean you will start off very behind on everyone else when you reach adulthood. That is why the protection and continuation of nuclear family is vital to a healthy society.I don't think the nuclear family is the only valid family formation. I actually think the exact formation of the family does not matter whatsoever as long as the environment for the child is stable and supportive.
@SuperDicq You can't legislate morality, it has to be engrained in you as a child. That is why the protection and continuation ofnuclear family is vital to a healthy society.