V3 was designed to close the loophole of making products that run on locked down Linux, releasing the code to them to satisfy the license, but not letting you run modifications on the real hardware.
In reality it just accelerated the rise of BSD licenses and kept other important critical projects like Busybox and Linux on V2. There were also the controversial BusyBox lawsuits that led to "no GPL in userspace policies" and a BSD0 fork Google uses.
@djsumdog@PhenomX6@thenewoil it's depend on the version because technically if GPLv2 apple was in breach of licenses (if my memory is right GPL2 cannot be used in proprietary system).
I did not know about that nvidia leak/presentation. It is crazy one of the major reasons Clang/LLVM exists is due to licensing. Apple has also been systematically removing as much GPL code as possible from their base operating system for the past decade. It's why they had an ancient version of Bash with backported security patches for so long before finally switching to zsh.
I still believe in the GPL. Everything I write is AGPLv3 these days.
oh I agree with you, and I think that blog post does. Author was going on what they thought Microsoft’s argument would be, which is total BS. … I kinda wonder what happens to stuff that’s already been written in co-pilot .. I wonder how many companies have banned it internally just due to these legal reasons. I know at my current company, we’re not allowed to use any libraries that have GPL code or pull in GPL dependencies (we have a CI task to check).
@djsumdog@PhenomX6@thenewoil yeah but all right reserved mean ALL right reserved, so this argument's can go right in this trash right there. Because using the code even for print on toilet paper is a breach of the licenses.
woah .. I was gonna say I don’t but a lot of stock in DeVault (he’s an ass) but he’s right about the licenses and the video is … pretty telling
Microsoft’s argument, with a heavy reliance on the fact that the model becomes a general purpose programmer, having meaningfully learned from its inputs and applying this knowledge to produce original work. Should a human programmer take the same approach, studying free software and applying those lessons, but not the code itself, to original projects, I would agree that their applied knowledge is not creating derivative works. However, that is not how machine learning works. Machine learning is essentially a glorified pattern recognition and reproduction engine, and does not represent a genuine generalization of the learning process
Same. I haven't even tried Autopilot, but I hate the idea that they may have trained it on everything people have contributed.
I'm going to guess they limited it to repos with permissible licenses or Stack Overflow posts, but we'll have to see if this case goes forward and what comes out in discovery. I'm going to guess they did not.