@feld On a similar note, it still sort of holds to this day.
ZFS originally made by Sun and later released under a non-GPL compatible license is still used a lot for archival storage. Its GPL replacement originally made by an Oracle employee is mostly in maintenance mode after Oracle mostly abandoned the project. SUSE still maintains it and uses it as the default filesystem in SLES, but it's still half-developed after years with many pitfalls.
XFS - originally made by SGI, now developed by Red Hat
ext* - possibly the only GPL filesystem that is used a lot outside of specialized needs (inspired by the MINIX filesystem and UFS)
JFFS2 - updated version of a filesystem made by Axis communications
F2FS - developed by Motorola, Samsung, Huawei, Google
All of the examples except one, or maybe two, wouldn't exist without a company developing it for their own commercial needs first.
LVM is the same. If it weren't for Red Hat's use of it everywhere, it definitely wouldn't be as usable as it is today.
BSD? Do you want a storage appliance with support. You go to iX Systems or Oracle. One uses primarily FreeBSD, the other bought the company that built the FS in the first place. Other users: Netflix, literally every enterprise firewall that doesn't use purpose built hardware for high-speed throughput.
Illumos: Nobody, same with all the other operating systems made to replace discontinued proprietary ones (ArcaOS and alike)
@sendpaws@phnt lots, it's just never talked about publicly. A few off the top of my head for BSD:
Cisco Juniper Dell/EMC Netapp Netflix Limelight Nintendo
Illumos is something I am less familiar with. It probably lives on in some industrial controls type companies.
I wouldn't call Sony's usage "license dodging". They have taken FreeBSD and completely transformed it for their console's OrbOS. It still has jails which are used to launch games, they ripped out a bunch of syscalls from the kernel and added new ones, plumbed in changes to optimize it for gaming, etc. Even if they released the source it would be pretty much unusable by anyone else.
One thing people don't realize is that videogame consoles don't behave like a PC -- they don't have graphics drivers. That architecture actually makes the GPU usage slower / less efficient. Although my knowledge about this is aging and it's possible modern PCs have done things to close this gap...
@feld@phnt >One thing people don't realize is that videogame consoles don't behave like a PC -- they don't have graphics drivers. That architecture actually makes the GPU usage slower / less efficient. Although my knowledge about this is aging and it's possible modern PCs have done things to close this gap...
It hasn't been that way for a while, especially not after the whole "SDK on the console" approach that Sega and Microsoft used with the DC and Xbox. You just never have to ever touch the GPU driver part unless you're writing an SDK for it. That's why the DC has it's own open SDK for it. https://github.com/KallistiOS/KallistiOS
The PS4/Xbox One definitely run OSes. The Xbox runs a NT kernel, the PS4 runs BSD.
As for license dodging, it's so they never have to release a single line of code back.
@sendpaws@phnt You're misunderstanding. I don't mean that they don't have an OS, it's that they don't have the same type of "driver" and that you would install on Windows or Linux (or FreeBSD) to expose an API (DirectX, Vulkan, etc) for talking to the GPUs.
I'm trying to think of who I know that can speak with some authority on this. I suspect @TTimo can clarify and correct me on the major differences that go beyond hardware
I know @thendrix has done plenty with consoles and so has @icculus but dunno if they have time to chime in on this rabbit hole we've gone down (how you access the GPU in a console vs PC at a low level)
@sendpaws@phnt@TTimo When you use DX you're not talking directly to the GPU, you're talking to a library that then figures out how to do that for you.
Pulling a random comment from an old HN thread because the forum links with more details are dead, sadly:
> But you are also wrong in the larger picture. Direct3D and OpenGL are first and foremost abstraction layers to access the GPU. Since in a console the hardware is immutable, you can gain a lot of performance by skipping (or trimming the fat of) these abstractions. > The XBox 360 version of DirectX is very different from the PC version: it's much much closer to the metal and exposes pretty much all the GPU functionalities.
also:
> "Sony is building its CPU on what it's calling an extended DirectX 11.1+ feature set, including extra debugging support that is not available on PC platforms. This system will also give developers more direct access to the shader pipeline than they had on the PS3 or through DirectX itself. "This is access you're not used to getting on the PC, and as a result you can do a lot more cool things and have a lot more access to the power of the system," Norden said. A low-level API will also let coders talk directly with the hardware in a way that's "much lower-level than DirectX and OpenGL," but still not quite at the driver level."
On the PS4 there was an API called the GNM; PS3 had one called the GCM.
> "At the lowest level there's an API called GNM. That gives you nearly full control of the GPU. It gives you a lot of potential power and flexibility on how you program things. Driving the GPU at that level means more work."
@feld@TTimo@sendpaws Regarding DirectX and Windows/Xbox, there are multiple abstractions at play. There's the DirectX API with a library, there's the userspace driver that handles the DirectX calls made from the Windows runtime and translates them into something the NT GPU driver can understand and then there's kernel driver which finally talks to the GPU. That's how this works on PCs and I doubt that Microsoft reengineered this whole behemoth specifically for Xbox since they use the same kernel since Windows 10.
Incidentally that's how Intel was able to emulate Dx9 on the first gen Arc GPUs. The userspace driver advertised Dx9 support and emulated it completely in software. The kernel driver had no idea what Dx9 was.
For Playstation 4, I remember reading a post or maybe watching a video about how the games have more-or-less direct access to GPU memory, but I'm also likely misremembering that since that was years ago when the first PS4 jailbreaks popped up.
> That's how this works on PCs and I doubt that Microsoft reengineered this whole behemoth specifically for Xbox since they use the same kernel since Windows 10.
I suspect they were able to change the driver model in the Xbox's forked NT kernel to remove some barriers that affect performance on PC as you don't need security features here. Untrusted code is not executed on the XBox, only games.
@feld@phnt@sendpaws There was a second company selling Illumos VMs too but I checked it out one time and it was nonfunctional, nobody writing articles about it ever actually tried it
@feld@TTimo@sendpaws Maybe, it's possible, but also the whole OS and games run in a VM (two VMs actually) which probably affects performance more than removing security barriers in the render pipeline.
@mischievoustomato@phnt they also destroy a lot of good stuff by taking it to the grave with them. It's sad but this is the cost of business.
You want investment into groundbreaking technologies you have to accept that they're going to patent them and squeeze as much as they can from them. If nobody wants to pay they just disappear forever.
@mischievoustomato@phnt capitalism is an incredible human motivator to solve problems. It can also completely destroy people and cause massive inequality, but it doesn't *have* to.
We can have a healthy society AND still reap the benefits capitalism.
@feld@phnt@mischievoustomato there is not realy any benefits to capuitalism because what capitalism actualy means is rich guys can own capital wich basicaly means they control the world
@dick@phnt@mischievoustomato the advanced processors, the electronics components, pretty much all consumer electronics, the entire network infrastructure, etc that we are using would not have been built in a fantasy world of communists because duplication of efforts and risk of potential waste from failed experiments is not exactly something that fits the communist world view.
But his public "have you ever even kissed a girl?" comment will live forever in my brain as one of the funniest fuckups a famous tech guy could make. He just needs to wear it as a badge of honor and admit youth comes with consequences
@feld@phnt@sendpaws he's apologized for it repeatedly but he's said actually-detestable stuff since then that I won't go into because I don't have proof on hand
@sun@phnt@sendpaws demanding a pronoun MUST be gendered is pretty psychotic
rejecting a pronoun change because it provides zero value to the codebase (e.g., relevant code wasn't refactored so there is no need to update the code comments) is fine IMO.
@feld@phnt@sendpaws I actually personally prefer to use they even when I know someone's gender, so I agree. But you might be surprised how many people consider using "they" instead of their preferred "she" is "transphobic". No, you're just not special, and lack of gendered pronoun is not "misgendering" you
@dick@phnt@mischievoustomato I know that material and I also know this material: there'a purity test and the other communism wasn't pure enough. A pure communisum would surely succeed :mmmhmm:
@feld@phnt@mischievoustomato what i also say to you is it easily could happen under Communism but as you should know if you are actualy educated all "communist" countries have not been communist at all but only socialist. there is a difference if you actualy know the material
> You want investment into groundbreaking technologies you have to accept that they're going to patent them and squeeze as much as they can from them. If nobody wants to pay they just disappear forever.
>>If nobody wants to pay they just disappear forever. >Optane 😩 Yeah, Optane was a misunderstood product by many. I still cherish the few functioning drives at work for meaningful usage.