@lunareclipse And if your problem motivating better compression is something like "environmental impact of downloads from kernel.org", your problem isn't insufficient compression. It's the mentality of putting CI shit everywhere (and now AI scrapers everywhere) downloading the same thing a million times. If you make the data smaller with better compression, they'll just download it more times.
@dalias@lunareclipse One of the reasons CI does that is that it is so inconvienent to cache things nowadays. CI boxes are stateless by design, and using a network cache requires either an MITM certificate or changing the URLs one uses. Better tooling for this would really help.
@alwayscurious@lunareclipse It's the responsibility of folks wanting to use CI not to be externalizing cost onto volunteer projects/maintainers or onto the world at large through bad behavior. If fixing this is too much of a burden, then don't use CI.