So in another thread there was a discussion of how the law came about and it raised a bunch of questions about history and anthropology for me. As a matter of anthropology and history, which came first, codified laws, or states? #anarchy#anthropology
My assumption based on light historical reading is that societies functioned well without codified laws for millennia, and even millennia after the invention of states much of law wasn't codified, it was essentially negotiated ad hoc. I mean Magna Carta was 1215 right?
The difference between and #Anarchist and a #Liberal is that a Liberal looks at what's happening in the US and thinks "oh my God it's all falling apart", and an Anarchist looks at what's happening and says "oh my God it's all coming together".
The crisis in scientific publishing is at its core another area of tension between authoritarian institutionalism and #anarchy. The people who are calling for "rigorous peer review" and to uphold "high editorial standards" and such are basically at their core trying to uphold a hierarchical view of society. There's "the people who are authorities" and "the little people who must bow to authority". It's trash, in the same way that capitalism is trash and a state run by elected elites is trash.
LOL I woke up to my mentions all blown up with a bunch of messages, and wondered what controversial thing I said right before bed, but it was just tea nerds geeking out on the physics of power distribution and electric teakettles. Keep up the good work fedi.
@GhostOnTheHalfShell The problems of enshittification are already there even with unicast updates. At the moment the technical problems of updates is solved with vast networks of data centers. Updating a billion android phones is not a cheap process for Samsung or Motorola. They're doing it at the cost of megatons of carbon release per year. We are talking about potentially reducing that to a marginal cost of about zero. A raspberry pi could do it @dentangle@librecast@onepict
Suddenly streaming things to "the world" is completely democratized. That is, any person who can afford a cell phone can suddenly become a provider of information to a practically unlimited number of viewers without a massive trillion dollar corporation intermediating that access.
@GhostOnTheHalfShell Multicast is a really neat tech that not only allows for greater efficiency in some circumstances, it also allows for greater control of how and what the endpoint consumes. There's a bunch of human and political implications, and certainly push back to be expected from commercial interests. @dentangle@librecast@onepict
A graphical comparison of the distributions of time-to-completion for naive Multicast vs TCP Unicast, in multiple experiments in three different schedules. (y axis is logarithm of time in hours)
TCP clearly benefits from staggered start. Multicast doesn't seem to matter actually.
This is wallclock duration of client I guess from the end of its initial wait time til it's finished collecting the file. It also would be interesting to see duration of entire experiment in each condition.
@GhostOnTheHalfShell Oh for sure. For example like a an interactive control panel to enable or disable certain streams is a great idea. No one wants billionaires to push harmful updates on them. Of course existing unicast updates have the exact same issue. So it's not a multicast problem per se @dentangle@librecast@onepict
@GhostOnTheHalfShell But the idea of doing software updates via multicast is a good foot in the door because it has clear benefits for everyone. Imagine you've got a stadium full of vulnerable IP cameras. No one wants to click "update" on each of 2000 web control panels. And no one wants a botnet of ip cameras. Also no one wants traffic choking their network or to buy a beefy server just to handle the thundering herd of cameras during updates. @dentangle@librecast@onepict
@GhostOnTheHalfShell For example, instead of streaming 4-5 conferences at JuliaCon to YouTube via Zoom, two proprietary services, suppose the conference organizers can stream 20-30 camera angles and associated audio streams to the global internet with no intermediary. And archive.org can pick them up, but so can your phone or desktop. And each participant gets their own custom subset of info. @dentangle@librecast@onepict
@GhostOnTheHalfShell Nothing about multicast makes that difficult. For example each camera could listen to one of 10 randomly selected multicast groups. Then you update one group at a time.
From the question of the internet... There could be 20 million sites with cameras. The mfg would clearly prefer to broadcast their firmware updates to the 20M clients rather than have a beefy server network of hundreds of machines to deal with 20M clients all wanting updates @dentangle@librecast@onepict
@GhostOnTheHalfShell Existing installs of large numbers of ip cameras are indeed likely to be managed by a big firm and such. The future doesn't demand that stay the same. There are plenty of use cases for software updates by multicast. For example global windows OS updates, consumer smart home devices, ISP router firmware, university campus heterogenous devices (printers, cameras, student supplied equipment in dorms, etc). Smart TVs, android phones... @dentangle@librecast@onepict
When sent to a multicast destination, a single self-contained IP camera could send its stream out and be received by say 10,000 viewers. Let's call it a watt of power consumption. When 10,000 people watch it on YouTube, a server must send a separate copy to every viewer. Suppose it's a 3Mbps stream. One camera, one watt, vs a cluster of servers capable of on net originating 30 Gbps of stream to the 10,000 viewers. Maybe that's 30 servers
My understanding is that OTA updates for Android phones can take 6 months to roll out, because each device checks for updates but isn't allowed to download it until it wins a lottery or something (random number generator on the server?).
RTP would be a likely protocol to carry video and audio, the only difference being it would be sent to a multicast address instead of your individual desktop.
And 300 watts each, so let's call it 9 kilowatts, or 9000 times the energy consumption. And the cost of the camera is $50 vs 30 servers at maybe $50,000 so a thousand times less infrastructure investment.
The librecast project is some smart and socially motivated people who started working on these questions around about 10 years ago. So you're not wrong.the motivation isn't just about reducing carbon though it's also about enabling freedom. @dentangle@librecast@onepict
Applied Mathematician, Julia programmer, father of two amazing boys, official coonhound mix mutt-walker.PhD in Civil Engineering. Debian Linux user since ca. 1994. Bayesian data analysis iconoclast