@Evan Prodromou Which was my point exactly. I.e. without a way to verify that the new key actually belongs to the same actor, just refetching from the same URL isn't a solution.
There must be a way to verify rotated keys that don't involve trusting the implementation of the remote endpoint.
@Evan Prodromou Let's assume another scenario: https://social.example/actor/a has deleted their identity for some reason. Somebody else registers on the same instance, using the same handle as actor/a. They now have the same handle, but a different key. The URL to the is the same.
How could a recipient (actor/b) determine which of these scenarios actually occurred?
@Evan Prodromou the best practice is to refetch the sender's key, because it's probably been rotated Refetching is probably fine as long as you know you can trust the source. So from what I understand this brings it back to the origin. How can you verify that a signature has been rotated, and the message isn't from some impersonator?
============================== Test summary ============================== TEST TOTAL PASS FAIL ERROR SKIP >> jtreg:test/hotspot/jtreg:tier1 3033 2720 9 0 304 << ============================== When I started on this project, we had about 100 failing Hotspot tests on FreeBSD, in addition to around 40 in the rest of the JDK. Getting below 10 in total (on x86_64) feels like a significant milestone, and worthy of a bit of celebration! :party_popper:
It's been an interesting, and very educational ride. Some of those tests were pretty easy wins, but some required delving deep into the internals of both OpenJDK and FreeBSD, as well as getting acquainted with the basics of the ARM architecture and instruction set. (Remembering how fascinated I was when the Acorn Archimedes was launched, I'd say this was long overdue!)
I finally feel that the OpenJDK BSD port is nearing a state where it makes sense to try to upstream it, and get it fully integrated into the OpenJDK infrastructure and build/test/CI frameworks. There's still a lot of work remaining to get there, it has to be done in portions and with the cooperation of the upstream project, but I hope to be able to spend the next six months or so to get there.
Thanks a lot to the welcoming and supportive OpenJDK developer community, as well as the @FreeBSD Foundation and the people there for sponsoring and supporting the project, and for providing help and insights about the FreeBSD internals when I got stuck.
@Børge @Jon Vær obs på at Aurora ikke har mulighet til å fjerne spionkoden som er bakt inn i så godt som alt du får via google's app-repo.
Embed this noticeHarald Eilertsen (harald@hub.volse.no)'s status on Tuesday, 07-Jan-2025 22:37:28 JST
Harald EilertsenDennis Schubert wrote the following post Fri, 27 Dec 2024 01:20:02 +0100Excerpt from a message I just posted in a #diaspora team internal forum category. The context here is that I recently get pinged by slowness/load spikes on the diaspora* project web infrastructure (Discourse, Wiki, the project website, ...), and looking at the traffic logs makes me impressively angry.In the last 60 days, the diaspora* web assets received 11.3 million requests. That equals to 2.19 req/s - which honestly isn't that much. I mean, it's more than your average personal blog, but nothing that my infrastructure shouldn't be able to handle.
However, here's what's grinding my fucking gears. Looking at the top user agent statistics, there are the leaders:
2.78 million requests - or 24.6% of all traffic - is coming from Mozilla/5.0 AppleWebKit/537.36 (KHTML, like Gecko; compatible; GPTBot/1.2; +https://openai.com/gptbot).
1.69 million reuqests - 14.9% - Mozilla/5.0 (Macintosh; Intel Mac OS X 10_10_1) AppleWebKit/600.2.5 (KHTML, like Gecko) Version/8.0.2 Safari/600.2.5 (Amazonbot/0.1; +https://developer.amazon.com/support/amazonbot)
and the list goes on like this. Summing up the top UA groups, it looks like my server is doing 70% of all its work for these fucking LLM training bots that don't to anything except for crawling the fucking internet over and over again.
Oh, and of course, they don't just crawl a page once and then move on. Oh, no, they come back every 6 hours because lol why not. They also don't give a single flying fuck about robots.txt, because why should they. And the best thing of all: they crawl the stupidest pages possible. Recently, both ChatGPT and Amazon were - at the same time - crawling the entire edit history of the wiki. And I mean that - they indexed every single diff on every page for every change ever made. Frequently with spikes of more than 10req/s. Of course, this made MediaWiki and my database server very unhappy, causing load spikes, and effective downtime/slowness for the human users.
If you try to rate-limit them, they'll just switch to other IPs all the time. If you try to block them by User Agent string, they'll just switch to a non-bot UA string (no, really). This is literally a DDoS on the entire internet.
Just for context, here's how sane bots behave - or, in this case, classic search engine bots:
@Erik Denne burde vært ulovlig også før disse endringene. GDPR er klar på at det kreves eksplisitt samtykke, noe som slett ikke er tilfelle med "Ved å fortsette å bruke dette nettstedet samtykker du til vår bruk av informasjonskapsler."
These are, as you can see, for the C++ version, but the visual builder parts was similar. It was just more elegant with Smalltalk as you didn't need a compile cycle, and the language lends itself better to this type of dynamic coding.
@alcinnz Have you ever looked at the VisualAge Smalltalk system by IBM? I think that was a pretty interesting approach at the time. It was very centered around UI design, and attaching actions to the various UI elements.
They tried applying the same approach to other languages later, like Rexx and C++, but especially the last one did not work as well conceptually.
There's GUI design tools today that look somewhat similar to this, but don't really get to the same level and elegance that the original VisualAge Smalltalk system did. At least from how I remember it. It's been a few years...
Have spent a significant part of the past week in the dark bowels of WordPress doing a much needed visual upgrade of the website of my band: https://imbalance.no/.
The result isn't all bad, I hope. After wasting way too much time on the useless "full site editing" feature, I had to go back to the good(?) old ways of classic templates to get the result I wanted. It's not entirely finished yet, and more stuff will come, but for now I felt it was time to get something done, as we're preparing for a few live shows in the first half of next year.
@lfa Really interesting doc! Lots of stuff I didn't know.
The MSX machines were pretty cool too, I remember. Didn't have one myself, but an older uncle had one that I could play a bit around with. My first computer was a Dragon 32 (similar to the Tandy Color Computer in the US), but later I got a used Intertec Superbrain running a weird "multiuser" CP/M variant. Computers were so much more fun in those days!
@lfa Oo, that looks interesting. I remember when the Lisa came, and while it looked interesting at the time, there was no way a kid like me could afford one.
Anyways, thanks for the tip! This might be tonights entertainment :)
@Aral Balkan You know where to find me, if the need arise again :) If I can support your work by providing some resources I have available in any case, that's the least I can do!
@alcinnz It wasn't all that difficult tbh. Probably easier than today. We drew pixels into memory areas, and when it was time to display them they were copied to the proper area in the display memory. With the low resolutions and color depths at the time, this wasn't huge operations.
Allow me the impertinence to begin by summarising your otherwise nobel-worthy piece of literature: In a nutshell, you say we should talk like you, fund like you, and disregard human rights like you. [...]
Excuse me if this does not appeal to me.
Has it even occurred to you that perhaps — just perhaps — we are not failing at being like Silicon Valley but that we do not want to be like Silicon Valley?
Have you ever wondered if maybe we can do better than to adopt an excessively greedy system of venture capital and exits that funds spyware and treats people as capital to be bought and sold?
This piece from @Aral Balkan is about ten years old, but sadly still relevant. (Although the context may have changed slightly.) Well worth a read!
@Perpetual Beta 🇺🇦 I'd find it a bit weird to list the shortcuts of various desktop environments as Linux specific too. Most of these DE's run fine on other Unix-like systems as well, so they're not limited to only run on top of one type of kernel.
Metallhue, programmerer og hedning. Initiativtager og primus motor i Norsk Urskog, vokalist og bassist i thrash metal-orkesteret Imbalance, bassist i Blastered, tilhenger av fri programvare, opptatt av datasikkerhet (CISSP) og personvern.Metalhead, programmer and pagan. Initiator and main force of Norsk Urskog, vocalist and bass player in the thrash metal band Imbalance, bass player of Blastered, supporter of free software, Certified Information Systems Security Professional (CISSP) and keen privacy advocate.