Ok, Youtube killed a tool that is essential to watch videos on Invidious (inv_sig_helper). Because of this my server literally exploded because invidious tried to reconnect to that tool which is not able to start at all *due to youtube changes*. There is nothing I can do (for now). I will try to find a way to fix it tho, but since that tool is written on Rust, I'm 100% clueless on how to fix it, and for sure, the fix is not an easy task. So yeah, Invidious is dead until a new update.
@lxo@nemobis@fijxu It's not a matter of running software from strangers.
It's a matter of having to run software without being able to even check it - browsers are designed to execute JavaScript without giving you the opportunity to check it first.
thanks to librejs and noscript, I could check it first. but it would still be running under someone else's control, so it's hardly distinguishable from any nonfree software. even if it's nominally free software, it's not free for me in that setting. it's like tivoized software.
the key point is that it's coming from, and is thus under control of, someone else's (strangers') servers, rather than from my own.
if it were running under control of my server, then it would be under my control, i.e., it would be free for me.
running under control of someone else's server, it may be free for the server operator, but not for me, so it's not welcome to my computers
@lxo@nemobis@fijxu >thanks to librejs and noscript, I could check it first. Unfortunately, even with those you could not check it first due to the way the JavaScript engine is designed.
There is no guarantee that the extensions will run before the JavaScript start running and JavaScript structures that will bypass then keep being found.
The only way to be safe against site JavaScript in firefox is to go to about:config and set javascript.enabled=false.
>but it would still be running under someone else's control Browsers could give you the opportunity to check all JavaScript before it is executed, save a version you prefer and use only that version, or run a different program with that website instead if you want, but such functionality has not been implemented and has been made needlessly difficult to implement intentionally.
yes, browser support for user-chosen replacements would be ideal
I hope the concern about javascript potentially running before noscript and librejs kick in is merely theoretical. do you have any evidence you could share that it actually hits in practice?
https://blog.torproject.org/new-release-tor-browser-907/ >Open about:config Search for: javascript.enabled The "Value" column should show "false" Either: right-click and select "Toggle" such that it is now disabled or double-click on the row and it will be disabled.
We are taking this precaution until we are confident recent NoScript versions successfully block Javascript execution, by default, by working around a Firefox ESR vulnerability.
Against remote scripts, I am reasonably confident that NoScript is adequate, as they don't seem to ever be downloaded unless you manually do so, but I'm not sure about specifically crafted JavaScript in <script></script> tags - it wouldn't surprise me if more ways were found that allow malicious JavaScript execution in script tags with javascript.enabled=true.
Someone has mentioned that they found LibreJS executed unlicensed JavaScript encoded via a certain method, but didn't mention further details (it seems they were telling the truth).
To have proper security guarantees with this sort of thing, you would need firefox with the JavaScript engine completely disabled (too bad extensions and a bunch of other things rely on that and you can only really get such sort of thing in netsurf).
presumably tagging the javascript as free software, as recognized by LibreJS, would make it work for those who only use LibreJS for javascript blocking.
but really, this entire approach is backwards. I mean, I know scraping bots are making server operators miserable, but the solution to that can't be to make everyone else miserable, slowing things down, wasting computing resources and rendering old computers unusable.
the premise that blocking bots is questionable. sure, it makes things more costly for the evil scrapers, but also to everyone else, and guess who has more computing resources to waste? surely not the people who are on old computers who are now being rendered artificially obsolete because of a misguided reaction to scrapers. surely not to people who are trying to automate things on the client side to make up for the asymmetry between servers, that are automated, and clients, that are denied the convenience of automation by stupid captchas that increase the asymmetry and thus the injustice.
playing the video is supposed to be useful to the user, whereas the proof of work is one of the issues that has given cryptocurrencies a poor rep, and for good reason. why are we even considering that, and, of all reasons, to implement DRM on web sites?!? (yes, it is DRM, an attempt to keep some users from doing things they otherwise could, through technological means)
there has to be a better way to tell welcome users from unwanted ones than miseducating users into blindly running programs that web sites push onto their computers, and fighting wasted computing power with wasted computing power by making the overhead permanent and pushing it onto everyone
@lxo It is sad, but we have had captchas forever. The skills to solve captchas are unevenly distributed. A PoW captcha only asks you to spend some electricity, which is a commodity more evenly distributed (among those who already have a browser and are using it for compute-intensive purposes like video). I have not measured how much additional electricity is consumed by visiting this sort of captcha, but I expect it is negligible compare to the playing of the video.
PoW captchas are essentially cryptocurrency miners for the server. when the server throws the result away, instead of using it to pay for the server or whatever, it's PoWaste
it wouldn't be quite as bad as if the server actually took the PoW/mined currency as payment, ideally also offering alternate means of anonymous proof of payment for access (GNU Taler, browserless mining, whatever) to bypass the on-browser PoW
@lxo "playing the video is supposed to be useful to the user, whereas the proof of work [...]"
I don't really see the qualitative difference here. When I play the video or audio there are parts of the decoding that are redundant for me, for example the video may be too high resolution or contain a padding intro/outro I'm not interested in or audio frequencies I can't here.
"there has to be a better way"
Maybe. I don't run an Invidious instance so I don't know. On wikis, QueryCaptcha works.
PoW is by definition an expensive-ish computation.
Sometimes the result is used for something. Then it's not wasted.
What we're seeing is result that's not used for anything, only to slow down access indiscriminately, to burden LLM bots so they don't burden the sites so much. It's entirely wasted computing. How can there be any doubt about that?
This kind of waste multiplies and accumulates, whether or not one keeps track of it.
@lxo It's not like currency, precisely because it can't be accumulated. Timeless accumulation is the problem with currency; without it, a number of problems vanish.
Anyway, we've still not established that any significant waste exists. That's not a philosophical question but something that a power meter can determine.
@lxo Captchas are always waste (compared to the user's task at hand). The question is whether this kind of captcha is more wasteful than the unfortunately more common ones. (The hypothetical benefits produced for datasets of proprietary captcha-makers are not verifiable and would only accrue to shareholders; they need not be counted.)
Yes, I loathe captchas, precisely because they come across to me as the server (owner) saying "I get to automate, you don't, loser". Plus, they run under control of a remote server, so they're nonfree. I'd rather get rid of them all.
But I'm also sympathetic to the needs of server operators who're getting overwhelmed by LLM bots. PoW as micropayment, offering something tangible to the server, besides granting access to the client, with alternate means for anonymous payment available, seems a lot more sensible to me than PoWaste: it solves two problems, rather than barely solving one while inconveniencing everyone.
that wouldn't be a bad approach, but it's not what's going on: lots of sites that didn't use captchas are adopting ones now, because others are doing so. that's not harm reduction, that's epidemic contagion.
now, cryptocurrency miners aren't exactly a great model to follow, not only because they're so wasteful, but because they're time-sensitive: whatever goes into computing a hash with slowish computers at high latency is likely to be PoWaste as well. protein folding, signal detection, and other massive computations that can be broken into smallish verifiable pieces would be more reasonable. and we don't even need a source of funding to "micropay" for such computing: websites that adopt such PoW access controls could refer users (transparently or not) to distributed computing platforms that issue tokens that can be used to micropay for access