they're shutting Skype down.
so long and thanks for all the ~ BLOOP-BLUOP doot-dit BLUOP-BLOOP dit-doot ~
they're shutting Skype down.
so long and thanks for all the ~ BLOOP-BLUOP doot-dit BLUOP-BLOOP dit-doot ~
world's worst cocktail right there
you can probably hire better devs by going "pspspspsps" than you can with most of the job ads I see
@ryanc if I saw that in a code review I'd be worried it's a backdoor lol
there are lies, damn lies, and the S in SNMP.
game where you have to try to avoid all the AI shit companies keep throwing at you, call it Escape From Markov
@ryanc nice :D
> why did Printables not check with the copyright holder before doing the takedown? isn't that a fairly fundamental part of copyright enforcement?
stupid laws and the resulting risk avoidance.
if someone makes a copyright claim to a platform and they do not action it in line with the schedule prescribed by DMCA, the platform operator is open to being sued by the claimant. now, in *theory* the process is "claim comes in, is vetted by a legal professional, and is actioned". but...
@millihertz ... the legal vetting is prohibitively expensive and onerous for any public platform that accepts user submitted content.
so, as a risk avoidance measure, they treat all claims as valid and process them automatically as long as they contain the minimum necessary information as defined by law, *regardless* of whether or not it is an obviously fraudulent or perjuring claim, because if they reject a single valid claim then they can be sued. so they accept everything.
@millihertz in the case of a fraudulent claim, the onus is then put on the copyright holder to file a counter claim, which then shifts the burden of legal risk away from the platform and onto the counter claimant, since the counter claim is filed under penalty of perjury.
unfortunately, this leads to an imbalance of consequences that is prone to abuse by harassers and other miscreants. I wrote about it here previously: https://codeinsecurity.wordpress.com/2023/08/16/the-dmca-enables-harm/
@ryanc there's an implementation here:
https://www.hackerfactor.com/src/jpegquality.c
imagemagick also has one in its `identify` tool but their code is utterly inscrutible so I've got no clue where it's implemented or how.
@ryanc ah, looks like it's all Node? that's unfortunate. I don't really want to add that to my toolchain for this blog. I'm trying to keep it ADHD friendly so I can just write and go, no maintenance or services. the entirety of the stack for this right now is one C# console app calling out to pandoc and then SCP'ing the files to the server. I write the blog posts in Markdown (using whatever editor I feel like), run the tool, and the post goes up. super low friction.
@ryanc part of my requirement here is to keep disk usage small since I'm running a fairly small VPS and would like my blog to stay there and not cost me a lot of money.
re: that first service, nope, do they offer it in a standalone form that I can invoke on Windows?
TIL you can closely estimate the quality factor a JPEG was encoded with by looking at the AC table and doing some fairly simple maths.
as part of my blog generator I'm optimising images before publishing, so if I look at a JPEG and can see that it was already compressed below Q=85, I probably won't see enough of a size saving by re-encoding it down to Q=80 to justify the extra perceptual losses of a repeat encoding.
for PNG images I attempt an indexed colour encoding, subtractively combine it with the original image data, threshold the luma and chroma deltas to get a per-pixel "different/same" value, sum those, and if less than 1% of the image is affected then I keep it. if not I fall back to max compression RGB888. (I don't use transparency in blog images)
on average this is saving me about 50-60% over whatever random PNGs and JPEGs I'm passing in. most of the savings are coming from PNG optimisation on screenshots.
anyone got a copy of that "I am the keyboard I have an important message... E" processor interrupt meme with the pissed off looking bird laying around? I just remembered it and I can't find a copy.
UPDATE: Success, thanks to @SnoopJ
and this is a 100% known problem. and it's solved using active current balancing on the GPU side. you put shunt resistors in series with the lines, measure the current, and actively shift the current draw in realtime to keep everything balanced. (you can do this with an ideal diode-OR controller, or separating the high-side feeds to separate sets of VRM phases)
nVidia *has* done this on some prior cards. but they reduced the shunt count here, running stuff in parallel, and this is the result.
source of the photo and lots more details here:
Graham Sutherland / Polynomial
he\himInto electronics, windows internals, cryptography, security, compute hardware, physics, colourimetry, lasers, stage lighting, D&B, DJing, demoscene, socialism.Currently looking for infosec work. See pinned post for details.I am mothman.Heavily ADHD.Nullsector/laser team @ EMF Camp, lasers & lighting orga @ NOVA Demoparty.I sell funny warning stickers at Unsafe Warnings: https://unsafewarnings.etsy.comAll posts encrypted with ROT256-ECB.Header photo by @jtruk
GNU social JP is a social network, courtesy of GNU social JP管理人. It runs on GNU social, version 2.0.2-dev, available under the GNU Affero General Public License.
All GNU social JP content and data are available under the Creative Commons Attribution 3.0 license.