This is how I learned someone had tried to shoot Trump. Also this is a great example of why #Mastodon desperately needs the ability for us to turn off replies on posts: if I were someone with much of a following, it's the kind of post that would invite discussion of the wisdom of shooting a president (FTR, I think it's stupid), which is one of many free speech nullifiers under US law. https://mastodon.social/@dangillmor/112783642933338644
Oh good grief, could tech "journalism" please outgrow their juvenile obsession with tiny signifiers? "Apple used orange first, you can't use it now!" It's the tech equivalent of 'everything deploying the same very general plot arc is a copy.'
Next up: If anyone but Mercedes-Benz makes a 4 wheeled automobile, they're copycats, because Mercedes did 4 wheeled automobiles before anyone else.
@goatsarah have they always been that way, & just had better PR in the past?
(My token experience is that at least British rail service has gone downhill in recent years. We were in England/Scotland for our honeymoon c2007, then again for a conference in 2019, & took a train north from London both times (Edinburgh in '07, York in '09, so same line). In just those 12 years, the experience degraded really strikingly.)
@goatsarah this makes me sad, as one category of things I've had on my retirement list was using some of the extensive European rail networks I've been reading about my whole life.
@ChrisWere@atomicpoet there's this brother duo that does strength training vids; they're also competitive body builders, & they did a vid following them both through a "cut" cycle getting ready for a competition, subsisting on broccoli & chicken for two weeks. They looked like shit, exhausted, washed out, & totally owned it. 'This sucks' being a continual refrain.
@nyrath [facepalming intensifies /] guess we're either gonna bring 2 astronauts home in a leaky capsule or find out how much slack there is in ISS' resource budget.
Pisses me off sometimes that the good things Machine Learning CAN do (like radically improving hearing aid tech) are getting swamped by the bullshit flowing from LLMs. (Srsly, this could improve the lives of a lot of people. THIS is the kind of stuff we should talk about, not robotic clones attending meetings.) https://front-end.social/@stephaniewalter/112568501168054468
Implicit claim: training off captured data is done locally Likely bullshit, in several ways. To my knowledge only Apple has baked in the hardware required to support this (their secret, previously unused chips). Without GPU support, would use lots of CPU time; either way locally-training machines would run hot & have shortened lifecycles. (faster turnover! drives subscription-based licensing!) ... #Microsoft#Recall
Then we get to the whole MS-OpenAI relationship. It's been widely pointed out that given observable financial realities, MS could end up stealth-owning OpenAI via compute credits (which it will probably report as revenue); but is anything in the GPT family portable? How hyper-local can you get? I think the bottom line ends up being that #Microsoft#Recall is largely #bullshit, in the sense of not really being a product that will ever exist in anything like it's described form. ...
I think the whole 'everything is local' spin will drive a lot of breathless speculation about MS using open source models, or only-local models, etc., & I think that's just MS enabling people in bullshitting themselves. Truth is we know nothing about this system because it doesn't exist yet; & velocity of dev suggests that when it does exist, it'll be a cobbled-together mess with massive vulnerabilities that's passing huge amounts of tokenized data to a cloud-based LLM. #Microsoft#Recall
Implicit: local data does not contribute to the cloud model's training Again, what's the utility of the product if it's not available through the cloud? Microsoft gave up on mobile. Fantasy-assistant #Samantha isn't very useful if she can't be with you everywhere. Bullshit element here may be that Recall as a useful product is vaporware that's primarily intended to drive share price, not an actual product. But M$ usually still wants revenue. #Microsoft#Recall
... & would kill battery life on the as-yet-vaporware #CoPilot Notebooks. So I don't think it'll really (at least not fully) "train locally". The scheme must somehow involve training in the cloud. Which means some kind of data has to pass to the cloud. Where (again) it can be exploited. #Microsoft#Recall
Final point because I look back & realize I didn't explicitly make it: #Microsoft#Copilot#Recall is basically in no small part a way to mine the personal files & business work product of users for #GenAI training data. Any claims to the contrary don't pass the smell test. In the process they will be incentivizing more-rapid hardware update cadences, which means either new licensing revenue, or more users shifting to subscription model. All with added energy & resource use from running harder.
But does that really matter? Yes: because what we know they CAN do (take screenshots every 5 seconds, encrypt them badly, do some kind of localized tokenization that sends exploitable data to the cloud using systems with known weaknesses, etc.) is pretty bad. #Microsoft#Recall
@weirdwriter Their assumption that algorithmic serendipity is somehow inherently un-tainted by algorithmic bias would be charming if it weren't so insidiously dangerous.
@powersoffour@futurebird@cyberlyra Thinking this could violate data privacy laws in the EU & in a number of US states. So, not surprised (& in fact conditionally pleased) that the capability is no longer there.
@inthehands I'm having some trouble with this. My experience, along with counsel I've been given over the years, is that one MUST attend to the clearly stated job requirements. That someone's flagged your application to ensure it passes, they must be addressed if one's to make it through the initial screen. & that in an annoying number of cases, the requirements literally CAN'T be ignored, because the screen is automated.