Implicit claim: training off captured data is done locally
Likely bullshit, in several ways.
To my knowledge only Apple has baked in the hardware required to support this (their secret, previously unused chips).
Without GPU support, would use lots of CPU time; either way locally-training machines would run hot & have shortened lifecycles. (faster turnover! drives subscription-based licensing!) ...
#Microsoft #Recall
Conversation
Notices
-
Embed this notice
FeralRobots (feralrobots@mastodon.social)'s status on Wednesday, 22-May-2024 22:05:56 JST FeralRobots -
Embed this notice
FeralRobots (feralrobots@mastodon.social)'s status on Wednesday, 22-May-2024 22:05:54 JST FeralRobots But does that really matter? Yes: because what we know they CAN do (take screenshots every 5 seconds, encrypt them badly, do some kind of localized tokenization that sends exploitable data to the cloud using systems with known weaknesses, etc.) is pretty bad.
#Microsoft #RecallTokyo Outsider (337ppm) repeated this. -
Embed this notice
FeralRobots (feralrobots@mastodon.social)'s status on Wednesday, 22-May-2024 22:05:54 JST FeralRobots Final point because I look back & realize I didn't explicitly make it: #Microsoft #Copilot #Recall is basically in no small part a way to mine the personal files & business work product of users for #GenAI training data. Any claims to the contrary don't pass the smell test. In the process they will be incentivizing more-rapid hardware update cadences, which means either new licensing revenue, or more users shifting to subscription model.
All with added energy & resource use from running harder. -
Embed this notice
FeralRobots (feralrobots@mastodon.social)'s status on Wednesday, 22-May-2024 22:05:55 JST FeralRobots ... & would kill battery life on the as-yet-vaporware #CoPilot Notebooks.
So I don't think it'll really (at least not fully) "train locally". The scheme must somehow involve training in the cloud. Which means some kind of data has to pass to the cloud. Where (again) it can be exploited.
#Microsoft #Recall -
Embed this notice
FeralRobots (feralrobots@mastodon.social)'s status on Wednesday, 22-May-2024 22:05:55 JST FeralRobots Implicit: local data does not contribute to the cloud model's training
Again, what's the utility of the product if it's not available through the cloud? Microsoft gave up on mobile. Fantasy-assistant #Samantha isn't very useful if she can't be with you everywhere.
Bullshit element here may be that Recall as a useful product is vaporware that's primarily intended to drive share price, not an actual product. But M$ usually still wants revenue.
#Microsoft #Recall -
Embed this notice
FeralRobots (feralrobots@mastodon.social)'s status on Wednesday, 22-May-2024 22:05:55 JST FeralRobots I think the whole 'everything is local' spin will drive a lot of breathless speculation about MS using open source models, or only-local models, etc., & I think that's just MS enabling people in bullshitting themselves. Truth is we know nothing about this system because it doesn't exist yet; & velocity of dev suggests that when it does exist, it'll be a cobbled-together mess with massive vulnerabilities that's passing huge amounts of tokenized data to a cloud-based LLM.
#Microsoft #Recall -
Embed this notice
FeralRobots (feralrobots@mastodon.social)'s status on Wednesday, 22-May-2024 22:05:55 JST FeralRobots Then we get to the whole MS-OpenAI relationship. It's been widely pointed out that given observable financial realities, MS could end up stealth-owning OpenAI via compute credits (which it will probably report as revenue); but is anything in the GPT family portable? How hyper-local can you get?
I think the bottom line ends up being that #Microsoft #Recall is largely #bullshit, in the sense of not really being a product that will ever exist in anything like it's described form.
...
-
Embed this notice