While this may be the best course of action the company can take to effectively demonstrate the UK government's resolute willingness to build a backdoor into some of the world's largest user bases, it is also setting up a major precedent.
Apple is now effectively putting in place a process that can be re-applied in the future whenever a government requires it by law.
Step 1: Have your algorithm train every creator on your platform to use clickbait thumbnails and titles, forcing users to watch the videos to find out what theyâre actually about.
Step 2: Add AI-generated summaries to let users know what a video is actually about before they have to click on it.
Step 3: Gaslight users into thinking youâre helping them, even though youâre barely solving a problem that you invented.
If you ever send me a screenshot or a photo where your computer screen is visible, there is a non-zero chance that I will spend the next 10 minutes zooming in and overanalyzing everything visible on your desktop
Appleâs line of rhetoric here which agitates the supposed safety of children and blankets the entirety of porn and sexual contents with the term âhardcore pornâ is exactly the kind of language youâd expect to hear from the mouths of conservative politicians and far-right extremists when they try to justify the eradication and persecution of marginalized groups online.
This encompasses a wide range of marginalized communities, from LGBT people, to sex workers themselves, who seldom live in anything other than deeply precarious conditions throughout their lives.
@kirb âCrying about itâ is a pretty condescending way of framing people's understandable outrage at the fact that Bluesky is repeatedly presented as a decentralized alternative to the Fediverse when in reality it is not â or at the very least, not about to be.
đłïžâ§ïž Militante @toutesdesfemmesâïž Contributor at @macstories, where I co-host @comfortzone every week.đ Turning Mastodon tangerine since 2023 @TangerineUIđ± @zelda