This article has a pretty blatant admission that "russian bots spreading misinfo" was always a lie. But it was an effective lie to normalize the escalation of online censorship.In the second study published Thursday, a multi-university group reached the rather shocking conclusion that 2,107 registered U.S. voters accounted for spreading 80% of the “fake news” (which term they adopt) during the 2020 election.Yet these were no state-sponsored plants or bot farms. “Supersharers’ massive volume did not seem automated but was rather generated through manual and persistent retweeting,” the researchers wrote. (Co-author Nir Grinberg clarified to me that “we cannot be 100% sure that supersharers are not sock puppets, but from using state-of-the-art bot detection tools, analyzing temporal patterns and app use they do not seem automated.”)
The way it argues that free speech shouldn't exist at all is staggering, but it gets even better.They compared the supersharers to two other sets of users: a random sampling and the heaviest sharers of non-fake political news. They found that these fake newsmongers tend to fit a particular demographic: older, women, white and overwhelmingly Republican.Supersharers were only 60% female compared with the panel’s even split, and significantly but not wildly more likely to be white compared with the already largely white group at large. But they skewed way older (58 on average versus 41 all-inclusive), and some 65% Republican, compared with about 28% in the Twitter population then. ... As Baribi-Bartov et al. darkly conclude, “These findings highlight a vulnerability of social media for democracy, where a small group of people distort the political reality for many.So they're basically admitting that the full weight of state and corporate propaganda can't compete with middle aged women exercising their free speech. The author is clearly not opposed to a small group of people having influence, he's just mad it's not his paymasters.
@gabriel I have seen bot detection tools, it's all snake oil. These things are developed for government agencies by whomever bids the lowest, not the best company for the job.
@sun@shitposter.world I can't wait 'till they start reporting on the fediverse and assume that everyone with a bot tag is a bot and anyone that doesn't can't be a bot.
@m0xee@gabriel I agree that Russia creates its own alternative news frame for current events and people in America that are sick of our propaganda sometimes fall hard for the foreign propaganda. on the other hand a big component of Russian propaganda is simply airing our dirty laundry and you can't really call that misinformation.
@sun@gabriel Russian investigative journos have always been claiming those were real people, not automated tools — in fact, here in Russia, no one uses terms such as "kremlin bots" for genuine robots. But they were also claiming these people still weren't acting out of their own volition — might've been getting paid or something like that, but that information is of course unverifiable 🤷
@m0xee@gabriel I am of the opinion that Russian airing of dirty laundry or pressing pain points is a drop in the bucket compared to domestic agents doing the same thing with their own motivations, like getting ad clicks or making their academic career out of critique and deconstruction.
@sun@gabriel Yes, it's not misinformation per se, but airing said dirty laundry selectively might've allowed to achieve for some interesting effects. I think that Russia might have contributed greatly in getting you society this polarized, but at present the real scale of this contribution seems to be impossible to assess.
@sun@gabriel Yes, could very well be the case. I'm not the one has a habit of overstating the factor of foreign influence — don't get me wrong, the malicious intent is there and it's tremendous, but the level of incompetence at which said influence is usually getting projected is also astounding 😂