>“We know that China, Iran, Russia, Turkey, and North Korea are using bot networks to amplify narratives all over the world,”
Add India and Israel to the list. Saw a pretty sharp uptick in supportive content for each immediately after violence broke out.
There is also a staggering amount about US military though there it's a bit harder to tell whether it's organic. Content about say F22 could very well be...it's pretty cool and presumably gets views.
One of the underappreciated aspects of information, publishing, and communications systems is that they can not only be used against you by their owners and operators, but by other opportunists.
Some years back a friend coined what I call "Woozle's Epistemic Paradox". Paraphrasing slightly:
"As epistemic systems become used more widely or by influential groups, there is substantial power to be had by influencing the discussions that take place."
The original formulation is more verbose:
Our present epistemic systems are undergoing kind of the same shock that the online community underwent when transitioning from BBSs and Usenet to the commercial web to social media.
We were used to a very high content-to-BS ratio because it took a certain amount of intelligence and intense domain-interest for people to be there in the first place -- and we've now transitioned to a situation where many people are there more or less accidentally and (the worst part), because of a high percentage of the population being present, there is now substantial power to be had by influencing the discussions that take place.
Science is much the same. For a long time, it was this small thing operating off to the side; only elites could afford to indulge in it, and their discoveries affected very few -- so the truth value could remain high because there was relatively little to be gained by distortion. People's lives were largely governed by things that had been around long enough that the culture had evolved to deal with them more or less reasonably, so they didn't need advice from domain experts to provide accurate information -- and where expertise was needed, it flowed from parent to child and from master to apprentice as part of a cultural process that everyone understood.
Which means that online forum moderation has to be performed with this in mind. Spammers and trolls are the easy stuff to root out (and they're hard enough). It's the spinners, manipulators, and propagandists who are truly insidious.
As a related question, one might want to ask what the merits of the "marketplace of ideas" and ultimate value and goals of free speech itself are.
(Both have been recent topics of David Runciman's truly excellent Past Present Future podcast. The first in a series on bad ideas, the second on revolutionary ones. They're more closely wedded than Runciman realises (and he does at least recognise this), and are likewise related to free market advocacy itself.)
For an excellent take on the Marketplace of Ideas trope and its relationship to John Stuart Mill, I strongly recommend Jill Gordon's "John Stuart Mill and the 'Marketplace of Ideas'" (1997) <https://doi.org/10.5840/soctheorpract199723210> Social Theory and Practice, Volume 23, Issue 2, Summer 1997, Pages 235-249.
I largely stopped using reddit a few years ago. I notice youtube as well is horrible. And then even more crazy is that YT will remove legitimate comments and censor legitimate videos, but leaves up obvious scam comments and obviously inappropriate bot comments and inappropriate videos.
It’s interesting to me that individuals would rarely identify as susceptible to misinformation, and yet en masse it seems pretty clear that millions of us are vulnerable.
>“We know that China, Iran, Russia, Turkey, and North Korea are using bot networks to amplify narratives all over the world,”
Add India and Israel to the list. Saw a pretty sharp uptick in supportive content for each immediately after violence broke out.
There is also a staggering amount about US military though there it's a bit harder to tell whether it's organic. Content about say F22 could very well be...it's pretty cool and presumably gets views.
One of the underappreciated aspects of information, publishing, and communications systems is that they can not only be used against you by their owners and operators, but by other opportunists.
Some years back a friend coined what I call "Woozle's Epistemic Paradox". Paraphrasing slightly:
"As epistemic systems become used more widely or by influential groups, there is substantial power to be had by influencing the discussions that take place."
The original formulation is more verbose:
Our present epistemic systems are undergoing kind of the same shock that the online community underwent when transitioning from BBSs and Usenet to the commercial web to social media.
We were used to a very high content-to-BS ratio because it took a certain amount of intelligence and intense domain-interest for people to be there in the first place -- and we've now transitioned to a situation where many people are there more or less accidentally and (the worst part), because of a high percentage of the population being present, there is now substantial power to be had by influencing the discussions that take place.
Science is much the same. For a long time, it was this small thing operating off to the side; only elites could afford to indulge in it, and their discoveries affected very few -- so the truth value could remain high because there was relatively little to be gained by distortion. People's lives were largely governed by things that had been around long enough that the culture had evolved to deal with them more or less reasonably, so they didn't need advice from domain experts to provide accurate information -- and where expertise was needed, it flowed from parent to child and from master to apprentice as part of a cultural process that everyone understood.
<https://web.archive.org/web/20210904005401/https://old.reddi...>
(Originally posted to Google+ ~2017.)
Which means that online forum moderation has to be performed with this in mind. Spammers and trolls are the easy stuff to root out (and they're hard enough). It's the spinners, manipulators, and propagandists who are truly insidious.
As a related question, one might want to ask what the merits of the "marketplace of ideas" and ultimate value and goals of free speech itself are.
(Both have been recent topics of David Runciman's truly excellent Past Present Future podcast. The first in a series on bad ideas, the second on revolutionary ones. They're more closely wedded than Runciman realises (and he does at least recognise this), and are likewise related to free market advocacy itself.)
The History of Bad Ideas: The Marketplace of Ideas: <https://www.ppfideas.com/episodes/the-history-of-bad-ideas:-...>
The History of Revolutionary Ideas: Free Speech: <https://www.ppfideas.com/episodes/the-history-of-revolutiona...>
(Both links are to audio, no transcript.)
For an excellent take on the Marketplace of Ideas trope and its relationship to John Stuart Mill, I strongly recommend Jill Gordon's "John Stuart Mill and the 'Marketplace of Ideas'" (1997) <https://doi.org/10.5840/soctheorpract199723210> Social Theory and Practice, Volume 23, Issue 2, Summer 1997, Pages 235-249.
Reddit is truly terrible with this now. Unusable.
I largely stopped using reddit a few years ago. I notice youtube as well is horrible. And then even more crazy is that YT will remove legitimate comments and censor legitimate videos, but leaves up obvious scam comments and obviously inappropriate bot comments and inappropriate videos.
It’s interesting to me that individuals would rarely identify as susceptible to misinformation, and yet en masse it seems pretty clear that millions of us are vulnerable.
[flagged]