> On one hand, Parler was a hate site, filled with conspiracy theories, radicalization and racism. So it's no loss that it's gone, and it took far too long to deal with it. And just like I wouldn't bat an eye for AWS taking down an ISIS recruitment site, I don't really see any loss here.
(I'm going to leave aside the unfounded assertion that Parler was a "hate site." If it is, then every popular website is a "hate site," from Facebook and Google all the way down. Scores of prominent people who aren't hateful but for bizarre Left-wing constructions of the word "hate" use Parler to communicate with millions of people who are themselves not hateful conspiracy theorists.)
I always feel there's an unjustified logical jump on the part of authoritarian sentiments such as this. The argument seems to be, "X hosts hateful content that it refuses to remove. If we remove X, then we will have reduced hate."
This doesn't make much sense. Has anyone's mind actually changed because of Parler? It makes even less sense for conspiracy theories, where censorship makes conspiracy theorists feel like they're on the right track.
I remember looking up some moon landing hoaxer content on YouTube probably five to eight years ago. There was a lot of it on YouTube, but then YouTube also recommended some debunking videos (a few of which had been made specifically in response to the conspiracy theory videos themselves). The debunking videos were just frankly more persuasive. (The only issue with my little experiment was that the "Algorithm" recommended me conspiracy nonsense for a few weeks after.) There were no passive-aggressive, condescending propaganda boxes, no appeals to the authority of the media or legal system, no "fact checks." Just arguments for and against.
This is not to say that people can't do bad things with speech. We have stories like a random lynch mob forming in India over a viral series of videos shared via WhatsApp.[1] There are other stories like this. All are appalling. And technology has removed frictions that existed before to keep these things from happening.
But let's not forget that the world in which speech is restricted is much, much scarier. The Rwandan genocide of 1994 was perpetrated by the most powerful members of Rwandan society, who used their monstrous power to slaughter Tutsis as well as moderate Hutus who spoke against the killings.
In the South under Jim Crow, speech was also violently suppressed with the aid of the states, who turned a blind eye to terrorist groups like the Klan going after black southerners or even white "race traitors" with lynching.
"Censorship, but only for the bad stuff" seems to be an unworkable system. People get riled up, and the consequences can be horrific, but they seem worse in a regime with heavy censorship that doesn't allow a safety valve for the bad ideas.
(I'm going to leave aside the unfounded assertion that Parler was a "hate site." If it is, then every popular website is a "hate site," from Facebook and Google all the way down. Scores of prominent people who aren't hateful but for bizarre Left-wing constructions of the word "hate" use Parler to communicate with millions of people who are themselves not hateful conspiracy theorists.)
I always feel there's an unjustified logical jump on the part of authoritarian sentiments such as this. The argument seems to be, "X hosts hateful content that it refuses to remove. If we remove X, then we will have reduced hate."
This doesn't make much sense. Has anyone's mind actually changed because of Parler? It makes even less sense for conspiracy theories, where censorship makes conspiracy theorists feel like they're on the right track.
I remember looking up some moon landing hoaxer content on YouTube probably five to eight years ago. There was a lot of it on YouTube, but then YouTube also recommended some debunking videos (a few of which had been made specifically in response to the conspiracy theory videos themselves). The debunking videos were just frankly more persuasive. (The only issue with my little experiment was that the "Algorithm" recommended me conspiracy nonsense for a few weeks after.) There were no passive-aggressive, condescending propaganda boxes, no appeals to the authority of the media or legal system, no "fact checks." Just arguments for and against.
This is not to say that people can't do bad things with speech. We have stories like a random lynch mob forming in India over a viral series of videos shared via WhatsApp.[1] There are other stories like this. All are appalling. And technology has removed frictions that existed before to keep these things from happening.
But let's not forget that the world in which speech is restricted is much, much scarier. The Rwandan genocide of 1994 was perpetrated by the most powerful members of Rwandan society, who used their monstrous power to slaughter Tutsis as well as moderate Hutus who spoke against the killings.
In the South under Jim Crow, speech was also violently suppressed with the aid of the states, who turned a blind eye to terrorist groups like the Klan going after black southerners or even white "race traitors" with lynching.
"Censorship, but only for the bad stuff" seems to be an unworkable system. People get riled up, and the consequences can be horrific, but they seem worse in a regime with heavy censorship that doesn't allow a safety valve for the bad ideas.
[1] https://www.buzzfeednews.com/article/pranavdixit/whatsapp-de...