Substack says it won't ban Nazis or extremist speech
Under pressure from critics who say Substack is profiting from newsletters that promote hate speech and racism, the company's founders said Thursday they will not ban Nazi symbols and extremist rhetoric from the platform.
Hamish McKenzie, co-founder of Substack, said, “I just want to make it clear that we don't like Nazis either – we don't want anyone to hold those views.” said in a statement, “But some people hold those and other extremist views. Given this, we do not think that censorship (including monetizing publications) makes the problem go away – in fact, it makes it worse.
The response came weeks later the Atlantic found that at least 16 Substack newsletters had “overt Nazi symbols” in their logos or graphics, and allowed white supremacists to publish on and profit from the platform. Hundreds of newsletter writers signed a letter protesting Substack's conditions and threatening to quit. Nearly 100 others signed a letter supporting the company's stance.
In the statement, Mr. McKenzie said he and the company's other founders, Chris Best and Jayraj Sethi, had come to the conclusion that censoring or demonetizing publications would not solve the problem of hate rhetoric.
He said, “We believe that supporting individual rights and civil liberties while subjecting ideas to open discourse is the best way to strip bad ideas of their power.”
This stance sparked waves of outrage and criticism, including from popular Substack writers who said they did not feel comfortable working with a platform that allows hateful rhetoric to flourish or thrive.
The debate has renewed questions that have long plagued technology companies and social media platforms about how content should be moderated.
Substack, which takes a 10 percent cut of revenue from writers who charge for newsletter subscriptions, has faced similar criticism in the past, particularly when it allowed transphobic and anti-vaccine language from some writers. .
Nikki Usher, a professor of communications at the University of San Diego, said many platforms are facing what is known as the “Nazi problem,” which dictates that if an online forum is available for a long time, there will be more extremists there. Be present. at some point.
Substack is positioning itself as a neutral provider of content, Professor Asher said, but it also sends a message: “We're not going to try to control this problem because it's complex, so the situation It's easier not to.”
There are over 200 writers publishing newsletters on Substack signed a letter Protest against the company's passive attitude.
“Why do you allow sites that traffic in white nationalism to be promoted and monetized?” It is said in the letter.
The authors also asked whether the company's vision for success included giving a platform to hateful people like prominent white nationalist Richard Spencer.
“Let us know,” the letter said. “From there we can each decide if this is still where we want to be.”
Some popular writers on the platform have already promised to leave. Rudy Fosterwho has more than 40,000 subscribers, wrote on Dec. 14 that readers often tell her they “can't afford Substack anymore” and she feels the same way.
“So here’s to 2024 where none of us will do that!” she wrote.
Other writers have defended the company. A letter signed by nearly 100 Substack writers says it's better to let writers and readers moderate the content, not social media companies.
elle griffinwhich has more than 13,000 subscribers on Substack, wrote in the letter that while “there is too much hateful content on the Internet,” Substack “has come up with the best solution yet: giving writers and readers the freedom to speak without being exposed.” That speech to the public.”
He argued that subscribers only receive newsletters they sign up for, so it is unlikely they will receive hateful content unless they comply. That's not the case at X and Facebook, Ms. Griffin said.
He and others who signed the letter supporting the company emphasized that Substack is not actually one platform, but thousands of individual forums with unique and curated cultures.
Alexander Helen, who writes science fiction and fantasy stories, signed Ms Griffin's letter. In a post on SubstackHe said a better approach to content moderation is to “take things into your own hands”.
“Be an adult,” he wrote. “Stop people.”
In his statement, Substack co-founder Mr. McKenzie also defended his decision to host Richard Hanania, president of the Center for the Study of Partisanship and Ideology, on the Substack podcast “The Active Voice.” The Atlantic reported that Mr. Hanania had previously described black people on social media as “animals” who should be subject to “more policing, incarceration and surveillance.”
Mr. McKenzie wrote, “Hanania is an influential voice for some in American politics,” adding that “there is value in knowing his arguments.” He said he was not aware of Mr Hanania's writings at the time.
Mr McKenzie also argued in his statement that censorship of ideas considered hateful only helps them spread.
“Deplatforming has a positive impact on reducing the spread of far-right propaganda and Nazi material,” said Kurt Braddock, a communications professor at American University who researches violent extremist groups.
Professor Braddock said that when extremists are removed from one platform, they often move to another, but most of their audience does not follow them and ultimately their income decreases.
“I can appreciate someone's dedication to the right to free speech, but the right to free speech is decided by the government,” he said, adding that businesses can choose the type of content they want to publish. host or ban.
While Substack says it does not allow users to call for violence, even this distinction can be blurred, Professor Braddock said, as racists and extremists may do so without openly doing so. Can walk on the line. But his rhetoric could still inspire others to violence, he said.
Allowing Nazi rhetoric on a platform also normalizes it, he said.
“The more they use this kind of rhetoric that dehumanizes or demonizes a certain population,” Professor Braddock said, “the more it becomes OK for the general population to follow it.”