On Wednesday, The New York Times streamed a subscriber-only event, “How to Save Ourselves from Disinformation”. It was the latest in a steady stream of misinformation reporting and columns that are invariably presented as righteous attempts to help Americans understand the truth about what’s happening in their country. The stated goal of this event in particular was to explore the roots of false information online and ways that we can stop some of it and root out its causes. In reality, these types of events and reporting are mainly about convincing Americans we are facing a false information threat that is unique and existential and that there’s not much that we can do about it. They serve as the groundwork for a larger initiative by the liberal media establishment and its allies in the U.S. government to censor or flat out shut down political opponents. This one was no exception.
A quick aside: to avoid confusion, I want to define a few terms that will be used frequently in the piece. “Misinformation” refers to information that is false or inaccurate. “Disinformation” is a form of misinformation created intentionally to deceive. “Censorship” is any attempt to substantially obstruct speech.
The event was organized into three separate segments: an discussion among Times technology columnist Kevin Roose, former far-right videographer Caolan Robertson and former far-right “YouTube radical”, Caleb Cain; a conversation between Times technology reporter Sheera Frenkel and comedian Sara Silverman, and a discussion of survey results featuring Frenkel, fellow Times technology reporter Davey Alba, and MIT management science and cognitive sciences professor David Rand.
Frenkel kicked off the event:
Four or five years ago, we would be talking about foreign actors and the ways in which they sowed disinformation into our online ecosystems. Today, the problem has become more peer-to-peer homegrown sources.
It’s not hard to understand who Frenkel was talking about here. Just about the entirety of the 45-minute-long event discussed disinformation and misinformation from and by right-wing folks. There were a few lazy, virtue-signaling mentions of false information being a bipartisan issue, but the participants didn’t leave any room for confusion about whom we should consider the problem.
If the Times were serious about tackling the over-exaggerated “serious problems” of false information, we would be having an earnest discussion about the most significant sources of disinformation. You want to talk about “homegrown sources”? Let’s talk about the major ones: the American intelligence community and mainstream outlets including the Times.
Remember the Iraq War? That little blunder was made possible by disinformation originating from operatives in the US government and then peddled by just about every reporter at every major US media outlet. For a more recent example, how about the SALT Caucus? That example involves a bipartisan group of 32 homegrown members of Congress united in pushing a fantasy that tax breaks for the wealthy somehow benefit working-class homeowners. How about the short story concocted by intelligence officials claiming Russia put bounties on U.S. troops in Afghanistan? How about Donald Trump being a Russian asset? All of these originated from homegrown sources. When disinformation originates from Times-approved sources like the intelligence community, everything is fine and dandy. But if it comes from anywhere else, it is a full-scale catastrophe. So reporters from the Times should be honest about who they are really referring to when they use the term “homegrown sources”: political opponents.
In Kevin Roose’s discussion with former far-right radicals Robertson and Cain, he asked if steps he has personally urged YouTube to take are making a difference:
YouTube has done a lot in the couple of years that we started talking to try to address the popularity and spread of extremist ideas on its platform. It’s changed its recommendation algorithm. Some of the people you, Caolan, have made content for have been de-platformed off YouTube. Do you think that this stuff that YouTube’s been doing has actually made a difference, has actually put a dent in the movements that drew you in to the far right?
Both men said that in part those steps have made a difference, but not really. Cain mentioned that deplatforming people has only slightly, but that no matter who is deplatformed and who is censored, it’s not going to get rid of the problem of radicalization and disinformation. Which is, of course, correct. Disinformation has existed forever because there are always going to be bad actors in human life. Robertson believes that we shouldn’t be deplatforming people at all, and instead the “quick fix” is for YouTube’s algorithm to not recommend extremists or extremist-adjacent folks. Roose doesn’t explain how he thinks the problem should be handled.
Thankfully, he wrote a piece back in February which nicely outlines proposed solutions for what he refers to as a “reality crisis”. One of the solutions Roose floated was promoting what he calls a “reality czar”. The czar would oversee a multi-agency task force that coordinates with the tech companies to tackle problems of extremism and misinformation. To be clear, Roose is advocating for two of the most powerful entities on the planet to collaborate to decide what constitutes false information. What could possibly go wrong?
These are the types of solutions that events like this one aim to facilitate because they serve the interests of establishment power. The process is relatively straightforward: paint the picture that the problem is very serious and complex, that steps taken thus far haven’t solved the problem, and that more drastic measures must be taken. Then with that justification, take those drastic measures, like Congress hauling tech CEOs before them to answer for misinformation and tech companies shutting down social media apps, to name a couple.
The next part of the event featured Sheera Frenkel in conversation with Sara Silverman. In this segment, Silverman likened the Times to a beacon of truth in a grimy, lie-filled world, and commented that finding the truth on the Internet is hard because it “seems like everything is the New York Times“. Her obvious implication here was that viewers should accept as truth everything that the Times produces. But how do viewers determine which other sources they can trust?
A good way to decide who to listen to is people who are obligated to report the truth, who will suffer consequences if they report something that isn’t true, who are beholden to have sources that are true, might be a more trustable news source than maybe a news channel that must label themselves entertainment. Stuff like Fox News is keeping people we love company all day long. It’s their babysitter.
What happens if, as is the case here in the real world, the exact opposite is true? What if reporters at the New York Times and mainstream liberal media outlets face no consequences when they report something that isn’t true, like the case of Taylor Lorenz? What happens if they aren’t beholden to have sources that are true, like every national security reporter at the Times? What happens when reporters are regularly rewarded for reporting lies in service of the correct narrative, like for instance, that misinformation is an oh-so-serious, existential threat? What does the public do then? Should it still unquestioningly believe everything the Times puts out, then? I’m no fan of Fox News, and Silverman is correct in saying that it is entertainment. But let’s remember that as each day passes, the New York Times is transforming into the exact same thing.
Silverman’s answer to Frenkel’s question about what simultaneously devastates her and gives her hope is illuminating:
The exchange of ideas. The hope that somehow the truth can get through. But I do know that history is written by the winners. And that can be scary. And that’s a really kind-of bone chilling thing to realize. But then Mr. Rogers says, “Look for the helpers”, and there’s a lot of helpers.
This explains a lot about what is behind the liberal establishment’s fixation on disinformation and the spread of misinformation. What is actually devastating to Silverman, the folks at The New York Times, and the Democratic establishment as a whole isn’t the exchange of ideas. It is that the wrong people are exchanging the wrong ideas. If one believes that ideas are wrong or bad, one should defeat them with better ideas. The Democratic establishment, as seen over the past several decades with their failed agenda, doesn’t have any better ideas. So now they must censor.
The event ended with a discussion among Alba, Frenkel and MIT Professor Rand regarding the results of a survey designed to determine if Americans’ propensity to share content online is correlated to whether or not they believe that content is true. A representative sample of the American public and Times subscribers were given the same survey. Six news headlines, three that were true and three that were false, were shared with participants. Participants were asked to determine which articles were true and how likely they were to share each article.
Rand explained that the general public did pretty well in determining which stories were true or false, but that a large number still still shared the false stories. This, according to him, is representative of a wider misinformation problem; not only is there a problem of people knowing whether something is true or false, but even when the people know the truth, they will still share inaccurate information. Furthermore, Rand explained that one-sixth of the survey participants from the American public shared inaccurate information when they knew the information was false.
How did the New York Times subscribers do?
You guys (Times subscribers) were doing even better than the American public, which is maybe not so surprising…whereas the rep sample, even though they thought the false headlines were false, they were still willing to share them. For you guys, it looks quite different, and there is basically no sharing of inaccurate content.
Why did the Times subscribers do better? In sum, Rand said that it could be that Times subscribers are simply better, smarter, and more discerning consumers of news. The problem lies with the one-sixth of survey takers who intentionally shared inaccurate information, not with the more discerning Times readers. What a nice, neat conclusion for the Times. Sarah Silverman tells us that we shouldn’t worry about the Times when it comes to disinformation. And Rand tells us that we don’t have to worry about New York Times subscribers when it comes to the sharing of misinformation. No matter that the Times, as detailed earlier, is one of the biggest purveyors of disinformation.
At the end of the event, Frenkel left viewers with a few recommendations. She told them to think about accuracy before sharing, not to retweet or like something just because it shows up on their feed (unless it’s a New York Times article, one can only assume), and to explore the sources of information before sharing it. That’s all reasonable advice. But how does the public sol? In this event, we given no potential solutions. There are two reasons why. First, because it’s an unsolvable problem. There are always going to be people intentionally creating and spreading false information. Second, the Times doesn’t actually care about solving it.
Reporters from the New York Times can’t stop disinformation, and they know it. That won’t stop them from continuing to churn out reporting and opinion pieces claiming that we’re living in a unique age of false information from which the American public must save itself. They’ll keep pretending that they are the righteous ones ensuring that the truth prevails. They’ll continue to disseminate their own disinformation fed to them by the intelligence community and turn a blind eye to other disinformation, so long as it comes from allies. And all the while, they and their friends in the mainstream liberal media, the halls of Congress, and the White House will continue to agitate for the censorship and shutdown of their political opponents.
Image “Misinformation” by 3dpete used under a Creative Commons Attribution 2.0 Generic (CC BY 2.0) license