When Disinformation Becomes Information
Your browser doesn't support HTML5 audio
As a New York Times subscriber, I have access to certain staff writers’ newsletters. Tressie McMillan Cottom (whose book Thick was one of my top 10 books last year) has a weekly newsletter, as does the opinion columnist Jamelle Bouie. I've also signed up for Jay Caspian Kang’s twice-a-week newsletter. Kang is an interesting dude, and I’m still trying to figure out what I think about him. He’s an Asian-American writer whose first nonfiction book, The Loneliest Americans, released late last year. I was excited to read it, but then I saw a few rather scathing reviews that Cathy Park Hong rewteeted, and that gave me pause. Not only about Kang’s book, but also my own. It was paralyzing to read such negative reviews from people within our community about a book from someone else in our community.
The main argument against The Loneliest Americans had to do with Kang’s very narrow definition of what Asian-American means or includes. I haven’t read the book, and it’s been several months since I’ve read the critical reviews, but the gist seemed to be that Kang oversimplifies the meaning of Asian-American by focusing almost exclusively on high-income East Asians. There was also some mention of how tiring it is to read about a self-loathing Asian-American, of which it sounds like Kang was/is.
Anyway, I bring all of this up to illustrate that Jay Caspian Kang is not a universally revered AAPI author. That said, I rather enjoy his biweekly newsletters. The writing is crisp, clear, direct. Sometimes he can sound a bit too bombastic for my taste, but that might stand out only because it’s a tone of voice that we don’t often hear from Asian-American writers. My main gripe with his newsletter is that the San Francisco-based Kang talks a little too much about California. (As a life-long east coaster, I take this personally.)
Kang’s newsletter this past Monday focused on disinformation. He began by referencing a Harper’s article from last year by Joe Bernstein about what Bernstein calls “Big Disinfo,” the group of thinkers and organizations which have sprung up in the past five or so years to examine and (supposedly) fix our society’s problem with disinformation. Kang — along with Bernstein — is skeptical of Big Disinfo. He believes that “Big Disinfo has oversold the problem a bit, not so much in how harmful specific disinformation campaigns can be but in how much everything we see can change our opinions. Social media has certainly changed nearly every facet of our lives, but it’s difficult to see any streamlined narrative in the daily chaos of the information that’s presented to us every time we pick up our phones.”
I wanted to take a moment to talk about my own experience with social media, because contrary to what Kang believes, I think what we see and read every day does change our opinions. Or maybe not our opinions, but our dispositions. I don’t think disinformation can take someone who believes all other vaccines are safe and effective and turn them into an anti-COVID vaxxer. However, I do think disinformation helps to solidify an anti-vaxxer’s belief that vaccines are unsafe, an affront to personal liberty, etc. This, by extension, leads to the polarization that makes any kind of agreement or compromise nearly impossible.
Back in high school one of my favorite classes was US Government. Early in the school year our teacher had us create a political spectrum along the back wall of the classroom. The most liberal person in the class would stand all the way to the left, the most conservative person in the class would stand all the way to the right, and the rest of us had to figure out as best we could where we landed in between. When all was said and done, I wasn’t the most liberal person in class, but I was the second or third most. This shows that I’ve always had strong progressive leanings; disinformation/social media hasn’t made me any more radical than I already was.
But social media — particularly Twitter — is far from innocent here. While it hasn’t made me skew further left, it has changed my opinion of those on the right. I have considerably less tolerance for those across the political aisle. Twitter makes it all too easy to get annoyed at something specific, whether that’s big (like voting rights issues in Texas) or small (Marjorie Taylor Greene’s gaffe calling the Nazi police “gazpacho”) and trigger that “ugh, here we go again” reaction that only serves to solidify how dumb and idiotic today’s Republican party is. That’s not something I would have thought back in high school.
Kang continues: “It may also be true that typical people see more disinformation than they did 30 years ago, but it’s difficult to quantify the difference between reading, say, fantastical tabloid headlines about U.F.O. sightings every time you walk down the grocery aisle and the falsehoods that come across our feeds.” I agree that quantifying that difference is near impossible, but Kang seems to be glossing over the bigger point here, which is that “falsehoods that come across our feeds” (such a gentle way of putting it!) are objectionably worse than a tabloid story about UFOs. The kind and type of disinformation that surrounds us these days is an order of magnitude more harmful than anything else before. I don’t say that to be alarmist; I say that because it’s way more dangerous when people read disinformation that tells them to avoid getting vaccinated than they do when they read disinformation that claims aliens are among us.
The thing is, the kind of dangerous disinformation I’m talking about has pretty much always been around. Fox News did not invent the game, even if they excelled so well they basically rewrote the rulebook. The difference is the platform for the disinformation — particularly the ease to access it. In the pre-internet days, it was a lot harder for dangerous disinformation to spread. Partly this was due to limitations of print media or radio waves, and partly because people weren’t able to close themselves off in echo chambers. But now it’s all too easy to visit only the corners of the internet that tell you what you want to hear. This is why today’s disinformation is particularly sinister: the internet can make it so that there’s no information for disinformation to base itself off of. If you think of disinformation and information as two sides of the same coin, then certain corners of the internet have created a trick coin — both sides are disinformation. And that’s how people come to believe so firmly in what they believe: the disinformation IS their information.
Republicans have learned that if you speak a lie enough times it becomes the truth. It doesn’t matter that the rest of us know what they’re saying is bullshit so long as their followers believe them. That’s why it’s hilariously ironic that Trump’s social media platform (which looks to debut tomorrow) is called Truth Social. For the people who only see disinformation, that lie becomes their information. As Heather Cox Richardson (a historian who writes a really great daily newsletter recapping an important aspect of the day’s news) noted earlier this week: “Spreading false stories depends on making sure the truth is inaccessible.”
Which leads me to one area where Kang and I agree: “The path toward solving the disinformation problem should go toward broadening access to education and fixing income inequality instead of trying to persuade tech companies to remove a few notorious accounts.” I think he’s exactly right. Disinformation is no longer only a tech issue. It has spread far beyond the reach and scope of technology. Education is the best solution, which is why it’s no surprise to me that conservatives are now targeting education — particularly what we can teach our kids. Just like with voting, Republicans are trying to rig the system in their favor. Or at least that’s what it looks like on my Twitter feed.