Three days ago, the editor of The Atlantic, a major American news magazine I occasionally write for, revealed that he was inadvertently added to a group chat on the encrypted messaging app Signal, in which plans to bomb Houthi rebels in Yemen were discussed. Two days ago, the government officials who participated in that chat lined up in front of Congress to say “nothing to see here”, arguing that these obviously sensitive conversations were not classified and that the Atlantic was just trying to stir up trouble.
So yesterday, the Atlantic released an even less redacted version of the conversations, still protecting the identity of a CIA officer in the group chat, and we are now seeing that Defense Secretary Pete Hegseth listed the timeline and targets for the attack to a list that included a journalist, and might as well have included any foreign intelligence service who’s managed to hack any one of the civilian-issued phones that received this data. (If you think that’s unlikely, consider this piece of reporting from Der Spiegel, where they quickly retrieved usernames, phone numbers and likely passwords on senior American officials implicated in the leak.)
One of the problems of writing about the Trump administration is the “flood the zone” strategy. If you write about your despair at academic funding cuts, you don’t take the time to write about the chilling abduction of an international student for writing an innocuous oped. The Signal scandal forces you to navigate “flood the zone” within a single incident – if you write about one absurdity, you run the risk of ignoring the others. So, to begin, let me note that:
1) Including a national security reporter in an exchange of pending warplans is an inconceivably huge fuck up.
2) Holding a meeting about war plans using Signal, rather than special purpose secure comms, is unwise, against all protocols and possibly a violation of the espionage act.
3) Using Signal disappearing messages to discuss government business – particularly presidential-level business – is a violation of government records law.
But what I’m focused on today is the ways in which this story may be an unfolding case study about the forking of reality.
Early polling indicates that many Americans, including a majority of Republicans polled, considered the Signal “leak” to be a serious or very serious problem. Those figures may be significant, because more Americans saw the Signal leak as more serious than other information scandals when they first broke: the Clinton email server, the Trump secret documents or the Biden secret documents.
The Trump administration is not backing down from its preferred strategy: deny everything and demand Republicans get in line. Rather than acknowledging the severity of the screw up, Republican officials, led by the President, are smearing Atlantic editor Jeffrey Goldberg, denying that the documents were a “war plan” and terming it an “attack plan”, insting that no classified information was shared, speculating that Goldberg had somehow hacked his way into the group and that Signal’s security was somehow broken.
On the surface, this seems absurd: acknowledging that this was a stupid mistake and demanding consequences for officials who breached security protocols seems easier than demanding that Americans accept this sloppy train of denials. But the practice of denying reality and demanding that we come along has worked well for Trump thus far, and SignalGate may offer a case study in how reality splits in two.
* * *
In 2004, Ron Suskind of the New York Times, reported that a George W. Bush administration official accused him of being part of the “reality-based community”, and that now that the US was an empire, “…when we act, we create our own reality.” This was a deeply prescient statement, and arguably, opened the era of “post-truth politics”.
Politicians have always lied and exaggerated, and especially after Watergate, journalists have worked to confront them with the falsity of their statements. Projects like PolitiFact, launched at the Tampa Bay Times in 2007, turned factchecking into an attention-grabbing feature, labeling egregious claims as “Pants on Fire” with accompanying graphics. PolitiFact won the Pulitzer for reporting in 2009, and has been praised by scholars for its accuracy, though journalism scholar Dan Kennedy points out that there’s only a limited number of claims that can simply be judged true or false.
There are many complaints that PolitiFact cites Republicans for lying far more often than Democrats. Recent research suggests that populist right politicians in Europe are more likely to lie than left politicians, or non-populist right politicians – the authors suggest that lying may be a successful rhetorical strategy and needs to be examined as a form of political speech, not just as an abberation.
The rise of fact-checking in the late 2000s set the stage for the rise of misinformation research in 2016. In that remarkable year, political pundits were stunned by Britain’s exit from the European Union and the US election of Donald Trump, two developments that had seemed extremely unlikely up until the moment that they came to pass. Journalists and scholars took an interest in “fake news” – stories invented by North Macedonian teenagers in order to make money from ad views – pivoting to the terms “misinformation” and “disinformation” when President Trump began using “fake news” to mean “news he didn’t like”.
Smart and talented misinformation researchers began tracking rumors as they spread online, offering factchecks, documenting patterns of misinformation spread and providing social media platforms with tools they would need to combat the spread of information disorder. In return, they got hauled in front of Congress, and doxxed and harassed by Trump supporters and vaccine skeptics who believed their work was biased and political. Platforms took their recommendations to heart during the pandemic, then reversed course, apologizing for exerting control over content and turning to methods of content moderation that turn sensemaking over to community volunteers and bridging algorithms.
Work on mis/disinformation has helped expose the involvement of foreign adversaries in election interference, and may well have saved lives during the COVID-19 pandemic. (If someone has a good academic citation for a study that looked at a relationship between information controls and COVID-connected deaths, I’d love to see it – my cursory look found a lot of papers asserting the importance of fighting misinfo, but little evaluative data that looked at morbidity.)
But it has emphatically not helped unite all Americans in a coherent, consistent information universe. Instead, PRRI’s post-2024 poll of election voters found that 63% of Republicans believed that the 2020 election had been stolen from Donald Trump, while only 4% of Democrats agreed. (31% of all voters and 26% of independent voters believe the election was stolen.)
Given waves of court cases, investigations and oceans of physical and digital ink written on the topic, we might have expected opinions about the 2020 election to have changed significantly. They haven’t. If anything, they’re moving in an opposite direction. A Washington Post/University of Maryland poll in 2021 found that 69% of voters thought the 2020 election was legitimate. In 2023, the same team used a similar method and found that 62% thought the election was legitimate – in other words, years of investigations and reporting didn’t increase confidence in the election, but accompanied a 7% decrease in confidence.
Results like this are not just discouraging for factcheckers and mis/disinfo researchers. They’ve led some commentators to talk about a “post-truth” moment in politics. A brilliant essay by the philosopher Michael Hannon – The Politics of Post-Truth – begins with an array of references to “an epistemological crisis” offered by thinkers from David Brooks to Barrack Obama. Unable to agree on what is true, philosopher Julian Baggini offers consolations and strategies for how we might navigate our “post-truth” world.
Hannon is rightly suspicious of the term “post-truth” – the term is invariably used as an insult. Someone who believes the earth is flat does not believe that there is no truth and that truth is irrelevant: they believe something different than I do because they choose to believe different sources of authority than I do. When we move beyond the insult of post-truth, we can get to the interesting questions of why someone chooses different sources of authority and, as a result, a different set of beliefs.
Some very clever friends have offered clues to solving this problem. Renee DiResta, veteran of the mis/disinfo mines, offers the idea of “bespoke realities”, composed from the floods of information available online, driven by algorithms designed to maximize attention and engagement towards commercial ends. Rather than “manufacturing consent”, as Lippmann both celebrated and warned about in 1922, through assembling an authoritative consensus reality, we end up in mutually incompatible realities that make it challenging to acknowledge the possibility or accuracy of other points of view and the actions they might dictate.
Political scientist Henry Farrell adds the key insight that information disorder is not an individual but a collective problem: “The fundamental problem, as I see it, is not that social media misinforms individuals about what is true or untrue but that it creates publics with malformed collective understandings.” Rather than ending up in purely individual bubbles, our bubbles cluster into publics that have sufficiently compatible worldviews that are capable of interpreting events and debating actions. Belief in the worldviews held within these publics may not be as simple of belief that water is wet, to use Henry’s example – some of these beliefs are “reflective beliefs”, things you’re supposed to believe because you are a Republican or a Democrat.
The big problem for Farrell is not thinking as a collective – that’s inevitable in a field as complex and information-rich as politics – but the fact that the technologies we use to connect as publics have powerful systemic biases. Twitter/X doesn’t develop a consensus from pro-Trump conservatives so much as it amplifies the ideas and obsessions of Elon Musk. (In a long and brilliant analogy to how online pornography sites have preferences shaped by people who pay for online porn, not necessarily those who consume online porn, Farrell observes, “…X/Twitter is a Pornhub where everything is twisted around the particular kinks of a specific, and visibly disturbed individual.”)
A key piece of the puzzle fell into place for me when I blogged Jay Rosen’s conversation with Taylor Owen at the “Freedom, Interrupted” conference in Montreal two weeks back. Jay noted that the “Big Lie” – the belief that Biden somehow stole the 2020 election – has become a litmus test for service in the Trump administration and, arguably, for support of the contemporary Republican party. Jay suggests that this shared belief creates a sort of camaraderie between participants, much as those engaged in a criminal enterprise might feel linked together by their shared culpability and vulnerability to arrest.
Building on thoughts of these four thinkers, I think this “post-post-truth” moment comes about when two conditions are true:
1) We have a vast array of information offering a variety of different perspectives and interpretations and
2) We’ve lost confidence in some or all institutions and the information systems associated with them.
The rise of the consumer internet and participatory media has brought about the first condition. My 2021 book Mistrust argued that mistrust in institutions of all sorts has been on the rise in the US since the 1970s, that this mistrust reflects the genuine failure of institutions in society. Mistrust of institutions now represents a default stance for many young people, who are more likely to trust individuals, particularly anti-institutionalist individuals, than institutions.
I think there may be a third part of the puzzle: a trigger. When an important part of your belief system comes into conflict with “consensus reality”, you’re likely to look for information that supports your beliefs. This might be a parent struggling to understand their child’s autism diagnosis and turning to “research” that leads them to vaccine skepticism, or a romantic rejection that sends someone towards the “men’s rights” movement.
In exploring these beliefs that align with your experiences or your understanding, this new belief system tends to chafe against aspects of consensus reality. What results is cognitive dissonance: the mental discomfort that comes from holding conflicting beliefs. The belief that vaccination is a corrupt conspiracy to enrich the pharmaceutical industry starts rubbing up against an existing belief that government regulation is generally well intentioned and benign. That conflict is uncomfortable, and one way to alleviate it is to research and find information that claims government regulation is generally overbearing and designed to serve corporate interests and not the people. You’ll find this information easily, and enough of that information will come from left-leaning points of view that if you found your way into anti-vaccine beliefs from the political left, it could be smooth sailing into an RFK Jr.-like belief system.
Not every belief that conflicts with consensus reality will cause reality to fork. I manage to read the New York Times, wincing only every fourth article, despite my haunting suspicion that Jeffrey Epstein’s death was not a suicide. Certain beliefs, however, seem designed to make reality fork and the Signal incident is likely to be one of these. Experts from previous governments, the defense establishment and the intelligence community are lining up to shout “this is not normal or safe behavior!” Accepting the Trump administration’s assurances that The Atlantic story is a hoax involves disregarding the perspectives of many people with solid conservative credentials, the sorts of military and intelligence backgrounds that generally command respect in US policy circles. But a quick read of right-wing media – I strongly recommend following The Righting, which rounds up right-wing media for left-leaning audiences every day – offers a view from the other side of this particular fork, where the so-called scandal is a damp squib.
Jay Rosen suggests that there’s a solidarity between those who encounter the Big Lie or another moment where reality forks and we choose to switch the path we’re on. I’d note that creating your own parallel reality can be more fun and rewarding that participating in one where your ability to have input is tied closely to expertise, social position and existing influence. In 2019, I wrote about QAnon as a radically participatory conspiracy, one in which you are encouraged to do your own research, create your own theories and participate in the construction of the new collective narrative. Solving the mysteries of the deep state is both deadly serious and a fun, rewarding community effort for those who participate. Mis/disinformation researcher Kate Starbird makes a similar point, comparing patterns of right-wing information disorder to improv theater and comedy.
The sense of agency and the need for your participation for this reality to thrive are powerful incentives for people who feel ignored and disregarded. And the far-right under Trump not only appreciates participation in its bespoke reality: it rewards its best performers with recognition and real power. Dan Bongino went from an undistinguished career as a secret service agent to a local radio talk show host to an internet provocateur to deputy director of the FBI. Many of the figures in the Trump administration can trace their elevation to power to their efficacy in creating aspects of a reality that avoids cognitive dissonance with Trump’s belief that he was divinely chosen to disassemble the administrative state.
Not only does consensus reality’s reliance on certain forms of authority (professional or academic expertise, access to certain positions of power) mean that it’s hard to participate in the process of shaping reality, the journalistic process of discerning truth creates its own forms of cognitive disonance.
Consider the lab leak theory, the idea that COVID-19 emerged from a lapse in lab safety rather than from cross-contamination in a wet-market in Wuhan, China. As Sheryl Gay Stolberg and Benjamin Mueller explain in the New York Times in 2023, “Some Republicans grew fixated on the idea of a lab leak after former President Donald J. Trump raised it in the early months of the pandemic despite scant evidence supporting it. That turned the theory toxic for many Democrats, who viewed it as an effort by Mr. Trump to distract from his administration’s failings in containing the spread of the virus.” The lab leak theory sometimes served as a jumping off point for theories that blamed China for engineering a virus to destroy the world economy. Seeking to blunt the political impact of those theories, a group of scientists wrote in the Lancet in early 2020 that lab leak theories were conspiracy theories intended to demean and target Chinese scientists who had worked alongside counterparts around the world to identify and combat the spread of the disease.
Over time, evidence has swung from the wet market theory to the lab leak theory, with a new analysis from the CIA now favoring lab leaks as an explanation of existing data. I admit feeling a sense of cognitive dissonance reading an article by my friend Zeynep Tufekci. Tufekci has applied her formidable skills developed analyzing social media and politics to understanding the science around COVID, writing “We Were Badly Misled About the Event That Changed Our Lives”. She argues that the Lancet article I cited in the previous paragraph was drafted by a close collaborator of the Wuhan lab, trying to shield the lab from blame and hide his tracks. Not only did we get the narrative wrong, Zeynep argues, but the media and the public were systematically mislead about risky research conducted within the Wuhan lab.
Reading Zeynep’s recent piece I discovered that, without really thinking about it, I still had the wet-market explanation in my head as the most likely explanation for COVID-19’s origins. It felt uncomfortable to realize that something I had considered to be true might be untrue? Less true? Likely untrue?
It’s possible to look at this sort of cognitive dissonance – “People in authority told me the lab leak was a conspiracy theory, and now they say it’s the most likely outcome” – and see it as a trigger for forking off a bespoke universe in which the media is unreliable, following
political power, rather than a public service mission of discovering underlying truth. How else could we get something as important as the origin of COVID-19 wrong for so long?
But the process of uncovering truth is often a messy and long one. Reporters do their best to triangulate between rival accounts of reality put forth by eyewitnesses, government figures, academics, researchers and other actors. Narratives change over time, sometimes smoothly, sometimes in an abrupt leap, like the paradigm shifts described by Thomas Kuhn.
Searching for “wuhan lab leak” on the New York Times, I found articles that gave the lab leak theory a hearing as early as May 2021, where David Leonhardt’s newsletter noted that the hypothesis may have been prematurely dismissed: “It appears to be a classic example of groupthink, exacerbated by partisan polarization. Global health officials seemed unwilling to confront Chinese officials, who insist the virus jumped from an animal to a person.” Another 2021 story interviews a Chinese scientist, but is skeptical about claims about the safety of her lab experiments in Wuhan. In early 2023, the Times reports that the US Energy Department has determined that a lab leak was the most likely explanation for the pandemic. In June 2024, the Times ran another Leonhardt newsletter looking at evidence for the lab leak theory versus the wet market theory, and an oped advocating the lab leak theory.
In other words, I perceived a sharp shift in narrative about COVID-19’s origin because I wasn’t watching the slow accumulation of evidence and the ongoing journalistic process of analysis and repositioning. My initial reaction might have been to mistrust media for getting a story badly wrong and then abruptly changing stories – looking at the accumulation of stories over the last few years, it looks like the Times, at least, has been closely tracking scientific and intelligence community assessments that one narrative is more likely than another.
Whether you read the meta-story of the lab leak as a condemnation of journalism’s errors, or a confirmation of the slow process of revealing complex truths probably has to do with the consonance or disonance with other core narratives in our own bespoke realities. Like many people on the left, I have reasonably strong confidence in the New York Times and processes behind “mainstream” media, so a reading in which journalism slowly finds truths is more comfortable than one in which journalism follows government diktats – I might read this story differently if my worldview centered on the unreliability of mainstream journalism.
What’s useful about this particular conception of “post-post-truth”, where we:
– understand the term “post-truth” as dismissive and recognize that truth systems emerge from the acceptance of one set of authorities in favor of another
– understand that countering misinfo for one individual is unlikely to change political dialogs, as we live less in individual bespoke realities and more in separate publics, with different presumptions and information sources
– see the role of cognitive dissonance pushing people with one set of incompatible beliefs to a different, less dissonant belief system
– accept that the process of determining truth through journalistic means is slow, imperfect and can lead to shifts in narrative
– understand that processes of determining truth can feel exclusionary and that other processes can feel welcoming and participatory?
My main hope is that this lets us see problems of forking realities as more understandable, if not necessarily more solvable. (My friend Erin Kissane has a wonderful quote from Ursula Franklin on her website: “Not all problems can be solved, but all problems can be illuminated.”)
When we find ourselves in disagreement with a friend over apparently different views of the world, it may be helpful to look at the sources or processes we each consider to be valid and see if we can find mutually acceptable paths towards agreeing on a truth. We may think through beliefs that have forced a friends’ reality to diverge from our own and may be able to muster some compassion in understanding those triggers are often responses to traumas. We can remind ourselves that journalistic methods towards truth lead to conclusions that can change over time as information emerges and question our own certainty about truths we hold dear.
For me, the challenge that’s hardest is the one Farrell puts forward: thinking of the dynamics of truth through the lens of publics, rather than the lens of people. The Trump administration is aggressively promoting interpretations of events that force a rejection of mainstream journalistic approaches to truth and invite participants into co-construction of a collective narrative. Events like SignalGate might point to truths that are simply so apparent that they cause cognitive dissonance for people who’ve accepted other parts of the Trump narrative – they may be an exit ramp, rather than a fork. Or they might be a compelling invitation to construct a narrative consonant with these uncomfortable facts, a narrative that pulls two rival publics even farther apart.
I have been working through issues about mis/disinformation in preparation for a talk I am giving at Amherst Cinema, the introduction to a screening of “Don’t Look Up” on April 1 – if you’re local to the area, please come by and support independent cinema. Many thanks to Erin Kissane, Nate Kurz and Jean-Phillipe Cointet, all who pointed me to resources and helped me think through these issues.
Header photo by Lola Audu, CC BY-2.0