A little over two years ago, amidst the pandemic, Malaysia banned chicken exports due to a domestic supply crunch. Many in land-scarce Singapore were concerned. Over drinks, an acquaintance remarked that he believed the bans were a ploy to later justify price hikes on locally sold chicken. A few of us laughed. Perhaps it was funny that chicken lay at the heart of such an elaborate conspiracy. But I realised this may be less about poultry than distrust in authority.
Singapore is rife with conspiracy theories. A popular one suggests the 1961 Bukit Ho Swee fires were manufactured, to quell resistance towards the government’s relocation efforts. Another describes beliefs among older Singaporeans that Internal Security Department agents spied on every company and Housing Development Board building to monitor discussions of opposition politics. On Reddit forums, users document others they’ve heard in Singapore over the years. Common themes include: covert acts undertaken by political leaders, top-secret military operations, the state’s surveillance capabilities, the integrity of our elections, and more. Before the 2023 Presidential Election, for example, a rumour suggested that the eventually disqualified presidential hopeful George Goh was a government plant, to ensure credible (though not too credible) “opposition” to then frontrunner and current president, Tharman Shanmugaratnam. Reading these reminded me that power invites suspicion rather naturally.
This was around the time I began to consider studying the political drivers of conspiracy theory beliefs in a field dominated by apolitical perspectives of the phenomenon. I study political psychology. Broadly speaking, this includes when and why people trust “experts”, and more recently, the structural underpinnings of conspiratorial thought. In casual settings, most people I’ve spoken to hold a rather poor impression of conspiracy theory believers: “lunatics”, “Trump supporters”, “anti-establishmentarians”, “crazy”, are among terms I hear bandied about. Though understandable, this impression—and perhaps more precisely, its universalism—is a little unfair, and should be challenged.
Recent research may be catching on. Scholars are increasingly looking at conspiracy theory beliefs as more than a cognitive tendency afflicting individuals per se, instead contemplating the structural conditions that may promote their spread. Singapore is a great place to interrogate the assumptions these rest on. A study published in 2022 (more on it shortly) found that GDP per capita correlates with lower levels of conspiratorial thinking. Yet, Singapore is an outlier, with higher levels of such thinking than its high income might suggest. Why?
In recent years, talk of disinformation and misinformation has gained traction. Disinformation refers to false information spread deliberately to deceive—by states or actors with political agendas to mislead or otherwise influence public opinion. Misinformation, by contrast, is false information spread without this intent—often shared by individuals who believe it to be true. (The definition and labelling of falsehoods, including, say, when satirical or parodical content uses one to make a point, is an important point of discussion, though beyond the scope of this essay.)
Both disinformation and misinformation represent challenges to political and social trust, as well as the stability of democratic institutions. But it is important to distinguish these from conspiracy beliefs. The latter are specific narratives that usually attempt to explain key societal events with reference to secret or malevolent plots involving the elite: political leaders, scientists, billionaires.
Disinformation and misinformation do not necessarily involve such elaborate or intentional deception. Conspiracy beliefs may thrive in an environment rife with misinformation and disinformation, but they are a distinct phenomenon.
Scholars have been aggressively developing psychological interventions to reduce people’s susceptibility to misinformation. A famous psychologist, for example, proposes “prebunking”—exposing people to fake news or typical features of misinformation before they encounter it in the real world—to “inoculate” individuals, à la intellectual vaccination. Other well-known means of combating fake news rely on similar techniques. Such efforts are rarer in dealing with conspiracy theories (though they do exist: researchers recently designed a generative-AI chatbot that offers persuasive counter-arguments to conspiracy theories.) This is partly because conspiracy theory beliefs tend to be more resilient, more socially braided, often with one’s identity or distrust in authority.
Psychologists say they possess the following features. First, conspiracy theories claim to address issues that ought to be of public interest. They also often describe harmful acts, and are “oppositional”, meaning the explanations they propose challenge official narratives. Importantly, they ascribe agency, attributing intentionality to certain individuals at the expense of more structural or systemic explanations.
Given these, their epistemic foundations are shakier. Taken together, they are less likely to be true. This is important too. Often, there is no inherent reason a conspiracy theory is wrong (that is, simply by virtue of it describing a conspiracy), though the simplistic nature of these narratives renders them, in their totality, less plausible. (For our purposes, let’s set aside the more radical conspiracy beliefs—the world is secretly controlled by lizard people, for example—which reside on the fringes of the public sphere. On the other hand, sizeable minorities, and in some cases, majorities, believe conspiracy theories that target key political elites.) Last but not least, conspiracy theory beliefs are sticky; they are notoriously resilient. As social constructs, they may become the basis for shared understandings and worldviews.