The term “error message” is not what it pretends to be. It is not a scientific term, nor is it an objective measure of truth and falsehood. Rather, it is a verbal cudgel used to discredit the opposition and force people to conform to a certain narrative. Recent Efforts to 'Prevent' People from Misinformation – A Recent Study Advocates science and a preprint on PsyArXiv—just the latest in a series of attempts to control minds under the guise of protecting them.
There are concerns about the impact of misinformation on the cognitive integrity of democracy
1. Additionally, misinformation significantly affects people’s perception of
Health behaviors in Experiment 2 and real-life situations 3,4. In response, researchers
Various interventions have been tested to combat misinformation on social media5 (e.g. accuracy
Promotion 6, digital literacy skills 7, prevention 8, debunking 9). These interventions work in different ways
psychological mechanisms, but all have a common goal: to improve the recipient’s discrimination ability
Distinguish between true and false information and/or improve the accuracy of news shared on social media
media. This method toolkit is useful but currently difficult to compare
Interventions because they have been tested in different settings and with different groups
Use different actors and different methods for stimulation (e.g., headlines vs. tweets). These
Differences make it difficult to know how an intervention will perform on equivalent tests
environment.https://osf.io/preprints/psyarxiv/uyjha
The study, by lead author Stephan Lewandowsky, epitomizes how “misinformation” can be exploited not as a genuine concern with the truth but as a rhetorical weapon aimed directly at dissent. By labeling contrarian views as dangerous misinformation, the author effectively closes the door to debate and solidifies his position as the arbiter of what can and cannot be discussed.
“Vaccinate” the novel
Let’s stop pretending that misinformation is a real phenomenon. The term implies an objective standard of truth, but in practice it is applied selectively and politically. What constitutes misinformation is determined not by careful investigation or evidence, but by whether the claim conforms to prevailing orthodoxy. If it fits the narrative, it's protected speech; If it challenges the narrative, that’s the wrong message.
This technique is science article. The authors describe misinformation as a disease that spreads like a virus, a framing that easily frames dissenters as a public health threat. The so-called solution is to “prevent” the public by preemptively teaching them how to recognize and reject misinformation. But this raises an obvious question: Who gets to decide what’s true and what’s false?
The answer is clear from the research methodology. Participants were tested on so-called misinformation paradigms, including the debunked claim that “97% of scientists agree” with man-made climate change. This figure, derived from John Cook’s heavily criticized research, is itself a prime example of narrative-driven data manipulation. The details are as follows: Watted?97% of statistics are produced through selective coding and arbitrary exclusions, producing numbers that serve political purposes rather than reflect scientific reality.
In this study, however, participants were “inoculated” with the number as fact, and dissent was seen as misinformation. This is not education; This is indoctrination. By teaching people to uncritically accept the dominant narrative, the author is not protecting them from lies but training them to parrot the party line.
Weaponization of misinformation
The real purpose of the term “error message” is control. By labeling an idea as misinformation, its opponents can dismiss it without substantial debate. This strategy is particularly effective in fields such as climate science, where the complexity and uncertainty of the subject matter are often reduced to simple slogans.
Consider how “error messages” are applied unevenly. Alarmist claims about impending climate catastrophe are rarely scrutinized within this framework, even if they lack scientific support. Predictions of catastrophic sea level rise or the claim that every hurricane is caused by climate change are accepted without question. Yet any doubt about the effectiveness of net zero policies or the accuracy of climate models is immediately labeled as misinformation.
This double standard exposes the word for what it is: a rhetorical weapon used to enforce compliance. It allows proponents of dominant narratives to delegitimize opposing viewpoints without discussing their substance. Worse, it has a chilling effect on free thought, as individuals and institutions self-censor to avoid being labeled as purveyors of misinformation.
Levandowski: The high priest of misinformation policing
Stephen Lewandowski's involvement in this study is not surprising. His career has been marked by a ruthless campaign to delegitimize dissent, particularly in the field of climate science. Steve McIntyre has documented his record extensively climate auditrevealed a consistent pattern: viewing opponents as irrational or conspiratorial rather than engaging with their arguments.
Levandowski's infamous “recursive rage” paper, for example, was a clumsy attempt to paint his critics as conspiracy theorists. The paper was so riddled with ethical and methodological problems that it was retracted, but Levandowski continued to employ the same strategy. Whether he selectively sampled data, relied on non-representative surveys, or outright misrepresented his opponents, his work always prioritized narrative execution over scholarly rigor.
In this latest study, Lewandowski doubles down on his preferred tactic: pathologizing dissent. By treating misinformation like a virus, he treats skeptics as purveyors of social harm rather than individuals with legitimate concerns. This is not science; It’s an exercise in narrative control.
For further reading on Lewandowski's problematic history, MacIntyre's analysis provides thorough documentation: “Lewandowski's Rage,” “Lewandowski's False Correlation,” and “Recursive rage and hidden decline.”
Misinformation as a political tool
The study comes within the broader context of society stifling dissent under the guise of combating misinformation. We are seeing a growing trend towards the use of fact-checking, social media censorship and calls for governments to regulate “misinformation”. These efforts are rarely about protecting the public from lies; they are about consolidating power by silencing opposition.
This is particularly evident in the climate debate. Skeptics who question the efficacy of renewable energy, the reliability of climate models or the unintended consequences of policies such as “net zero” are often labeled as peddlers of misinformation. This tactic sidesteps the need for debate by treating skeptics as morally or intellectually deficient.
The dangers of this approach cannot be overstated. It erodes the foundations of scientific inquiry, which relies on an open exchange of ideas and a willingness to question orthodoxy. By framing dissent as a social ill, proponents of misinformation policing risk turning science into a dogma that allows only sanctioned viewpoints to exist.
Conclusion: Reject the Myths of Misinformation
Misinformation is not a real phenomenon; it is a rhetorical tool used to discredit the opposition and force compliance. Recent research by Lewandowski and colleagues illustrates how the term has been weaponized to silence dissent and promote narrative control. By teaching people to uncritically accept dominant narratives, this research does not combat misinformation but rather perpetuates it.
Real intellectual progress comes from questioning assumptions, debating ideas, and acknowledging uncertainty. The concept of misinformation inverts these principles, replacing them with a system of censorship and thought policing. If we value freedom of thought and the integrity of science, we must reject the fiction of misinformation and resist efforts to use it as a tool of control.
Relevant