Tag Archives: Misinformation

Staying Sane Is Hard Work

Sliding down into delusion is seductive, easy, and fun. Modern information technology is making it ever harder to resist. Staying sane, on the other hand, is hard work—and it is getting harder every day.

The internet has made it possible for infectious ideas to spread faster than any physical disease. For a virus to circle the globe, you need mutations and air travel. To become infected by fake news and dangerous ideas, you need only a Wi‑Fi connection. Modern technology exposes us to vastly more information than ever before, much of it unhealthy, and every time our neural networks are exposed to bad information, it feels a bit more sensible to us—even if we know it is fake. Mere repeated exposure wears ever‑deepening grooves of familiarity into our brains. The more we see, hear, and click on a claim, the more reasonable it feels. Eventually, insidiously, it becomes self‑evident—common sense that seems inescapable.

In the past, news was filtered through human editors and gatekeepers. They certainly had their biases and blind spots, but at least someone was nominally responsible for quality. Today, sources like Facebook, Fox News, YouTube, podcasts, X/Twitter, and even our government have largely abandoned any obligation to fact‑check before amplifying. They create the illusion of informed reporting but are often almost completely untethered to reality. Their algorithms and personalities have one overriding job: keep you engaged. They notice what you watch or click and then say, in effect, “If you believe that, then check this out!” They do not care whether they are feeding you solid science or the latest conspiracy theory; they only care whether you will stay tuned in and click some more. The responsibility to sort out well‑supported information from unsupported claims, sound logic from specious arguments, is pushed entirely onto you.

That would be a tall order even if our brains were perfectly rational. They aren’t. Imagine you are curious about a fringe idea like Bigfoot. You type “proof of Bigfoot” into a search engine or social platform, intending to investigate skeptically. You will quickly find articles, videos, posts, and even reality shows arguing that Bigfoot is at least plausible or even real. Because you clicked, the algorithms learn that Bigfoot content “works” on you and begin to serve you more of it: more sightings, grainy photos, confident testimony. Before long, your feed is heavily populated by Bigfoot believers. From your perspective, it starts to look as if there is an enormous body of evidence out there. Everywhere you look, people treat the idea seriously. If so many people think there is something to it, there must be something to it.

In reality, you are being drawn out onto ever thinner and more dangerous limbs. The algorithm nudges you along in little steps, each of which seems perfectly solid and reasonable. This process does not just happen with Bigfoot. It happens with vaccine myths, climate denial, election lies, cultish political beliefs, and every other infectious or click‑inducing idea. The result is that many people come to feel they have made a careful, “objective” study of an issue when in fact they have been drawn, step by step, down a rabbit hole into an Alice in Wonderland alternate reality.

We cannot redesign the global information system by ourselves, but we can develop habits that make us harder to capture. One simple practice is to explicitly search for the reverse of whatever you are investigating. If you search for “proof of Bigfoot,” deliberately also search for “debunking Bigfoot claims,” and click on those results often enough that the search engines learn you will reliably choose that kind of content too. This at least gives you some exposure to different perspectives. Both sides might still be exaggerated, but you are less likely to be left with the illusion that everyone agrees with one side only.

Another, related technique is to always look back to first principles. If you only consider that next little step out along the branch, it will seem safe and sensible. But if you stop and look back at how far you have wandered from the solid trunk, you quickly realize that you are dangerously far out on a limb. Having acknowledged that we do occasionally discover new species, must really therefore admit that a hitherto undiscovered tribe of Bigfoot might actually exist?

It also matters where you spend your time. Just as like‑minded people congregate in person, different online communities attract and cultivate different kinds of thinkers. Choose to frequent healthy online environments. That is not to say you should avoid diverse ideas; but if rumor, outrage, and unvetted claims infect the community or the platform itself, you will become infected. Seek out vibrant but serious gathering sites where people demand citations, scrutinize sources, and correct obvious nonsense. If you stick to them, your own brain will become better at recognizing sound evidence and logic, as well as specious arguments. If the level of discourse on a trusted site degrades, you should leave and stop exposing your brain to it.

Given all the infectious information we are unavoidably exposed to, it is no surprise that people sometimes slip from belief into delusion. Beliefs, at least in principle, are subject to change. We might hold them strongly, but new evidence can persuade us to reconsider. When a belief becomes impervious to change—when no amount of contrary evidence, no matter how strong or consistent, is allowed to matter—it has crossed over into delusion. Using that word makes many professionals uneasy. In a clinical setting, “delusional” has a specific meaning and diagnostic criteria. Nevertheless, in the generally accepted lay domain, delusion is the proper word to describe thinking patterns that have become impervious to evidence or reason.

When a person or a movement has fallen prey to delusional ideas, when contrary facts are dismissed out of hand or reinterpreted as attacks, we no longer function in the realm of honest disagreement. We are locked into a self‑reinforcing mental world that will not adjust to reality. In a culture where influencers dominate the discourse, the rest of us are put at risk. Delusions can be comforting, energizing, and politically useful, but facts always assert themselves in the end. Reality does not care if you believe in it.

As a result of so many infectious ideas being disseminated so quickly, we are currently suffering from a global pandemic of delusion. We cannot wipe it out, but we can protect ourselves and try not to contribute to its spread. We can monitor our own information diets, seek out counter‑evidence, choose better communities, learn to better assess claims, and be more precise in our language. We can and must resist being nudged toward delusion. As susceptible as our brains are to misinformation, they can also be trained to better assess the soundness of claims and to detect specious arguments.

The way repetition reshapes our memories and our very perceptions, the way algorithms exploit our pattern‑seeking brains, the way beliefs slide, inch by inch, into full‑blown delusion—all of these dynamics, and many others, are at work in our politics, our media, our religions, and our personal lives. In my book Pandemic of Delusion: Staying Rational in an Increasingly Irrational World (see here), I unpack those mechanics in much greater detail, with concrete examples and practical tools for recognizing when you, or someone you care about, is being nudged away from reality. If this short essay inspires you to want to bolster your defenses, the book will provide you with a practical field guide: offering insight as to why we are so susceptible to misinformation, how to recognize it, and how to immunize yourself against it. It will give you a fighting chance to stay sane when the world around you seems determined to drive you crazy.

The Insidious Effect of Big Lies

In this blog and in my book, Pandemic of Delusion (see here), I have written a lot about how it is that we are all so woefully susceptible to lies and misinformation. We are clearly far more vulnerable than most of us are willing to believe, particularly with regard to our own thinking.

Just as there are lots of ways that vines can wiggle their way into a garden, are many mechanisms by which lies can infiltrate our neural networks and eventually obscure the windows of our very perceptions.

And as with invasive species of vines, one infiltration mechanism is a simple numbers game. Our neural networks are “trained” through repetition. So regardless of how skeptical we imagine we are, the more lies we hear and the more often we hear them, the more we become increasingly comfortable with them.

Another counter-intuitive infiltration mechanism is size and scope. In many cases, the whopper of a lie is easier for us to accept than more modest lies. We conclude that surely no one would make up such a big lie, and surely a lie that big would be exposed it if were not true. So therefore it must be true by virtue of its audacity alone!

Implicit in this is the concept of anchoring, but I have not yet discussed this explicitly. The concept of anchoring is most often used in economics to describe the effect of pricing. If you “anchor” the retail price of a rock at say, $100 and then mark it down to say $10, most consumers conclude that $10 is a great deal on a rock that’s totally worthless. This perception is enhanced if you see lots of “competing” rocks being sold for similarly high prices and purchased by others.

As it relates to lies and misinformation, anchoring has a similar effect. When we hear a really, really big lie we sometimes accept or dismiss it outright. But the effect of the big lie is more insidious than that. First, as we have said, if we hear it often enough we will become inexorably more accepting of it. But also, the big lie anchors our skepticism.

Big lies anchor our skepticism in two ways.

First, a big lie causes us to consider that, as with the rock, there must be <some> value, <some> truth there. This plays well into our self-image as measured and open-minded thinkers. Our brains compromise. We take intellectual pride in not being fooled outright by the big lie even as we congratulate ourselves for being open-minded enough to consider that some of it might or even must be true.

Second, big lies further anchor our thinking when we are exposed to a lot of them. As with individual lies, we pride ourselves in rejecting <most> of the big lies, even as we congratulate ourselves for accepting that some of them might or even must be true.

And each lie we accept, or even entertain in whole or in part, makes it easier to accept more and bigger lies.

We humans have always had the same neural networks with the very same strengths and limitations. Our neural networks have always been trained through repeated exposure and have always been susceptible to the same confounding effects such as anchoring. But it is only very recently with the advent of social media that our neural networks have been exposed to so much misinformation so incessantly.

As if that was not enough to drive us to delusion, we now have Artificial Intelligence. AI has yet to show whether its god-like powers of persuasion will nudge us toward facts and reason or plunge us further into delusion and manipulation.

And to make it even worse, our reason has been further attacked the emergence of the virulent, invasive new species called Trumpism. Trump and his allies, intentionally or instinctively, leverage the power of big lies, repeated over and over, to cause us to believe absolute nonsense. Dangerous nonsense. Even democracy-ending nonsense.

Understanding the effect of big lies on us, particularly when we imagine that we are being moderate and measured in our acceptance of them, is critical. We have to understand this at a gut level, because we cannot trust our brains on this.

One final, and perhaps somewhat gratuitous comparison to make is that this “partial” acceptance of an anchored big lie is not unlike the imagined “reasonable” position of agnosticism when it comes to the completely, utterly false claim that god exists. It is perhaps not completely a coincidence that Trump’s most deluded followers are Evangelical Christians.

Pandemic of Delusion

Pandemic of Delusion can be found on Amazon (see here).

You may have heard that March Madness is upon us. But never fear, March Sanity is on the way!

My new book, Pandemic of Delusion, will be released on March 23rd, 2023 and it’s not arriving a moment too early. The challenges we face both individually and as a society in distinguishing fact from fiction, rationality from delusion, are more powerful and pervasive than ever and the need for deeper insight and understanding to navigate those challenges has never been more dire and profound.

Ensuring sane and rational decision making, both as individuals and as a society, requires that we fully understand our cognitive limitations and vulnerabilities. Pandemic of Delusion helps us to appreciate how we perceive and process information so that we can better recognize and correct our thinking when it starts to drift away from a firm foundation of verified facts and sound logic.

Pandemic of Delusion covers a lot of ground. It delves deeply into a wide range of topics related to facts and belief, but it’s as easy to read as falling off a log. It is frank, informal, and sometimes irreverent. Most importantly, while it starts by helping us understand the challenges we face, it goes on to offer practical insights and methods to keep our brains healthy. Finally, it ends on an inspirational note that will leave you with an almost spiritual appreciation of a worldview based upon science, facts, and reason.

If only to prove that you can still consume more than 200 characters at a time, preorder Pandemic of Delusion from the publisher, Interlink Publishing, or from your favorite bookseller like Amazon. And after you read it two or three times, you can promote fact-based thinking by placing it ever so casually on the bookshelf behind your video desk. It has a really stand-out binding. And don’t just order one. Do your part to make the world a more rational place by sending copies to all your friends, family, and associates.

Seriously, I hope you enjoy reading Pandemic of Delusion half as much as I enjoyed writing it.

I Say Give Them Time

As my readers know I occasionally take exception to comments made by highly respected intellectuals. I hope that when I do so it is not to engage in a gratuitous attack, but to offer an important counterpoint. In that spirit I must take exception to recent comments made by the highly respected thinker and author Malcolm Gladwell (see here).

The comments I refer to were offered by Mr. Gladwell when he appeared on The Beat with Ari Melber last week. The full text can be heard on the Ari Melber podcast dated July 3rd, 2021.

Mr. Melber introduced the segment by pointing out that we live in a period in which Republicans are attempting to revise history and promote lies. He asked Mr. Gladwell for his thoughts about all of that and whether there were any solutions. It should be noted that this question was asked in the context of promoting Mr. Gladwell as an expert on human thinking and behavior.

Here is a slightly polished transcription of the response by Mr. Gladwell:

I think about the role of time. I wonder whether we’re in too much of a hurry to pass judgment on the people who continue to lie about what happened on Jan 6th, there are many forms that denial takes. One of it is that I honestly don’t believe that anything went wrong there. Another form is that I do believe but I’m not ready to admit it yet. A lot of what looks like a kind of malignant denial in the republican party right now is probably just people who aren’t ready to come clean and renounce a lot of what they were saying for the previous four years. I say give them time.

While this admonition for patience may sound superficially learned and wise, I find it naïve, wrong both theoretically and factually, and damagingly counterproductive. While I certainly don’t expect Mr. Gladwell to cite all his supporting evidence in a short interview segment like this, I don’t believe he has any. I suspect this is simply well-meaning but unrealistic platitude, analogous to “the arc of the moral universe is long, but it bends toward justice.” That’s OK, except that he is putting forth an unsupported platitude as the conclusion of a purported expert in human thinking.

But such an expert on human thinking should understand that neural networks simply do not function in a way that would make “give them time” a reasonable strategy. As long as Republicans continue to hear the same old lies repeated over and over, they are not going to eventually recognize and reject them. Repeated exposure does not reveal lies but rather transforms our brains to accept them more deeply.

Our neural networks are influenced mainly by the quantity and repetition of the training “facts” they are exposed to. They have little capacity to judge the quality of those facts. Any training fact, in this case any idea the neural network is exposed to, is judged as valid by our neural network machinery in proportion to how often it is reinforced. And by the way, I know most of us want to believe that we collectively are not so susceptible to this because we want to believe that we personally are not. But we are.

So, my objection to Gladwell is that he does not truly understand how our neural networks function because if he did he would understand that “I say give them time” is counterproductive advice at this time. Now, yes, it would be good advice if we were confident that Trump voters are being exposed regularly and primarily to truthful information. If that were the case I would agree, yes, give their neural networks more exposure time. However, I don’t believe that there is any reasonable basis to think that giving them more time will serve any purpose except to further reinforce the lies they are continually exposed to from Trump, the Republican Party, and Fox News. We are simply not ready to just be patient and let the truth seep in and percolate.

The more nuanced advice, in my opinion, to the question posed by Ari Melber is that we must discredit and stem the flow of misinformation from these sources and expose Republicans regularly to truly factual information. Once we do that, then, yes, I say just give them time for their neural networks to become comfortable with it. With enough exposure their neural networks will transform whether they want them to or not. But to accept the status quo right now and “give them time” as Mr. Gladwell suggests would be horribly premature and ill-advised.