President Kennedy in Dealey Plaza, Dallas. November 22, 1963.

Since John F. Kennedy’s assassination in November 1963, people have leveled conspiracy accusations variously at the CIA, the Mafia, the CIA and Mafia working together, the Soviet KGB, pro-Castro Cubans, anti-Castro Cubans, Mossad, the American military establishment, J. Edgar Hoover, and Lyndon B. Johnson – and that is by no means a comprehensive list. Recently declassified government documents regarding the murder of JFK appear unlikely to overturn the Warren Commission’s finding that Lee Harvey Oswald acted alone, but even more unlikely to end speculation that he didn’t. Whatever these documents say or don’t say, some folks will always suspect that hidden forces were (and are) at play.

Conspiracy theories work by providing alternative interpretations to facts awkward for the official story of an event (e.g., the absence of stars in the moon landing photographs). They are also generally impossible to disprove, at least to the conspiracy theorists themselves, since any “evidence” adduced against them can be written off as a product of the conspiracy – that is, exactly what the CIA / New World Order / Vast Right Wing Conspiracy / freemasons / alien lizard people want you to think. The mechanism of conspiracy is perfect for disregarding your own set of awkward facts.  

I don’t personally place a lot of stock in conspiracy theories, not least because I don’t think people, even alien lizard ones, are that well-organized. That said, there is probably a literal ton of evidence suggesting that, in one specific sense, I do behave like a conspiracy theorist. “Confirmation bias” is the well documented phenomenon that human beings tend not to arrive at conclusions after careful consideration of available evidence, but rather latch on to evidence that supports a conclusion they already hold while questioning, discounting or completely ignoring evidence to the contrary. No conspiracy required; we just don’t take counter-evidence all that seriously in general.

One example among many: in 1979, researchers at Stanford University conducted a study to observe how people respond to both confirmatory and disconfirmatory information. They recruited a group of undergraduates, half of whom were in favor of capital punishment and half opposed, and presented them with studies either demonstrating or disproving the death penalty’s effectiveness as a deterrent to crime. There was a catch: all the studies, pro and con, were fictitious. The researchers carefully constructed them to counterbalance each other, so that neither side was more rigorous or careful or in anyway methodologically superior to the other.

The Stanford experiment found that both groups thought that the studies supporting their point of view were well-constructed, while dismissing the opposing studies as relatively poor pieces of scholarship full of methodological errors and unwarranted conclusions. Simply put, both sides were much better at finding problems with the other guy’s argument than with their own. Moreover, there was a polarizing effect to encountering opposition. On average, both pro- and anti-capital punishment proponents walked away even more convinced of their view (“Really? That’s the best the other team’s got?”).

We seem programmed to discount opposing arguments when we consider them, which is evidently not that frequently. In a 1991 study, scholars at the Harvard School of Education found that people are pretty bad at coming up with reasons not to hold the positions they do on complicated political issues. This “my-side” bias is resistant to education, showing itself in high-schoolers and graduate students alike.


The existence of confirmation bias has pretty clear implications for our political discourse. It’s hard enough as is to grasp the complexities of gun control or immigration policy or climate change; it’s much harder when our brains are working to shut out bits of evidence and shut down lines of reasoning that conflict with our preconceived notions. Like it or not, our equipment is faulty. The check engine light is on, so let’s not try to drive the car coast-to-coast. In other words, we need to moderate our political opinions, or at least moderate our rhetoric, or at very least moderate our feelings of righteous indignation towards the folks on the other side of the aisle.

Alternatively, if you want to be justifiably confident in your political opinions, emulate the example of scientists. Scientists are not people who are magically exempt from confirmation bias, but rather people with competitive colleagues committed to pointing out the flaws in their research and conclusions. As I’ve written elsewhere, bulletproofing your case means you need people to take shots at it. You’re not very good at seeing the holes in your arguments – no one is – so you should find other people to do it for you. This is both humble and courageous.


Of infinitely greater importance is the question of whether confirmation bias undermines the validity of Christian belief. Nobody’s unbiased about the things that really matter to them, and it seems that people who stake their very identity on the truth of Christianity would be the worst about carefully sifting through and weighing the evidence. From an objective standpoint, it would appear that religious belief is the most suspect kind of belief there is – and the more deeply invested the believer, the more questionable the belief.

But this is only the case if Christianity is a theory based on evidence, which it is not. If Christianity’s true, then it’s not something people came up with to explain a certain set of facts. If it is something that people came up with to explain a certain set of facts, then it’s not true.

Sometimes people do argue for the truth of Christianity as the best explanation for this or that historical fact (e.g., the resurrection of Jesus Christ is the best explanation for his followers’ claims to have seen him alive again, and afterwards of the meteoric rise of the early church). Such efforts are very interesting and can possibly serve as supports for faith, but they’re not a good foundation for Christian belief and, I suspect, hardly anyone really uses them as a foundation.

As Soren Kierkegaard (who hailed from Denmark, one of the very few things I know about Denmark other than that they have the highest tax rate in the world and that they probably invented danishes) observed in Concluding Unscientific Postscript, evidential arguments for the truth of Christianity are at their very best only probable and therefore unable to bear the awful weight of faith:

“[N]othing is easier to perceive than this, that with regard to the historical the greatest certainty is only an approximation, and an approximation is too little to build … happiness on and is so unlike an eternal happiness that no result can ensue.”

Confirmation bias is a problem confronting arguments for Christianity on the basis of historical research, archaeology, science and probably philosophy. But people don’t believe in Jesus based on historical research, archaeology, science or philosophy. They believe in Jesus because they need the help, or because they don’t know what else to do with their sin, or because the story of the gospels is irresistibly beautiful to them, or because their moms taught them to believe in Jesus and they never found a particularly good reason not to. And they believe because of God’s grace and the work of the Holy Spirit. If Christianity is true at all, then it has to be this way. Flesh and blood don’t reveal that Jesus of Nazareth is the Christ.

Which should, at very least, lead us to moderate our spiritual hubris.