I’ve long been a proponent of the notion that polarized systems involving belief and skepticism are often guilty of the same transgression: ironically, this is belief. Perhaps this sounds redundant, or maybe like I’m creating a circular argument that pokes fun at its own simplistic irony. This is not the case, however; indeed, I feel like the biggest problem with both ideological schools (belief and skepticism) is that proponents of each have a self-contained, inherent belief they prepend to anything they attempt to explain using science. If, to a “believer,” observable data reveals something they believe reinforces their argument, they call it “evidence.” If a skeptic does the same, they might call it debunking. What both parties tend to fail in doing every time is to take a closer look at information that, superficially, fails to support their predisposed theory.
In the new issue of Wired, Jonah Lehrer has written a brilliant article called “The Neuroscience of Screwing Up.” In it, he shares a variety of unique observations that question the way the human mind, as well as the scientific process works in general. Below is a brief excerpt that pertains specifically to the treatment of “anomalies,” and how people are often predisposed to interpret them through this so-called study in scientific failure:
The reason we’re so resistant to anomalous information–the real reason researchers automatically assume that every unexpected result is a stupid mistake–is rooted in the way the human brain works. Over the past few decades, psychologists have dismantled the myth of objectivity. The fact is, we carefully edit our reality, searching for evidence that confirms what we already believe. Although we pretend we’re empiricists–our views dictated by nothing but facts–we’re actually blinkered, especially when it comes to information that contradicts our theories. The problem with science, then, isn’t that most experiments fail–it’s that most failures are ignored.
This makes good food for thought… and questions where we really draw the line between things like myth, science, objectivity and belief. Much like the experimental musings of speculative physicists, even when something observed doesn’t fit the given setting, it doesn’t rule out that the anomaly might exist without conflict in a different, hypothetical setting.
‘What the thinker thinks, the prover proves’, as RAW said it. This isn’t a new idea, particularly — just one that has been (amusingly) ignored. To work it into the framework of memetics, those ideas riding us who have the most control fear other ideas who may wrench away their control — and, of course, the idea that we ourselves may be subject to confirmation bias is in of itself counter to those biases which wish to be confirmed.
If we do not use our beliefs as tools, our beliefs use us as tools — and then, as John Keel once said, ‘belief is the enemy’. We tend to see what we believe, not the other way around, and although it is easier to see what we believe when it jives with what everyone around us believes (when it is credible), we can still see that which we believe when it would (in the socially and perhaps empirically accepted framework of reality) be impossible. Beliefs don’t give a damn about reality, just as reality doesn’t give a damn about beliefs.
PKD’s aphorism comes in handy here. How do you know whether or not something is really there? See if it changes form (or disappears entirely) when you wholeheartedly adopt a belief system that denies it. Bigfeet become bears and escaped gorillas, and if we as observers can change their nature easily by changing our beliefs, then the reality of the situation is unclear. If we change our belief system and discover that this thing is not a bear by any stretch of the imagination in a bigfeet-are-bears reality tunnel and is likewise not a gorilla, then perhaps we have seen a real one.