Tuesday, April 20, 2010

The fallacy of the "independent thinker"


There must have been a time when the expression “Think outside the box” meant something. I’m not quite sure, because many expressions that have passed into and out of corporate-speak seem to have been crafted to obscure, if not drain, meaning from a statement.

From my experience, the cliché once captured the idea of changing one’s perspective or approach to problem-solving as a means to achieving a goal. No matter how fecund a particular style of management or problem-solving normally is for you, occasionally you find a situation where it doesn’t produce results easily, if at all. Or, it may work for a while but lose its effectiveness over time. Or it can lead to a lot of wasted time and resources chasing avenues of approach into blind alleys. As one writer put it (back in the days of the typewriter), “Some days the words won’t come; other days, they can’t get to the wastebasket fast enough.”

In other words, you were “boxed in” to a method. So you got of the box by trying something different. A literary example: When Sherlock Holmes was stuck in his mind on a problem, and simply smoking two or three pipes of shag tobacco wouldn’t do, he’d perform a chemistry experiment on the theory that “a change of work is the best rest”. Or he’d lie back on his sofa and play the violin. Sometimes something as simple as taking a walk, or facing a different direction, or meeting at a bar rather than a conference room, can stimulate one’s creativity.

Anyway, the whole point of “thinking outside the box”—at least as I understood the term for the last few years—is that, although you’re using a different method of thinking or stimulating creativity than you ordinarily do, you’re still trying to solve the same problem. You change the means of achieving the goal, not the goal itself. If a change in method produces a change in goal, then you’ve gone astray and need to be herded back.

I think it’s because of the cliché’s association with creativity and independence that it has become synonymous with that older bromide “Think for yourself”. This degradation of meaning was recently brought home to me in a discussion on Fr. Mike Longenecker’s blog, “Standing On My Head”, when a virulent critic of the Catholic episcopate accused defenders of the Pope of parroting a party line back and forth without real thought. In reply to one of my criticisms, he responded that he was merely trying to get us to “think outside the box”. The resulting argument went downhill from there.

In looking back, the interchange between “Joseph” and myself was not one of my better debate moments; I should have taken more time to familiarize myself with the content and context of his previous arguments rather than taking issue with the latest of his posts. It was another one of those cases where “meaning well” simply wasn’t enough.

However, I must admit that the tenor of “Joseph’s” accusation—we weren’t “thinking for ourselves”—got under my sometimes thin skin. It’s a common anti-Catholic accusation that intellectual assent to the magisterium of the Church puts the Catholic brain to sleep, that we are and have always been mindless sheep, mental cripples who give credence to even the most ridiculous postulations of Pope and bishop with nary a flicker of doubt or dismay. Even a modest survey of the last fifty years of Catholic theological wrangling—to say nothing of the last 2,000 years—should be enough to put holes in that assertion large enough to float an aircraft carrier through. Yet our critics don’t even acknowledge our internal bickering, save when it serves their inconsistent purposes.

At one time, I would have said that I’d become disenchanted with “independent thinking” because of its anti-Catholic (even anti-religious) associations. But that wouldn’t be the truth, and I wouldn’t want to use that phrase even as a throwaway line. Instead, I would say I concluded that the phrase “think for yourself”, as it’s currently used, is little more than a synonym for “agree with me” … or, at the very least, “disagree with them”, whoever “they” are in the context in which that cliché is uttered.

We start from the axiom that a true statement is one which conforms as much as possible to the object, quality or action it tries to describe: the congruence of subjective observation with the reality it observes. It’s possible for a true statement to be incomplete, to see only one side or angle of the object, or miss shades of quality. It’s also possible for something to be true only within a specific context or set of conditions. But besides Henry James’ criteria of a truth fitting with other truths, producing results and shedding light on other things, it must also have the criterion of being a statement other people can understand and agree with. Not everybody has to agree, of course; however, if the Jamesian criteria are consequences of a statement being true, then its denial must also have consequences: disjunction, dysfunction, darkness and discord.

Let me put it in terms that refer back to my post on subjectivism: Even if statement X is only conditionally true—that is, only true within certain boundaries—then within those boundaries it’s not just “true for me” but true for everyone, whether they can see it or not, whether they can accept it as true or not. If I am the first person to observe that fact, then theoretically I should be able to describe it in a way that other people can also observe and agree to it. But if I’m not the first, then other people should be able to describe it to me in such a way that I can see and agree to it.

So far we’re speaking of the proper sense of knowledge, in which I (that is, the individual person) have observed some fact and added it to the stockpile of “things I know”. I’ve done the experiment or the tests; I’ve created and walked through the reasoning; I’ve directly benefitted or suffered from the consequences of it. But there are a lot of facts in that stockpile that are knowledge in the improper sense: other people have done the testing and theorizing, and have communicated them to me, and I have accepted them. I haven’t done the necessary calculations to prove that the earth is a sphere; I’ve never observed the process of photosynthesis; I wasn’t standing on the Haitian shore as Crístoforo Colombo stepped off the Santa Maria in 1492. Nevertheless, I’ve accepted them as facts.

I do this—we all do this—because we simply don’t have the time, the resources or the talent to make all the necessary observations and calculations to make these facts knowledge in the proper sense. In the case of history, it simply isn’t possible to go backwards in time to verify the various dates and events that are pounded into our skulls in our classroom; nor do we all have equal access even to the primary sources. A lot of what we know, we take on trust: we trust that the people we learn from know what they’re talking about, that they don’t have a desire to mislead or manipulate us, and that if we could (and wanted to) actually make the necessary observations and calculations we would find that the experts were right.

Now, to a certain extent, it does occasionally prove fruitful to question facts, even if all one does is rediscover that the facts espoused by authority were true all along. But the point of questioning such facts is to discover whether they are facts; if they crumble under pressure, then progress may be made. But if they don’t crumble, then denying them is a sign of being not strong-minded but hard-headed. Nor is universal skepticism a sign of wisdom; rather, it betrays a distrust of others that goes beyond reasonable limits.

For this reason, the fallacy of the “independent thinker” has two weaknesses. The first weakness is that it depends on the illusion that all one’s knowledge is derived independently. The second weakness is that it depends on the unnecessary condition that independent thought must lead to countervailing conclusions.

Sir Isaac Newton once said that, if he saw further than others of his day, then it was because he stood on the shoulders of giants. It was his brilliance that allowed him to take true observations from his intellectual predecessors and put them together into a cohesive theory of motion. But it was his wisdom to recognize that creating the theory did not make those truths original to him. For my part, if recognizing and retailing that which I see must be true leads me to be condemned as “unoriginal” or “stale”, I can live with that. Given an exclusive choice of being either original or right, I’d rather be right.

Better to be in the box than out on a limb.