a godly snooze: If the philosopher Berkeley’s God ever decided to catch forty winks, the consequences for existence itself would be dire. illustration by nicholas e. meyer.

From earliest times, philosophers have been found labouring under a misapprehension: that, if it appeared to them to be only logical (or soul-satisfying, or ultimately just aesthetically pleasing) for the world to have this or that property—for things in reality to be some given way, and not another—then that was it. Things were actually so.

However, this is a delusion. The world, and all reality, material and nonmaterial, have no need to conform to what any philosopher, under his or her way of thinking, believes should be the case. And the delusion has been extensive enough throughout philosophy to constitute a curse, for it has kept many otherwise supremely brilliant and ever-so-subtle minds from suspecting that their conclusions might just possibly be resting on unwarranted or indeed solipsistic grounds. But it goes deeper than that: the delusion has often obscured the need, in tandem with the work of cogitation, to try, wherever possible, to actually find things out. This obscuring has even occurred in cases in which lip service was paid to the idea—which stretches back to Parmenides the Greek—of checking with reality.

Cases in point of thinkers being sure that things are the way their intellect—and/or their hunches or their personal or social proclivities—has decided they should be are embarrassingly plentiful. Here are three.

One: the Zen position (which is not exclusive in Eastern philosophy) that an immediate subjective apprehension of reality is necessarily superior to reasoning or research.

Two: Gilles Deleuze’s argument that the foundation, ‘the absolute ground’, of philosophy equates with the plane of immanence. (By this, he meant a kind of soup—more precisely, a consommé—in which everything, ideas, things, the lot, coexist but without differentiation or delimitation of any kind. There, they are ‘in themselves’, which means immanence, not ‘beyond themselves’, i.e. in transcendence.)

Three: the Rig Veda’s account of the dismemberment of Purusha—primaeval man, mind, or consciousness. From his mouth came the Brahmins; from his arms, the warriors; from his thighs, the common people; from his feet, the menials; from his head, the sky; from his mind, the moon; from his eye, the sun; from his feet once more, the earth. Even if this is taken symbolically, as a poetic expression of myth, it is hard to deny that it expresses its originators’ view that society and the world ought to be organised hierarchically—and therefore, that that is how the world surely is organised.

If only such statements were phrased more tentatively. A philosopher might write, especially in areas of thinking that scarcely lend themselves to experimental probing, ‘This position I am stating is not one that I can prove to be the case—but it provides a solid, workable interpretation or model of the case. I see it as superior to previous models of how things are; so, until, and if, a better one is developed, it should stand.’ Yes, the philosopher might write something along those lines. But the chances are overwhelmingly that he or she won’t.

In some cases, the reason for this may be that the philosopher is afraid of not having the same impact, not gaining the same level of renown, if he or she seems to sound wishy-washy instead of categorical. (To be consistent: in the present essay, categorical statements are to be understood as meaning the best interpretation of the known facts thus far.) In the majority of cases, though, the reason philosophers don’t write that way is that it doesn’t cross their minds that their conclusions could be anything less than definitive. What goes for philosophers goes, equally or even more so, for theologians.

The above title, ‘The philosopher’s curse(s)’, obviously refers to a curse(s) that philosophers have lived under, not a curse(s) issued by them. The suggested plurality of curses is due to the fact that from the above overarching fallacy—‘I think so, therefore it is so’—follow others. They are derived or comparable to it, yet aren’t identical to it. Then there are also some that are unrelated to it. This article lists a total of six, including the Big One already mentioned.

Notice that when philosophers gave themselves the task of apprehending the nature of the alleged ultimate reality, of finding what lay behind the multiplicity of appearances, their respective speculations—or gut feelings—took them to different conclusions.

Here’s the second—one which, although it can be seen as a particular case of the Big One, is distinctive enough to constitute a category unto itself. It is the belief that the material world that we see, hear, and touch is inferior and/or less real than some other, ungraspable one. This conception is widespread in Eastern philosophy, yet it is not restricted to it. Kant was also one of those holding that the material world is less real than the spiritual world (however that often woolly concept is defined). The mental mechanism by which mankind arrived at this idea is transparent: the world was found to be mysterious, dangerous, and incomprehensibly complex, not to mention often unfair. Unsurprisingly this led to a yearning for a superior, even if invisible, world, and from yearning, the next step was utter conviction that such a world indeed exists. 

Notice that when philosophers gave themselves the task of apprehending the nature of the alleged ultimate reality, of finding what lay behind the multiplicity of appearances, their respective speculations—or gut feelings—took them to different conclusions, about which, naturally, each was always convinced. The example that springs most immediately to mind is that of the Presocratics, each of whom identified different elements as the underlying substance/principle, or arche, of reality: water, air, or fire. But examples also range as far and near as the Buddhist thinker Nagarjuna, for whom the root of everything was the Void, or Schopenhauer, for whom behind all reality lay the Will, or Heidegger, for whom nothing other than Being fitted the bill.

It needs to be underlined here that the list of six fallacies refers to nefarious basic approaches in philosophy—not to the simple procedural errors or the writing vices that specific philosophers might fall into, even if the line between the two may not always be hard and fast. To illustrate: the listing doesn’t refer to unwittingly falling into some hooey or inconsistency that the very same philosophers may be arguing against. It doesn’t refer, either, to grating individual idiosyncrasies, like writing in a needlessly obscure way (with never an example to clarify the points being made) just to show off the author’s cleverness.

Nor does it refer to the fallacy of prior assumption, wherein a philosopher fails to notice, much less prove, some assumed point before continuing with his or her argument. The above-mentioned search for the ultimate reality behind the world provides a good example of this fallacy. The prior (unproven) assumption is that there is one such ultimate underlying substance. (Sometimes the philosophers’ brainwork led them to the conclusion that there is not one but two underlying substances which are opposite, complements, and rivals.)

Incidentally but significantly, why one or two ultimate realities? Why not five? Why not one hundred and one? Why not even none at all? They merely thought it obvious that there had to be one (or the two that are forever fighting it out between themselves) since they found the idea of a fundamentally heterogeneous and messy universe offensive. Many people (possibly most) still do, but this is no more than a preference—in this case, of an essentially aesthetic type. Preferences, and philosophies based solely upon them, do not establish fact.

Reality may on occasion agree with someone’s preferences about the way things ought to be (in which case they won’t agree with the preferences of others who have thought differently). But that will have been no more than coincidental—analogous to the case of someone obsessed with Tuesdays who declares, every day, ‘Today is Tuesday!’ and periodically happens to be right.

john smibert’s c. 1728-30 portrait of Berkeley. Luckily, he appears to be awake.

Here comes the third of the accursed philosophical delusions: the thought, often conscious but sometimes subconscious, that the way things are in the world depends on human understanding of them. George Berkeley, who took this idea furthest, condensed it in Latin: Esse est percipi—to be is to be perceived. For those who share this conclusion, the arguments are apparently so strong that they obscure the fact that if human understanding colours all facts about the world—or indeed precedes them—this only happens for humans. (If the philosophers fail to say so, it’s because they have failed to connect these particular dots, or because they do not attach any importance to the connection.) As for the rest of the world, it would go about its merry way, or grim way, if there were no humans to perceive it, and even if humans had never existed.

It boils down to this: it could be that, yes, human philosophy truly cannot prove there is a world outside of people’s thoughts and/or their perceptions—however, that’s hardly the fault of the world. The shortcoming belongs to human philosophy.

At the heart of any delusion that things are otherwise is human vanity, even if masked by sleight of brain. What is needed, in this as in so much else, is some humility. Not, in this case, personal humility, but a collective humility based on a true assessment of our standing as tiny creatures on the surface of a minute mote in the universe. Imagine that, one day, humans not only destroy the Earth but manage to create a black hole that swallows up the planet itself and also everything else in its vicinity. Even in that extreme case, the idea that the universe as a whole depends on humans or any of their attributes is an exhibition of hubris on a staggering scale. This, by the way, is quite typical of a lot of human thinking. Here’s a case in point: the idea that mighty planets, stars, and constellations make it their business to determine the characters and fates of humans.

The fallacy extends to science—even, or especially, in its most modern areas. The delusion appears whenever science neglects to say—or to see—that if something remains indeterminable, it may only be so to us. Science will never be able to precisely know, at one and the same time, a particle’s position and momentum. But that doesn’t mean that the particle doesn’t have a precise position and a precise momentum at any given time, even as scientists’ measurements are messing with them; it’s just unknowable to us, and therefore meaningless to us as scientists. The particle isn’t responsible for being knowable or meaningful to us.

ChatGPT 4.0, Dall-E 3.0 portrayal of Schrödinger’s cat

We may not know if Schrödinger’s famous cat is alive or dead until the dust has settled. But at any given moment, the cat itself is either alive (even if dying) or dead: a certain scientific wave function keeps observers in the dark about the cat’s status—but that can mean little to it.

Einstein himself, who suggested Schrödinger’s thought experiment in the first place (albeit with a non-feline example), did refer to ‘reality as something independent of what is experimentally established.’ However, this standpoint of his didn’t gain much traction. What is true is this: science genuinely cannot advance except with what is experimentally established (actually, with what is experimentally disprovable). But science, human knowledge of the world, isn’t the same thing as the world—except when human self-importance conflates the two, or faulty thinking fails to distinguish between them.

Some bad philosophical habits that harden into curses aren’t as pervasive as the above ones, although they are still too frequent. (Always read ‘philosophical’ as ‘philosophical/theological’. The medieval Scholastic period was one in which philosophy and theology were particularly hard to tell apart, but there are plenty of other cases in which one has shaded into the other. In some religions the distinction is purposely meaningless.)

One bad habit—the fourth in the list—involves philosophers whose thinking has led them to results that are mutually contradictory or absurd in a way they wouldn’t normally countenance, or who find themselves forced to choose among alternatives when they would prefer to hang on to all options. They could question their original assumptions and start afresh; or they could accept that a few things may just be unsolvable (like finding a complete and consistent foundation for mathematics, which Gödel proved to be impossible). Instead, philosophers with the bad habit in question simply paper over the problem with a layer of mysticism.

Then, after the mystical attitude has shown the way to reconciling the antithetical or closing any annoying inconsistencies, if there are any remaining doubts about details, they can be declared solved through the invocation of a mystery: the obdurate details are not for human beings, or at least uninitiated human beings, to understand.

And if even that fails, mysticism allows direct appeals to supernatural agencies as a way out of philosophical dilemmas. Take the bitter medieval debates over the relationship between God the Father and God the Son, and then, between God the Son’s human and divine aspects: Father and Son could be decided to be mystically at once distinct and similar; the Son’s two aspects could be pronounced to be separate but commingled.

Then there is, for instance, Berkeley’s solution to the dilemma raised by Esse est percipi—namely, that things dematerialise the moment people close their eyes or look away and exist anew when they are perceived again. He fell back on God (he was, after all, a bishop). God, obviously being always awake and seeing everything, keeps everything in existence. Objection overruled.

A fifth fallacy: extrapolating one’s conviction, not to the nature of the world as in the first item, but to the minds of other people. Philosophers who fall for this are merely following a widespread human practice (although perhaps they, of all people, should know better). The practice is exemplified by those who repeat the dictum that ‘Everybody needs to believe in something’, originated by those who themselves need to believe and extrapolate their need to all others. The dictum can be refuted by simply pointing to people who do not believe in anything, in the specific sense of ‘believe’ that is meant here, and who do not miss it. But that would require going out to find if some such people do exist, and it is much easier to generalise in armchair comfort.

Descartes, too, was extrapolating to everyone else when he decided that perceptions are reliable if they are clear and distinct. He was clearly imagining that if they were clear and distinct to him they would be so to others—never conceiving that the person alongside him might be having a clear and distinct perception quite divergent from his own. Different people find different things to be unarguably evident.

But it’s not innocent that Derrida makes something out of the coincidence that in French différer can mean both ‘to differ’ and ‘to defer’. A philosopher who thought in English might as well, when bringing up that ‘God’ is ‘dog’ written backwards, seriously find some significance in that fluke.

And so to the sixth and final curse: a curse lurking in language. Philosophers may build up claims based on language phenomena that only occur in the tongue they happen to work in. German philosophers must guard against their language’s propensity for agglutinating words: putting together a single word for a concept tends to give it added substance (particularly since German nouns get Capitals). Thus, ‘being in the world’ is, in English, an idea; the equivalent German, In-der-Welt-sein, constituting just one (albeit hyphenated) word, is much more. In-der-Welt-sein, Heidegger’s concoction, becomes an actual Thing. (The usual English translation is ‘being-in-the-world’, the hyphenation carrying over to give it a similar standing.)

And consider Jacques Derrida’s key concept différance. The fact that in French it’s pronounced identically as under the usual spelling, différence, is innocent enough wordplay. But it’s not innocent that Derrida makes something out of the coincidence that in French différer can mean both ‘to differ’ and ‘to defer’. A philosopher who thought in English might as well, when bringing up that ‘God’ is ‘dog’ written backwards, seriously find some significance in that fluke.

Philosophy is a wonderful enterprise. It is just a shame that its practitioners have fallen, again and again, into pitfalls that could have been avoided.

Philosophy-related further reading

‘The Greek mind was something special’: interview with Charles Freeman, by Daniel James Sharp

Consciousness, free will and meaning in a Darwinian universe: interview with Daniel C. Dennett, by Daniel James Sharp

Atheism, secularism, humanism, by Anthony Grayling

A French freethinker: Emile Chartier, known as Alain, by Michel Petheram

‘When the chips are down, the philosophers turn out to have been bluffing’: interview with Alex Byrne, by Emma Park

‘The real beauty comes from contemplating the universe’: interview on humanism with Sarah Bakewell, by Emma Park

On sex, gender and their consequences: interview with Louise Antony, by Emma Park

Image of the week: Anaxagoras, by Emma Park

Image of the week: Portrait bust of Epicurus, an early near-atheist, by Emma Park

Can science threaten religious belief? by Stephen Law

Lifting the veil: Shelley, atheism and the wonders of existence, by Tony Howe

In posting, you agree to abide by our guidelines

Your email address will not be published. Required fields are marked *

Your email address will not be published. Comments are subject to our Community Guidelines. Required fields are marked *


Our articles are free to read but not free to produce. We are an independent non-profit company and rely on donations and membership subscriptions to maintain our website and the high quality of our publications. If you like what you read, please consider making a donation.

You May Also Like