RUMINATIONS

Part I – Chapter 2

About Induction

1. Critical thought

2. Misappropriation

3. Evidence

4. Detail

5. Seems and Is

6. Adduction

7. Pertinence

8. Trial and Error

9. Field Specific

10. The Human Factor

11. Theorizing

12. Approaching Reality

13. Experiment

14. The Uncertainty Principle

15. Epistemic Ethics

16. Phenomenology

17. Appearance, Reality and Illusion

18. Existence and Non-existence

19. Philosophy and Religion

1. Critical thought

Critical thought, or criticism, is considering the truth or falsehood of an idea – not only its truth, and not only its falsehood, either. It is not essentially a negative, anymore than positive, penchant, but an attitude of rigorous review in judgment, of keeping our standards high.

What makes a theory “scientific”, in the strict sense, is not whether it emanates from some prestigious personage or institution or corporation, but whether a maximum of care has been taken to formulate it and test it in accord with all known criteria of inductive and deductive logic. Science does not primarily mean, as some imagine, lab technicians with white aprons or university professors, or the exact sciences or mathematical equations. The term “science” initially refers to serious study, or to pursuit of knowledge as against mere opinion. It signifies a sustained effort of sound methodology, as currently possible and appropriate to the field of study concerned.

2. Misappropriation

The most common logical fallacy is perhapsthe misappropriation of logical expressions– using the language of logic, without having in fact resorted to logical processes. This often suffices to convince some people.

For examples: one might say: “it is a reasonable assumption that…” when one has made no attempt to logically check the issue out; or: “it may be inferred that…” when no deductive or even inductive logical process allows such inference. One gives the impression of logic, but without factual basis. Words like “it must be that”, “a fortiori”, “in conclusion”, “because of”, etc., are freely used as alibis, in lieu of logic, in the way of mimicry, when logic was in fact ignored or opposed.

Of course, such behavior in discourse is not always intentional dishonesty. It is often due to ignorance of logic or lack of logical skill, or even just to inattentive, vague and imprecise thinking. In particular, many people are not aware ofthe difference between strictly deductive inference and merely inductive inference– these two logical modes being all the same to them. Sometimes, even though their reasoning was sound and its results plausible, they are just not aware exactly how they did it.

An example of intentional dishonesty is the discourse of Nagarjuna, which as I show inBuddhist Illogicis replete with pretended logic.

Another notable example of pseudo-logical discourse is Sigmund Freud’s “Moses and Monotheism”. His method there can be characterized asfalse advertisingandcreeping annexation. He says he won’t engage in some form of argument (which would be too obviously logically illicit or unscientific); and then, in the very next breath or gradually thereafter, he goes ahead and inserts that very argument into his discourse (to justify his prejudices). He loudly acknowledges the argument to be invalid (so as to give the impression that his approach is virtuously objective and scientific); then, coolly ignoring the very methodological imperatives he has just admitted, he hammers home his (foregone) ‘conclusions’. It is psychological manipulation. He relies on the prestige acquired in his field to pass over lies concerning another field.[1]

3. Evidence

Every experience (concrete appearance – physical or mental percept, or intuition) is ‘evident’, in the sense that it is manifest before consciousness and that such appearance automatically gives it a minimum of credibility.

Concepts or theses (products of abstraction) are not themselves evident in this sense (though they too ‘appear’ in a sense), but rely for their credibility on their relation to certain experiences. An experience is ‘evidence for’ some concept or thesis, when it serves to confirm it adductively. A concept or thesis is ‘evidently true’ to the degree that such evidence for it is to be found.

A concept or thesis is said to be ‘immediately evident’, when very little effort is required to establish its truth, i.e. when the evidence that suffices to do so is readily available to everyone.

A concept or thesis is ‘self-evident’ (or evident by itself), if it is provable without reference to further experiential evidence (other than the minimum experience underlying its very conception or formulation). Such proof is achieved by noticing or showing the negation of the concept or thesis to involve an inconsistency or a self-contradiction of some sort.

We label ‘obvious’, then, all experiences (as such, i.e. in and for themselves), as well as ‘immediately evident’ and ‘self-evident’ concepts or theses.

4. Detail

An important criterion for the credibility of theories is thedegree of detailthey propose. For instance, the immediate Creation theory is vague, whereas the gradual Evolution theory offers detailed descriptions of entities and processes. But of course, even the most detailed theory may turn out to be false. The existence of elaborate fictions in the form of novels (or scientific hoaxes presented as fact) shows that detail is not by itself proof.

One should also distinguish betweenexplaining(e.g. fossils are leftovers of creatures that lived on earth in times past) andexplaining-away(e.g. fossils are mere artifacts placed on earth by God to test people’s faith). The former is generally preferable to the latter. Though here again, the criterion is not determining.

5. Seems and Is

The following are some of theinductive argumentswhich help clarifythe logical relations between the copulae ‘seems’ and ‘is’:

Uncertain mood:

P seems true and NotP seems equally true;

therefore (for this observer, at this time):

P ‘may be’ true, and equally NotP ‘may be’ true.

Probabilistic mood:

P seems true more than NotP seems true;

therefore (for this observer, at this time):

P ‘is probably’ true, and NotP ‘is probably not’ true.

Decisive mood:

P seems true and NotP does not seem true;

therefore (for this observer, at this time):

P ‘is’ true, and NotP ‘is not’ true.

6. Adduction

Adductive inference often takes the form of a deductively invalid syllogism, such as:

All Z are Y, and

these X are Y;

therefore, these X are probably Z.

Of course, strictly speaking the conclusion does not follow from the premises; however, the premises dosuggest some likelihoodfor the conclusion.

For example, “all beans in your bag are white, and the beans in your hand are white; therefore, the beans in your hand are probably from your bag.”

7. Pertinence

Pertinence might be explicated as the construction of an appropriate major premise, so that a given minor premise is enabled to yield the proposed conclusion. (I am thinking here of my findings in a-fortiori logic, generalizing the way we comprehend certain Biblical statements as inferences by interposing a presumed tacit major premise.[2])

How is the missing major premise discovered? It is not found by some direct, infallible insight – but as in all our knowledge (although we may not be consciously aware of these mental processes), it is arrived at inductively, by means of trial and error.

There may in fact be several alternative major premises, equally able to fulfill the required task of making the inference possible – equally pertinent. We may be aware of only some of these available possibilities.

We start by proposing a likely candidate for the post of major premise. This may at first glance seem like the most likely hypothesis. Later, we may change our minds, considering that the candidate does not fit in our overall context of knowledge in some respect(s). For instance, the proposed major premise might be more general than necessary, so that although it allows us to draw the desired conclusion in the present narrow context, it causes some havoc in a wider perspective. In such case, we propose a less general major premise or a considerably different one; and so on, till we are satisfied.

A hypothesis proposed is ‘pertinent’, if it can do the job at hand, which is to infer the desired conclusion from the given (minor) premise, even if it turns out to be rejected because it does not fit into the broader context. A proposed major premise incapable of fulfilling this role is ‘impertinent’.

8. Trial and Error

With regard to the trial and error involved in adduction: “trial” means trying an idea out in practice, testing a theory by observation; and “error” means that some of the ideas we test will fail the test and thus be eliminated from further consideration or at least adjusted.

This is a rather broad notion. There are perhaps numerous, distinguishable types of ‘trial and error’ – in different fields of study, in different situations – which we ought to distinguish and list. I do not attempt it here.

It should in any case be stressed that this simple method is pervasive in our pursuit of knowledge. Already at the level of sensation, we are using it all the time. For instance, when we smell food to check out if it is fresh, we are using this method. At the level of concept formation, we again repeatedly appeal to it. E.g. when we try out different definitions for a group of things that seem similar, we are using this method. Similarly, when we formulate individual propositions or compounds of many propositions, we use trial and error.

Trial and error is not just a ‘scientific method’ for high level theoreticians and experimenters – it is the basic way to knowledge by mankind, and indeed by all sentient beings. It is ‘adaptation’ to the environment in the domain of knowledge, a subset of biological adaptation applicable to conscious organisms.

9. Field Specific

Each field of study has methods and parameterspeculiarto it, as well as many that are found in common with other fields. We may thus refer to specialized principles of logic.

For example, the logic of historical research (historiology) would demand that the various forms of evidence – physical remnants (artifacts, drawings, writings, etc.), behavioral indices (traditions handed down), as well as verbal sources (witnesses, second-hand contemporary testimony, historians’ later claims, etc.) – be clearly categorized and distinguished from each other, and their relative weight as evidence be assessed as objectively as possible.

10. The Human Factor

Induction depends greatly on the human factor – on our intelligence (in some cases, genius), on our open-mindedness, on the clarity and rigor of our thinking, and on the detachment and carefulness of our reasoning and experimentation.

When theorizing and setting up tests to confirm or reject our theories, it is important to make a big effort to foresee all conceivable explanations and all their possible implications. If the theories considered are not all the theories conceivable in the present context, or if we do not correctly work out their respective experimental predictions, our inductive conclusions are bound to be faulty and misleading.

The danger could be illustrated with the following example from the history of science[3]. At one time, people thought that tiny living organisms could be ‘spontaneously generated’ – e.g. maggots could appear out of nowhere in rotting meat. This seemed contrary to the thesis that all life was created in the first week, for instance. To resolve the issue, a scientist called Francesco Redi (Italy, 1626-97) devised an experiment in 1668, enclosing meat in a container flies could not penetrate and observing whether flies emerged in it. As it turned out, no flies emerged from within the meat, leading Redi to the conclusion that flies lay eggs and in this case were prevented from doing so.

So well and good. However, suppose Redihadfound flies in the meat, would he have drawn the conclusion that flies are spontaneously generated? He would have been tempted to do so, since (as far as I was told) he did not foresee alternative theses, such as that flies’ eggs might be carried to the meat like pollen or always present in it like bacteria. If that had been the case, Redi’s inference from the appearance of flies in the meat would have been erroneous. We see from this example the importance of conceiving all possible alternative explanations for a phenomenon, before testing one’s theories.

Note in passing that this is an example of what J. S. Mill much later called ‘the method of residues’. The alternative explanations are listed, then tried out and eliminated one by one, leaving one theory we can still rely on. Of course, the reliability of the residual theory depends on the exhaustiveness of the original list of theories. If all theories are eliminated, we know (from the law of the excluded middle) we need to somehow conceive one more. Sometimes we lack the necessary intelligence or information for that.

A current example of this is the debate in the USA between Creationists and Darwinists. The latter support Darwin’s theory of evolution, and point to the plentiful and varied empirical evidence over billions of years for it (though the issue of origin remains unresolved); while the former support the Biblical idea of sudden emergence of life just a few thousand years ago and suggest “intelligent design” as an alternative outlook. Each group considers that the other’s ideas should not be taught in the classroom.

But, it seems to me, the idea of Divine creation (apart from other specifics of the Biblical narrative) is strictly speaking compatible with Darwinism, if we grant that God chose to institute ‘chance’ evolution (i.e. spontaneous genetic mutations and environmental selection) as the way the life He created in nature would proceed thenceforth. A third alternative is thus conceivable, which reconciles the conflicting theses and allows biology to be peacefully taught in the classroom.

11. Theorizing

Theorizing is of course not a one-time, static thing, but an ongoing, changing process.

An old theory may be replaced a new one, either becausethe factscurrently faced are not covered by the old theory or because somelogical orconceptualimperfection or inadequacy has been found in it. The new theory may not be much different from the old, a mere adjustment of it, but it must in any case bring something extra to bear, either a wider capacity to explain facts or some sort of logical improvement or conceptual clarification.

In setting standards for theorizing, we must highlightthe fallacy of relying on “somehows”as a way to leap overholesin one’s theories. This may be viewed as one of the ways people “jump to conclusions”.

For example, to defend the idea of theodicy (Divine justice or karma), we posit a thesis of reincarnation (in this world or another). That is, seeing the injustice evident in everyday life, we first think there must be some hidden guilt in the life of the victim, and that unpunished criminals will be dealt with before their life is through. We assume that, in the long run, over the course of a whole life, apparent discrepancies are canceled out and equilibrium is restored. But then, realizing that this too is evidently not empirically true we assume reincarnation as an explanation. For instance, children are sometimes raped or murdered; and since these are clearly innocent victims within their current life, granting that children are not punished for their parent’s sins, the assumption of justice makes us suppose that they committed commensurate crime in a past life. Similarly, for an evidently unpunished criminal, it is assumed that Divine justice will punish him in an afterworld, or that karma will do so in a future life.[4]

In cases like this, the big fallacy is to be satisfied with a “somehow” to fill the gaps in our hypothesis. In the case of reincarnation, for instance, the theory should not be accepted unlessan exact description of eventsin the transition from body to body were proposed, combined witha set of testable predictionsthat would make possible at least some empirical confirmation of the thesis (besides the events it is designed to explain). Theapparentsupport that avaguereincarnation thesis gives to theforegoneconclusion that “there is always justice” is not sufficient.

There are almost always hidden obscurities in our theories: the vagueness of some term, the lack of clarity of some proposition, the jumping to conclusions in some argument. Indeed, the sciences cannot claim success in their enterprise, as long as philosophy does not claim its own success. So long as consciousness, knowledge, universals, and similar concepts and problems of philosophy are not fully understood and solved, anything the special sciences say ignores such underlying obscurities and uncertainties. This means that the apparent success of science is temporary and delimited. Success can only be claimed at infinity, when all branches of knowledge reach their respective goals.

12. Approaching Reality

What do we mean by a thesis “approaching reality”? We refer to thedisjunctionof all conceivable (now or ever, i.e. to date or in the future) solutions to a problem. At everyeliminationof one of these alternative solutions, all other alternatives are brought closer to being “the” solution. It is a bit like a game of musical chairs, where the last, leftover contestant will be declared the winner. As the list of possibilities is shortened, the status of each possible solution is increased. Thus, it is not only through confirmation (of a given thesis), but also through rejection (of alternative theses), that the given thesis advances in our esteem, or in its “degree of truth”. In this way, we do not have to claim every thesis true or false without making nuances, and can view the quantitative aspect of induction as having formal justification.

13. Experiment

Experiment is a category of observation. It is observation in the midst of active interventions, in contrast to totally passive observation. Even when an observer moves around an object to see it from other angles, without interfering with the object, that is experiment of sorts. Asking people questions on some topic is also experiment of sorts.

Of course, when we think of experiment, we especially think of manipulations of some object – i.e. changing some conditions in or around it, and observing how its properties or behaviors are affected. Scientific experiment may be viewed as a way to speed up observation – making the object go through different phases of its nature, rather than waiting for it to vary by happenstance. Experiment improves on mere observation simply because it expands its scope. Experiment is not some new discovery by modern science[5]but has always existed – since the first man prodded some beast with his finger to see how it would react!

To conclude, the distinction of experimentation is not manipulation of the object, but action by the observer. The essence of experimental research is still observation. It is active, instead of passive, observation. Experiment is not some epistemological category apart from and superior to observation.

Indeed, one might well ask if any observation is passive. But the answer to that is necessarily yes. At the end of any experimental activity, there has to be a moment of passive observation. Rather, then, one might say that the essence of observation is passive – patient looking and seeing, receptivity and attention.

Experiment can of course go wrongfor a variety of reasons; its results are not always credible. It may be designed on the basis of wrong theoretical or practical assumptions; the physical equipment intended to control or measure the phenomena studied may be badly constructed or set up; the researchers may be insufficiently careful and accurate in their handlings and readings, whether inadvertently or ‘accidentally / on purpose’; the researchers may erroneously record their correct findings; and the results may be misinterpreted, due to weak logic or lack of intelligence or narrow knowledge base, or simply due to conscious or unconscious bias.

Often, experimenters are simply unable to see things differently from the schemas they are used to, and have foregone conclusions in their minds no matter what the experiments they make imply. Sometimes, however, experimental results seem contrary to all expectation and the incredulity of researchers is eventually legitimated by review of all procedures and further experiment. If an experiment givesinexplicableresultsin the light of all current knowledge and theory, one should indeed review and redo it very carefully.

Thus, theory and experiment have a dynamic, two-way relation. Experiments are meant to confirm or refute theories, by testing their predictions. But also, theories are used to design and evaluate experiments, as well as to explain their results. The two must repeatedly be adapted to each other.

14. The Uncertainty Principle

The Uncertainty Principle of quantum physics, according to which we cannot precisely measure both the position and the momentum of a particle at a given time, may be interpreted either epistemologically (i.e. as an insurmountable practical difficulty of observation and calculation) or ontologically (i.e. as something out there, a truth about the particle itself, such that it does nothaveprecise position and momentum). Taken in this neutral manner, it is assumably generally accepted as scientific fact; it is the interpretations of it that are debated.

Classical physics would opt for the epistemological view. This would say that at the phenomenal levels under consideration, any measuring instrument or technique physically affects the objects to be measured, and therefore cannot provide an accurate result – butwe can still hypothesize thatthere is an underlying reality, i.e. that the particle does indeed have both position and momentum. Note well that this posture is logically compatible with the notion that the assumed “underlying reality” will never be specifically known, i.e. there is no intent to evade the discovery that it is technically unknowable.

Modern positivism would prefer the ontological interpretation. It would say: no, the immeasurability is not an illusion underlain by definite facts –we can hypothesize thatthe indeterminacy is itself the ultimate reality, the truth of the matter. Note well that this posture is just as hypothetical as the preceding; it cannot claim to know what the “ultimate reality” is anymore than the other view, since the common premise is precisely that the reality is technically inaccessible to humans. It is thus just as much a doctrinal stance, however prestigious those who take it are.

Granting the said impossibility of full measurement, it follows that – in this instance at least – each of the two interpretative theses is neither verifiable nor falsifiable. In this context, at least, their logical status is the same – they are equally speculative.

Both postures are admittedly hypothetical, but the former is clearly simpler, the latter philosophically more problematic. One of the principles of scientific method, in any context, is to prefer the simpler thesis unless we have good reasons to seek out a more complex one. That is, the simpler view is considered inductively more likely, because it is less prone to affect previously established knowledge.

We are not forced to rest content with the classical view; but we must havesufficient motiveto abandon it in favor of the more complicated positivist view. The latter involves some very revolutionary suppositions about the nature of matter (namely, the possibility of natural spontaneity), which we cannot favor just for the hell of it, merely for the pleasure of challenging the existing order of things. We must first show up some distinctive weakness in the older view or some novel strength in the newer view, to justify such a radical overhaul of all past acquisitions and explanations.

The positivists argue that since we cannot determine these facts precisely, we might as well – for all practical purposes – regard them as non-existent. But the result is not quite the same, because we should consider not only the consequences of such a posture on their particular field of study, but with regard to knowledge as a whole. That is, it is not an innocuous stance – it has wide-ranging ontological and epistemological significance, seemingly putting some important fundamental assumptions of reason (viz. that all natural events are caused) in doubt.

Furthermore, there is no justification in forbidding further discussion of the issue henceforth. The positivists make an argument by intimidation, saying effectively “those who disagree with us are not worthy of intellectual consideration”[6]. But surely, the positivists must still remain open-minded – for they may indeed one day beprovedwrong, if it should happen that we are able to dig deeper into matter, and eventually find some way to experimentally measure what the uncertainty principle says we cannot.

We cannot empirically prove a “cannot” – a “cannot” isa generalizationfrom experience (though, in some cases, it is a logical insight, as in the preceding sentence). The uncertainty principle is not a purely empirical fact, plucked out directly from experience; it emerges within a certain theoretical context, which shapes our interpretation of events. This context, like many others throughout the history of science, may yet change, as our knowledge grows. There is no final and incontrovertible scientific theory.

Note well that I am not personally defending one or the other posture here[7], but comparing them from a neutral perspective, giving both fair consideration. That is, I am evaluatingtheir discourseas a logician, using a discourse that is pure logic.

15. Epistemic Ethics

Logic is not only about forms of reasoning, but also about intellectual style. It is first and foremost a teaching of epistemic ethics: the attitudes the intellect must adopt to arrive at truth. These include suppression of one’s ego, open-mindedness and truth-orientation, among many others.

Genuine philosophers earnestly search for truth. They have sincere questions and try to answer them honestly. They admit areas of doubt or ignorance. They are open to change, and evolve over time.

Fake philosophers play the role of being philosophers, but are really not philosophers. They have little interest in the substance of issues, but seek to dazzle an audience with their superficial erudition and their style. They sow famous names around in the hope of reaping reflected glory. They follow intellectual fashions in pursuit of wide approval ratings, being pious or subversive as befits the current market of ideas. To gain attention and fame, they may be scrupulously conventional or say shocking things.

They say things they do not personally fully understand; they claim to have knowledge they in fact lack. They are apologists for received doctrines, rather than researchers; and when they seem to propose some new doctrine, it is only by arbitrary opposition to established ideas so as to appear original.

For many people, philosophy is an instrument of social climbing or power over others, rather than a search for truth. Such people may convince many others of this or that absurd or silly doctrine, using the prestige of their position in the education system or in the media, or in some other social role. But in fact, they have only muddled their victims’ minds and incapacitated them.

When philosophizing, it is wise to remain low-key and matter-of-fact, avoiding grandstanding and personal emotional outbursts as much as possible. This is an issue of style, not substance. But if one does not exercise sufficient restraint in such discourse, it is very easy to get lost in misleading hyperboles. The wrong choice of language can end up determining our doctrines, causing us to approximate and exaggerate.

Here, I have in mind the likes of Nietzsche or Kierkegaard (and many others), whopervasivelyintertwine their emotional responses with their philosophical realizations. They make a big thing of their personal reactions – writing in a narcissistic manner. Thus, in the face of his insight that man is alone in the universe, without apparent supports – Nietzsche indulges in theatrical outbursts, dramatizing his utter shock, role-playing a heroic response. This is all bombast, designed to give his ego a sense of self-importance; it is a kind of mental equivalent of masturbation. Kierkegaard – “same-same, but different”: an equally emotional approach, though a self-pitying one and one with more sincerity.

Such personal reactions were, of course, characteristic of the times and places those philosophers lived in. Their styles seem so “un-modern” – few would indulge in such tonalities today. We are perhaps less flamboyant – but also more careful to avoid confusion between judgments of fact (true–false) and judgments of value (good–bad). Philosophers are human, and may of course be passionate to some extent, and express their personal valuations; but this should not be the centerpiece of their discourse.

16. Phenomenology

‘Phenomenology’ refers to the consideration of experience, in its largest sense, before distinctions are made between ‘real’ experiences and ‘illusory’ ones. The term was coined by Johann Heinrich Lambert (German, 1728-1777) in hisNew Organon(1764), with this application in mind.

The title of the 1807 work of Georg W. F. Hegel (German, 1770-1831),Phenomenology of the Spirit, would be a misnomer, if we regarded the term as limited to sensory experiences and their mental equivalents, to the exclusion of intuitions. For the spirit (or self or soul) has no perceptible phenomenal qualities, but is self-intuited.

Although the term ‘phenomenon’ nowadays is usually taken (and I so take it) to refer to experiences with features like sights, sounds, etc., whether sensed or fancied, its original meaning in Greek and then Latin is ‘appearance’, a broader term in which we may well include intuited experiences (as Hegel did, and I do too).

Thus, ‘phenomenology’ should be understood to refer to the study of appearances, and not only phenomenal appearances.

Phenomenology is a branch of philosophy designed to overcome the problem posed by ‘naïve realism’. The existence of this problem does not mean that there is no solution to it. Phenomenology neutralizes the issue, showing that all realism is not necessarily naïve, and allowing for philosophical theories favoring realism that are more subtle and capable of truth.

Concerning my barely mentioning Edmund Husserl (German, 1859-1938) in my workPhenomenology, I have this to say. I simply had no pretension of being a historian. My silence was certainly not intended to ignore or belittle this philosopher’s great work, whose scope, depth, intelligence and intellectual maturity are evident. I acknowledge strong influence from it (ideational and terminological). But I also have other influences (such as Indian philosophy), and my own contributions to make.

My intent in the said work was to summarize briefly, in a minimally intellectual manner accessible to the maximum number of people, the value and necessity of a phenomenological approach to knowledge, so as to underscore and bypass the common affliction of naïve realism. Husserl’s discourse, to my mind, perhaps because of its roots in German Idealism and its academic style, gives the impression that the phenomenal is a conceptual construct rather that a raw experience. I tried to avoid such misleading impression, and to give readers a practical tool.

As for use of the term ‘phenomenology’ – it cannot be reserved to Husserl’s work, but may legitimately be applied to any study of the phenomenalper se(i.e. quite apart from its status as reality or illusion, i.e. before such ontological and epistemological status is debated and determined).

Aristotle spoke of a science of beingquabeing, which he called ‘first philosophy’, and his successors labeled ‘metaphysics’ (because of the editorial position of this book after that on physics), and which became known as ‘ontology’. The idea and name of such a study has remained of universal value, even though there have been over time many views as to its possibility, scope and content.

In time, Western philosophy realized the methodological difficulties of this proposed discipline. In particular, it was not easy to disentangle it from the theory of knowledge, or ‘epistemology’ and logic. Conflicting schools kept arising; in each generation, in one guise or another, they competed: Idealists vs. Materialists, or Empiricists vs. Rationalist, and so forth.

The idea of a more fundamental field of research – viz. phenomenology – gradually arose in response to the realization of the underlying cause of the difficulties. In order to reconcile traditional philosophical tendencies, all of which evidently contained some truth, philosophy needed to reconsider the issues with renewed innocence, more clearly distinguishing between raw given data and processed information.

This new ‘first philosophy’, or science of appearancequaappearance (as we may also call it, imitating Aristotle), cannot be regarded as necessarily and forevermore frozen with the form and content Husserl first gave it. The term ‘phenomenology’ belongs to all philosophers, as an open and neutral term like the terms ‘philosophy’, ‘epistemology’, ‘ontology’, or ‘logic’. It is no longer the name of a school of thought (like Phenomenalism), but of a branch of philosophy.

17. Appearance, Reality and Illusion

Phenomenology results from a realization that the building blocks of knowledge are appearances. This realization is obtained through a dialectic, comprising thesis, antithesis and synthesis, as follows.

(a)At first, one naturally regards everything one comes across in experience or thought as ‘real’ (this is the ‘naïve realist’ stance).

(b)Then, faced with evident contradictions and gaps in one’s knowledge, one logically realizes that some things that seemed real at first must or at least may eventually be considered unreal – i.e. ‘illusory’ (this constitutes a cognitive crisis).

(c)Finally, one realizes that, whether something is real or illusory (and ultimately remains so or turns out to be the opposite), at least it can immediately (unconditionally and absolutely) be acknowledged as ‘apparent’ (this is the ‘phenomenological’ stance, which resolves the crisis).

Knowledge of reality can then be inductively built up from knowledge of appearances, thanks to the following principle (d):One may credibly assume something that appears to be real is indeed real, until and unless it is proved illusory or at least put in doubt for some specific reason.This may be characterized ‘subtle realism’, and proceeds from the realization that the mere fact of appearance is the source of all credibility.

Thus, phenomenology follows the natural flow of knowledge, which is to initially accept individual appearances as real, while remaining ready to reclassify them as illusory if they give rise to specific logical problems that can only be solved in that specific way. The concept of ‘appearance’ is therefore not strictly primary, but a transitional term for use in problematic cases. Since it refers to the common ground between ‘reality’ and ‘illusion’, it is deductively primary. But since the latter are in practice attained before it, it is inductively secondary.

The concepts appearance, reality and illusion are to begin with concerned with experiences; and only thereafter, by analogy, they are applied to abstractions, i.e. conceptual products of experience arrived at through rational considerations, such as comparison and contrast (i.e. affirmation or negation, and measurement).

The term ‘fact’ is usually intended to refer to purely experiential data, i.e. the raw material of knowledge, in which case the opposite term ‘fiction’ refers to other items of knowledge, i.e. those tainted by interpretative hypotheses. (But note that in practice of course we do not always abide by such strict definitions, and may use the terms more broadly or narrowly.)

The concepts of truth, falsehood and uncertainty correspond in scope to those of reality, illusion and appearance. The latter triad is applied to the contents of propositions, while the former concerns the propositions as such. For example, considering “dogs bark”, the fact of dogs barking is ‘a reality’, while the proposition that dogs bark is ‘true’; similarly in other cases.

Once we understand all such concepts as signifying different epistemological and ontologicalstatuses, it becomes clear why they need to be distinguished from each other. They are all used as logical instruments – to clarify and order discourse, and avoid confusions and antinomies.

Note well that phenomenology is not a skeptical philosophy that denies reality to all appearances and claims them all to be illusions. Such a posture (which too many philosophers have stupidly fallen into) is logically self-contradictory, since it claims itself true while rejecting all possibility of truth. The concept of illusion has no meaning if that of reality is denied; some credulity is needed for incredulity. Doubt is always based on some apparent contradiction or gap in knowledge; i.e. it is itself also an item within knowledge.

18. Existence and Non-existence

What is the relation between the concepts of existence and non-existence (or being and non-being), and those just elucidated of appearance, reality and illusion, one might ask?

At first, the term existence may be compared to that of reality, or more broadly to that of appearance (to admit the fact that illusions occur, even if their status is not equal to that of realities). However, upon reflection, an important divergence occurs when factors like time and place are taken into consideration.

We need to be able to verbally express changes in experience over time, space and other circumstances. An appearance, be it real or illusory, ‘exists’ at the time and place of its appearance – but may ‘not exist’ at some earlier or later time, or in another place. The ‘existence’ of appearances is transient, local, conditional and relative.

What appears today may cease to appear tomorrow, although it might (or might not) continue to appear less manifestly, through someone’s memory of it or through the appearance of exclusive effects of it. Something may appear here within my field of vision, but be absent elsewhere. You may see this in some circumstances, and then notice its absence in others.

We thus need to distinguish different ways of appearance. With reference to time: in actuality, or through memory or anticipation; or with reference to spatial positioning. Or again, with regard to modality: in actuality, only through potentiality (i.e. in some circumstances other than those currently operative), or through necessity (i.e. in all circumstances).

Time and place also incite a distinction between ‘existence’ and ‘reality’ (or ‘truth’), in that when something ceases to exist at a given time and place, the reality of its having existed at the previous time and place is not affected.

Furthermore, appearances are apparent to someone, somewhere – they are contents of consciousness, objects of cognition. The concept of existence is differentiated also with reference to this, by conceiving that what may be apparent to one Subject, may not be so to another. Moreover, we wish to eventually acknowledge that something may conceivably exist even without being experienced by anyone (though of course, in defining such a category, we must admit for consistency’s sake that we are thereby at least vaguely and indirectly conceptually cognizing the object concerned).

We thus come to the realization thatthe concept of appearance is a relatively subjective one, involving two distinct factors: an object of some kind with specific manifestations, on the one hand, and an awareness by someone of that object at a given time and place. The concept of existence is intended to separate out the objective factor from the factor of consciousness implicit in the concept of appearance.

‘Existence’ is thus needed to objectify ‘appearance’, and allow us to conceive of the object apart from any subject’s consciousness of it. We need to be able to conceive of the objects appearing to us as sometimes ‘continuing on’ even when we cease to be aware of them. Furthermore, we need to be able to consider objects that we have not yet personally experienced, and even may never experience. In this manner, we can project our minds beyond mere appearance, and through conception and adduction hope to grasp existence in a larger sense.

The concept of existence and its negation are thus additional instruments of logic, facilitating rational discourse, without which we would not be able to mentally express many distinctions. Consequently, saying ‘existence exists’ and ‘non-existence does not exist’ is not mere tautology, but an acknowledgement that the words we use have certain useful intentions. These statements constitute one more way for us to express the laws of thought. Existence cannot be denied and non-existence cannot be affirmed.

We do not make the distinction between ‘existents’ and non-existents’ by mentally lining up two kinds of things, like apples and things other than apples. The epistemological scenario applicable to most of our concepts is not applicable to such basic ones, which are of a more broadly pragmatic nature. Discernment rather than distinction is involved.

Whereas the concept ‘existence’ has some ultimate experiential content, ‘non-existence’ has none – because factual denial is not based on the same mental process as affirmation. We never experience non-existence – we only (in certain cases)fail toexperience existence. The concept of existence is not built up by contrast to that of non-existence, since (by definition) the former relates to ‘all things’ and the latter to ‘nothing’, and nothing is not some kind of something. There is no time, place or circumstance containing nothingness. The word ‘non-existence’ is just a dumping place for all the words and sentences that have been identified as meaningless or false.

Terms like ‘existence’ and ‘non-existence’ are not ordinary subjects, copulae or predicates; they are too broad and basic to be treated like any other terms. Those who construct a theory of knowledge, or an ontology, which concludes that ‘existence does not exist’ or that ‘non-existence exists’ have not understood the logic of adduction. When there is a conflict between theory and observed facts, it is the theory (or the ‘reasoning’ that led up to it) that is put in doubt and is to be dismissed, not the facts.

19. Philosophy and Religion

It is important to distinguish between religion (including philosophical discourse based on a particular religion, for apologetic or polemical purposes) and philosophy proper (which makes no direct appeal to premises from a religious tradition, though it may discuss religious issues).

This is a derivative of the distinction between faith and reason, keeping in mind that faith may be reasonable (i.e. without conclusive proof or disproof) or unreasonable (i.e. in spite of conclusive disproof). Note that reasonable faith is necessarily before the fact – for, if some fact is already indubitably established, there is no need of faith in it. Unreasonable faith is contrary to fact.

Some philosophers regard faith in pure speculations, those that are in principle neither provablenor disprovable(e.g. faith in the existence of God or in strict karma), as unreasonable. But I would class the latter as within reason, for it is always – however remotely – conceivable that some proof or disproof might eventually be found, i.e. the ‘principle’ is itself is hard to establish with finality. Moreover, the category of pure speculation is even applicable to some scientific theories (for example, Bohr’s interpretation of quantum uncertainty as indeterminacy).

Religion is based on faith, i.e. on the acceptance of theses with insufficient inductive and deductive reasons, or without any reason, or even against reason (i.e. albeit serious divergence from scientific conclusions based on common experience and logic) – on the basis of statements by some assumed spiritual authority, or even merely because one feels so emotionally inclined.

Philosophy, on the other hand, is based on personal understanding, on purely empirical and logical considerations; although some or many of its theses might well to some extent be hypothetical, or even speculative, they remain circumscribed by scientific attitudes and theories – that is, a sincere effort is made to integrate them with the whole body of experience and reason.

The difference between religion and philosophy is not always clear-cut, note well. Religion is not throughout contrary to reason, and philosophy is not always free of mere speculation. The difference is whether the credulity, ordegree of belief, in speculative propositions is proportional or not to the extent of available adductive evidence and proof. In the case of mere faith, the reliance on a given proposition is disproportionate to its scientific weight; whereas in the case of rational conviction, there is an effort to keep in mind the scientific weight of what is hypothesized – one is ready to admit that “maybe” things are not as one thinks.

The two also differ in content or purpose. Religions are attempts to confront the problems of human finitude and suffering, through essentiallysupernaturalexplanations and solutions. The aim of religion is a grand one, that of individual and collective redemption. Philosophies resort tonaturalexplanations and expedients, attempting to understand how human knowledge is obtained and to be validated, and thus (together with the special sciences) gradually identify ways and means for human improvement. There is still an underlying valuation involved in the philosophical pursuit, note well; but the aim is more modest.

To make such a distinction does not (and should not) indicate an antireligious bias. It is not intended as a ‘secularist’ ideology, but merely as a secular one. Religion (or at least those parts of particular religions that are not decisively anti-empirical or anti-rational) remains a legitimate and respectable human activity – it is just recognized as being a different intellectual domain, something to be distinguished from philosophy so as to maintain a balanced perspective in one’s knowledge.

The reason this division was produced historically by philosophers was to protect philosophy (and more broadly, the special sciences) from being reduced to a supporting role, as the “handmaiden” of religion. It was necessary to make philosophy independent of religion to enable philosophers to engage in critical judgment, if need arose, without having to force themselves to be “religiously correct” or risk the ire of politically powerful religious authorities.

The secularization of philosophy was precisely this: a revolt against foregone conclusions imposed by religious authorities (i.e. people collectively self-proclaimed as sole torch-bearers of truth) as undeniable ‘fact’. It is important to understandthe logical rationalebehind such a revolt, i.e. why it is epistemologically valid and necessary.

Anyone can stand up and claimto have been graced by some Divine revelation/salvation (or holy spirit) or to have attained some Buddhist or Hindu enlightenment/liberation.

Many people throughout history have made such metaphysical claims. Some have gone so far as to claim to be a god or even G-d. Some have not made explicit claims for themselves, but have had such claims made on their behalf by others. Some of the claimants – notably, Moses, Jesus, Mohammed, and Buddha – have founded world-class religions, that have greatly affected the lives of millions of people and changed the course of history. Other claimants – like your local shaman, Egypt’s Pharaoh, or Reverend Moon – have been less influential.

The common denominator of all these claims is some extraordinary mystical experience, such as a prophetic vision or a breakthrough to ‘nirvana’ or ‘moksha’ (enlightenment/liberation). The one making a claim (or claimed for by others) has a special experience not readily available to common mortals, on the basis of which he (or she) becomes a religious authority, whose allegations as to what is true or untrue are to be accepted on faith by people who have not personally had any commensurable experience.

The founding impetus is always some esoteric experience, on the basis of which exoteric philosophy and science are shunted aside somewhat, if not thoroughly overturned. The founding master’s mantle of authority is thereafter transmitted on to disciples who do not necessarily claim an equal status for themselves, but who are pledged to loyally study and teach the founder’s original discoveries.

Religion is essentially elitist, even in cases where its core experience (of revelation or enlightenment) is considered as in principle ultimately open to all, if only because of the extreme difficulty of reaching this experience.

In some cases, the disciples can hope to duplicate the master’s achievement given sufficient effort and perseverance. In other cases, the master’s disciples cannot hope to ever reach their teacher’s level. But in either case, they are the guardians of the faith concerned, and thence (to varying degrees) acquire institutional ‘authority’ on this basis, over and above the remaining faithful.

Thus, we have essentially two categories of people, in this context.

a)Those who have had (or claim to) the religious experience concernedfirst-hand.

b)Those who,second-hand, rely on the claim of the preceding on the basis of faith, whether they have institutional status of authorities or not.

Now, this distinction is not intended to be a put-down, a devaluation of either category of person. But it is a necessary distinction, if we are to understand the difference in epistemological perspective in each case.

From the point of view of a first-hand recipient, i.e. someone who has personally had the mystical experience concerned, his discourse is (for his own consumption, at least) pure philosophy, not religion. He is presumably not required to have faith, but all the information and reasoning involved is presented to him on platter. His task is simple enough; his responsibility is nil, his certainty total.

But a second-hand recipient has a difficult task, epistemologically. He has to decide for himself whether the first-hand teacher is making a true or false claim. He has to decide whether to have faith in him or not. He is required to accept anad hominemargument.

This objection is not a judgment as to the master’s veracity. Some alleged masters are surely charlatans, who lie to others so as to rule and/or exploit them; some of these remain cynically conscious of their own dishonesty, while some kid themselves as well as others. But it may well be that some alleged masters are not only sincere, but have indeed had the experience claimed and have correctly interpreted it.

But who can tell?Certainly not the ordinary Joe, who (by definition) has never had the experience concerned, and in most cases can never hope to duplicate it – and so is not qualified to judge. Yet, he is called upon to take it on faith – sometimes under the threat of eternal damnation or continuing samsara if he does not comply.

How is the common man to know for sure whether some person (contemporary – or more probably in a distant past, who may even be a mere legend) has or has not had a certain mystical experience? It is an impossible task, since such experience is intrinsicallyprivate!

To date, we have no scientific means to penetrate other people’s consciousness. And even if we could, we would still need to evaluate the significance of the experience concerned. Such judgments could never be absolute and devoid of doubt, but necessarily inductive and open to debate. Thus, the ‘certainty’ required by faith could not be rationally constructed.

It is no use appealing to witnesses. Sometimes two or more people confirm each other’s claim or some third party’s. Moreover, often, alleged authorities disagree, and reject others’ claims. But who will confirm for us innocent bystanders that any of these people are qualified to authenticate or disqualify anyone?

Thus, faith is a leap into the unknown. However, it is often a necessary leap, for philosophy and science are not able to answer all questions (notably, moral questions) convincingly, and we in some cases all need to make decisions urgently. So, religion has to be recognized by philosophy as a legitimate, albeit very private, choice. In this context, note well, secularism is also a religion – an act of faith that there is no truth in any (other) religious faith.

Note: Buddhism is today often painted as “a philosophy rather than a religion”, implying that it does not rely on faith. But this is a patently unfair description: there are plenty of faith loci within Buddhism. Belief in the wheel of reincarnation (samsara), belief in the possibility of leaving it (nirvana), belief that at least one man attained this Buddha state (Siddhartha Gautama), belief in the specific means he proposed (moral and meditative disciplines, notably non-attachment), belief in a multitude of related stories and texts – all these are acts of faith.

These beliefs require just as much faith as belief in the existence of God, and other more specific beliefs (starting with belief in the Torah, or Christian New Testament, or Koran), within the monotheistic religions. The adherent to Buddhism must take on faith the validity of his spiritual goal and pathway,beforehe becomes a Buddha (assuming he ever does). The end and means are not something philosophically evident,tillhe reaches the end through the means. This is the same situation as in the monotheistic religions.

So, Buddhism is not primarily a philosophy, but a religion – and to say otherwise is misleading advertising. The same is true of Hinduism, which shares many doctrines with Buddhism (as well as having some monotheistic tendencies, although these are not exclusive).

It is important to remain both: open-minded, granting some of the claims of religions as conceivable; and cool-headed, keeping in mind some of them are unproved. Intolerance of religion is not a proper philosophical stance, but a prejudice, a dogma. The true philosopher, however, remains sober, and does not allow himself to get carried away by emotional preferences.

Transcendental claims can, nevertheless, be judged and classed to some extent. Sorting them out is, we might say, the realm of theology (a branch of philosophy).

Some claims are, as already pointed out, directly contrary to experience and/or reason; if some harmonization cannot be construed, philosophy must exclude such claims. Some are logically conceivable, but remotely so; these are to be kept on the back burner. And lastly, some are very possible in our present context of knowledge; these can be used as inspirations and motivations for secular research.

Generally speaking, it is easier to eliminate false claims than to definitely prove true claims.

Each specific claim should be considered and evaluated separately. It is not logical to reject a doctrine wholesale, having found fault with only some aspects of it (unless these be essentials, without which nothing else stands). In such research, it is well to keep in mind the difference between anon sequiturand a disproof: disproving premises does not necessarily mean their conclusions are false, for they might be deducible from other premises.

In choosing among religions, we usually refer to the moral recommendationsand behavior patternsof their founder and disciples (as well as more sociologically, of course, to traditions handed down in our own family or society) as indices. If the advice given is practiced by those preaching, that is already a plus. If the advice and practice are wise, pure, virtuous, kindly, and loving, etc. – we instinctively have more confidence. Otherwise, if we spot hypocrisy or destructiveness, we are repelled.(Of course, all such evidence is inconclusive: it suggests, but does not prove.)

But, however persuaded we personally might be by a religious teaching, its discourse cannot be dogmatically taken as the starting premise of philosophy.To a first-hand mystic, it may well be; but to the rest of us, it cannot be.Philosophy is another mode of human inquiry, with other goals and means. Spirituality and rationality are neither necessarily bound together, nor necessarily mutually exclusive. They might be mixed somewhat, but never totally confused.

Thus, if someone claims some mystical experience, or refers to authoritative texts based on some such foundation, his philosophizing might well be considered attentively and learned from to some degree, but it is ultimately irrelevant to pure philosophy; or more precisely such discourse can become in part or wholly relevant only provided or to the extent that it submits to the secular standards of public philosophy.

The latter can only refer to experiences and insights that can readily be duplicated, i.e. that are within everyone’s reach (except a minority with damaged organs), if they but consider certain empirical data and follow a set of inductive and deductive arguments. It aims at developing, using ordinary language, a potentially universal worldview and understanding.

Admittedly, as some would argue, high-level philosophy (as with advanced mathematics or physics) is in practice not comprehensible to most laymen! Just as meditation or other religious techniques are not easily mastered, it takes a lot of effort and intelligence to learn and apply logic in depth. Moreover, the novice who enters the path of philosophy is as hopeful (full of faith in eventual results) as the religious initiate; and all along both disciplines, small successes encourage him to keep going.

So, one might well ask the embarrassing question: what is the difference between the elitism of philosophy and that of religion? Ultimately, perhaps none, or just a difference of degree! This answer would be true at least of reasonable religion. But in the case of unreasonable religion, we ought not allow ourselves to believe in it – even as a remote possibility – until if ever it becomes manifestly reasonable, i.e. until and unless our basic view of reality is indeed overturned by actual personal experiences.

It is unwise to excessively compartmentalize one’s mind and life; at the extreme, one may risk some sort of schizophrenia. One should rather always try to keep one’s rationality and spirituality largely harmonious. Faith in religious ideas need not be an ‘all or nothing’ proposition; one can pick and choose under the guidance of reason. Reason is not in principle opposed to faith; it allows for its essentials.

The challenge for today’s philosophers of religion, who wish to bring God and/or other religious ideas back into the modern mind, is to fully acknowledge and accept the current conclusions of modern science. It is no use trying to tell an educated contemporary that scientific claims – regarding the age and size of the universe, the evolution of matter, the age and history of our planet, the evolution of vegetable and animal life on it, the emergence of the human species – are all wrong! Such discourse is irrelevant to the modern mind, if not absurd.

There is still room, side by side with the worldview of science, for religious ideas – but these must inductively adapt to survive. This is always possible by exploiting (within reason) loopholes in the current scientific narrative, whatever it happen to be at any given time. Instead of emphasizing conflicts, thinkers should seek out the conceptual possibilities for harmonization. Real scientists remain open-minded wherever there are lacunae.

Creationism need not be a fixed dogma. Rather than insist that the world was created in 6 days some 6’000 years ago, say that God is the creator of the initial matter-energy of the universe, and of the laws of nature and evolution inherent in it, and that He triggered the ‘big bang’ 13.7 billion years ago.

Moreover, in physics, suggest that the indeterminacy apparent in quantum mechanics is perhaps really the opportunity God uses to daily impinge on details of the world process. Or again, in biology, propose the first conversions of mineral into living and then animate matter (wherever and whenever they occurred) were maybe due to God’s intervention; and rather than combat Darwinism, accept it as part of God’s plan and hypothesize that the apparently spontaneous occasional mutations of genes might well be miracles.



[1]It is my wish to analyze that whole book in detail someday, so as to show up the cunning and variety of his tricks.

[2]SeeJudaic Logic, chapter 4.2.

[3]I noted this example in the course of a lecture long ago, so I cannot guarantee my present rendition is entirely accurate. But no matter, I only include it here for purposes of illustration.

[4]As I have pointed out elsewhere, such doctrines are unfair to innocent victims, accusing them without justification of past crimes; and they whitewash criminals, making it seem like they merely implement justice!

[5]Although, of course, modern science has been using experiment more consciously, systematically and successfully than ever before.

[6]This is also an argument by authority. To which one can answer: one may be a great physicist and a not-so-great philosopher; merit in one field does not guarantee success in all others. Such attitudes are reminiscent of religious authoritarianism.

[7]My neutrality should be evident from the open-minded position I have taken with respect to the idea of natural spontaneity inThe Logic of Causation(see for example chapter 10.1 there).

You can purchase a paper copy of this bookBooks by Avi Sion in The Logician Bookstoreat The Logician’s secure online Bookshop.