Saturday, September 26, 2015

Chesterton and Kierkegaard on the Difference of Christ

What difference does Christ make?

This question has many answers in many different contexts. Two of my favorite writers, G.K. Chesterton and Soren Kierkegaard, focus on the difference Christ makes in terms of human possibility.

Man is different from other animals insofar as he lives self-reflected in a world. Beavers and dogs don't worry about how they relate to the world; they just exist as they are unselfconsciously in the world. They are the world. But man knows himself as who he is in relation to the world. Kierkegaard describes this difference in The Sickness Unto Death in terms of the self as "a relation which relates itself to itself." The fact that man by nature relates himself to the world means his existence, unlike that of non-rational animals, is a dialectic of possibility and necessity. I understand who I am (or think I understand), and I also understand the world and my place in it, and in terms of that relationship life presents a present reality of necessity and a horizon of possibility. I exist as a relationship to the world, but I can know that relationship and (perhaps) change it - I can relate myself to the relationship which constitutes my self in the world.

But I can do that only in terms of the possibilities available to me, and those are constituted by my philosophy. What sort of possibilities are available to the natural but pre-Christian man, that is, the pagan man? Chesterton in Orthodoxy describes the pagan world as a world of pink. The great pagan virtue is moderation; a little of everything but not too much of anything. Red and white mixed together, not too much of each. This is a natural and sensible policy, and in the pagan world it produced great men like Aristotle and Marcus Aurelius. The ideal gentleman is a little bit of a warrior and a bit of a scholar as well. He drinks wine but not too much; he loves others but not too much of that either. For love is a form of madness and madness is unbalanced. Above all he maintains self-control, for he knows that the world contains good things as well as evil things, and that it ends in death. He keeps these facts before him and holds himself well so that he is neither carried away by good fortune, nor destroyed by misfortune, for life inevitably involves both. There is no better wisdom in a world without Christ, especially in a world that cannot imagine Christ. The life of balanced moderation is the best life that the best pagan mind could imagine; it defines the horizon of pagan possibility.

What has changed with Christ? The Gospel of John tells us that His first miracle occurred at Cana, and involved the replenishment of wine at a wedding feast that had run dry. We can assume that the host of the feast had on hand an appropriate amount of wine for the celebrations. It would seem, then, that any additional wine would violate the principle of moderation; we've gone from having a sensible good time to getting drunk in excess. But this is why it is a miracle, for a miracle is more than merely the suspension of ordinary physical expectations; it is a sign and revelation of a new order of existence, an order that breaks through the old pagan compromises and proposes a way of life that answers to the transcendent meaning of Christ. The exhaustion of the wine at Cana symbolizes the exhaustion of pagan virtue and the existential hopes it offered. The party is over; it is expected to be over and the celebrants are prepared to go home; no one can imagine the party continuing, or at least continuing with any propriety. But Christ can imagine it, and through His grace he turns water into wine, that the party may continue, theoretically indefinitely. From that moment forward the horizon of pagan hope has been forever shattered, for the possibility that it is not the final limit, that there is a way of life that is not bound by pagan compromises, has been permanently introduced into the human imagination.

Chesterton describes the difference as a world of pink becoming a world of bold reds and whites; reds for the warriors and whites for the monks. There were warriors in the ancient world, of course, and pacifists as well. But the pure warrior, like the pure pacifist, could not express an ideal human type because he violated the principle of moderation or balance. More significantly, the warrior and the pacifist had nothing to do with each other. Each might despise the other and, if they didn't, by the nature of things they at least expressed different philosophies of life. But in Christendom the martial Knight was as much an expression of the authentic Christian life as was the peaceful Monk. Far from expressing opposite philosophies of life, they both expressed different ways of performing the same mission: Redeeming the world in the name of Christ. Chesterton states the difference this way: In the ancient world the balance of existential possibilities was expressed in the single individual of the moderate, virtuous gentleman. In Christendom, the balance of possibilities occurred in the Church as a whole rather than individuals:
This was the big fact about Christian ethics; the discovery of the new balance. Paganism had been like a pillar of marble, upright because proportioned with symmetry. Christianity was like a huge and ragged and romantic rock, which, though it sways on its pedestal at a a touch, yet, because its exaggerated excrescencies exactly balance each other, is enthroned there for a thousand years. In a Gothic cathedral the columns were all different, but they were all necessary. Every support seemed an accidental and fantastic support; every buttress was a flying buttress. So in Christendom apparent accidents balanced. Becket wore a hair shirt under his gold an crimson, and there is much to be said for the combination; for Becket got the benefit of the hair shirt while the people in the street got the benefit of the crimson and gold. It is at least better than the manner of the modern millionnaire, who has the black and the drab outwardly for others, and the gold next his heart. But the balance was not walkways in one man's body ad in Becket's; the balance was often distributed over the whole body of Christendom. Because a man prayed and fasted on the Northern snows, flowers could be flung at his festival in the Southern cities; and because fanatics drank water on the sands of Syria, men could still drink cider in the orchards of England. This is what makes Christendom at once so much more perplexing and so much more interesting than the Pagan empire; just as Amiens Cathedral is not better but more interesting than the Parthenon.  - Orthodoxy, Ch. 6
For both Chesterton and SK, the advent of Christ permanently changed the nature of existence and of the world - and that whether you believe in Christ or not. The key point they share in this regard is that Christ revealed possibilities that were unimagined prior to the Incarnation. After the Incarnation, those possibilities cannot be eradicated from the human spirit, even if Christ Himself is later denied. The price of denying Christ cannot be a simple return to the pre-Christian world, for the possibilities he revealed will remain in the human imagination- it is only their fulfillment that will become impossible, since that fulfillment is only possible with the grace of God. The result is that post-Christian life can never be a simple return to paganism; it will instead be one of melancholy and despair.

Saturday, September 5, 2015

The Linda Problem and the Conjunction Fallacy

Over at his Neurologica blog, Dr. Steven Novella has an interesting post concerning probability and the "conjunction fallacy". The conjunction fallacy arises from not realizing that the conjunction of two propositions can never be more likely than each proposition taken separately, i.e. "A and B is true" can't be more likely to be true than "A is true." I was hoping to comment on his blog about it, but Wordpress won't let me register, giving me "internal server error" messages every time I try. So I'll just post my commentary here.

The specific case taken in Novella's blog post involves a study that posed the following problem:

Participants are given information about a hypothetical woman named Linda:
  • (ELinda is 31 years old, single, outspoken, and very bright. She majored in philosophy. As a student, she was deeply concerned with issues of discrimination and social justice, and also participated in antinuclear demonstrations.
After reading the description of that target E, they requested the participants to estimate the probability of a number of statements that were true referring to E. Three statements are included as follows:
  • (TLinda is a bank teller.
  • (FLinda is active in the feminist movement.
  • (T ∧ FLinda is a bank teller and is active in the feminist movement.
The interesting result from the study is that many participants rate the last statement as more probable than the first. The scientists immediately conclude (and Novella goes along with them) that the participants must be guilty of the conjunction fallacy: T ^ F can't be more probable than T. If Linda is a bank teller and is a feminist, then it is certainly true that she is a bank teller.

It's unfortunate that the scientists did not follow up with interviews investigating the thought processes of the participants (at least I couldn't find where they had). They just immediately conclude that the participants are guilty of the conjunction fallacy and blame it on a reliance on "intuition". But I suspect that if the participants were asked straightforwardly about the conjunction fallacy - "Is both A and B being true more probable than A by itself being true?" - most everyone would come up with the right answer. So there is likely more going on here than a simple failure to understand the conjunction fallacy.

What's going on, I think, is that the Linda Problem is actually a poorly formed question in probability. Likelihoods have meaning only in the context of an implied probabilistic experiment. We understand what "there is a 50% chance it will rain tomorrow" means because we supply for ourselves the implied probabilistic context: "Given days with meteorological conditions like today, half the time the next day is rainy and half the time it is not." The question about tomorrow's weather is really a continuation of the experiment and we estimate the probability based on prior outcomes.

But it's the case that once a probabilistic experiment has occurred, the probability of the outcome for that experiment goes to 1 and all other outcomes go to 0. Once the dice are rolled and come up 7, the likelihood that the outcome of that experiment was 7 is 1 and that it was anything else is 0.

Now consider the statement T, "Linda is a bank teller." There is no probabilistic context here. Linda is what she is and is nothing else, like a dice roll that has already happened. So the probability that Linda is a bank teller is either 1 or 0 depending on whether she actually is a bank teller. Same with her being a feminist, and same with the conjunction of her being both a bank teller and a feminist. They are all either 1 or 0.

Of course, if Linda is not a bank teller, then the probability of T is 0 and T ^ F is 0. But if she is a bank teller but not a feminist then the probability of T is 1, F is 0, and T ^ F is 0. So in that sense it is strictly true that the third statement can never be more probable than the first.

But this is a degenerate use of "likelihood." Likelihood adds nothing to the analysis which is strictly logical. And in fact the participants are not "failed" for a failure to estimate probabilities correctly, but for the alleged failure to perceive the logical necessity of the conditional "If T^F then T".  So there is a bit of a bait and switch going on.

People generally approach test questions in good faith. They assume the questions are well-formed, and when they aren't, they provide their own context in an attempt to interpret the question as well-formed. In this case, being explicitly told that the question is probabilistic - and intuiting that any probabilistic question requires a probabilistic context within which the concept of likelihood makes sense - they supply the probabilistic context that is not provided by the question. Really they should say that the problem is degenerate and the likelihood of each statement is either 1 or 0, but we can't say which.

To come up with any other numbers requires a probabilistic background against which to generate a non-degenerate likelihood. This is where the much maligned "intuition" comes in. Forced to generate their own background, the participants likely tell themselves something like "If I was trying to find Linda, would I be more likely to find her starting with the general population of bank tellers, or focusing on the population that is both bank tellers and feminists?" And they reasonably conclude the latter is the better choice.  Formally what they are doing is saying "Given a random draw on the populations of bank tellers, or the population that is both bank tellers and feminists, I'm more likely to come up with Linda making a draw on the latter." And they are right about that.

An intelligent participant would be understandably irritated if told later that he got the question wrong because he doesn't understand that if Linda is a bank teller and a feminist, then she is a bank teller. What the experimenters did was bait (or force) the participants into treating the question as a probabilistic one (requiring a probabilistic context), then graded them as though they had asked them a logical question about conjunction.

Thursday, September 3, 2015

The Cult of Suffering and Assisted Suicide

Andrew Stuttaford at the Secular Right has a post on what he calls the Cult of Suffering and assisted suicide.

I was struck by Stuttaford's objection to a certain Sister Constance Veit:
That last paragraph is, I have to say, disgusting. Sister Veit's argument that those wrestling with the later stages of a cruel disease are on a "mission" on behalf of the rest of us, a mission that they never asked to be on, is an expression of fanaticism, terrifying in its absence of empathy for her fellow man.
The "a mission that they never asked to be on" reminds of Chesterton's discussion of this point in the chapter "The Flag of the World" in Orthodoxy:
 A man belongs to this world before he begins to ask if it is nice to belong to it. He has fought for the flag, and often won heroic victories for the flag, long before he has ever enlisted. To put shortly what seems the essential matter, he has a loyalty long before he has any admiration.
In other words, we are born on a mission, and have accepted that mission, long before we ever have the chance to "ask" whether we want to be on it. GKC calls this the "primary loyalty" to life and, like all primary principles, it can be difficult to defend because it is generally what one argues from rather than what one argues to. Historically this primary loyalty was taken for granted as obvious and commonsensical, like patriotism and loyalty to one's country - in this case, "cosmic patriotism."

Life begins in suffering - birth is a traumatic experience - and involves suffering of some sort until death. Until very recently, regular and persistent pain was a fact of life. Imagine having a toothache before novocaine or a kidney stone before modern surgery. My grandfather's generation would pull their own teeth with a pair of pliers. And I remember reading about an instrument people once inserted in themselves all the way up to their kidneys in order to crush kidney stones so they could later be passed in excruciating pain.

And yet, historically,  persistent suffering of a physical variety was not what generally drove people to suicide. Those reasons were typically emotional - Romeo and Juliet or stockbrokers jumping off buildings after the 1929 crash - or matters of honor: Roman (or, recently, Japanese) generals doing themselves in after a defeat, or pederasts caught in the act (King George V: "Good grief! I thought chaps like that shot themselves.") If persistent suffering were something that could only be answered with death, everyone would have killed himself 200 years ago. So much for the human race.

The problem with suffering is that it is a fact of life that doesn't go away whatever your philosophy. (Well, that is not quite true: Death makes it go away.) Mr. Stuttaford speaks of "empathy for your fellow man" but I wonder what his "empathy" actually means in practice. The Little Sisters of the Poor minister to the dying who are beyond hope of recovery. Whatever Stuttaford thinks of their empathy, they at least make sure the dying do not die alone or friendless. And they offer them the hope that their suffering is not meaningless. Does Stuttaford spend any time with the dying, or does his "empathy" extend only so far as the abstract position that they should be offered a lethal syringe? I find such "empathy" far more horrifying than anything Sister Veit says - and in fact is not empathy at all but merely an embrace of the Cult of Death. To that I prefer the Cult of Suffering.

Saturday, August 29, 2015

The Quotable Chesterton

"A great classic means a man whom one can praise without having read." - G.K. Chesterton

It's virtually a cliche to point out that Chesterton is among the most quotable of authors. But it's easy to misunderstand the Chesterton quote taken out of context. For instance, take the quote above, from his essay "Tom Jones and Morality" in All Things Considered. Our first reaction to it may be to think that GKC is being ironic and taking a swipe at people who talk up a classic without having read it. But in context it is clear that GKC means no such thing and intends just what he says.

Chesterton's point is ultimately conservative in the best sense of the word. A great classic becomes so based on the developed opinion of mankind over many decades or centuries.  We can praise a classic without having read it based on trust in that common, longstanding opinion. I can, Chesterton says, talk of "great poets" like Pindar without ever having read Pindar because "a man has got as much right to employ in his speech the established and traditional facts of human history as he has to employ any other piece of common human information." And the status of great classics is one of those "established and traditional facts.

While GKC defends the right of men to praise a classic without having read it, he disputes a right to condemn a classic without having read it. The reason should be obvious. Praising a classic is submitting to the historically developed consensus concerning a work; condemning one is contradicting that tradition and, so, going it on your own. If you are going to contradict the received opinion, you've got to have some reasons for doing so, and it is hard to see how you could have good ones without having read the work in question.

GKC never wrote pithy quotes for the sake of being quoted. His wit is always a spur to more considered reflection - a reason for us to be careful of a GKC quote absent context.

Friday, August 21, 2015

Science Discovers Socrates

When I first began to seriously read philosophy, and by that  I mean reading Plato, Aristotle and Aquinas directly and not through summaries or interpretations of them,  perhaps the most thrilling discovery I made was the extent to which they anticipated just about every important philosophical position that might be taken. What I had thought were modern views that the ancients were too ignorant or naive to conceive, had in fact been explored by them, and were often treated more intelligently than they were by their supposed modern betters. This is no more true than in Plato.

For instance, the objection that philosophy is just a verbal game that never really proves anything seems like a modern objection based on a review of the long history of philosophy. But we find that this is actually an ancient objection, and in fact was at the heart of the charges against Socrates at his trial. Socrates, it was claimed, just played verbal games making the weaker argument appear the stronger, misleading his young followers. Or the objection that there is no objective morality, and that "right" and "wrong" are in fact defined by whomever is the strongest and able to impose his views. We like to think that it was the naive ancients who believed in things like ghosts and objective morality, whereas we moderns, wiser through science and cultural experience, no longer fall for such things. But the idea that "right" and "wrong" have no objective foundation is a very ancient opinion and is the subject of the Platonic dialog Gorgias, in which Socrates has a spirited argument with a defender of such a view.

I recently had, once again, the experience of reading an intelligent modern author (and scientist) elaborate what he thought was a novel insight but was one which, naturally, had been explored by Plato thousands of years ago. I refer you to Dr. Steven Novella's Neurologica blog, in which he wrote a post discussing Expertise and the Illusion of Knowledge. The post begins with:
In general people think they know more than they do. This is arguably worse than mere ignorance - having the illusion of knowledge.
Anyone familiar with Plato will immediately see that Dr. Novella is practically quoting Socrates in the Apology. But he does not seem to be familiar with Plato, and he goes on to describe the scientific investigation that backs up the assertion of the illusion of knowledge,  as though the possibility of the illusion of knowledge had not already been decisively established for Western culture twenty five hundred years ago in Athens.

Most of his blog post is concerned with the scientific investigation of the illusion of knowledge, and it is only at the end of the post, and almost in passing, that Dr. Novella approaches but never actually raises the truly decisive question:
As always, I encourage my readers to apply these lessons not only to others but to themselves. The Dunning-Kruger effect and the illusion of knowledge apply to everyone, not just to others.
The horrifying thing about the illusion of knowledge is that when you have it, you don't know you do. That is why it is an illusion. And the the question of questions is: How do I know when I truly know something as distinct from when I only think I know it?

It's not enough to merely mention the Dunning-Kruger effect and move on, as though simple awareness of the effect is sufficient to inoculate one from it. The scientists used made up terms and fake concepts (like "annualized credit") to measure the extent to which subjects claimed knowledge they could not possibly have (since there was nothing to know), and perhaps it would be a good start to make sure we ourselves are not trading in deliberately bogus concepts. But that's not really the problem that faces us. The problem for us is that, even trading in legitimate concepts, we can end up believing we know things to be true that we don't.

Before discussing Plato's answer to the question of how we know when we truly know, let's consider modern approaches to the question. Descartes could be said to have launched the modern era by proposing universal doubt as the true way to found epistemology (or, the science of how we know what we know). Doubt all that you know, and what can survive that doubt can be confidently embraced as truly known. Famously, Descartes concluded the one thing that survived universal doubt was the fact of his own thinking - cogito ergo sum. From that nugget, Descartes reconstructed the world of common sense, including the existence of God.

Unfortunately, it turned out that Descartes's procedure wasn't the pure doubt he thought it was. Why, for instance, is thinking the crucial existential act? I dance therefore I am, I pray therefore I am, I eat therefore I am all work as well. In fact, as Kierkegaard tells us, the simple I is sufficient to establish existence. I anything therefore I am works because it is really I am that comes first and anything else comes later. What Descartes's approach does is falsely privilege thought over existence, as though existence were held in suspense until thought ratified it. Instead, the truth of our own existence is immediately known to us, and the conclusion we should draw is not that existence is the one thing thought can safely conclude, but that it was foolish for thought to ever doubt existence in the first place.

This may sound like one of those philosophical points that really has no bearing on anything anyone is really interested in but it is far from that. For the Cartesian move can be summed up in the principle that doubt is its own justification or, in other words, that we are justified in doubting something by the simple fact that it can be doubted. This Cartesian attitude has become deeply embedded in the modern consciousness, not just in philosophers, but in the common man as well. And it has terrible effects because it is false to human nature.

Human nature is incarnate - we existence as embodied beings in time and space. Time starts running for us as soon as we are born and does not stop for us until we die, and every moment of that time existence makes demands on us, whether we doubt those demands or not. As children, we must be fed, kept warm and educated. A child cannot doubt and, in any event, should not doubt what he is presented with. A baby who somehow was able to doubt the value of the food he was given and refused to eat until the nature and necessity of food was established for him would soon die; a child who doubts his parent's admonishments to not wander off with strangers may very likely find himself in an unanticipated but dreadful situation. So by the time a child has grown old enough to learn of Descartes and considers flirting with the process of universal doubt, he has already spent many years not doubting and, in fact, could only have arrived at the position of being able to doubt through that non-doubt (which I will give the name faith for purposes of brevity.) Will he then embrace doubt, including doubt of the very life story that brought him to the place at which he could doubt? This isn't a bold move into sure knowledge, but the deliberate forgetting of that which made us who and what we are; the consequence of which is the tendency of modern man to wander through life not knowing what he is doing.

Really the situation is this: To get through life, we must believe many things, simply to get on with our day. Universal doubt is an existential impossibility and is the arbitrary decision to take one side of the analysis of error  Kierkegaard poses at the beginning of Works of Love: One can go wrong by believing that which is false, but one can also go wrong by failing to believe that which is true. The modern man following Descartes assumes the downside is all in falling into the former error.  But falling into the latter error is arguably worse, Kierkegaard tells us, because through it we close ourselves off to the best things in life, which can only be had through faith.

One way to think of the Cartesian approach is as an attempt to find an absolute starting point for philosophy; a point which can be embraced by any man, anywhere as the start of his thought. Descartes, the mathematician, is naturally thinking of things like geometry, which has an absolute starting point in Euclid's postulates. Anyone, anywhere at anytime who wishes to take up geometry must do it, if he is to do it legitimately at all, with these same postulates. Can the same be said of thought in general? If so, then we could get past the endless dialog of opinion that was characteristic of philosophy and so distressed the founders of modern thought.

The problem, as we've seen, is that we have already embraced many things, things that have made us who we are, by the time we arrive at a place where we could undertake Cartesian doubt. Geometry can start anytime we want, but life has already started and conditioned us by the time we become philosophically aware. And it continues to condition us even as we ponder it. What this means is that, unlike geometry, there can be no absolute starting point to philosophy. The ancient dialog of opinion that characterizes classical philosophy is not a peculiar feature of that philosophy, but is reflective of the substance of philosophy, which is human existence.

When we arrive at the point at which philosophical consciousness is possible, we have already been conditioned by our upbringing and education. We already have a set of beliefs about the world and ourselves, about what the nature of the world is and who we are, about what is good and evil, about what is important and not important. Philosophical awareness, whether of the Socratic or Cartesian variety, can begin to happen when we realize not all that we think we know we do in fact know. The Socratic approach to this realization, unlike the Cartesian approach, is not therefore to throw everything we believe overboard. It is, rather, to understand that human nature is such that we must accept as true many things that have not as yet survived our critical scrutiny. It is to continue to live and commit ourselves in light of those beliefs, and to gradually but methodically subject those beliefs to philosophical scrutiny.

Notice how subjective this process is. By subjective I merely mean that every individual will have had a different experience and be equipped with a different set of opinions by the time he comes to philosophical consciousness. And philosophy for him can only mean working through the set of opinions that are peculiar to him. Thus there is no absolute starting point to philosophy because there is no absolute starting point to life. The only universal starting point was established by Plato and, brilliantly, explored in his writing. By writing his philosophy in dialog form, following the examination of opinion by Socrates, Plato communicates the truth that philosophy can only mean working through the opinions particular to a man - you - and not some abstract set of opinions or truths falsely claimed to be a priori universal.

Some of the conundrums that puzzle we moderns show how far we have strayed from the Socratic viewpoint. For instance, one often hears the assertion: If you had grown up a Jew you would be a Jew now, or you had grown up a Muslim, you would be a Muslim now. The only reason you are a Christian is because you were born into a Christian family. The implication, Cartesian in spirit, is that we can only really find the truth by abstracting ourselves out of the existential commitments into which we are born. But this is simply false. Socrates did not discard the social and cultural obligations into which he was born as an Athenian. In fact, his last words poignantly show that his obligations were on his mind right to the end: "Crito, we owe a cock to Asclepius." What he did do was philosophically investigate those commitments as he lived them, as shown, for instance, in the dialog Euthyphro. That I might be a Muslim today were I born in a Muslim country simply shows that - hopefully - I would not think human obligation goes away just because I doubt it even if I were a Muslim.  It is true that I might not have experienced the philosophical freedom I do now were I born Muslim, but this does nothing to undermine the philosophical freedom I do have having been born here. In other words, the fact that I might have been born a Muslim and never really challenged it philosophically does nothing to show that there is anything philosophically suspect in being born a Catholic and staying a Catholic. It might just be - and I think it is - that Catholicism is the one religion that really can withstand philosophical scrutiny.

So what is Plato's answer to the question of how we distinguish what we know from what we only think we know? The answer is that we know something to the extent that we can answer for it; that is, that it can withstand philosophical scrutiny in the form of Socratic cross-examination. This answer is both subjective and not absolute; I know something to the extent that I can provide reasons for it that can withstand scrutiny. And it is not absolute because cross-examination never has an absolute end. Our views can always be subject to further challenge. Put another way: I know something when I can provide a good answer to the question - How do you know that?

Returning to Dr. Novella and his post, his last sentence admonishing his readers to take account of the Dunning-Krueger effect (historically known as the Socratic insight) in their own thinking constitutes a Socratic moment. If we stop and ponder the implications of the realization of our own ignorance, we may find ourselves open to a truly philosophical adventure - one in which there is no absolute starting point but which has an absolute end in the truth. One way to short circuit this adventure is by positing an absolute starting point to thought - be it Cartesian universal doubt, or, as seems to be the case with Dr. Novella, the value of science. But the value of science, and indeed what constitutes science vs. the pseudoscience Novella battles in his blog, are not themselves scientific but meta-scientific (i.e. philosophical) questions. And as such they can only be resolved through the dialog of opinion.

So embrace the Dunning-Krueger effect, but turn to Plato to discover what it truly means.

Saturday, July 18, 2015

Relating Ourselves to Indirect Knowledge, Pt. 2

In part 1of this series, I began a discussion of how we can use reason to relate ourselves to indirect knowledge. Indirect knowledge is, briefly, knowledge that we ourselves do not know the immediate reasons for its truth. Instead, someone else knows the reasons, and we are related to that knowledge through them by their mediation. Examples include complicated mathematical proofs (like the one recently demonstrated for  Fermat's Last Theorem). We might not be able to follow the logic, but the mathematicians can, and we can appreciate what the mathematical geniuses have done. Or scientific claims like global warming, in which we cannot possibly conduct or review the science ourselves, but instead must trust what the relevant experts say about it.

Relating ourselves to indirect knowledge is very different than relating ourselves directly to knowledge. The latter involves a consideration of truth immediately in terms of the fundamental reasons for something's being true or not. There is no mediator. In the former, the crucial question is how we judge the mediator, since we must take his word respecting the fundamental reasons for the truth of falsity of something. In my earlier post, I pointed to Socrates as an example of how to evaluate mediators, and used his example in the Apology: We must test a mediator to discover whether he himself is able to separate his knowledge from his opinions, and so give us only his expert knowledge and not also, in addition, his non-expert and perhaps poorly founded opinions masquerading as expert knowledge. I gave Carl Sagan as a classic example of the expert who fails Socratic examination. In such cases, an expert can still be useful, but we must be very careful to separate what he genuinely knows through his expertise (the wheat) versus the mass of non-expert opinion he gives along with it (the chaff).

We may also consider that indirect knowledge can never contradict direct knowledge. There is only one truth and it is the same for us as it is for everyone else. Thus we know 2+2=4 directly, and any purportedly expert theory that ends up contradicting that truth (implicitly as well as explicitly) must be suspect; for whatever the expert knows, he can't know that 2+2 equals something other than 4. That's an obvious and trivial example, and better examples are not hard to find. Let's look at what Jerry Coyne tells us about truth, fact and knowledge on pages 186 and 195 of Faith vs Fact:

(Begin quote)
For consistency, I'll again use the Oxford English Dictionary's definitions, which correspond roughly to most people's vernacular use. "Truth" is "conformity with fact; agreement with reality; accuracy, correctness, verity (of statement or thought.)" Because we're discussing facts about the universe, I'll use "fact" as Stephen Jay Gould defined "scientific facts": those "confirmed to such a degree that it would be perverse to withhold provisional assent." Note that these definitions imply the use of independent confirmation - a necessary ingredient for determining what's real - and consensus, that is, the ability of any reasonable person familiar with the method of study to agree on what it confirms... Finally, "knowledge" is simply the public acceptance of facts; as the Dictionary puts it, "The apprehension of fact or truth with the mind; clear and certain perception of fact or truth; the state or condition of knowing fact or truth." What is true may exist without being recognized, but once it is it becomes knowledge. Similarly, knowledge isn't knowledge unless it is factual, so "private knowledge" that comes through revelation or intuition isn't really knowledge, for it's missing the crucial ingredient of verification and consensus...

"I'm hungry," my friend tells me, and that too is seen as extrascientific knowledge. And indeed, any feeling that you have, any notion or revelation, can be seen as subjective truth or knowledge. What that means is that it's true that you feel that way. What that doesn't mean is that the epistemic content of your feeling is true. That requires independent verification by others. Often someone claiming hunger actually eats very little, giving rise to the bromide "Your eyes are bigger than your stomach."
(Emphases in original and end quote).

Socrates once put forward the observation that flute-playing implies a flute player. Similarly, knowledge implies a knower. There is no knowledge without someone knowing that knowledge or, in other words, knowledge is the substance of the act of knowing. What this means is that, contra Coyne, all knowledge is subjective, meaning that all knowledge is knowledge only because it is known by someone, somewhere, at some time. The fact that all knowledge is subjective is a piece of primary knowledge - it is something we can know directly for ourselves simply by reflection on the nature of things. Thinkers like Coyne like to speak of the abstraction "science", as though it is a disembodied process generating results all on its own, but we should remember that science is but the activity of scientists, and to the extent that anything is known by science, it is known by individual scientists here and there.

The "independent confirmation" of which Coyne writes is a useful and wonderful thing, but he fails to realize that it is dependent on the "subjective truth or knowledge" that he disparages. "I'm hungry" is certainly one thing we can say; another is "I hear or have read your experience in confirming my scientific experiment." The latter is as subjective as the former. Coyne claims that the former needs independent verification of its epistemic content (that content apparently being "I need food.")  Well what about the latter? The epistemic content of the latter is that "it is a fact that you have confirmed my scientific experiment." This would seem to need independent verification as well. How will I get it? By listening to something else you say or write, or what someone else has said or written? Then those subjective experiences - which as experiences are also of the form "I am hearing you say that..." - are themselves subject to the same requirement of independent verification. We have an infinite regress here, and for a very good reason. Any contact I have with reality will be subjective, simply because I am me, and science can escape that truth only on pain of indulging in magical thinking. Introducing a radical divide between our subjective experience and its epistemic content destroys not only Coyne's intended target of religious belief, but the very possibility of knowledge.

"I'm hungry" does not always mean that I need food. But in the normal course of events it does; that is why nature gave us the feeling. "I hear you saying that you have confirmed my experiment" doesn't always mean I have heard you say that - I could be dreaming, hallucinating or simply have misheard you - let alone that you have in fact confirmed my experiment. But in the normal course of events it does, and in the normal course of events I might reasonably take for granted that you have in fact confirmed my experiment. Subjective experience is not indubitable; the attempt to make it indubitable (as in the thinking of Descartes) only leads to yet more fundamental and dangerous misunderstandings. But it is literally all we have got.

The only basis from which to critique our subjective experience is through yet more subjective experience. Doesn't this just involve us in yet another infinite regress? No, because this involves us in the philosophical process of dialectic. Subjective experience does not go on to infinity, but turns back on itself, We criticize subjective experience A in terms of experience B and B in terms of A, deciding what makes the most sense based on how our theories make sense of experience comprehensively.

For instance, consider the ancient philosophical question of the difference between sleeping and waking. How do I know I'm not sleeping right now? I notice that in certain cognitive states the question of whether I am sleeping or waking never occurs to me, and seems like it could not occur. These states, of course, are when I am sleeping, and in fact when the question occurs to me as to whether I am sleeping or waking, I know I am in the processing of waking up. So the difference between sleeping and waking seems to be that waking is aware of both itself and the state of sleeping, while sleeping is not aware of either itself or the waking state. Now since I am aware of the distinction between the two states, I must be awake. Thus we have the subjective experience of sleeping (experience A) being critiqued from subjective experience B (waking), with both experiences shedding light on the other (from the perspective of B) leading to a comprehensive insight into both experiences.

Or consider the process of science itself. While flute-playing implies a flute-player, and knowledge implies a knower, science implies a scientist. That is, all science occurs in the context of the subjective experience of a scientist. This is a very valuable piece of direct knowledge that is surprisingly often overlooked. Scientists, being people like you and me, can and must take the everyday world of common sense for granted; not just in their everyday life, but in their scientific endeavors as well. If the microbiologist starts wondering whether he's really looking through his microscope, or the physicist that he's really discussing his results with other physicists and not merely a Matrix-like simulation meant to deceive him, then his science will never get started. There is therefore a dialectic between ordinary experience and the specialized experience of the scientist in the lab.

Coyne seems to be in the grip of a mythical belief that the scientific method allows one, in the moment of science, to transcend human nature itself and reach the otherwise unattainable realm of the "objective." Like all truth myths, it  isn't recognized as such but serves as an unarticulated background assumption.

And the cure for it is philosophical reflection on direct experience.

Sunday, June 28, 2015

Recent Supreme Court Actions

While reading Plato's Laws in researching my recent post, I came across a phrase that seemed appropriate to the recent actions of our Supreme Court:

"No human being is competent to wield an irresponsible control over mankind without becoming swollen with pride and unrighteousness."  - Laws, Book IV