Friday, December 16, 2011

The Pillars of Knowledge: Faith and Trust?

We don't often reflect upon the reality that our "knowledge" is either faith based or trust based, which fundamentally constitutes nothing more than the essence of "belief".

Since the secular and non-secular worlds are supposed to be utterly different from one another, for the secular realm such as science, let's call that belief "trust", and for the non-secular realm such as religion, let's call that belief "faith".

Whatever words we use, the essence of "belief" boils down to a dependence on other people, a belief in what other people write or say they know, as the real basis for what we call "our knowledge".

There is a substantial amount of discourse and debate, in some circles, about the impact that this reality should or should not have on us:
"I find myself believing all sorts of things for which I do not possess evidence: that smoking cigarettes causes lung cancer, that my car keeps stalling because the carburetor needs to be rebuild, that mass media threaten democracy, that slums cause emotional disorders, that my irregular heart beat is premature ventricular contraction, that students' grades are not correlated with success in the nonacademic world, that nuclear power plants are not safe (enough) ...

The list of things I believe, though I have no evidence for the truth of them, is, if not infinite, virtually endless. And I am finite. Though I can readily imagine what I would have to do to obtain the evidence that would support any one of my beliefs, I cannot imagine being able to do this for all of my beliefs. I believe too much; there is too much relevant evidence (much of it available only after extensive, specialized training); intellect is too small and life too short.

What are we as epistemologists to say about all these beliefs? If I, without the available evidence, nevertheless believe a proposition, are my belief and I in that belief necessarily irrational or non-rational? Is my belief then mere belief (Plato's right opinion)? If not, why not? Are there other good reasons for believing propositions, reasons which do not reduce to having evidence for the truth of those propositions? What would these reasons look like?

In this paper I want to consider the idea of intellectual authority, particularly that of experts. I want to explore the 'logic' or epistemic structure of an appeal to intellectual authority and the way in which such an appeal constitutes justification for believing and knowing.
"
(Epistemic Dependence, by John Hardwick, Journal of Philosophy, Vol. 82, No. 7, July. 1985). A problem arises when we rely on experts in a situation where those experts disagree with one another, even on the same set of facts:
"Alvin Goldman has criticized the idea that, when evaluating the opinions of experts who disagree, a novice should 'go by the numbers'. Although Goldman is right that this is often a bad idea, his argument involves an appeal to a principle, which I call the non-independence principle, which is not in general true. Goldman's formal argument for this principle depends on an illegitimate assumption, and the examples he uses to make it seem
intuitively plausible are not convincing. The failure of this principle has significant implications, not only for the issue Goldman is directly addressing, but also for the epistemology of rumors, and for our understanding of the value of epistemic independence. I conclude by using the economics literature on information cascades to highlight an important truth which Goldman's principle gestures toward, and by mounting a qualified defense of the practice of going by the numbers."
(When Experts Disagree, by David Coady, Episteme: A Journal of Social Epistemology, Vol. 3, No. 1, 2006). This is an ancient problem which has been handled in different ways that result in divergent solution paths, but with no absolute conclusions.

This blog has discussed the issue of how our jurisprudence deals with conflicting expert testimony in the post Why Trial By Jury?, how experts can be very wrong in the post What Is Pseudo Science? (cf. The Appendix of Vestigial Textbooks), and Dredd Blog has discussed "scientific faith" in various contexts in the post Heretics Deny The Dark Matter of Faith.

The bottom line is that none of us, whether scientists or religionists, should get arrogant about our beliefs to the point of exalting "our knowledge" over the "knowledge" of others.

Instead, we should always remember that expert or non-expert material we rely on does not give us actual, direct knowledge of the vast majority of subjects we deal with:
"Knowledge is invariably a matter of degree: you cannot put your finger upon even the simplest datum and say 'this we know.'"
(T.S. Eliot). The danger of reliance on the seats of power for knowledge is the essence of Toxins of Power Blog posts:
"Events in my life caused me to start questioning my goals and the correctness of everything I had learned. In matters of religion, medicine, biology, physics, and other fields, I came to discover that reality differed seriously from what I had been taught. As a result of this questioning process, I was startled to realize how much of my 'knowledge' was indeed questionable."
(Dr. Thomas C Van Flandern). So, on this blog you will see the words "hypothesis" and "theory" used often, in place of an assertion of absolute knowledge.

It's A Matter of Trust, by Billy Joel



5 comments:

Randy said...

One person's "known" is another person's "unknown" ... in the "sense" that there are "known knowns", "known unknowns", "unknown unknowns", and "unknown knowns".

"Known And Unknown", by The Donald Rumsfeld, Link

Dredd said...

An example of the difficulties ... while trying to determine how many people "believe" in a flat Earth (Yes, Flat-Earthers Really Do Exist).

Lisa Mae said...

This hurts my head however mind opening it actually is

Lisa Mae said...

This hurts my head however mind opening this information is.

Randy said...

"Cognitive biases and brain biology help explain why facts don’t change minds" (Link)