The most obvious definition to try would be that A attributes a probability of over 50% to P being true. However, this seems potentially both over-extensive and under-extensive in the range of situations it classes as 'belief'. First, suppose A buys a lottery ticket and flips a coin. Without looking at the outcome of either, A can say that the probability of the proposition "the coin has landed on heads, or the lottery ticket has won the jackpot" is ever-so-slightly above 50%, yet it seems strange to call this a belief when it is at best a guess based upon probabilities. Furthermore, suppose there is a set of propositions P1, P2, ... all of which are mutually exclusive, whose subjective probabilities (according to A) of being true sum to 1, and P(P1)<0.5, but P1 is by a substantial margin the most probable of these. Would it then be reasonable to say that A believes P? I'm uncertain, which seems to suggest that it is likely to depend upon the vagaries of the case.
With regard to the first issue, perhaps we can tweak our definition to state that P(P) must be a posterior probability, based upon actual evidence, as opposed to a prior probability based on... something. (I'm assuming for the moment that A follows a Bayesian Epistemology; of course most people, including most philosophers, do not fit this description, but I personally at least try to and this whole post is a somewhat roundabout way of tackling an issue regarding my own beliefs).
This doesn't really seem to make any sense, however, as a normative prescription for how we ought to choose our beliefs. If we have a sensibly chosen prior then it's hard to see why having evidence should affect the epistemic nature of our view of a proposition. (By nature, I refer to belief vs. justified belief vs. knowledge vs. whatever else there is, as opposed to status, i.e. strongly believe vs. weakly belief vs. weakly disbelieve and a thousand-one-variants thereof).
What I suppose I'm getting at is that the notion of belief as a binary concept is very unclear, and perhaps incoherent. What would this mean? It shouldn't affect our actions: we are perfectly capable of acting on things we 'disbelieve' or believe to have negligible probability - this is one of the key ideas in some areas of global catastrophic risk. It ought, however, to affect the way we think about epistemology - if it is impossible to come up with a sensible definition for belief, then this will throw all attempts to define knowledge out of the window. For people with sensible (i.e. Bayesian) epistemologies this is no problem, but it might form the basis of an attack upon non-Bayesian epistemology.
No comments:
Post a Comment