Pur Autre Vie

I'm not wrong, I'm just an asshole

Monday, July 14, 2014

Belief and Evidence

I've been thinking about belief and evidence.  I suppose it's natural to think of confidence as varying continuously from 0 to 1, but in reality there are discrete, yes-or-no decisions that have to be made, and so the question is whether the evidence has pushed your confidence level across a threshold.

But in reality, evidence is often scarce.  And so there is the question of setting thresholds in light of scarce evidence.  And in fact some of the most important decisions will turn on beliefs that are inherently difficult to assess in terms of evidence.  So what is to be done?

I don't know.  It strikes me as being similar to the "bounded rationality" problem.  In short, the problem is that if you have scarce processing power, then you have to allocate it judiciously, but the allocation problem itself requires processing power.  But how much processing power should be allocated to the allocation of processing power?  To answer that question requires processing power.  But how much should be allocated to that problem?

To give a more down-to-earth example, imagine you are buying some kind of ingredient online.  It is available in different amounts (measured in different units) and at different prices (specified in different currencies).  If we are assuming that you are perfectly rational, then you should pick the best combination of quantity and price.  And we could simplify this by assuming that you only care about minimizing the per-unit price (so you don't care about the overall quantity).  It's a very simple calculation at that point, in that it involves no judgments.  It's arithmetic.

But it's still not simple enough.  You value your time, and you don't want to spend an undue amount of time converting all of the quantities and prices into common units.  And so there is a possibility that you will select a price quote that is strictly worse than another option, by your own lights.  (Now I want to be clear:  we are assuming that you have all of the vendors' price quotes at your fingertips.  In other words, this is a processing problem, not a search problem.  A search problem has roughly the same dynamic, though.)

And then there is the possibility that you will spend the "wrong" amount of time converting units and currencies.  If you spend too little time, you are likely to end up paying too much.  If you spend too much time, you will end up "paying" more in time than you gain in improved price.  And so the right amount of time to spend processing the data is a function of the expected gains from each calculation.

But calculating the expected gains for each calculation is itself difficult.  I suppose you could do a handful of conversions, then see what the distribution of prices and quantities is, and then use that to predict what the lowest price will be if you convert x prices.  But doing this calculation itself takes time.  You might choose to make an informal calculation - glance at the prices you've converted so far, see that they are in a tight range, and simply pick the best one.  Or see that they are all over the place, and convert a few more.  This economizes on meta-processing but it risks failing to optimize on processing itself.

Anyway enough.  This is not a new problem, there is a whole concept of "satisficing" that addresses it.  My point is just, it seems we are in a similar position when it comes to reasoning our way through decisions when the evidence is inherently limited.  We have to set thresholds uncomfortably low, or we have to abandon rational thought altogether.  And it seems to me that in fact as you set thresholds low enough, and you bring in enough informal and subjective evidence, you essentially turn a reasoning process into an emotional one.  That's not quite right.  Your beliefs become more like attitudes, more "political," the more you rely on low thresholds.  This is why deciding that you love someone is as much a choice as it is a conclusion.  Our language in this area is (maybe appropriately) fuzzy.  You determine that you love someone, or you determine to love someone.

So your beliefs end up merging seamlessly into your values and dispositions.  And so of course you are open to accusations of being "unscientific" or lacking rigor, or whatever, but that is just how life is lived on the low-evidence end of the spectrum.

6 Comments:

Blogger Zed said...

I'm with you until midway through the penult. paragraph, but I don't see how it leads you to the idea that "beliefs are like attitudes." It doesn't seem right to say that anything you act on is a "belief"; I might have a heuristic that says always choose heads on a coin-flip, but that surely does not imply a _belief_ in the proposition that the coin will come up heads. Similarly, I don't (if I'm sensible) _believe_ that a satisficing solution is the global maximum. In the absence of good evidence one tends to overweight the non-consequentialist (and therefore not evidence-dependent) parts of one's moral code, but I do not see how this is breaking down the distinction between beliefs and attitudes.

11:32 AM  
Blogger James said...

Yeah, I was actually writing a follow-up post roughly along those lines. You could think of it this way. If your evidence is limited, then you don't put much confidence in your conclusion, and it doesn't count as a "belief" (or it is a qualified belief). So in other words you maintain mental reservations even as you act on the conclusions that you have reached based on the available evidence.

I think a lot of the time we don't act that way, though. A lot of our strongest beliefs are not rigorously supported with evidence in the traditional sense. I think it's highly likely that if the Irish had ruled themselves, the famine would have been far less devastating (because they would have closed their ports - for most of the famine, Ireland was a net exporter of food). This is common sense, I think, but my evidence is thin. Nevertheless it is a strongly held belief and it feeds directly into my (also strong) belief that democracy is a good form of government.

I guess we can get into a discussion about what counts as good evidence. Maybe if you have enough informal, diffuse evidence it should be given a lot of weight. So then maybe a different way of putting my point is: at some level of abstraction, your values interact strongly with your beliefs by the mechanism of methodological choices in your reasoning. (For instance, given almost limitless evidence, your "procedural" choices about how to weigh evidence end up being outcome-determinative.)

I'll have more later.

12:59 PM  
Blogger Grobstein said...

"Belief" in the pure sense (a la Sarang's comment) is an idealization rather than a psychological reality, I think, because in practice we don't thoroughly wall off epistemic from practical dispositions.

A nice example is trust. Trust seems to be paradigmatically epistemic. I trust someone if I believe that their utterances are sincere and reliable, which is a pure fact question about the world. But in practice we treat trust as a matter of "political" stance -- we trust someone partly as a way of publicly esteeming them.

Similarly, people seem to claim to "believe that a satisficing solution is the global maximum," because of internal preferences for aligning beliefs with practical attitudes (cf. "cognitive dissonance").

1:35 PM  
Blogger Zed said...

I think we use a very close approximation to the purely epistemic notion of belief whenever the requisite action is low-stakes enough -- so the concept seems to be valuable. A lot of people like Herodotus and Gibbon but one cannot argue from there to _trusting_ them, and I think that sort of argumentative leap would usually be seen as a mistake.

(Similarly, I think even in cases where people _want_ to think a satisficing solution is the global maximum they do not defend this reasoning but try to come up with less lizard-brain justifications. Rationalization is the homage that &c. James is, as I read him, trying to justify certain procedures rather than describe behavior, so I'm not sure how far the trust example in fact helps him.)

3:28 PM  
Blogger Grobstein said...

Yeah the concept is certainly useful. To the extent that we deviate from the purely epistemic conception, I'm not sure this is a mark against the purely epistemic concept. It might be a useful ideal even if it is often a bad descriptor, or it might be useful in specialized contexts.

Per trust specifically, the contamination effect I'm pointing to is most salient in transactions between actual people, rather than evaluations of the work of dead authors. "Do you trust me?" / "I trust you." is a fairly common form of verbal transaction and kind of puzzling on the purely epistemic concept.

But I guess you are right to point out that James is mostly saying something else and somewhat deeper. There's a philosophical slogan due to Quine or Lewis that goes something like, "Probability requires certainty." The idea is like: to say that something is 50% likely requires taking as given (certain) a bunch of background assumptions (viz. prior distributions), and without them no numbers can be assigned; instead we must just be vaguely "uncertain" about everything. We can use reliable-seeming heuristic procedures to assign priors, but their reliability is always also uncertain, etc. The "No Free Lunch Theorem" is a formalization of this sort of idea.

I think the further assertion that "beliefs are like attitudes" in this context comes from the impossibility of ultimate justification. Beliefs are supported "ultimately" by a kind of stance towards the evidence, and if the adoption of one stance or another is not the sort of thing that can be justified then maybe that makes it less like a paradigm "belief."

11:30 AM  
Blogger Grobstein said...

Some philosophers do think there are things you can know with certainty. Descartes famously thought this about "clear and distinct" mental impressions.

Lately I've become interested in arguments for Ockham's Razor, the bias towards simple explanations. Is there reason to think Ockham's Razor tends to produce true beliefs? Can such reasons be framed without reference to assumptions about the world? (If your reason for preferring simple beliefs is that the world prefers simple truths, you seem to be trapped in circular reasoning.)

A line of papers from Kevin Kelly argues that Ockham's Razor is rational because it minimizes how many times you have to revise your conclusions in the worst case. It's hard for me to see whether 1) this is a reason at all, 2) whether it is true with any kind of generality. http://www.hss.cmu.edu/philosophy/kelly/papers/kellyinfo14.pdf

Other arguments appeal to the relative ease of testing / disproving simple theories, concerns about statistical overfitting, etc.

I'm not really sure what I make of this literature but I bring it up because it seems to me the kind of thing that can in principle address the sorts of concerns James is raising.

11:46 AM  

Post a Comment

<< Home