Now let's say:
- I'd pay $2000 not to have the side effects I've seen described
- estimate my life is worth $10M1
Estimate probabilities at:
- P(dying | rabies & no treatment) = 100%
- P(dying | rabies & treatment) = 0%
- P(side effects | treatment) = 50%
- P(rabies | raccoon was rabid) = 0.1% (since no noticeable skin break)
- P(raccoon was rabid | observed behavior) = 6% (apparently the behavior is perfectly normal, so this is just the average incidence of rabies in raccoons4)
Expected losses given treatment = .5 * $2k = $1000
Expected losses given no treatment = .001 * .06 * $10M = $600
Of course, this is close enough to be extremely sensitive to P(rabies | raccoon was rabid), which I don't actually know. I washed up and I don't think I even touched the creature, but if I missed a scratch, I'm in trouble. Calibrated confidence makes me somewhat iffy about the 0.1% estimate; if it's 5% instead, we're looking at $30k instead of $600, and I should berate the doctor until she gives me a shot.
But also, I think my estimate above is not including my social values, on which I effectively place a very high dollar value. One of these values might be phrased as "trust the experts or become an expert", and another "don't whine". Wah, math is hard.
P.S.: the following Bulgarian folk dance2 is brought to you by probabilistic, b0st0n, and dsaklad.
1 It's tempting to say one's life is worth $∞; and indeed, if I had $X, and I had to pay $X to stay alive, I probably would, no matter how large $X was. But there's got to be a maximum dollar amount that comes into play when you're talking about a differential chance of death rather than a certainty: you can rationalize crossing the street, going snowmobiling, or whatever, because there's something that's "worth it"; and "it" is (probability of death) * (value of your life). In wrongful death lawsuits and engineering calculations, this is normally set to an actuarial estimate of your expected lifetime earnings.
2 Not a Bulgarian folk dance.3
3 see also
4 To justify the use of the name "Bayes", here is an application of Bayes' Rule: P(rabid | aggressive) = P(aggressive | rabid) P(rabid) / P(aggressive) = 0.06 * P(aggressive | rabid) / P(aggressive); my supposition that P(aggressive | rabid) ~= P(aggressive) doesn't actually seem very sensible when written out like that. If the ratio is even as high as 2, the decision flips the other way.