By Robert H.
Are we allowed to start worrying about nuclear war without seeming weird? Is the Cold War on again enough for that? 'Cause I was googling around a little, and America has a surprisingly good shot at succeeding totally in a first strike against Russia, which is obviously bad.
I say "surprisingly" not because I think the chances are high, but because I would have thought our chances of surviving a full scale nuclear war without the Russians hitting a single target would be zero. More than that, I would have thought the odds of just losing three or four cities would be zero. I would have thought the odds of only losing a third of our population were zero. Basically I thought MAD was still a thing, and both Russia and America could destroy each other under any conceivable nuclear scenario. The only way to win is not to play, how about a nice game of chess?
But that author wants to play up how risky a first strike would be, and yet he still leaves me with the impression that we have an outside chance of pulling it off totally.
Worrying. It makes me wonder how much further we would have to develop our ABM capability before we could shoot down any Russian nukes likely to survive a first strike, which makes me wonder if that possibility fuels erratic behavior on Russia's part. The knowledge that an age of American nuclear primacy might be right around the corner would scare me shitless if I were Russian, and obviously the worse your future position the more aggressively you play your current advantages. What's worse, the less likely you are to survive a first strike, the more likely you are to launch one, which means we are more likely to launch one, which means etc. The balance of terror doesn't work without the balance.
On the bright side, nuclear war could solve the Fermi paradox.
Tuesday, March 18, 2014
Thursday, March 6, 2014
Uncertain
By Robert H.
Andrew Cohen says some weird things about uncertainty:
To maybe state the obvious, it is perfectly possible to hold a belief you think is the belief most likely to be correct without thinking it is likely to be correct. For example, say a delicious torta could be behind one of three doors, and you happen to think there is a 49 percent chance it is behind door one, a 20 percent chance door 2, and a 21 percent chance door three. In that situation you might believe that you should open door one, but you would also think that the torta is probably not behind door one. This is one way to hold a belief without being confident you are correct.
But we could have even more options: maybe not opening any doors is a possibility, and if you wait two days ta torta will simply be given to you. Now we have to weigh our preferences: the chance of a torta now, or a certain torta later. Here we may say, "I prefer to wait," and firmly believe that, but acknowledge that it's all a matter of taste. This is another way to hold a belief without being confident you are correct: acknowledging that you are judging by subjective criteria that vary from person to person.
Lastly, you could simply not know what other people know. You've conducted some sort of investigation and come up with some sort of probability, but are you confident telling someone else to pick your door if they've conducted a separate investigation? Maybe they've peaked behind door 3! Maybe they have a better sense of smell! This is another reason to be hold a belief but not be confident in it: it may be the best belief based on what you know, but you may not know enough.
Cohen seems to say, if I am reading him right, that knowledge that your beliefs could be wrong shouldn't encourage you to be more tolerant of other beliefs. I'm not so sure. Imagine 100 doors, and you think door number 2 is most likely to hold the prize at a 2 percent chance. Is your tolerance of people advocating for door number 1 as low as if there are only 2 doors, and door number 2 has a 99 percent chance of being correct?
Of course, it's possible that there are a lot more "2 doors with an obvious answer" problems than "100 doors with a tricky answer" problems in the world. I wouldn't bet on life being simple, though.
Andrew Cohen says some weird things about uncertainty:
We are, indeed, fallible. I don’t think anything follows from this with regard to toleration (see Chapter 7, section E). It is perfectly reasonable to think toleration is a value while recognizing one’s own fallibility. One may be wrong, but to say one thinks X is to say, “given all else I know, I think X and I will maintain X until shown that X is false.” As Joseph Schumpeter said “To realise the relative validity of one’s convictions and yet stand for them unflinchingly, is what distinguishes a civilized man from a barbarian.” Indeed, it seems entirely natural to be willing to stand for one’s beliefs unflinchingly, recognizing one’s judgments may nonetheless be wrong.
To maybe state the obvious, it is perfectly possible to hold a belief you think is the belief most likely to be correct without thinking it is likely to be correct. For example, say a delicious torta could be behind one of three doors, and you happen to think there is a 49 percent chance it is behind door one, a 20 percent chance door 2, and a 21 percent chance door three. In that situation you might believe that you should open door one, but you would also think that the torta is probably not behind door one. This is one way to hold a belief without being confident you are correct.
But we could have even more options: maybe not opening any doors is a possibility, and if you wait two days ta torta will simply be given to you. Now we have to weigh our preferences: the chance of a torta now, or a certain torta later. Here we may say, "I prefer to wait," and firmly believe that, but acknowledge that it's all a matter of taste. This is another way to hold a belief without being confident you are correct: acknowledging that you are judging by subjective criteria that vary from person to person.
Lastly, you could simply not know what other people know. You've conducted some sort of investigation and come up with some sort of probability, but are you confident telling someone else to pick your door if they've conducted a separate investigation? Maybe they've peaked behind door 3! Maybe they have a better sense of smell! This is another reason to be hold a belief but not be confident in it: it may be the best belief based on what you know, but you may not know enough.
Cohen seems to say, if I am reading him right, that knowledge that your beliefs could be wrong shouldn't encourage you to be more tolerant of other beliefs. I'm not so sure. Imagine 100 doors, and you think door number 2 is most likely to hold the prize at a 2 percent chance. Is your tolerance of people advocating for door number 1 as low as if there are only 2 doors, and door number 2 has a 99 percent chance of being correct?
Of course, it's possible that there are a lot more "2 doors with an obvious answer" problems than "100 doors with a tricky answer" problems in the world. I wouldn't bet on life being simple, though.
Subscribe to:
Posts (Atom)