Andrew Cohen says some weird things about uncertainty:
We are, indeed, fallible. I don’t think anything follows from this with regard to toleration (see Chapter 7, section E). It is perfectly reasonable to think toleration is a value while recognizing one’s own fallibility. One may be wrong, but to say one thinks X is to say, “given all else I know, I think X and I will maintain X until shown that X is false.” As Joseph Schumpeter said “To realise the relative validity of one’s convictions and yet stand for them unflinchingly, is what distinguishes a civilized man from a barbarian.” Indeed, it seems entirely natural to be willing to stand for one’s beliefs unflinchingly, recognizing one’s judgments may nonetheless be wrong.
To maybe state the obvious, it is perfectly possible to hold a belief you think is the belief most likely to be correct without thinking it is likely to be correct. For example, say a delicious torta could be behind one of three doors, and you happen to think there is a 49 percent chance it is behind door one, a 20 percent chance door 2, and a 21 percent chance door three. In that situation you might believe that you should open door one, but you would also think that the torta is probably not behind door one. This is one way to hold a belief without being confident you are correct.
But we could have even more options: maybe not opening any doors is a possibility, and if you wait two days ta torta will simply be given to you. Now we have to weigh our preferences: the chance of a torta now, or a certain torta later. Here we may say, "I prefer to wait," and firmly believe that, but acknowledge that it's all a matter of taste. This is another way to hold a belief without being confident you are correct: acknowledging that you are judging by subjective criteria that vary from person to person.
Lastly, you could simply not know what other people know. You've conducted some sort of investigation and come up with some sort of probability, but are you confident telling someone else to pick your door if they've conducted a separate investigation? Maybe they've peaked behind door 3! Maybe they have a better sense of smell! This is another reason to be hold a belief but not be confident in it: it may be the best belief based on what you know, but you may not know enough.
Cohen seems to say, if I am reading him right, that knowledge that your beliefs could be wrong shouldn't encourage you to be more tolerant of other beliefs. I'm not so sure. Imagine 100 doors, and you think door number 2 is most likely to hold the prize at a 2 percent chance. Is your tolerance of people advocating for door number 1 as low as if there are only 2 doors, and door number 2 has a 99 percent chance of being correct?
Of course, it's possible that there are a lot more "2 doors with an obvious answer" problems than "100 doors with a tricky answer" problems in the world. I wouldn't bet on life being simple, though.