We can distinguish mistakes into two forms. The first type is a false alarm, in which you overestimate the likelihood that an event will occur, and the second type is an oversight, in which you underestimate the likelihood that the event will occur. Suppressing the probability of an oversight will make a false alarm more likely, and vice versa. Plenty of examples, I'll just give three:
- Statisticians make a distinction between "type one" errors, rejecting a null hypothesis when it shouldn't be, and "type two" errors, failing to reject a null hypothesis when it should be. If the null hypothesis is that a given event will not happen, then type one errors can be thought of as false alarms, and type two errors as oversights.
- A lifeguard can choose to pay less attention to each individual momentary dip under water, and thus lower his stress from false alarms. But he inevitably does so at the risk of increasing the risk of an oversight--not noticing when someone is underwater for too long.
- Rhodopsin switches conformational states in response to photon exposure. We can think of a false alarm as when rhodopsin changes states even when a photon has not hit it, and an oversight as when rhodopsin fails to switch states despite photon exposure. Evolution seems to have strongly selected for minimizing false alarms as opposed to minimizing oversights. (That is, oversights still occur ~ 30% of the time; see here)
(Above photo credit goes wholly to flickr user Abhijit Patil)