Hubris
We decide on the amount and on the quality of the information we possess. The easier case is when the problem outcome is already known (the coin flip was heads, now what?); this is a case where we do not need to predict anything, the event already occurred. The next case is when we do not know the outcomes but know their probabilities (this is a biased coin, with a chance of 55% heads). We know how to solve this type of problem, and call it risk management. The harder case is when not even the probabilities are known (oops, this coin might even have two equal faces...). Usually, the literature calls it uncertainty, which is a different, uglier, beast than risk. And humans do not like uncertainty. We try to solve them using what we know about risk, by simplifying our models -- and possibly forgetting about this fact -- so that we can analyse the problem and compute its solution with the tools (like statistics) and assumptions (like normality) we understand. Sometimes our assumptions are not that far away and our decisions are good approximations/predictions of what happens. Sometimes we miss the mark and the chosen action results in disaster. Black swans, an expression coined by Nassim Taleb, are examples of this. It is when we assume that extraordinary events are impossible and one of them occurs nonetheless. This is the price of recklessness or laziness or both; the hubris of assuming too much.
Sem comentários:
Enviar um comentário