There’s an obsession over examining what works better than something else, whether it’s good practices, methodology, the logic by which money is made (a business model), types of services we desire to offer, etc. This obsession might be driven by the people who have implemented one of those things and survived.

To explain how cautiously viewing others’ success can lead to ways of thinking differently, I chose to highlight the story of Abraham Wald as told by Jordan Ellenberg in his book “How Not To Be Wrong: The Power of Mathematical Thinking”.

Abraham Wald was born in what now is Cluj in Rumania, in the year 1902. A mathematician, Wald finished his studies in the University of Vienna by 1930.

To make this long-life story short, Wald worked for a classified program during the Second World War, the SRG (Statistical Research Group). In the words of his director W. Allen Wallis, there never was so much statistical talent in one place. That place was situated in an office a few streets away from the University of Columbia in New York.

In that office, there were some famous people such as:

**Frederick Mosteller**, who would later create the statistics department in Harvard University.**Leonard Savage**, the pioneer of the decision theory. Savage couldn´t see three guys on a donkey; he was practically blind.**Norbert Weimberg**the father of cybernetics.

Getting to the point of this story, the SRG received the most extravagant and complex questions one could imagine. One of them will be the highlight of this article.

The aviation of the United States sent the aforementioned statistics team the following problem: “We don’t want the enemy taking down any more planes, so we can reinforce them, but doing so makes the machines heavier, less maneuverable and will consume more fuel. Neither too much reinforcement nor too little can be valid options.”

The SRG was consulted to find the optimal point. In addition, data had been provided detailing the quantity of bullet holes the planes had when returning from their missions in Europe.

Wald himself gave an answer to the problem: “The reinforcement doesn’t have to be placed where the bullet holes are, but where they aren’t”.

The same answer looked at from another point of view is another way of asking: Where are the missing bullet holes in the planes that never returned? The analogy here is of observing the data of the people that are recovering in the hospital. The ones that aren’t there are missing because they don’t have any chance of recovering.

From a mathematical point of view, the idea mentioned before consists in figuring that the probability that a combat plane returns after it’s machinegunned in its engine is zero. None. Wald’s focus would persist during following wars (Korea, Vietnam).

For mathematicians, the problem is known as “Survival Bias” which usually appears disguised in different contexts. One simple explanation of the bias is the trend we have of focusing on the survivors and not on the ones that have perished, depending on the situation. After every event that has survivors, the non survivors are destroyed or removed from the equation.

In business, this logical error is repeated frequently. We only look at the models that had success in the past, and we examine the difficulties that they present so we don’t make the same mistakes. We leave behind the non-surviving businesses, and that doesn’t let us consider which ones they were and the reasons they didn’t work.

Wald, as you can imagine, told the same story but with equations.