☞ Please consider disabling “reader-mode” — this site has no ads nor Google tracking, but does have dark-mode and intentional typesetting and imagery. ☜
Advice from “successful entrepreneurs” might be unreliable due to Survivor Bias. What’s real, and what’s random?

Do you read articles written by a founder who failed three times, never finding success?
No, because you want to learn from success, not hear about “lessons learned” from a someone who hasn’t yet learned those lessons themself.
However, the fact that you are learning only from success is a deeper problem than you imagine.
Some stories will expose the enormity of this fallacy.
Bullet holes: A brain teaser

During World War II the English sent daily bombing raids into Germany. Many planes never returned; those that did were often riddled with bullet holes from anti-air machine guns and German fighters.
Wanting to improve the odds of getting a crew home alive, English engineers studied the locations of the bullet holes. Where the planes were hit most, they reasoned, is where they should attach heavy armor plating. Sure enough, a pattern emerged: Bullets clustered on the wings, tail, and rear gunner’s station. Few bullets were found in the main cockpit or fuel tanks.

The logical conclusion is that they should add armor plating to the spots that get hit most often by bullets. But that’s wrong.
Planes with bullets in the cockpit or fuel tanks didn’t make it home; the bullet holes in returning planes were “found” in places that were by definition relatively benign. The real data is in the planes that were shot down, not the ones that survived.
This is a literal example of “survivor bias”—drawing conclusions only from data that is available or convenient and thus systematically biasing your results.
Doesn’t most business advice suffer from this fallacy? You read about successes but what about the businesses that “never made it home?” Like the downed planes, could failure contain more lessons than success?
Burying the other evidence

Scientific journals publish extraordinary results, so studies whose results are statistically insignificance aren’t published. Rather, they are abandoned or silently stowed away in academic filing cabinets.
For this reason, this practice is called the “file-drawer effect,” and it’s a particularly insidious form of survivor bias because it is invisible. Peter Norvig sums it up nicely:
When a published paper proclaims “statistically, this could only happen by chance one in twenty times,” it is quite possible that similar experiments have been performed twenty times, but have not been published.
Pharmaceutical companies have exploited this effect to skew results intentionally. It’s gotten so bad that journals are calling for a public database to prevent fraud:
More than two-thirds of studies of anti-depressants given to depressed children, for instance, found the medications were no better than sugar pills, but companies published only the positive trials.
If all the studies had been registered from the start, doctors would have learned that the positive data were only a fraction of the total.
—Washington Post
Doesn’t most business advice suffer from this fallacy? Harvard Business School’s famous case studies include only success stories. To paraphrase Nerving, what if twenty other coffee shops had the same ideas, same product, and same dedication as Starbucks, but failed? How does that affect what we can learn from Starbucks’s success?
Experimental proof of ESP

Dr. Joseph Rhine brought the rigor of experimental psychology to the study of the paranormal, and ESP (Extra Sensory Perception) in particular. He made waves in the 1930s with controlled experiments testing whether a person was able to predict the order of the cards in a shuffled Zener deck (with symbols like circle, square, star, and wavy lines).
In a typical experiment, 500 people are screened for “strong telepathic ability,” measured by significantly above-average performance in a 25-card deck. Those selected are tested again; most drop away. Tested a third time, perhaps one p