The Spectre of Math

November 26, 2014

Law of large numbers: idiots, monkeys, CEOs, politicians, and nuclear war

Filed under: Economics,Mathematics,Politics — jlebl @ 1:26 am

Something that seems to be ignored by many people is the law of large numbers.  Suppose you take an action that has 99% rate of success.  That’s 1% chance failure.  Tiny!  Well do it enough times, and failures will happen.  Given enough candidates with 1% chance of winning, one of them will win.  Then everybody is surprised (but shouldn’t be).  Suppose that in the 435 seats for congress, there were all candidates that according to polls had 99% chance to win, and there was always a second candidate with 1% chance of winning.  I would expect 4or 5 of the underdogs to win.  If they didn’t we were wrong about the 99%.

Or how about entrepreneurs.  Suppose you take 100 idiots.  They each get a totally whacky idea for a new business that has 1% chance of success.  One of them will likely succeed.  Was it because he was smart?  No, there was enough idiots.  We should not overvalue success if we do not know how many other similar people failed, and how likely was success.  What if you have a person who started two businesses that had 1% chance of success.  Was that person a genius?  Or did you just have 10000 idiots.  You have surely heard that giving typewriters to monkeys will eventually (if you have enough monkeys and time) will produce works of Shakespeare.  Does this mean that Shakespeare was a monkey?  No.  There weren’t enough idiots (or monkeys) trying.  Plus the odds of typing random sentences, even if they are grammatically correct, and ending up with something as good as Shakespeare are astronomically low.  Shakespeare was with a very very very high degree of confidence not a monkey.  I can’t say the same for Steve Jobs.  The chance of Jobs having been a monkey are still somewhat smaller than your general successful CEO.  Think of the really important decisions that a CEO has to make, there aren’t that many.  If we simplified the situation and went simply with yes/no decisions on strategic things, there are a few in the lifetime of a company.  Most decisions are irrelevant to the success, and they even out: make a lot of decisions that make a small relative change and you will likely be where you started (again law of large numbers).  But there are a few that can make or break a company.  Given how many companies go bust, clearly there are many many CEOs making the wrong make or break decisions.  So just because you hired a CEO and he made a decision to focus on a certain product and drop everything else, and you made it big.  Does it mean your CEO was a genius?  Flipping a coin then gives you 50% chance of success too.

Same with stock traders.  Look and you will find traders whose last 10 bets were huge successes.  Does it mean that they are geniuses?  Or does it simply there are lots of stock traders that make fairly random decisions and some of them thus must be successful.  If there are enough of them, there will be some whose last 10 bets were good.  If it was 10 yes/no decisions, then you just need 1024 idiots for one of them to get all of them right.  They don’t have to know anything.  Let’s take a different example, suppose you find someone that out of a pool of 100 stocks has for the last 3 years picked the most successful one each year.  This person can be a total and complete idiot as long as there were a million people making those choices.  The chance of that person picking the right stock this year is most likely 1 in 100.  Don’t believe people trying to sell you their surefire way to predict the stockmarket, even if they are not lying about their past success.

OK.  More serious example of law of large numbers: Suppose your country does a military operation that has 99% chance of success and 1% chance of doom to your country.  Suppose your country keeps doing this.  Each time, it seems it is completely safe.  Yet, eventually, your country will lose.  You start enough wars, even with overwhelming odds.  Your luck will run out.  Statistically that’s a sure thing.  If you want your country to be around in 100 years, do not do things that have even an ever so tiny chance of backfiring and dooming that country to failure.  You can probably guess which (rather longish) list of countries that I am thinking of, which with good odds won’t be here in 100, 200, or 500 years.

Let’s end on a positive note:  With essentially 100% probability humankind will eventually destroy itself with nuclear (or other similarly destructive) weapons.  There seem to be conflicts arising every few decades that have a chance of deteriorating into nuclear war.  Small chance, but positive.  Since that seems likely to me to repeat itself over and over, eventually one of those nuclear wars will start.  It can’t quite be 100%, since there is a chance that we will all die in some apocalyptic natural disaster (possibly of our own making) rather than nuclear war.  Since there is also a small chance that everybody on earth gets a heart attack at the same exact time.  Even if we make sure we don’t do anything else dangerous (such as nuclear weapons), civilization will end one day with a massive concurrent heart attack.

Blog at