Thinking Fast and Slow

thinking fast and slowParts of this book by Daniel Kahneman were interesting reading and other parts were too much about psychological research to hold my attention.  You will enjoy the book if you are a fan of heuristics, which is defined by the Merriam Webster Dictionary as …”involving or serving as an aid to learning, discovery, or problem-solving by experimental and especially trial and error methods.” The book mentions that “applying the rule of thumb” is another way of defining heuristic thinking. The book makes a strong point that we are comfortable in relying on our intuition to address a problem or situation to reach a quick decision and that decision is often wrong. The dust cover of the book “…explains the two systems that drive the way we think. System 1 is fast, intuitive, and emotional; System 2 is slow, more deliberative, and more logical.” There are numerous examples of how our analysis using System 1 often misleads us. The Conclusion includes the observation that when you have an overconfident intuition you are thinking correctly you should “…recognize signs that you are in a cognitive minefield, slow down, and ask for reinforcement from System 2.”

I have the feeling I should have forced myself to read the passages I found tedious more carefully. Much of what I did read with interest provided all sorts of warnings that, if heeded, will apply to making better decisions. Chapter 20, “The Illusion of Validity” describes two completely different situations that I found equally interesting. The first is about how author was fulfilling his Israeli Army obligation by evaluating candidates for officer training. Certain difficult tasks were assigned (such as eight candidates lifting a heavy log over a barricade) to watch who was a leader, cooperative, stubborn, submissive, arrogant, patient, hot-tempered, persistent, or a quitter. The evaluators were confident of their analysis of the various candidates, and the evaluations proved to have virtually no value in determining the eventual performance of a person as an officer in real world military situations. The author firmly believed in his evaluations, but was shocked at how often he was wrong. He calls this “cognitive illusion.”  I should admit that this example was of interest to me because I was a participant in similar exercises while I was in the U.S. Army Infantry Officer Candidate School a few decades ago.

The other primary subject in Chapter 20 that is important to me today describes the ability of “market analysis experts” in making decisions about where money should be invested or when investments should be sold. There is a huge financial industry built on the illusion that there are skilled analysts who can correctly predict the performance of stocks, and the book calls this the “illusion of skill.” Billions of stock shares are traded daily. Some are sold by people who are convinced that full or nearly full value of the stocks have been reached while buyers are convinced they are buying a bargain. A study of 10,000 trading accounts reached the conclusion that doing nothing would have, on the average, been a better investment strategy than the selling and buying of most traders. The author was asked to evaluate the performance of twenty-five “wealth advisors.” Bonuses were given to yearly high performers. It was found there was virtually no correlation between bonuses and the overall successes during the full eight years. “The results resembled what you would expect from a dice-rolling contest, not a game of skill.”  Two of three mutual funds managed by the “very best stock pickers” (which have higher fees than baskets of stocks in Electronically Traded Funds that aren’t actively managed) underperform the overall market.

I was relieved that the author did not advise for people to ignore signals from sixth sense as another example where people should dismiss the messages of intuition. He gives an example of a fire fighter “sensing” something was wrong and ordered for everyone to get out just before the floor collapsed. The fire fighter realized later that he had sensed several changes (the noise of the fire had grown quiet while his ears had gotten quite hot) that led him to realize a disaster was about the happen.

An interesting proof that people can become effectively blind when intensely focusing is given from the book “The Invisible Gorilla,” by Christopher Chabris and Daniel Simons. People are told to watch a film of two teams of basketball players. One team wears white shirts and the other black. Those being tested are told to count passes made by players wearing while shirts and ignore passes made by those wearing black. “Halfway through the video, a woman wearing a gorilla suit appears, crosses the court, thumps her chest, and moves on.” The gorilla is in view for 9 seconds, but about half the thousands who watch the video do not notice the gorilla. Another example of the impact of concentration is to be walking with a friend and ask them to calculate 23 X 78 in their head. The friend will invariably stop walking while they begin concentrating. Another common example is to tell someone to avoid the thought of white bears.

A disturbing discussion is about “depletion effects in judgment” and the measured fact that restoring sugar to the brain prevents deterioration of performance. Eight parole judges in Israel were given about 6 minutes for each review. The numbers of paroles granted reached 65% immediately after a food break and dropped to zero for the reviews performed just before the next meal. Make certain the decision makers have just had a meal if you are ever up for parole!

People reading the question, “How many animals of each kind did Moses take into the ark?” seldom see anything wrong with the question. “The number of people who detect what is wrong with this question is so small that it has been dubbed the “Moses Illusion.” Noah took animals into the ark and not Moses. People think of animals going into the ark is a biblical context and Moses is not out of place.

There is an interesting warning about resistance to stereotyping and the opposition to profiling. The author observes that opposition is a laudable moral position, but it comes with a cost. “The costs are worth paying to achieve a better society, but denying that the costs exist, while satisfying to the soul and politically correct, is not scientifically defensible.”

The author also made me want to finally read Nassim Taleb’s book “The Black Swan.” I’ve read about it often and had it recommended to me, so I’m going to put it on my list. Taleb is described as the “trader-philosopher-statistician” who introduces the “narrative fallacy to describe how flawed stories of the past shape our views of the world and our expectations for the future.” He says we constantly fool ourselves “by construction of flimsy accounts of the past and believing they are true.” After that warning the story of the Google founders is told, and that certainly goes against believing that there were fallacies in their thinking. They came up with a superior way of searching information on the Internet, started a company, and made many good decisions. One particularly good decision was to turn down an offer for a million dollars for the company. Unfortunately there are many stories of smart innovators who became hapless competitors of Google. Reading the Google story makes you feel you’ve learned valuable lessons in what makes a company succeed. However, “…your sense of understanding and learning from the Google story is largely Illusory.”

Another warning brings to mind the new Michael Lewis book about the “flash crash” on Wall Street and the indictment of “high frequency trading.” The author says the sad truth is that algorithms are superior to experts. The author proposes that the experts try to be clever in their evaluations of complex combinations in making their evaluations. Algorithms aren’t clever; they merely produce results based on mathematical probability.

Chapter 30, “Rare Events” gives a chilling insight into the power of terrorism. There were 23 suicide bombings in buses in Israel between December 2001 and September 2004 which caused 236 fatalities. There was an average of 1.3 million daily bus riders during that time period. The author knew the risk was statistically tiny, and that he was much more at risk of being injured in an accident driving his car, but he became very nervous if he had to stop his car beside a bus. He avoided buses because he wanted to think of something else. It illustrates why terrorism is so effective.

I recommend the book for all the interesting stories and observations. I will not suggest that it is a book that reads fast or that it is all riveting reading. However, it certainly would be worthwhile if you have plenty of time to invest.

One thought on “Thinking Fast and Slow

  1. If you are thinking of diving into Taleb, I recommend that you start with “Fooled by Randomness”. I think it is a little more edifying.

Comments are closed.