back start next


[start] [1] [2] [3] [4] [5] [6] [7] [8] [9] [10] [11] [12] [13] [14] [15] [16] [17] [18] [19] [20] [21] [22] [23] [24] [25] [26] [27] [28] [29] [30] [31] [32] [33] [34] [ 35 ] [36] [37] [38] [39] [40] [41] [42] [43] [44] [45] [46] [47] [48] [49] [50] [51] [52] [53] [54] [55] [56] [57] [58] [59] [60] [61] [62] [63] [64] [65] [66] [67] [68] [69] [70] [71] [72] [73] [74] [75] [76] [77] [78] [79] [80] [81] [82] [83] [84] [85] [86] [87] [88] [89] [90] [91] [92] [93] [94] [95] [96] [97] [98] [99] [100] [101] [102] [103] [104] [105] [106] [107] [108] [109] [110] [111] [112] [113] [114] [115] [116] [117] [118] [119] [120] [121] [122] [123] [124] [125] [126] [127] [128] [129] [130] [131] [132] [133] [134] [135] [136] [137] [138] [139] [140] [141] [142] [143] [144] [145] [146] [147] [148] [149] [150] [151] [152] [153] [154] [155]


35

* He died prematurely of cancer at 59 in 1996. Tversky was on the board of the Dreman Foundation, which is dedicated to the study of investor and economic decision-making.

will be affected by all the new technology. If I dont use it Ill be at a competitive disadvantage." Technology has produced such a sea of data that analysts are foundering in information. The data glut threatens to swamp, rather than enhance, investment decisions.

As the Morgan analyst continued, extracting useful information from the 49 databases the bank subscribes to is Uke finding a needle in a haystack. "The more data you get, the less information you have," he groans. His intuition coincides with the psychological findings. One of the effects of dealing with vast amounts of information, which we touched on in chapter 4, is the experts mistaken belief that he can make accurate forecasts, despite the difficulties we have examined.

Increased information, as was demonstrated, does not lead to increased accuracy. A large number of studies in cognitive psychology indicate that human judgment is often predictably incorrect. Nor is overconfidence unique to analysts. People in situations of uncertainty are generally overconfident based on the information available to them; they usually believe they are right much more often than they are.

Amos Tversky, one of the leading figures in cognitive psychology, researched expert overconfidence in the stock market.* According to Tversky, "In one study, analysts were asked such questions as what is the probability that the price of a given stock will exceed $X by a given date. On average, analysts were 80% confident but only 60% accurate in their assessments."* The study was repeated numerous times.

In other studies, analysts were asked for their high and low estimates for the price of a stock. The high estimate was to be a number they were 95% sure the actual price would fall below; the low estimate was the price they were 95% sure the stock would remain above. Thus, the high and low estimates should have included 90% of the cases, which is to say that, if the analysts were realistic and unbiased, the number of price movements above and below this range would be 10%. In fact, the estimates missed the range 35% of the time, or three and a half times as often as the forecasters estimated.

Tversky went on to note that "rather than operating on rational expectations" (with total logic, unaffected by behavior, as efficient market theory assumes investors do), "people are commonly biased in several directions: They are optimistic; they overestimate the chances that they will succeed, and they overestimate their degree of knowledge, in the sense that their confidence far exceeds their hit rate. "



Mr. Inside and Mr. Outside

We have seen that forecasters in many professions are far too confident of their estimates, and that the forecasts are too optimistic. Both can be deadly in the marketplace. Research indicates some of the psychological reasons these patterns occur so repeatedly.

Tversky was queried about overconfidence at an investment behavioral conference in 1995. The questioner asked what he thought of the fact that analysts were not very good at forecasting future earnings. He responded, in part, "From the standpoint of the behavioral phenomena . .. analysts should be more skeptical of their ability lo predict [earnings] than they usually are. Time and time again, we learn that our confidence is misplaced, and our overconfidence leads to bad decisions, so recognizing our limited ability to predict the future is an important lesson to learn." [Italics mine.]**

Tversky was asked at the same conference if analysts and other professional investors learn from their experiences. He replied that, "unfortunately cognitive illusions are not easily unlearned.. . . The fact that in the real world people have not learned to eliminate .. . overconfidence speaks for itself."* An earlier study by Baruch Fischhoff showed that even when experts are warned about the existence of this problem, forecasters appear unable to adjust for its effects.* This explains why neither analysts nor investors benefit from the findings that earnings estimates cannot be made with precision, and therefore continually suffer the consequences of high forecasting errors.

These findings apply to many other fields. A classic analysis of cognitive psychologists found that it was impossible to predict which psychologists would be good diagnosticians. Further, there were no mechanical forecasting models that could be continuously used to improve judgment. It concluded the only way to resolve the problem was to look at the record of the diagnostician over a substantial period of time.

Researchers have also shown that people can maintain a high degree of confidence in their answers, even when they know the "hit rate" is not very high. The phenomenon has been called "the illusion of validity."* This also helps to explain the belief that analysts can pinpoint their estimates despite the strong evidence to the contrary. People make confident predictions from incomplete and fallible data. There are excellent lessons here for the stock forecaster.



Daniel Kahneman, who for several decades coauthored many important scholarly pieces with Amos Tversky, wrote on this subject in collaboration with Dan Lovallo.°

Forecasters are "excessively prone" to treat each problem as unique, paying no attention to history. Cognitive psychologists note there are two distinct methods of forecasting. The first is called the "inside view." This method is the one overwhelmingly used to forecast eamings estimates and stock prices. The analyst or the stock forecaster focuses entirely on the stock and related aspects such as growth rates, market share, product development, the general market, the economic outlook, and a host of other variables.

The "outside view," on the other hand, ignores the multitude of factors that go into making the individual forecast, and focuses instead on the group of cases believed to be most similar. In the case of eamings estimates, for example, it would zero in on how accurate eamings forecasts have been overall, or how accurate they have been for a specific industry or for the company itself, in deciding how precisely the analyst can estimate and the reliance that can be placed on the forecast.

If the stock market forecaster is to succeed using the inside view, he must capture the critical elements of the future. The outside view, in contrast, is essentially statistical and comparative, and does not attempt to read the future in any detail.

Kahneman relates a story to demonstrate the difference. In the mid-1970s, he was involved with a group of experts in developing a curriculum on judgment and decision-malcing under uncertainty for high schools in Israel. When the team had been in operation for a year and had made some significant progress, discussion tumed to how long it would take to complete the project. Everyone in the group, including Kahneman, gave an estimate. The forecasts ranged from 18 to 30 months. Kahneman then asked one of his colleagues, an expert in curriculum development, to think of similar projects he was familiar with, at a parallel stage in time and development. "How long did it take them from that point to complete their projects?" he asked.

After a long pause, the expert replied with obvious discomfort that, first of all, about 40% of the projects were never completed. Of the balance, he said, "I cannot think of any that was completed in less than seven years, nor any that took more than ten." Kahneman then asked if there were any factors that made this team superior in attempting the task. None, said the expert. "Indeed we are slighdy below average in terms of our resources and our potential." As experienced as he was with



[start] [1] [2] [3] [4] [5] [6] [7] [8] [9] [10] [11] [12] [13] [14] [15] [16] [17] [18] [19] [20] [21] [22] [23] [24] [25] [26] [27] [28] [29] [30] [31] [32] [33] [34] [ 35 ] [36] [37] [38] [39] [40] [41] [42] [43] [44] [45] [46] [47] [48] [49] [50] [51] [52] [53] [54] [55] [56] [57] [58] [59] [60] [61] [62] [63] [64] [65] [66] [67] [68] [69] [70] [71] [72] [73] [74] [75] [76] [77] [78] [79] [80] [81] [82] [83] [84] [85] [86] [87] [88] [89] [90] [91] [92] [93] [94] [95] [96] [97] [98] [99] [100] [101] [102] [103] [104] [105] [106] [107] [108] [109] [110] [111] [112] [113] [114] [115] [116] [117] [118] [119] [120] [121] [122] [123] [124] [125] [126] [127] [128] [129] [130] [131] [132] [133] [134] [135] [136] [137] [138] [139] [140] [141] [142] [143] [144] [145] [146] [147] [148] [149] [150] [151] [152] [153] [154] [155]