April 13, 2014 Jim’s Journal: LOL
WARNING: do NOT read this week’s LOL if you are ready for a good night’s sleep with your jammies on and a hot chocolate (or scotch) in your hand. The contents may be a little disturbing. I’ve just finished Nate Silver’s “The Signal and the Noise” and to me it is a must-read for all investment professionals. However, given the nature of the book—statistical forecasting—and its length—457 pages—many won’t read it. So, that’s where this journal comes in: I’ll try to give you the gems in just a few pages. But note the warning above: this is not feel-good reading for active investment managers whose business is to produce alpha. Silver makes it embarrassingly clear how accurate the overconfidence bias is for all of us: “our bias is to think we are better at prediction than we really are.” And elsewhere in the book, “perhaps the central finding of behavioral economics is that most of us are overconfident when we make predictions.”
The title of the book—Signal and the Noise—refers to the goal of any predictive model which is to “capture as much signal as possible and as little noise as possible.” And the content of the book explains how hard that is!
My favorite example of overconfidence from the book is this one about the economic forecasts in November 2007, heading into the financial crisis of 2008. The Survey of Professional Forecasters said that GDP would end up about 2.4% in 2008, slightly below the long term trend. As Silver writes, “This was a very bad forecast: GDP actually SHRANK by 3.3% once the financial crisis hit. What may be worse is that the economists were extremely confident in their bad prediction. They assigned only a 3% chance to the economy’s shrinking by any margin over the whole of 2008. And they gave it only about a 1-in-500 chance of shrinking by at least 2%, as it did.” Various surveys show that economic forecasts fall outside the 90% confidence interval 25% to 50% of the time. Silver summarizes, “There is almost no chance that the economists have simply been unlucky; they fundamentally overstate the reliability of their predictions.”
When Silver sought out a humble economist to reflect honestly on this dismal record, he chose Jan Hatzius (Goldman Sachs economist). Silver writes: “Hatzius in November 2007 when most economists still thought a recession of any kind to be unlikely, Hatzius penned a provocative memo entitled, ‘Leveraged Losses: Why Mortgage Defaults Matter.’ It warned of a scenario in which millions of homeowners could default on their mortgages and trigger a domino effect on credit and financial markets, producing trillions of dollars in losses and a potentially very severe recession—pretty much exactly the scenario that unfolded.” When asked by Silver why it is that economic forecasters tend to do so poorly, Hatzius responded, “Nobody has a clue.” When Silver probed deeper, by asking why it is that economic forecasters don’t provide confidence intervals for their predictions, Hatzius responded, “Because they’re embarrassed.” The huge confidence intervals around their predictions would indicate that they really don’t have any ability to forecast!
So what CAN investors do to get more accurate economic forecasts? Use the wisdom of crowds. Silver writes, “my research into the Survey of Professional forecasters suggests that these aggregate forecasts are about 20% more accurate than the typical individual’s forecast at predicting GDP, 10% better at predicting unemployment, and 30% better at predicting inflation.”
And then the real gem for all of us: “This property—group forecasts beat individual ones—has been found true in almost every field in which it has been studied.”
As I read Silver, I was on the lookout for the topic of “conviction.” That is, the notion that analysts should show strong conviction for their buy and sell ideas. Many PM’s have expressed that as a problem-to-be solved in their firm: we need these analysts to show more passion for their ideas. Sure enough, Silver has a view: “the amount of confidence [conviction] someone expresses in a prediction is not a good indication of its accuracy—to the contrary, these qualities are often inversely correlated.” Interesting that the investment leaders who prize objectivity– “emotions must be left at the door”—are often the same ones who are telling their analysts to display a lot of emotion about their buy/sell ideas. I have long agreed with Silver’s viewpoint: I would rather have a level headed analyst lay out the probabilities of success than a wild-eyed, table-pounding one implore me to buy as much as I can now.
At one point, Silver condemns high conviction by saying, “when a prediction about a complex phenomenon [i.e. the stock market] is expressed with a great deal of confidence [i.e. conviction], it may be a sign that the forecaster has not thought through the problem carefully, has overfit his statistical model, or is more interested in making a name for himself than in getting at the truth.” Better to understand and “admit to what we do not know.”
Silver looks at predictions in many arenas: weather, poker, earthquakes, GDP, terrorism, pandemics. And of course, the stock market. Silver’s view of active management—can it beat the market?—is summarized in the passage below:
“efficient-market hypothesis can’t literally be true. Although some studies (one of Silver’s) seem to provide evidence for Fama’s view that no investor can beat the market at all, others are more equivocal, and only a few identify fairly tangible evidence of trading skill and excess profits.1 It probably isn’t the mutual funds that are beating Wall Street; they follow too conventional a strategy and sink or swim together. But some hedge funds (not most) very probably beat the market, and some proprietary trading desks at elite firms like Goldman Sachs almost certainly do. There also seems to be rather clear evidence of trading skill among options traders, people who make bets on probabilistic assessments of how much a share price might move. And while most individual, retail-level investors make common mistakes like trading too often and do worse than the market average, a select handful probably do beat the street.”
So Silver is just stating what we all know: that after risk, trading costs and fees, it is very hard for active managers to show consistent alpha to their clients. A good quote in the book from Henry Blodgett:
“Everybody thinks they have this supersmart mutual fund manager. He went to Harvard and has been running money for twenty-five years. How can he not be smart enough to beat the market? The answer is: Because there are nine million of him and they all have a fifty-million dollar budget and computers that are collocated in the New York Stock Exchange. How could you possibly beat that?”
Silver makes the same point later in the chapter on poker when he writes, “it is not so much how good your predictions are in an absolute sense that matters but how good they are relative to the competition. In poker, you can make 95% of your decisions correctly and still lose your shirt at a table full of players who are making the right move 99% of the time. Likewise, beating the stock market requires outpredicting teams of investors in fancy suits with MBAs from Ivy League schools who are paid seven-figure salaries and who have state-of-the-art computer systems at their disposal.” Believe it or not, when FCG asks investment managers, “what is your edge in this competitive market?” we often hear: we have smart and hardworking people. Really? So, show us the shops that have the dumb, lazy people!
Silver elaborates on this point about relative abilities: “if you have strong analytical skills that might be applicable in a number of disciplines, it is very much worth considering the strength of the competition. It is often possible to make a profit by being pretty good at prediction in fields where the competition succumbs to poor incentives, bad habits, or blind adherence to tradition—or because you have better data of technology than they do. It is MUCH harder to be VERY good in fields where everyone else is getting the basics right—and you may be fooling yourself if you think you have much of an edge.” Given all the CFAs who are newly minted every year, it is hard to believe that the relative strength of the competition in the markets is weak.
FCG’s experience shows that only a few active managers have good responses to the question, “what is your competitive edge?” For example, one of FCG’s client has focused on the “barriers to entry” piece of Porter’s five forces. (Michael Porter, Competitive Strategy). Given this firm’s track record of success and their deep knowledge of barriers to entry, we are inclined to agree: they have found an edge. But this answer is very different from “smart people who work hard.”
Silver does hold out some hope for the quants in the investment world. Speaking about poker players, Silver writes the following, but it could just as easily apply to fundamental research: “there is no other game that I know of where humans are so smug, and think that they just play like wizards, and then play so badly. Basically, it’s because they don’t know anything, they think they must be God-like, and the truth is that they aren’t. If computer programs feed on human hubris, then in poker they will eat like kings.” And in investing, good quants will chew up fundamental shops. Unless of course the fundamental shops have created a bona fide edge.
More importantly, for active managers who still want to compete in the “brutally competitive” (a quote from one of my PM clients) markets, Silver offers really good insight and advice about forecasting. Starting with, of course, one’s self awareness around over-confidence. Silver writes that winning in the markets requires “deliberate and conscious effort. You need to have the presence of mind to ignore [the crowd]. Otherwise you WILL make the same mistakes that everyone else is making.”2
But Silver advises investors to understand the consensus: “I pay quite a bit of attention to what the consensus view is—what a market like Intrade is saying—when I make a forecast. It is never an absolute constraint. But the further I move away from that consensus, the stronger my evidence has to be before I come to the view that I have things right and everyone else has it wrong. This attitude, I think, will serve you very well most of the time.” “One way to look at this is that markets are usually very right but occasionally very wrong.” Buffett makes this same point in his baseball analogy of “waiting for the fat pitch.” But as Silver says, “it’s very hard to make a steady career out of that, doing nothing for years at a time.”
So if “sometimes the price is wrong” what tips does Silver have for us?
Silver uses the same chart that Mauboussin and others have used to make the point about skill vs. luck and what it implies about process:
|Low luck||High luck|
|High skill||Chess||Poker or Investing|
Silver writes, “It can take a long time for poker players [or investors!] to figure out how good they really are.” I have sat in on performance reviews for PM’s and analysts where they make statements about one month’s performance as if it is all SIGNAL! In other words, that one month is an accurate statement about whether the investor has skill or not. You might as well assess major league batters on how they did in a single game.
It is so tempting in the investment world to mistake signal for noise, and many firms do it. But the reality for investors is that the environment they live in is VERY noisy, with precious little signal in it. (For this reason, Silver says that there is little hope that economies or earthquakes can be accurately forecasted, there is virtually no signal among all the noise.) Given this reality, Silver writes, “the only solution when the data is very noisy is to focus more on process than on results. If the sample of predictions is too noisy to determine whether a forecaster is much good, we can instead ask whether he is applying the attitudes and aptitudes that we know are correlated with forecasting success over the long run. In a sense, we’ll be predicting how good his predictions will be.”
And then, giving a nod to FCG’s business of coaching and development, Silver writes of the best poker players, “they don’t take any of their success for granted; they focus as much as they can on self-improvement.” Silver quotes a professional poker player, but it could just as easily be a successful investor: “Anyone who thinks they’ve gotten good enough, good enough that they’ve solved poker, they’re getting ready for a big downsizing!”
Surprisingly, Silver hits on another theme of FCG: meditation. (See “Why every investor should meditate” on the FCG website, written 10 years ago!) Speaking about the techniques of a world class poker coach, Silver writes, “Angelo’s methods for achieving success with his clients are varied and sometimes unconventional: he is an advocate of meditation, for instance. And for increasing self-awareness, encouraging them to develop a better sense for which things are and are not within their control.” If you’ve not seen the YouTube of Ray Dalio (CEO at Bridgewater Capital) discussing his own meditation practice, you can follow this link to view it: http://www.youtube.com/watch?v=zM-2hGA-k5E In it, he attributes much of his success to meditation because it provides him the calm mind to avoid reactive behavior. And he covers the same “above/below the line” logic as FCG when he describes the fight/flight response that interferes with good thinking.
And then the final and all-important paragraph of the chapter about poker, which speaks directly to investors:
“When we play poker [or invest], we control our decision-making process but not how the cards come down. If you correctly detect an opponent’s bluff, but he gets a lucky card and wins the hand anyway, you should be pleased rather than angry, because you played the hand as well as you could. The irony is that by being less focused on your results, you may achieve better ones.” (This may sound unusual or ironic to Silver but it is well accepted wisdom in many spiritual disciplines: focus on the actions, not the outcomes.)
So, what do we make of an active equity shop that sponsors a “pick three stocks” contest for their analysts and then gives big prizes to the winners, 6 months later? 3 stocks over six months? Isn’t that completely noise? (I know I sound like the president of the “no fun club” chiding them for their contest, which is meant to be fun.) But the principle is pretty poor: in effect, bet on Red 17, get lucky, then call yourself a genius! One analyst is reported to have taken his winnings and left the firm immediately afterwards! Nearly every firm we have worked with focuses on the performance numbers NOT the process that generated the numbers. The rules of investing—like the rules of poker—should be clearly stated enough so that after the fact one can assess whether the “hand was well played” or not. Silver writes that TV announcers for poker—like many investment leaders—focus “too much on the results [of the poker hands] and not enough on the correct decision-making process.” This sort of evaluation requires that investors keep accurate journals for their decision making…and the emotional state they are in while making the decision. (Only about 10% of investors that we’ve met do this.) without journals, there is no hope of accurately assessing the quality of the research because “hindsight bias” will alter your view of why you bought or sold a security.
So what is another skill that good investors should develop? Calibration. It has to do with self-awareness: knowing our ability to be overconfident and to scale back forecasts probabilities accordingly. An example: “the national weather Service forecasts are admirably well calibrated. When they say there is a 20% chance of rain, it really does rain 20% of the time. They make good use of feedback—to fine tune their forecasts—and consequently they are honest and accurate.” (The meteorologists at the Weather Channel tend to fudge a little bit for better ratings. But the biases really jump out in the local network news, with accuracy and honesty paying the price.)
In describing the role of meteorologists, Silver comes close to describing a valid role for many fundamental analysts working with quant filters: “The National Weather Service keeps two different sets of books: one that shows how well the computers are doing by themselves and another that accounts for how much value the humans are contributing.” (Note: humans add about 25% over the computer guidance alone.) Many investment shops use a combination of quant tools and fundamental analysis, with the goal being that fundamental work will add a significant value to what the computers can do alone. Apparently with meteorologists, the number—25%– is readily available! We worked with one fundamental equity shop in which one analyst had gone over to the dark side and was preaching heresy: “fundamental research adds NO value, all the alpha is created by the quant model.” The firm solved his problem by removing him from the equity team.
Silver reinforces FCG teachings throughout the book (maybe that’s why I liked it so much!), such as: “Getting feedback about how well our predictions have done is one way—perhaps the essential way—to improve them.” Specifically, he cites the weather forecasters: “meteorologists get a lot of feedback—weather predictions play out daily.” By constantly analyzing their forecast accuracy, the meteorologists refine and improve their methods over time.
Silver does a wonderful job of explaining Bayesian modeling for the lay person. From a Bayesian perspective, “prediction is fundamentally a type of information-processing activity—a matter of using new data to test our hypotheses about the objective world, with the goal of coming to truer and more accurate conceptions about it.“
Silver offers a Bayesian argument for why black and white thinking (“all or none”) is harmful to good problem solving. He writes, “If you hold there is a 100% probability that God exists, or a 0% probability, then under Bayes’ theorem, NO amount of evidence could persuade you otherwise.” Bayes’ theorem involves developing a hypothesis to explain some phenomenon and then testing it. (Basic scientific method) But if person’s a priori view is absolute: 100% or 0%, then there is nothing to test. We call this “holding your story tightly” and recommend that firms eliminate it from their culture. One of the world class poker plays makes this same point when he says, “It’s important in life to come up with a probability instead of a yes or no.” If investors take one idea away from this book it should be: think probabilistically. Not in absolutes.
On self-awareness, Silver is clear: “Elite chess players [and investors!] tend to be good at metacognition—thinking about the way they think.” Also, Silver says that self-awareness around our own biases, like overconfidence, is hugely important.
In summary, these big ideas may be useful to investment professionals:
- We are all overconfident, so accept it and become more humble in your forecasts
- Feedback is key to improving your forecasting skill, so look for it.
- Keep a journal. You need to be able to accurately assess the process for each decision.
- Process is more important than results in a noisy environment with little signal
- Did I mention overconfidence?
See you next time for Michael Lewis’ Flash Boys. Cheers, jw