Grading the Forecasters

Larry Swedroe on data that reveals the disappointing accuracy of market forecasts.

eLhI30EVQgi2DnRqxNG3_larry_swedroe.jpg

Larry Swedroe, Director of Research, The BAM Alliance

The financial media tends to focus much of its attention on stock market forecasts by so-called gurus. They do so because they know that it gets the investing public’s attention. Investors must believe such forecasts have value or they wouldn’t tune in. Nor would they subscribe to various investment newsletters or publications that, like some, claim to provide you with “news before the markets know.”

Unfortunately for investors, there’s a whole body of evidence demonstrating that market forecasts have no value (though they supply plenty of fodder for my blog)—their accuracy is no better than one would randomly expect.

For investors who haven’t learned that forecasts should only be considered entertainment (or that they may fall into the more nefarious category of what Jane Bryant Quinn called “investment porn”), they actually have negative value because forecasts can cause such investors to stray from well-developed plans.

Empirical Data On Forecasting

A new contribution to the evidence on the inability to forecast accurately comes from David Bailey, Jonathan Borwein, Amir Salehipour and Marcos Lopez de Prado, authors of the March 2017 study “Evaluation and Ranking of Market Forecasters.”

Their study covered 6,627 market forecasts (specifically for the S&P 500 Index) made by 68 forecasters who employed technical, fundamental and sentiment indicators. The sample period is 1998 through 2012.

Their methodology was to compare forecasts for the U.S. stock market to the return of the S&P 500 Index over the future interval(s) most relevant to the forecast horizon. The authors evaluated every stock market forecast against the S&P 500 Index’s actual return over four time periods—typically one month, three months, six months and 12 months.

They then determined the correctness of the forecast (i.e., whether the forecaster has made a true or false forecast) in accordance with the time frame for which the forecast was made. Because of the more random nature of short-term returns, they weighted the forecasts as follows:

  • Up to one month: 0.25
  • Up to three months: 0.50
  • Up to nine months: 0.75
  • Beyond nine months (up to two to three years): 1.00
  • If the forecast does not include a time frame, or unless there is an impression stating otherwise: 0.25

Following is a summary of the authors’ findings:

  • Across all forecasts, accuracy was 48%—worse than the proverbial flip of a coin.
  • Two-thirds of forecasters had accuracy scores below 50%.
  • About 40% of forecasters had an accuracy score between 40% and 50%.
  • About 3% of forecasters fell in the left tail, with accuracy scores below 20%.
  • About 6% of forecasters fell in the far right tail, with accuracy scores between 70% and 79%.
  • The highest accuracy score was 78% and the lowest was 17%.

The distribution of forecasting accuracy by the gurus examined in the study looks very much like the common bell curve—which is what you would expect from random outcomes. That makes it very difficult to tell if there is any skill present.

Famous Forecaster Scores

There were many well-known forecasters among the results. I’ve highlighted 10 of the more famous, most of whom I’m sure you’ll recognize, along with their forecasting score:

  •  James Dines, founder of The Dines Letter. According to his website, he is “truly a living legend … one of the most accurate and highly regarded investment analysts today.” His forecasting accuracy score was 50%. Not quite the stuff of which legends are made.
  • Ben Zacks, a co-founder of well-known Zacks Investment Research and senior portfolio manager at Zacks Investment Management. His score was 55%.
  • Bob Brinker, host of the widely syndicated MoneyTalk radio program and publisher of the Marketimer newsletter. His score was 46%.
  • Jeremy Grantham, co-founder and chief investment strategist of GMO, a global investment management firm. His score was 42%.
  • Dr. Marc Faber, publisher of the Gloom, Boom & Doom Report. His score was 39%.
  • Jim Cramer, host of CNBC’s Mad Money. His score was 37%, finishing in 50th place.
  • John Mauldin, well-known author. According to his website, “his individual investor-readers desperately need to know what his institutional money-manager clients and friends know about the specific investments available to help them succeed in challenging markets.” His score was just 36%.
  • Gary Shilling, Forbes columnist and president of A. Gary Shilling & Co. His score was just 34%.
  • Abby Joseph Cohen, recently retired from her position as president of Goldman Sachs’ Global Market Institute. Her score was just 34%.
  • Robert Prechter, president of Elliott Wave International, publisher of the Elliott Wave Theorist and the author of multiple books. He brought up the rear, with a score of just 17%.

Of course, there were a few forecasters with fairly good records. But only four of the 68 gurus posted scores at or above 70% (among them was David Dreman), and just 11 in total had scores above 60%.

Yet 18 forecasters had scores below 40% (versus just the 11 with scores above the 60% mark), and five had scores below 30% (compared with just four with scores above 70%). It’s also important to keep in mind that strategies based on forecasts have no costs, but implementing them does.

Hindsight Is 20/20

As the authors noted, while some forecasts turn out to be uncannily accurate, others lead to significant losses. Unfortunately, it’s extremely difficult to determine ahead of time which will prove accurate. And if you pay attention, even those who provide the forecasts have admitted the difficulty (though they get paid a lot of money to ignore the evidence).

Here’s what Barton Biggs, who at the time was the director of global strategy at Morgan Stanley, had to say: “God made global strategists so that weathermen would look good.” Keep this in mind the next time you find yourself paying attention to some guru’s latest forecast. You’ll be best served by ignoring it.

As I point out in my book, “Think, Act, and Invest Like Warren Buffett,” that’s exactly what Buffett himself does, and what he advises you to do—ignore all forecasts because they tell you nothing about the direction of the market, but a whole lot about the person doing the predicting.

This commentary originally appeared July 28 on ETF.com

By clicking on any of the links above, you acknowledge that they are solely for your convenience, and do not necessarily imply any affiliations, sponsorships, endorsements or representations whatsoever by us regarding third-party Web sites. We are not responsible for the content, availability or privacy policies of these sites, and shall not be responsible or liable for any information, opinions, advice, products or services available on or through them.

The opinions expressed by featured authors are their own and may not accurately reflect those of the BAM ALLIANCE. This article is for general information only and is not intended to serve as specific financial, accounting or tax advice.

© 2017, The BAM ALLIANCE

©2024 West Loop Financial LLC