Neil Dutta
Wed, February 8, 2023
If an 'expert' is warning you that the market is about to crash, check their math
Beyond popping up as talking heads on cable news, Wall Street analysts and economists serve an important purpose helping to guide the financial decisions of a wide array of investors — from average Americans worried about retirement to large institutions deciding where to put billions of dollars. At its best, Wall Street's coterie of experts is supposed to translate the shifting sands of the economy into a cogent, useful investment view.
I've been a sell-side economist for over 15 years, helping clients and the public understand the shape of the economy and what it means for the markets. I've always tried to be clear and allow the data to shape my thoughts, rather than fit it into my preconceived notions. But over the past few years, and especially since the onset of the coronavirus pandemic, I've noticed that more and more analysts are relying on low-quality, prepackaged narratives to drum up fear about the direction of the economy and stock market. Admittedly, stocks haven't done well over the last year, but instead of providing clients and the general public with clear-eyed views, a new ecosystem of hackneyed, alarmist analysts is relying on low-quality data to push people away from steady investments into an alternative ecosystem of products of dubious quality.
Lies, damned lies, and statistics
Data is at the center of any economist's work; it provides crucial insights on the state of the economy and can be useful in helping to gauge what will happen next in the markets. But there are a few types of data points that investors should look out for when trying to identify shoddy analysis. In many cases these data points seem sophisticated or a perfect catchall, but in reality they paint a deceptive or overly simplistic picture of the economy.
One type of data point to be wary of involves vehicles for confirmation, which use old data to confirm what an analyst already believes. Take the popular Leading Economic Index. The idea behind the LEI is simple: It bundles up a series of disparate economic data points and tracks whether they are getting better or worse — promising to signal coming turns in the business cycle. But all the information in the LEI is already stale. The individual data components are released days or weeks before the composite index is published. For example, the most recent LEI was a summation of data from December, which an analyst could point to as a sign of an impending recession. But if you look at January's data, we've already seen positive signs, so there is good reason to expect the LEI to bounce this month — confirming what we already know.
The index is also revamped after every recession — given new weights and components so the new index perfectly signals the recession that just happened. But if you go back and look at the LEI prior to each recession, it typically doesn't signal a clear peak. Instead of being a useful measure to gauge the future health of the economy, the LEI is simply an index built around the latest recession that provides scary signals based on old data.
Other hackneyed data points are deliberately ambiguous or sweeping to the point of abstraction. Take another popular freak-out indicator: monetary aggregates, a measure of how much money, such as cash and bank deposits, is floating around in the economy. The aggregate is painted as a perfect summation of the economy — what better way to track its health than by measuring all the money changing hands among the government, businesses, and households? But the correlation between money growth and the health of the economy has broken down over the years, and scaremongers have been able to fit any change in the supply of dollars into whatever narrative suits them. Weak money growth is a problem, they argue, because having fewer dollars moving around the economy could signal that the system is seizing up. On the other hand, a rapid growth of the money supply has been used by doomsayers as a sign that the only thing supporting the economy is the Federal Reserve handing out new dollars. When a measure is so broad that it becomes a choose-your-own adventure, it is hardly useful for quality analysis.
The third type of much-loved, often-dubious tool is the overly fickle indicator. These data points are prone to large swings that produce a lot of false signals but make it easy for analysts to spin up a warning of impending catastrophe.
Take the ISM Manufacturing PMI, which has a near-mythical reputation on Wall Street. The ISM is convenient to use: It is released early in the monthly data cycle, broadly tracks the swings of the economy, and is easy to understand. The ISM is a survey of 300 purchasing managers at a wide variety of manufacturing companies who are asked whether conditions are better or worse relative to the prior month. Are customers ordering more or less? Is it easier or harder to find workers? Are prices for parts higher or lower? If the index falls below 50, then things are getting worse — above 50, and things are improving.
Let's assume the ISM signals a turning point in the business cycle when it runs below 50 for three consecutive months. Even with that broad reading, the ISM tends to send more false signals than correct ones. For example, in the 1990s the ISM had several dips below 50 that lasted longer than three months and no recession popped up. In fact, the ISM is three times as likely to be late in signaling a trough as it is early in signaling a peak. A recession prediction indicator this is not. Plus, the ISM is telling us only about the momentum of the economy. If GDP growth is 4% in January and 4% in February, the ISM would be 50 because things are staying exactly the same. Is that bad? No, 4% growth is pretty good. But the ISM would be interpreted by some to mean the economy is on the verge of a major slowdown.
Doomsday preppers
Ultimately statistics are only as good as the analysts and economists using them. Sifting through the hundreds of indexes, surveys, and economic measures requires a discerning eye and willingness to be humble about the unknowns.
One of the most basic tenets of statistics is that correlation is not causation. Most things in the macroeconomy are correlated. When people are getting more jobs, it usually means businesses are selling more products. Just because two lines on a chart are moving in a similar direction doesn't mean they explain everything happening in the wide world.
I cannot tell you the number of analysts I have come across in this industry who believe they have stumbled on something groundbreaking by publishing a chart of industrial production — a measure of how active factories are — against the ISM — which is, again, a measure of how factory executives feel about the economy. In many cases, the folks who break everything down into one or two "simple" explanations tend to have a conclusion in mind first.
A lot of this low-quality research is also tilted toward a negative bias — the economy is getting worse, the markets are about to tank, a recession is around the corner. Research shows that humans are hardwired to think negativity sounds smarter, and financial media is also tilted toward a focus on the downside. This makes a perverse incentive for lazy analysts to consistently beat the doom-and-gloom drum. It also makes it hard to discount them. In a bull market, when rising stocks lift all boats, these analysts are still making money while arguing the downside just "hasn't happened yet." And when the market shifts and stocks sink, they can crow "I told you so!" Heads I win; tails you lose.
In the end, the people most hurt by this are ordinary investors. People just trying to save for retirement or sock some of their paycheck away are discouraged from making solid, stable investments and pushed into more defensive positions — or in the most extreme cases away from investing altogether. The people who fall prey to these low-quality analysts end up leaving gains on the table and end up with a smaller stable of savings.
In my view, what makes someone worth listening to is whether their thought process makes sense. It is not enough to point to this indicator or that one. Rather, is the analyst laying out a sequence of events that follows logically? For example, if retail sales cratered in a month but gas prices fell, employment rose, and confidence picked up, would it make sense that sales sank? No, it's more likely to be an anomaly since all of the indicators point to Americans having more cash in their pocket. A good analyst would note those other trends and attempt to explain the discrepancy; a cheap analyst would declare that a recession is nigh. In this business, it is important to take a holistic approach.
There is no single summary statistic that gets you the right "call" — if it sounds too easy to be true, that's because it probably is. No single data point is a substitute for good judgment, which is the best leading indicator of all.
Neil Dutta is Head of Economics at Renaissance Macro Research.