How macroeconomic statistics failed the US

By Felix Salmon
September 27, 2011

There’s one big reason why the current economic weakness in the US has come as such a shock. It’s not the only reason, but it’s an important one, and it hasn’t gotten nearly the attention it deserves: the state of macroeconomic data-gathering in the US is pretty weak.

In particular, the data coming out of the Bureau of Economic Analysis at the beginning of 2009 was way off. Here’s Cardiff Garcia, introducing an interview with Fed economist Jeremy Nalewaik:

The initial GDP estimate for the fourth quarter of 2008 showed that the economy contracted by 3.8 per cent. It was released on January 30, 2009 — about three weeks before Obama’s first stimulus bill passed. That number was continually adjust down in later revisions, and in July of this year the BEA revised it all the way down to a contraction of 8.9 per cent.

The BEA is happy to try to explain what happened here — but whatever the explanation, the original 3.8% figure was a massive and extremely expensive fail. It was bad enough to be able to get a $700 billion stimulus plan through Congress, but if Congress and the Obama Administration had known the gruesome truth — that the economy was contracting at a rate of well over $1 trillion per year — then more could and would have been done, both at the time and over subsequent months and years. Larry Summers warned at the time that the risks of doing too little were much greater than the risks of doing too much; only now do we know just how right he was on that front. (And even he didn’t push for a stimulus of more than $700 billion.)

So what’s being done to beef up the state of America’s macroeconomic statistics so that this kind of monster error doesn’t happen again? The BEA is doing the best it can, but it’s constrained both in terms of its budget and in terms of the quality of economists it can attract.

Here’s how Cardiff ended his interview:

FT Alphaville was recently having a broader discussion about the status of macroeconomic data-gathering in the US with a fellow blogger, Felix Salmon, and he made the point to us that it’s been in secular decline for the last few decades. Do you agree? Is this something that’s come up in your work on output measures?

Some evidence suggests that the measurement errors in GDP growth have become worse in recent years. This may have to do with the increasing importance of services in the economy in recent decades, a sector where the GDP source data has historically been spotty. This is because, historically, the U.S. Census bureau has not collected spending data for many types of services on a regular basis.

Despite budget constraints, the statistical agencies have mounted a major effort to improve their measurement of services GDP. However, even as they make progress, it is important to keep in mind that there will always be measurement errors of some kind or another in the GDP and GDI source data, so taking some sort of weighted average, as I proposed in the Brookings paper, would be the soundest approach.

Frankly, this just isn’t good enough. Moving to a weighted average of GDP and GDI doesn’t improve the quality of our statistics one bit; it’s just an attempt to cope with the fact that neither of them is particularly reliable. As the economy becomes increasingly complex and service-based rather than goods-based, it’s crucial that our statistical architecture keeps pace — and it clearly isn’t doing so.

When I told Cardiff that the status of macroeconomic data-gathering has been declining for decades, I was making two separate statements — first that the quality of statistics has been declining, and secondly that the status of economists collating such statistics has been declining as well. Once upon a time, extremely well-regarded statisticians put lots of effort into building a system which could measure the economy in real time. Today, I can tell you exactly how many hot young economists dream of working for the BEA on tweaks to the GDP-measurement apparatus: zero.

I’m pessimistic that this is going to change. Putting together macroeconomic statistics is not a prestigious part of the economics profession any more, and government payscales are pretty meager compared to what good economists earn elsewhere.

Increasingly the economists in the government who craft the policy responses to macroeconomic developments are working on a GIGO (garbage in, garbage out) basis. That, in turn, means more bad responses, more bubbles, more recessions, and in general more macroeconomic volatility. The world is getting messier — and we don’t even have a good basis for measuring just how messy it is, any more.

14 comments

Comments are closed.