Body
Bureau of Labor Statistics Commissioner Erika McEntarfer suggested earlier this month that budgetary pressures might require cutting the sample size of the Current Population Survey (CPS) by 5,000 households. She emphasized the cut wasn't a done deal — but left the impression it's the leading option to resolve the survey's near-term money crunch.
The CPS provides the statistical foundation for monthly estimates of the unemployment rate and other key indicators of US labor-market health. The survey's ongoing viability is essential to myriad decision makers — including the Federal Reserve, fiscal policy makers, investors, and state and local governments, to name a few.
- The budget crunch confronting the survey stems from inadequate increases in spending authority, rising pay rates for field representatives who collect the data, and households' declining willingness to participate in the survey.
- A cut in sample size would adversely affect the accuracy of every estimate generated from the survey responses.
- At the national level, the reduction in accuracy would probably be undetectable in most cases. For example, the monthly unemployment rate at the national level is currently surrounded by a 90% confidence interval that extends about 0.20 percentage point in each direction. If the sample size is reduced from 70,000 to 65,000, that confidence interval would increase to about +/- 0.21 ppt — a tiny effect.
- The adverse effects would be more serious for estimates derived from small subsets of the national sample. Such estimates — for example, for minorities, small states, and age-related subgroups of the population — are already much less precise than their national-average counterparts, and would become even less so. But again, the incremental damage from a one-time sample cut of 5,000 would be slight.
- As reassuring as these considerations might seem, there are real dangers. First, BLS is already refraining from publishing some detailed estimates because the statistical accuracy of these estimates falls below established quality thresholds. A cut in sample size presumably would cause more estimates to be dropped from publication.
- Second, the budget pressure on this program is driven heavily by households' diminishing willingness to participate in the survey – and that pressure will only intensify if the response rate for the survey continues to slide. In other words, there's no telling that a one-time cut of 5,000 would be the end of the story. Any subsequent cuts in sample size would have proportionately larger adverse effects on accuracy.
- Third, there's no fully articulated a strategy for how to replace the CPS if the situation continues to deteriorate. The CPS gathers information about aspects of the labor market that are difficult or impossible to measure using "big data" or alternative sources. If the survey response rate were to implode — and the current plan to add an internet self-response mode were to encounter similar challenges — it's not clear how the statistical agencies would proceed.
A major guidepost for economic policy making could be jeopardized if robust solutions aren't developed quickly. Modernizing the survey to ensure its long-term viability will require resources that would need funding from Congress.
Some Detailed Estimates Already Not Possible
With a universe of about 60,000 households and a response rate of 70%, the CPS has been completing roughly 42,000 interviews for the past year or so. That sounds like a lot — and indeed, it dwarfs the size of many other surveys. Yet it's insufficient for some purposes.
For example, the BLS aims to publish estimates of the unemployment rate for detailed demographic groups in each of the 50 states plus the District of Columbia. However, it doesn't publish estimates when their precision falls below a specified threshold.
The following graph shows the number of states in which the CPS, with its current sample size, provided an adequate basis for estimating the annual average unemployment rate in 2022 for various demographic groups. (Monthly and quarterly estimates are far noisier than these annual averages.)
Even Current Sample Is Too Small to Support Some Estimates
These numbers are essentially replicated for white people, whose large share of the population allows for sample sizes to be adequate for many estimates. However, for Black, Asian, and Hispanic or Latino people, the numbers are markedly lower even at the current sample size. If the sample size is cut by 5,000, the number of estimates failing to meet the publication-quality threshold inevitably will rise.
Why the CPS Is So Essential
The CPS collects information that's difficult or impossible to obtain elsewhere. This information allows answers to be developed for several key questions, including:
- If you weren't employed, did you want a job? This attitudinal question can only be answered by the respondent.
- If you wanted a job, did you search for one within the past four weeks? Because relevant search activity can take many different forms, this question is most easily answered by the respondent. Unless respondents have searched within the past four weeks, they're not defined as unemployed for purposes of the headline measure, even if they would like a job.
- If you were at home, what was your workforce status? Conceivably, location-tracking data — the type of resource visionaries often point to as a possible substitute for the CPS — could be used to estimate how many individuals were traveling to known places of work — but for people remaining at home, tracking data probably couldn't distinguish between those who don't want a job, those who want one but can't find one, and people working from home.
One Factor Driving Up Costs
This development has helped drive up the cost of conducting the survey. When a household doesn't respond to an invitation to participate in the CPS, a field representative will follow up in hopes of obtaining a response. These efforts are time-consuming, and hence costly for the survey.
A Situation of Considerable Peril
Much remains uncertain about the situation — including whether the sample size will indeed be cut. But the real danger lurks in the confluence of fundamental forces behind the survey's budget squeeze: flat nominal annual spending authority, rising pay costs, and American households' declining willingness to participate in such surveys.
Unless aggressive action is taken, it's possible a key statistical guidepost for decision makers throughout the economy will be seriously degraded.
Commentary Type