Conspiracy at the BLS?

by Menzie Chinn, Econbrowser

Reader rjs writes in defense of Noel Sheppard’s mis-reading of a BLS release:

… with the benchmark revision & 3 different adjustments, i cannot have confidence in anything in this report…absent the seasonal adjustment, the actual number for january non-farm payrolls was a loss of 2,689,000 jobs; knowing that the BLS confidence interval is on the order of plus or minus 100,000; seasonally adjusting that job loss to show 243,000 jobs gained leaves plenty of room for an error in the methodology…

Since several (usually conservative) commentators (e.g., [0]) have raised questions about the BLS methodologies, I thought it of interest to look at how implausible the estimates are – with a focus on the seasonal adjustment procedure (which seems to be black magic to many). First, here’s graph of the seasonally adjusted and the seasonally unadjusted nonfarm payroll employment series (both easily available to the intellectually curious at the St. Louis Fed’s FRED database, category 11).

Figure 2: Reported nonfarm payroll employment, seasonally adjusted (black), and not seasonally adjusted (blue), in 000’s, 2007M01-2012M01. NBER defined recession dates shaded gray. Source: BLS January release via FRED, series PAYEMS and PAYNSA, respectively.

So, to the uninitiated it must surely be confusing that the not seasonally adjusted (nsa) series is declining while the seasonally adjusted (sa) series is rising. rjs argues that there is too much going on in the January figures to make anything of the rise. I thought it useful to at least examine the seasonal adjustment process. From BLS:

Estimating Methods

Benchmark data

For the establishment survey, annual benchmarks are constructed to realign the sample-based employment totals for March of each year with the UI-based population counts for March. These population counts are less timely than sample-based estimates and are used to provide an annual point-in-time census for employment. For National series, only the March sample-based estimates are replaced with UI counts. For State and metropolitan area series, all available months of UI data are used to replace sample-based estimates. State and area series are based on smaller samples and are, therefore, more vulnerable to both sampling and non-sampling errors than National estimates.

Population counts are derived from the administrative file of employees covered by UI. All employers covered by UI laws are required to report employment and wage information to the appropriate State workforce agency four times a year. Approximately 97 percent of total nonfarm employment within the scope of the establishment survey is covered by UI. A benchmark for the remaining 3 percent is constructed from alternate sources, primarily records from the Railroad Retirement Board and County Business Patterns. The full benchmark developed for March replaces the March sample-based estimate for each basic cell. The monthly sample-based estimates for the year preceding and the year following the benchmark are also then subject to revision.

Monthly estimates for the year preceding the March benchmark are readjusted using a “wedge back” procedure. The difference between the final benchmark level and the previously published March sample estimate is calculated and spread back across the previous 11 months.

The wedge is linear; eleven-twelfths of the March difference is added to the February estimate, ten-twelfths to the January estimate, and so on, back to the previous April estimate, which receives one-twelfth of the March difference. This assumes that the total estimation error since the last benchmark accumulated at a steady rate throughout the current benchmark year.

Estimates for the 7 months following the March benchmark also are recalculated each year. These post-benchmark estimates reflect the application of sample-based monthly changes to new benchmark levels for March and the computation of new business birth/death factors for each month.

Following the revision of basic employment estimates, estimates for women employees and production and nonsupervisory employees are recomputed using the revised all-employee estimates and the previously computed sample ratios of these workers to all employees. All basic series of employment, hours, and earnings are re-aggregated to obtain estimates for each sector and higher level of detail. Other derivative series (such as real earnings and payroll indexes) also are recalculated. New seasonal adjustment factors are calculated and all data series for the previous 5 years are re-seasonally adjusted before full publication of all revised data in February of each year. [emphasis added – mdc]

In general, one should consult the documentation if one has questions about the NFP series.

Once one has a handle on how the series are created, one can evaluate how sensitive the estimates are to differing approaches. Now, the BLS documentation makes clear the seasonal adjustment procedure is applied to components of aggregate employment, and I don’t have ready access to the n.s.a. versions of the components. But we can still apply the seasonal adjustment process to the aggregate n.s.a. series. Below, I report results when I apply the procedure to a series including and excluding temporary Census workers.

Built into many time-series statistical packages is the standard seasonal adjustment procedure, Census X-12 ARIMA.

Figure 3: Log nonfarm payroll employment, seasonally adjusted by BLS (black), seasonally adjusted using X-12 ARIMA applied to PAYNSA over 2006-2012 (dark red), applied to PAYNSA ex.-temporary Census workers (green). NBER defined recession dates shaded gray. Source: BLS January release via FRED, series PAYEMS and PAYNSA, respectively, NBER; X-12 ARIMA executed in EViews 7.

To see if these results were driven by the Census procedure, I used a generic seasonal adjustment procedure which estimates seasonal factors as deviations from a moving average (the seasonal factors are assumed to be additive, given I am working on log series). I also estimate the seasonal on differing samples: 1967-2012M01, 1987-2012M01, 2007-2012M01, and the latter, using a n.s.a. series excluding temporary Census workers.

Figure 4: Log nonfarm payroll employment, seasonally adjusted by BLS (black), seasonally adjusted using difference from moving average over 1967-2012 period (dark red), over 1987-2012 period (green), over 2007-2012 period (purple), and over 2007-2012 period on series excluding temporary Census workers (orange), in 000’s, 2007M01-2012M01. NBER defined recession dates shaded gray. Source: BLS January release via FRED, series PAYEMS and PAYNSA, respectively. Additive seasonal executed in EViews 7.

As is obvious from Figures 3 and 4, I do not obtain drastically different results when I apply X-12 ARIMA to the aggregate n.s.a. series, or apply a generic adjustment procedure to a sample starting in 2007. In terms of changes in 2012M01, I find the BLS estimate of 243 thousand to be bracketed by 203 thousand (X-12 applied to PAYNSA) and 282.3 thousand (additive seasonal applied to ex.-Census PAYNSA).

Figure 5: First difference of log nonfarm payroll employment, seasonally adjusted by BLS (black), seasonally adjusted using X-12 ARIMA applied to PAYNSA over 2006-2012 (dark red), applied to PAYNSA ex.-temporary Census workers (green), seasonally adjusted using additive seasonal factors estimated over 2007-2012M01 period on PAYNSA (purple), on PAYNSA ex.-temporary Census workers. NBER defined recession dates shaded gray. Source: BLS January release via FRED, series PAYEMS and PAYNSA, respectively, NBER; X-12 ARIMA and seasonal adjustment executed in EViews 7.

Finally, as I illustrated in Figure 1 of this post, the household series, adjusted to conform to the nonfarm payroll employment concept, which conservative commentators had urged the BLS to construct, evidences a similar upward trend (and has recorded figures consistently higher than the official establishment series. This is something no one else has remarked upon, to my knowledge.

So, there might be some conspiracy in the bowels of the BLS, busily and deliberately churning out misleading employment data, as some conservative commentators suggest. But for me, a little investigation leads me to doubt that thesis.

Related Articles

Analysis articles about employment

Analysis Blog articles by Menzie Chinn

The Problems With Employment Data Tracking Patterns by John Lounsbury, 5 April 2010

Unemployment: Keeping it Honest by John Lounsbury, Seeking Alpha, 7 August 2009

Are Unemployment Numbers Bogus? By John Lounsbury, Seeking Alpha, 4 June 2009

About the Author

Menzie Chinn is Professor of Public Affairs and Economics at the University of Wisconsin, Madison. His research examines the empirical and policy aspects of macroeconomic interactions between countries. Recent work focuses on how exchange rates behave and identifying the determinants of trade surpluses and deficits.
He is coauthor with Jeffry Frieden of Lost Decades: The Making of America’s Debt Crisis and the Long Recovery (W.W. Norton, 2011), and coauthor with Yin-Wong Cheung and Eiji Fujii of The Economic Integration of Greater China: Real and Financial Linkages and the Prospects for Currency Union (Hong Kong University Press, 2007).
In 2000-2001, Professor Chinn served as Senior Staff Economist for International Finance on the Council of Economic Advisers. He is a Research Associate in the International Finance and Macroeconomics Program of the National Bureau of Economic Research and on the advisory council of the Peterson Institute for International Economics. He has been visiting scholar at the International Monetary Fund, the Congressional Budget Office and the Federal Reserve Board. He is currently an associate editor of the Journal of Money, Credit and Banking. In 2011 he started a two-year term on the Congressional Budget Office’s Panel of Economic Advisers. His work has been cited in The Economist, Financial Times, Reuters, Wall Street Journal, Business Week, and he has been interviewed on CNBC, NPR, and Minnesota Public Radio.

Share this Econintersect Article:
  • Print
  • Digg
  • Facebook
  • Yahoo! Buzz
  • Twitter
  • Google Bookmarks
  • LinkedIn
  • Wikio
  • email
  • RSS
This entry was posted in Employment and tagged , , , , , , , . Bookmark the permalink.

Make a Comment

Econintersect wants your comments, data and opinion on the articles posted.  As the internet is a "war zone" of trolls, hackers and spammers - Econintersect must balance its defences against ease of commenting.  We have joined with Livefyre to manage our comment streams.

To comment, just click the "Sign In" button at the top-left corner of the comment box below. You can create a commenting account using your favorite social network such as Twitter, Facebook, Google+, LinkedIn or Open ID - or open a Livefyre account using your email address.