The National Journal
January 2,1 2011
Taming unemployment starts with solving the mystery of the jobs that were supposed to have been created in the past 10 years but weren’t.
America’s jobs crisis began a decade ago. Long before the housing bubble burst and Wall Street melted down, something in our national job-creation machine went horribly wrong.
The years between the brief 2001 recession and the 2008 financial collapse gave us solid growth in our gross national product, soaring corporate profits, and a low unemployment rate—but job creation lagged stubbornly behind, more so than in any economic expansion since World War II.
The Great Recession wiped out what amounts to every U.S. job created in the 21st century. But even if the recession had never happened, if the economy had simply treaded water, the United States would have entered 2010 with 15 million fewer jobs than economists say it should have.
Somehow, rapid advancements in technology and the opening of new international markets paid dividends for American companies but not for American workers. An economy that long thrived on its dynamism, shedding jobs in outdated and less competitive industries and adding them in innovative new fields, fell stagnant in the swirls of the most globalized decade of commerce in human history.
Even now, no one really knows why.
This we do know: The U.S. economy created fewer and fewer jobs as the 2000s wore on. Turnover in the job market slowed as workers clung to the positions they held. Job destruction spiked in each of the decade’s two recessions. In contrast to the pattern of past recessions, when many employers recalled laid-off workers after growth picked up again, this time very few of those jobs came back.
These are the first clues—incomplete, disconcerting, and largely overlooked—to a critical mystery bedeviling a nation struggling to crawl out of near-double-digit unemployment. We know what should have transpired over the past 10 years: the completion of a circle of losses and gains from globalization. Emerging technology helped firms send jobs abroad or replace workers with machines; it should have also spawned domestic investment in innovative industries, companies, and jobs. That investment never happened—not nearly enough of it, in any case.
If we can’t figure out why, we may be doomed to a future that feels like a long jobless recovery, no matter how fast our economy grows. “It’s the trillion-dollar question,” says David E. Altig, senior vice president and research director for the Federal Reserve Bank of Atlanta, where economists are beginning to explore the shifts that have clubbed American workers like a blackjack. “Something big has happened. I really don’t think we have a complete story yet.”
THE LOST DECADE
We certainly didn’t see it coming. At the turn of the millennium, the Bureau of Labor Statistics predicted that the U.S. economy would create nearly 22 million net jobs in the 2000s, only slightly fewer than the boom 1990s yielded. The economists predicted “good opportunities for jobs” and “an optimistic vision for the U.S. economy” through 2010.
Businesses would reap the gains of new trading markets, the projection said, and continue to invest in technologies to boost the productivity of their operations. High-tech jobs would abound, both for systems analysts with four years of college and for computer-support analysts with associate’s degrees. The manufacturing sector would stop a decades-long jobs slide, and technology would lead the turnaround. Hundreds of thousands of newly hired factory workers would make cutting-edge electrical and communications products, including semiconductors, satellites, cable-television equipment, and “cellular phones, modems, and facsimile and answering machines.”
Politicians, particularly those in the Rust Belt, decried the losses. Hardly anyone, meanwhile, noticed the more damaging shortfall in the national jobs picture: Every major occupational group was running far behind the 2010 job-growth projections—often to the tune of 2 million jobs per group.
The forecasters said that the economy would create 22 million jobs over the next 10 years. At the decade’s economic peak, though, that number stood at only 7 million. Job growth in the 2000s was the lowest of any decade ever recorded by the federal government, stretching back to the 1940s. As a result, workers were extremely vulnerable to the tidal-wave recession that washed away all of the decade’s meager gains.
U.S. payrolls, by their 2008 peak, had grown about 5 percent from the start of the decade. Ever since the Labor Department began tracking employment in the late 1930s, no previous decade produced less than 20 percent payroll growth.
The national population grew faster than the labor force; in 2008, about 63 percent of working-aged Americans held a job, down from 65 percent in 2008, reversing decades of improvement in the employment-population ratio. Real middle-class incomes fell from 2000 to 2007—from a median of $58,500 to $56,500 another first in U.S. record-keeping.
It’s easy to see today why such alarming numbers went so undetected. The national unemployment rate stayed persistently low, between 4 and 6 percent, until the financial crash. Voters tend to associate the jobless rate with the strength of the economy. But the rate was low not because the economy was adding a lot of jobs, but because fewer people were joining the workforce—specifically, fewer women.
Female workers poured into the labor pool during World War II and steadily throughout the decades that followed. In the late 1990s, that trend began to end with about three in five women in the workforce. The phenomenon was a mathematical blessing for the unemployment rate, which measures the percentage of eligible workers who want to find jobs but can’t. When women’s employment demand stopped increasing, the economy didn’t need to create as many new jobs to keep the jobless rate low.
Blinded by low unemployment, lawmakers and economists overlooked two crucial warning signs of the nation’s deteriorating economic health. One was the percentage of working-aged men—the traditional backbone of the U.S. labor force—who held a job. The other was the number of jobs being created each month. Throughout the 2000s, both numbers nose-dived.
A few researchers caught early warning signs of the trend. In 2003, economists Erica L. Groshen and Simon Potter at the Federal Reserve Bank of New York warned in a paper that “structural changes” in the economy appeared to be hindering job creation. Groshen and Potter noted that after the past two recessions, in 1990-91 and 2001, economic growth had picked up long before jobs began to reappear, bucking a long historical trend of growth and jobs returning in tandem. The explanation, Groshen and Potter said, was a shift away from the time-honored American tradition of laying off workers in bad times and recalling them when the clouds parted.
“Most of the jobs added during the recovery have been new positions in different firms and industries, not rehires,” they wrote. “In our view, this shift to new jobs largely explains why the payroll numbers have been so slow to rise: Creating jobs takes longer than recalling workers to their old positions and is riskier” when recovery still appears fragile.
In other words, American companies had adopted a more cold-blooded attitude toward recessions, one that fit the new model of globalization and automation. Technology made it easier to lay off your 100 least-effective workers and ship their jobs to India, or to replace them with a software program that made your remaining workforce dramatically more productive.
That theory would hold true in the next recession, too. Meanwhile, it raised a troubling question: Why didn’t the gains of cold-bloodedness stack up to the costs?
Here is how the evolving global economy is supposed to work: Mature economies with high living standards, such as the United States, ship some of their lower-skill jobs to developing countries where wages are lower. The costs of the outsourced goods and services go down, and the buying power of the developing countries goes up. American firms reap higher profits, which they invest in developing higher-value products that can’t be made elsewhere and sell them to increasingly flush consumers at home and abroad. Laid-off American workers find jobs in the innovative industries that result.
That story has almost entirely come true for corporate America, whose record profits spurred strong GDP growth throughout the 2000s, but not for workers. “A lot of people have been displaced due to technology and outsourcing,” says Mark Thoma, an economics professor at the University of Oregon who writes the popular Economist’s View blog. Those workers have often settled into worse jobs than the ones they lost, he adds, if they have found work at all. “That’s not really what’s supposed to happen.”
Thoma is one of a fleet of economists from top university research departments, regional Fed banks, think tanks, and the wonky economic blogosphere, who were asked why U.S. job creation had stalled so spectacularly in the past decade. Liberals and free-market purists alike all said, “Good question,” and almost to a person added some form of “I wish we knew the answer.”
Lawmakers have still barely touched the question—they are too focused on taxes, regulation, and government spending, policy areas that hardly any economist has suggested as explanations for our lost decade of job growth. Researchers are just starting to piece together the evidence, and no one can yet finger the culprit.