There’s no doubting that worldwide, kids are out of work. In the United States alone, the unemployment rate for 15 to 24-year-olds is about 16 percent, nearly twice the national average. In parts of Europe, the figures are much worse, with a whopping 56 percent youth unemployment rate in Spain alone — representing about 900,000 people.
But do these high numbers represent a global labor market crisis that imperils future growth, as the headlines warn? Maybe not. Maybe instead, they’re evidence of a generation of college graduates determined not to settle, which bodes well for our future.
To understand why, it’s worth a quick detour through history. Until the early 20th century, there was no clear concept of “unemployment.” Classical economics emerged in the late 19th century at a time when there was an ample supply of labor to feed the relentless maw of industrial production in both Europe and America. Because there was no social safety net, people worked in order to generate essentials such as food, clothing and shelter. You had to work to survive, and there was always work to be done and need for bodies to do it. Many believed that “unemployment” was only an option for vagrants, who were in turn viewed as immoral.
The Great Depression threw those views into question. Millions found themselves unable to find jobs, even when they wanted to. The Bureau of Labor Statistics began to create an unemployment rate in the 1930s, and with it a definition of what qualified as “the workforce” and of what it meant to be unemployed. A key aspect of the definition was not that you were “out of work” but rather that you were actively looking for a job, yet unable to find one. It pointed to a flaw — either temporary and cyclical, or longer-lasting and structural — with the labor market and, by extension, with the economy as a whole.