The attached policy study was co-authored by Andrew Stern, former president of the Service Employees International Union and a senior fellow at Columbia University. It appeared first in the Winter 2017 edition of National Affairs.
To the graduating classes of 2017, the American workplace of 1957 would seem like a foreign world. Sixty years ago, only a little more than a third of all women went to work, the overwhelming majority of them in low-status careers. A majority of men working could likely be called “hardhats” of one sort or another. They worked in factory, farm, mining or service-industry jobs that often relied on physical strength.
Education and skill levels were far less important than they are today. Just under 42 percent of students finished high school, and less than one worker in 10 had a bachelor’s degree. This wasn’t really a problem because, outside the rarefied worlds of law, medicine, high finance and academia, few employers would demand one anyway. Outside of farms, the great majority of full-time jobs had predictable nine-to-five hours; many of those that didn’t were equally predictable shift-work at manufacturing plants.
Larger employers managed careers, benefits and training. Most workers were expected, and sometimes required, to retire at 65. This wasn’t usually a problem, of course, since the average life expectancy for men was less than 67 years: Long, expensive retirements were neither common nor expected. While the still-new Social Security system provided a retirement safety net for the relative handful who had long retirements, other government benefits were decidedly modest; there was no Earned Income Tax Credit, no Supplemental Security Income (although some state-level programs served a similar purpose), no Medicare and no Medicaid. Many employees who had “health insurance” actually had hospitalization-only coverage or “mini-med” plans that few today would consider adequate. More than a quarter of the private-sector workforce belonged to labor unions, and, at least for white males, profitable union-heavy employers like General Motors provided ever-rising wages and expanding benefits to nearly everyone who worked there. There’s little doubt that the peak of union power correlated with the creation of a mass middle class, by far the largest in the world — albeit one that largely excluded female and nonwhite workers from its broad prosperity.
Two years later, President Dwight Eisenhower signed what’s arguably the last major addition to the corpus of law concerning unions, the Labor-Management Reporting and Disclosure Act. The law — which established secret-ballot elections for union offices, as well as various anti-corruption measures — was considered relatively minor in 1959. Few thought it would be the last word in labor relations. The other major labor laws — the National Labor Relations Act, the Taft-Hartley Act, and the Fair Labor Standards Act — were still of reasonably recent vintage, dating from the Franklin Roosevelt and Harry Truman administrations. At the time, nearly everyone likely thought further tweaks were inevitable. But they never came.
Indeed, while major laws like the Employee Retirement Income Security Act of 1974 and the Patient Protection and Affordable Care Act of 2010 have changed employee-benefit structures in myriad ways, the fundamental federal rules governing employer-worker relations were written for a different era. The law was written for industrial-era workers doing routine, often physically demanding work on set shift schedules.
The rules governing labor organizations are just as stringent as the laws regulating employee-working conditions often were back in the heyday of unions. Even in right-to-work states, where workers can opt out of paying union dues, workers in bargaining units are not allowed to negotiate on their own behalf, and unions must represent even those who do not pay dues. Good arguments can be made that it’s unfair to allow workers to gain the benefits of a union contract without paying for it — and equally unfair to force representation on workers who don’t want it. In states that aren’t right-to-work, some workers are forced to pay union-agency fees or make alternative contributions and receive representation that they actively do not want. A similarly good argument can be made that this is also unfair.
These arrangements may have been appropriate in the mid-20th century, but we need a discussion about whether they remain appropriate in the 21st-century economy. Indeed, it’s possible to argue both that current laws are too inflexible for those who own enterprises and too restrictive on those who try to organize workers to defend their own interests.
Unions clearly haven’t fared well over the past several decades. Even now, after eight years of an administration that openly supported unionization and with a clear set of National Labor Relations Board rulings that unions say are “pro-worker,” private-sector participation in organized labor stands at its lowest levels — below 7 percent — since the passage of the National Labor Relations Act in 1935. Even when one includes government employees, about a third of whom are covered by union contracts, only a little more than 11 percent of U.S. workers belong to organized labor, down from 28 percent at the high-water mark for unionization in 1954.
There’s little doubt that the decline in organized labor has correlated strongly with stagnation (at best) and decline (in many cases) in real wages paid to less-skilled workers and particularly to less-skilled men, as well as vastly lower labor-force participation rates for men. Of course, the decline of organized labor isn’t the only factor contributing to this trend, but it’s unquestionable that stronger unions correlated strongly with higher real wages for the working class. As former Federal Reserve chairman Ben Bernanke observed, “Whatever the precise mechanism through which lower rates of unionization affected the wage structure, the available research suggests that it can explain between 10 percent and 20 percent of the rise in wage inequality among men during the 1970s and 1980s.”
Even when Democrats generally sympathetic to unions held a 60-seat majority in the U.S. Senate, organized labor’s 2009-2010 push for the Employee Free Choice Act — which would have required employers to recognize a union if a majority of employees signed a union loyalty card — came up short. (So-called “card check” is currently allowed only if employers consent to it, which most won’t do.) If private-sector unions want to persist, much less thrive, they’ll need to make significant changes to their strategy, their financial model and the law. The current path toward economic irrelevancy and terminal decline obviously isn’t good for organized labor itself, but it also correlates with a number of negative trends affecting less-skilled workers, particularly wage stagnation.
But if labor organizations perceive management as “winning,” few on the management or ownership side of the equation find the current situation copacetic either. Companies without unions, but with large, modestly skilled workforces, often establish special offices dedicated solely to resisting unionization efforts. This costs millions of dollars. Companies with innovative business models — like the sharing-economy companies Uber, Handy, Instacart and Lyft — have faced dozens of state inquiries and a number of lawsuits that call into question whether their labor practices make them “employers.” Questions about who constitutes an employer also have been raised of the entire franchise-based business model, with implications for everything from small home-based carpet-cleaning operations to massive hotel and restaurant chains.
Even conventional large employers would like more flexibility to take on “project-based” workers and pursue new business models than they have right now. While a few state-level proposals have moved forward to create a “third” category of worker — one who is neither an employee nor a contractor — the prospects for similar legislation at the federal level are dim, with little consensus about how to define this new category.
In short, regardless of which side of the labor-management divide that one sits, there’s good reason to be skeptical that national reforms are feasible or that they would change much even if they were enacted. Indeed, efforts to reform and update our federal labor laws to meet new realities have failed for more than a generation. It’s time for a new path, one that takes advantage of one of the most successful public-policy innovations of the past 50 years: waivers from federal law to allow state experimentation.
Such waivers are already allowed under a wide range of laws, including the Social Security Act, the Elementary and Secondary Education Act (2002’s No Child Left Behind Act expanded their use greatly), and the Affordable Care Act. A system to allow state waivers from major labor laws similarly could give every interest group a chance to try bold reforms the federal framework doesn’t currently allow. If properly structured, such waivers could facilitate experiments with new business and revenue models for labor organizations, provide new opportunities for entrepreneurs, create new jobs and expand prosperity. No one will like everything waivers might make possible, but everyone could find something to like. And in the end, American workers and employers could both be better off.
Image by Everett Historical