Natalie Matthews-Ramo

Why Is It So Hard for Americans to Get a Decent Raise?

A new answer could change how we think about unions, monopolies, and the minimum wage.

If you were a delivery van driver searching for a new job any time between the years of 2010 and 2013, chances are, you wouldn’t have found many businesses competing for your services. In Selma, Alabama, there was, on average, just one company posting help wanted ads for those drivers on the nation’s biggest job board. In all of Orlando, Florida, there were about nine. Nationwide the average was about two.

The situation for telemarketers wasn’t great either. In any given city or town, approximately three companies were trying to hire for their services. Accountants only had it a little better: Roughly four businesses were posting jobs for them.


Those numbers are based on the findings of a new research paper that may help unlock the mystery of why Americans can’t seem to get a decent raise. Economists have struggled over that question for years now, as wage growth has stagnated and more of the nation’s income has shifted from the pockets of workers into the bank accounts of business owners. Since 1979, inflation-adjusted hourly pay is up just 3.41 percent for the middle 20 percent of Americans while labor’s overall share of national income has declined sharply since the early 2000s. There are lots of possible explanations for why this is, from long-term factors like the rise of automation and decline of organized labor, to short-term ones, such as the lingering weakness in the job market left over from the great recession. But a recent study by a group of labor economists introduces an interesting theory into the mix: Workers’ pay may be lagging because the U.S. is suffering from a shortage of employers.

The paper—written by José Azar of IESE Business School at the University of Navarra, Ioana Marinescu of the University of Pennsylvania, and Marshall Steinbaum of the Roosevelt Institute—argues that, across different cities and different fields, hiring is concentrated among a relatively small number of businesses, which may have given managers the ability to keep wages lower than if there were more companies vying for talent. This is not the same as saying there are simply too many job hunters chasing too few openings—the paper, which is still in an early draft form, is designed to rule out that possibility. Instead, its authors argue that the labor market may be plagued by what economists call a monopsony problem, where a lack of competition among employers gives businesses outsize power over workers, including the ability to tamp down on pay. If the researchers are right, it could have important implications for how we think about antitrust, unions, and the minimum wage.

Monopsony is essentially monopoly’s quieter, less appreciated twin sibling. A monopolist can fix prices because it’s the only seller in the market. The one hospital in a sprawling rural county can charge insurers whatever it likes for emergency room services, for instance, because patients can’t go elsewhere. A monopsonist, on the other hand, can pay whatever it likes for labor or supplies, because it’s the only company buying or hiring. That remote hospital I just mentioned? It can probably get away with lowballing its nurses on salary, because nobody is out there trying to poach them.

You don’t have to look hard to tell that we live in a world where many employers have extraordinary leverage over their workers—just read about the grueling, erratic, computer-generated schedules low-wage workers are forced to navigate, or the widespread proliferation of noncompete agreements. And it’s clear that American industry has consolidated enormously over the decades. Years of mergers and the rise of exceedingly profitable superstars like Google and Facebook have concentrated economic power in fewer corporate boardrooms, and research suggests that America’s transformation into a life-size Monopoly board may be cutting into labor’s share of the economy.

But studying monopsony has traditionally been tricky for economists, because they lacked good data that would let them analyze broad trends specifically in labor market concentration. The new paper hops over that hurdle by using a trove of data from, which publishes about one-third of all online job ads in the country. (Even for economists who are paid to worry about it, industry consolidation sometimes has its upsides.) The team looked at the number of companies advertising jobs in more than two dozen different occupations, from nurses to accountants to telemarketers, in each of the country’s different metro and nonmetro areas between 2010 and 2013. They then calculated local labor market concentration using the awkwardly named Herfindahl-Hirschman Index, or HHI, which antitrust regulators use to analyze the effects of mergers on competition.


What they found was a bit startling. The Department of Justice and Federal Trade Commission consider a market with an HHI score of 2,500 or more to be highly concentrated—if a merger between two wireless companies left that little competition for cell services, for instance, there’s a good chance the government’s lawyers would challenge it. In their paper, the authors find that America’s local labor markets had a whopping average HHI score of 3,157. Employers also tended to advertise lower pay in cities and towns where fewer businesses were posting jobs—suggesting that the lack of competition among companies was letting them suppress pay. According to one of their calculations, moving from the 25th percentile of labor market concentration to the 75th percentile would lower pay in a metro area by 17 percent.

The degree of concentration, and the effect on wages, tended to be worse in smaller towns than major cities. Places like Alpena, Michigan, and Butte, Montana, had the least competition among employers, while New York, Chicago, and Philadelphia had the most. It also varied by occupation. Equipment mechanics, legal secretaries, telemarketers, and those delivery drivers faced some of the most highly concentrated job markets; registered nurses, corporate salesmen, and customer service representatives had some of the least. But overall, the problem looks pervasive.

If the U.S. really does have the sort of widespread monopsony problem this paper documents, it would be one more important point on the constellation of reasons workers have fallen so far behind this century. It would also change the way we need to think about certain public policy issues.

Take the minimum wage. The classic argument against increasing the pay floor is that it will kill jobs by making hiring more costly than it’s worth. But in a monopsony-afflicted world where companies can artificially depress wages, a higher minimum shouldn’t hurt employment, because it will just force employers to pay workers more in line with the value they produce.


The same goes for collective bargaining. In the perfectly competitive labor markets of economics textbooks, labor unions are basically dead weight that make companies less efficient. In a world where a small clutch of businesses do most of the hiring, unions may actually fix a broken market by giving workers more sway.

Then there’s antitrust. Today, when regulators are evaluating a large merger, they tend to think about how it will affect the prices consumers pay. If two health insurers merge, will Americans end up paying higher premiums? If a wireless company eats its rival, will our cellphone bills shoot up? In principle, the government’s lawyers can also consider what corporate consolidation will do to workers, but that tends to be a backburner issue. This paper’s findings suggest that Washington needs to think more carefully about how mergers can impact the job market, not just on the national level, but in specific cities and towns, where the marriage of two, smaller companies could have a big local impact.

Of course, this is just one, early study, and like most economics research, there are questions to raise about its technique. It’s possible, for instance, that nurses or accountants are offered lower pay in cities where few companies are hiring because the economy isn’t very good there. That’s an especially big concern, since the paper draws its data from the early years of the post-recession recovery, when unemployment was still quite high. Its authors take various approaches to try to account for this, but they may not be fool-proof. Harvard University labor economist Lawrence Katz told me that he suspected the findings about market concentration and wages were directionally correct but that they may be a bit “overstated,” because it’s simply hard to control for the health of the labor market.

“They are getting at what is an important and underexplored topic … using a creative approach of using really rich data,” he said. “I don’t know if I would take perfectly seriously the exact  quantitative estimates.”


Still, even if the study is only gesturing in the direction of a real problem, it’s a deeply worrisome one. We’re living in an era of industry consolidation. That’s not going away in the foreseeable future. And workers can’t ask for fair pay if there aren’t enough businesses out there competing to hire.

More from Business

Saying We Should Treat Guns Like Cars Overstates How Well We Regulate Cars

Why the Trump Administration’s Latest Attack on Obamacare Is a Bad Idea, According to the Trump Administration

Uber Lost $4.5 Billion in 2017. So Why Are Investors Optimistic?

Scott Pruitt Is Still Flying First Class in Brave Defiance of the Dangers of Coach

The Trump Infrastructure Plan Just Might Have One Good Idea for Cities

Trump Infrastructure Plan: You Pay For It