It’s no secret that the years since the Great Recession have been hard on American workers. Though unemployment has finally dipped below six per cent, real wages for most have barely budged since 2007. Indeed, the whole century so far has been tough: wages haven’t grown much since 2000. So it was big news when, last month, Aetna’s C.E.O., Mark Bertolini, announced that the company’s lowest-paid workers would get a substantial raise—from twelve to sixteen dollars an hour, in some cases—as well as improved medical coverage. Bertolini didn’t stop there. He said that it was not “fair” for employees of a Fortune 50 company to be struggling to make ends meet. He explicitly linked the decision to the broader debate about inequality, mentioning that he had given copies of Thomas Piketty’s “Capital in the Twenty-first Century” to all his top executives. “Companies are not just money-making machines,” he told me last week. “For the good of the social order, these are the kinds of investments we should be willing to make.”
Such rhetoric harks back to an earlier era in U.S. labor relations. These days, most of the benefits of economic growth go to people at the top of the income ladder. But in the postwar era, in particular, the wage-setting process was shaped by norms of fairness and internal equity. These norms were bolstered by the strength of the U.S. labor movement, which emphasized the idea of the “living” or “family” wage—that someone doing a full day’s work should be paid enough to live on. But they were embraced by many in the business class, too. Economists are typically skeptical that these kinds of norms play any role in setting wages. If you want to know why wages grew fast in the nineteen-fifties, they would say, look to the economic boom and an American workforce that didn’t have to compete with foreign workers. But this is too narrow a view: the fact that the benefits of economic growth in the postwar era were widely shared had a lot to do with the assumption that companies were responsible not only to their shareholders but also to their workers. That’s why someone like Peter Drucker, the dean of management theorists, could argue that no company’s C.E.O. should be paid more than twenty times what its average employee earned.
That’s not to imply that there aren’t solid business reasons for paying workers more. A substantial body of research suggests that it can make sense to pay above-market wages—economists call them “efficiency wages.” If you pay people better, they are more likely to stay, which saves money; job turnover was costing Aetna a hundred and twenty million dollars a year. Better-paid employees tend to work harder, too. The most famous example in business history is Henry Ford’s decision, in 1914, to start paying his workers the then handsome sum of five dollars a day. Working on the Model T assembly line was an unpleasant job. Workers had been quitting in huge numbers or simply not showing up for work. Once Ford started paying better, job turnover and absenteeism plummeted, and productivity and profits rose.
Read the rest at NewYorker.com
Follow Us