Posts Tagged ‘labor’


It’s about time someone pointed out the obvious: “Bosses are dictators, and workers are their subjects.”

We generally don’t talk that way, of course. However, as Elizabeth Anderson [ht: ja] explains, contemporary workplaces are like private governments, in which employers have dictatorial powers over their workers—and workers have almost no say in how they are governed.

Like Louis XIV’s government, the typical American workplace is kept private from those it governs. Managers often conceal decisions of vital interest to their workers. Often, they don’t even give advance notice of firm closures and layoffs. They are free to sacrifice workers’ dignity in dominating and humiliating their subordinates. Most employer harassment of workers is perfectly legal, as long as bosses mete it out on an equal-opportunity basis. (Walmart and Amazon managers are notorious for berating and belittling their workers.) And workers have virtually no power to hold their bosses accountable for such abuses: They can’t fire their bosses, and can’t sue them for mistreatment except in a very narrow range of cases, mostly having to do with discrimination.

Dictatorship in the workplace—after workers are forced to freely sell their ability to work in the labor market—seems obvious to me and many other heterodox economists. But it’s certainly not obvious to mainstream economists, who like their classical predecessors continue to celebrate the freedom and mutual benefit of wage contracts and the efficiency of firms that are ruled by the representatives of the property owners.*

What is even more interesting, at least to me, is the way Anderson mentions the issue of time and then seems to let it slide.

Here’s how she begins her essay:

Consider some facts about how American employers control their workers. Amazon prohibits employees from exchanging casual remarks while on duty, calling this “time theft.” Apple inspects the personal belongings of its retail workers, some of whom lose up to a half-hour of unpaid time every day as they wait in line to be searched. Tyson prevents its poultry workers from using the bathroom. Some have been forced to urinate on themselves while their supervisors mock them.

But then Anderson, after mentioning “time theft,” moves on to the various ways employers exercise dictatorial control over their workers and forgets about time. But isn’t time what the employer-worker relationship is all about—the reason that employers act like dictators and workers are forced to surrender almost all their rights while they are working?

What is mostly absent from Anderson’s analysis is time, especially the distinction between necessary labor-time and surplus labor-time. During part of the workday, employees—whether at Walmart, GM, or Google—work for themselves, and thus receive a wage equal to the value of their ability to work. But they continue working and during those extra hours they aren’t working for themselves, but for their employers. That’s time that’s stolen from the workers, which forms the basis of their employers’ profits.

So, the real “time theft” is not what workers do to their employers, exactly the opposite, what employers do to their workers—when, after necessary labor-time is completed, workers are forced to have the freedom to engage in surplus labor-time.

Thus, when Amazon workers exchange casual remarks while on duty, they’re cutting into the surplus labor-time due to their employers. The half-hour Apple workers wait in line to be searched, for which they are not paid, is a way of making sure that particular activity doesn’t cut into the surplus-time due to their employers. By the same token, when Tyson prevents its’ poultry workers from using the bathroom, who are then forced to urinate on themselves, less time is being spent engaged working for themselves and more for their employers.

In other words, under conditions of workplace dictatorship, time is stolen from workers to  benefit their employers.

Furthermore, because employers, and not workers, are the ones who appropriate the benefits of surplus labor-time, it puts workers in the position of continuing to be forced to have the freedom to sell their ability to work and to submit to the dictates of their employers.

Thus, “time theft” is both a condition and consequence of the private dictatorship of employers in the workplace.

A whole book could in fact be written about this idea of “time theft,” inside and outside the workplace.

For example, inside the workplace, new technologies have the effect both of allowing time to slip out of employers’ grasp—as, for example, when workers appear to be working at their desks but, in fact, are surfing the internet or catching up with friends and family members on Facebook—and allowing employers to tighten their grip—especially when it permits control over the pace of work and new forms of surveillance. Technology seems to cut both ways when it comes to “time theft” in the workplace.

But “time theft” is also important outside the workplace. Consider, for example, the standardization of time—which robs many of us of local traditions of time—as well as the fact that there is a large and growing gap in life expectancy between those at the top and bottom of the economic scale—which means time is being stolen from the poor and distributed to the rich.

I could go on. The important point is “time theft” is an ongoing problem of contemporary capitalism, both within the dictatorship of the workplace and in the seeming democracy of our lives outside of work.

It’s time someone wrote that book.


*In fact, Oliver Hart and Bengt Holmstrom were awarded the 2016 Nobel Prize in Economics for “proving” that capitalist firms (and not, e.g., worker-owned enterprises) represent the most efficient way to organize production.



New technologies—automation, robotics, artificial intelligence—have created a specter of mass unemployment. But, as critical as I am of existing economic institutions, I don’t see that as the issue, at least at the macro level. The real problem is the distribution of the value that is produced with the assistance of the new technologies—in short, the specter of growing inequality.

David Autor and Anna Salomons (pdf) are the latest to attempt to answer the question about technology and employment in their contribution to the recent ECB Forum on Central Banking. Their empirical work leads to the conclusion that while “industry-level employment robustly falls as industry productivity rises. . .country-level employment generally grows as aggregate productivity rises.”

To me, their results make sense. But for a different reason.


It is clear that, in many sectors—perhaps especially in manufacturing—the growth in output (the red line in the chart above) is due to the growth in labor productivity (the blue line) occasioned by the use of new technologies, which in turn has led to a decline in manufacturing employment (the green line).


But for the U.S. economy as a whole, especially since the end of the Great Recession, the opposite is true: the growth in hours worked has played a much more important role in explaining the growth of output than has the growth in labor productivity.

The fact is, increases in labor productivity—which stem at least in part from labor-saving technologies—have not, at least in recent years, led to massive unemployment. (The losses in jobs that have occurred are much more a cyclical phenomenon, due to the crash of 2007-08 and the long, uneven recovery.)

But that’s not because, as Autor and Salomons (and mainstream economists generally) would have it, there are “positive spillovers” of technological change to the rest of the economy. It’s because, under capitalism, workers are forced to have the freedom to sell their ability to work to employers. There’s no other choice. If workers are displaced from their jobs in one plant or sector, they can’t just remain unemployed. They have to find jobs elsewhere, often at lower wages than their earned before. That’s how capitalism works.

Much the same holds for workers who don’t lose their jobs but who, as new technologies are adopted by their employers, are deskilled and otherwise become appendages of the new machines. They can’t just quit. They remain on the job, even as their working conditions deteriorate and the value of their ability to work falls—and their employers’ profits rise.

What happens, in other words, is the gains from the new technologies that are adopted are distributed unevenly.


This is clear if we look at labor productivity for the economy as a whole (the blue line in the chart above) since the end of the Great Recession, which has increased by 7.5 percent. However, the wage share (the green line) has barely budged and is actually now lower than it was in 2009.


The results are even more dramatic over a long time frame—over periods when labor productivity was growing relatively quickly (from 1947 through the 1970s, and from 1980 until the most recent crash) and when productivity has been growing much more slowly (since 2009).

During the initial period (until 1980), labor productivity (the blue line in the chart) almost doubled while income shares—to the bottom 90 percent (the red line) and the top 1 percent (the green line)—remained relatively constant.

After 1980, however—during periods of first rapid and then slow growth in productivity—the situation changed dramatically: the share of income going to the bottom 90 percent declined, while the share captured by the top 1 percent soared. Even as new technologies were adopted across the economy, the vast majority of people were forced to find work, at stagnant or declining wages, while their employers and corporate executives captured a larger and larger share of the new value that was being created.

Autor and Salomons think they’ve arrived at a conclusion—concerning the “relative neutrality of productivity growth for aggregate labor demand”—that is optimistic.

The conclusions of my analysis are much more disconcerting. The broad sharing of the fruits of technological change, from the end of World War II to the late 1970s, was relatively short-lived. Since then, the conditions within which new technologies have been adopted have created a mass of increasingly desperate workers, who have either been forced to labor in more automated workplaces or have been displaced and thus forced to find employment elsewhere. In both cases, their share of income has declined while the share captured by a tiny group at the top has continued to rise. That’s the “new normal” (from 1980 onward) which looks a lot like the “old normal” of capitalist growth (prior to the first Great Depression), interrupted by a relatively short period (during the three postwar decades) that is becoming increasingly recognized as the exception.

Even more, I can make the case that things would be much better if the adoption of new technologies did in fact displace a large number of labor hours. Then, the decreasing amount of labor that needed to be performed could be spread among all workers, thus lessening the need for everyone to work as many hours as they do today.

But that would require a radically different set of economic institutions, one in which people were not forced to have the freedom to sell their ability to work to someone else. However, that’s not a world Autor and Salomons—or mainstream economists generally—can ever imagine let alone work to create.


Special mention

FellP20170628_low  RallT20170628_low

capital shares

Yesterday, I showed that conventional thinking about factor shares has been finally overturned: they are not necessarily constant, especially within existing economic institutions.

In fact, labor’s shares have been declining for decades now.

The opposite is true of capital’s shares: they’ve been rising for almost three decades.

The profit share of national income has, of course, a cyclical (short-term) component. It falls in the period preceding each recession, and begins to rise again during recessions. That’s how capitalism works.

But the profit share (illustrated by the blue line in the chart above, measured on the left side) also exhibits secular (longer-term) movements—and, since 1986 (when it reached a low of 7 percent), it more than doubled (to a peak of 15.4 percent in 2006) and remains still very high (at 13.6 percent in 2016).

Over that same period, the share of income captured by individuals at the top—the top 10 percent and, a smaller group, the top 1 percent (in the red and green lines, respectively, measured on the right)—who receive distributions of the surplus, also increased dramatically. The share of income of the top 10 percent rose by 29.7 percent (from 36.4 to 47.2 percent of total factor income) and of the top 1 percent by even more, 40.2 percent (from 25.4 to 35.6 percent).

To expand the conclusion I reached yesterday: under existing economic institutions, factor shares do in fact change—and they’ve been turning against labor (beginning in the mid-1970s) and in favor of capital (since the mid-198os) for decades now.

That’s a fundamental change in the class nature of the U.S. economy that needs to be reflected in economists’ theoretical models—which also needs to be corrected in reality, by radically transforming existing economic institutions.

labor shares

When I first began studying economics, the conventional wisdom was that “factor shares”—the shares of national income paid to labor and capital—were relatively constant.

So, there really was no need to worry about the problem of inequality. Poverty, maybe, but not the gap between wages and profits.

Now, of course, all of that has changed. Not only is there increasing recognition that the labor share changed, it’s been declining for more than four decades.

Even Stephen Cecchetti and Kim Schoenholtz, the authors of the textbook Money, Banking and Financial Markets, have acknowledged that

For at least the past 15 years, and possibly for several decades, labor’s share of national income has been declining and capital’s share has been rising in most advanced and many emerging economies.

Thus, for example, the labor share of national income in the United States has fallen by about 12 percent from 1970 to 2014 (as indicated by the index scale on the left side of the chart above).

But, as it turns out, that’s only part of the story. The share of national income going to workers has declined by even more than that.

There are two main reasons why the “labor share” doesn’t give an accurate picture of the “workers’ share” of national income. First, as Michael D. Giandrea and Shawn A. Sprague explain, the labor share (as calculated by the Bureau of Labor Statistics) includes both employee compensation and the labor compensation of proprietors (and thus a portion, minus the capital share, of the income going to proprietors). Second, the labor share does not account for inequality between the different groups who receive what is officially measured as labor compensation:

the compensation of a highly paid CEO and a low-wage worker would both be included in the labor share.

So, in order to get an accurate picture of workers’ share of national income, we need to turn to other data.

What I’ve done in the chart above is measure (on the right side of the chart) the shares of income going to the bottom 90 percent and the bottom 50 percent of Americans. And, not surprisingly, the declines are even more dramatic: 20 percent for the bottom 90 percent (falling from 66 percent of total factor income in 1970 to 53 percent in 2014) and even more, 45.8 percent, for the bottom 50 percent (from 19 to 10.3 percent between 1970 and 2014). Those are the shares actual workers—not proprietors or CEOs—take home.

Finally, the conventional wisdom has begun to change. Under existing economic institutions, factor shares do in fact change—and they’ve been turning against labor for decades now.

The bottom line, though, is the situation of workers is even worse than what is indicated by the declining labor share. The workers’ share has fallen even more dramatically in recent decades.

It’s time, then, for the old models—the old theoretical models as well as the models for organizing the economy—to be thrown out and replaced in order to create an economy that actually works for American workers.



First, it was conspicuous consumption. Then, it was conspicuous philanthropy. Now, apparently, it’s conspicuous productivity.

According to Ben Tarnoff,

the acquisition of insanely expensive commodities isn’t the only way that modern elites project power. More recently, another form of status display has emerged. In the new Gilded Age, identifying oneself as a member of the ruling class doesn’t just require conspicuous consumption. It requires conspicuous production.

If conspicuous consumption involves the worship of luxury, conspicuous production involves the worship of labor. It isn’t about how much you spend. It’s about how hard you work.

And that makes a lot of sense, for at least two reasons. First, CEO salaries in the United States continue to be much higher than average workers’ pay—276 times as much in 2015. CEOs need to publicize the long hours they work in order to attempt to justify the large gap between what they take home and what they pay their workers. As Tarnoff explains, “In an era of extreme inequality, elites need to demonstrate to themselves and others that they deserve to own orders of magnitude more wealth than everyone else.”


The problem, of course, is many American workers are working long hours these days. According to the Bureau of Labor Statistics, in 2015, employed persons ages 25 to 54, who lived in households with children under 18, spent an average of 8.8 hours working or in work-related activities and the rest sleeping (7.8 hours), doing leisure and sports activities (2.6 hours), and caring for others, including children (1.2 hours ).


And, on a weekly basis (taking into account public holidays, annual leaves, and so on), U.S. workers put in almost 25 percent more hours—or about an hour more per workday—than Europeans.


The other reason why conspicuous productivity matters is because, in comparison to the First Gilded Age (when Thorstein Veblen first invented the term conspicuous consumption), a larger share of the surplus captured by the top 1 percent takes the form of labor income during the Second Gilded Age. They get—and deserve—that large and growing share because they work long hours.

The problem, of course, as I showed the other day, that composition of income has changed since 2000. Since then, the capital share of their income has bounced back. Thus, the “working rich” of the late-twentieth century are increasingly living off their capital income, or are in the process of being replaced by their offspring who are living off their inheritances.

This was my conclusion:

It looks then as if those at the top have either turned into or been replaced by rentiers, thus joining the existing owners of capital at the very top—thereby mirroring, after a short interruption, the structure of inequality last seen during the first Gilded Age.

That’s perhaps why conspicuous productivity was invented. Increasingly, those at the top are able to capture a large share of the surplus not because they do, but because they own. But if they can hide that by boasting about the long hours they work, they can attempt to defend their class power.

Or so they hope. . .


Who’s running away with the surplus, those at the top or those at the very top?

In a new study on “income inequality in the 21st century,” Fatih Guvenen and Greg Kaplan note that recent increases in inequality in the United States need to be understood in terms of trends of and, especially, within the top 1 percent. That’s particularly true when, instead of using Social Security data (which capture labor income), they turn to Internal Revenue data (which capture all forms of income).

While I agree with Guvenen and Kaplan that historically there have been significant differences between the incomes of the top 1 percent and the top 0.1 percent—those at the top and those at the very top—in my view, they tend to exaggerate the differences and lose sight of the fact that the two groups have become one.

Clearly, as can be seen in the chart above (based on data from Thomas Piketty, Emmanuel Saez, and Gabriel Zucman), the average income of those in the top tenth of one percent has risen much more than that of the top one percent. From 1979 to 2014, the average income of those at the very top has risen 277 percent compared to an increase of 183 percent for those at the top. But, of course, the average incomes of both groups have soared compared to that of the bottom 90 percent, which has increased only 27 percent over the same period.

And while they’re right, the rise in capital income much more than labor income helps explain the rising share of income of those at the very top, especially in recent decades, the fact is both groups—whether in the form of labor or capital income—have managed to capture a rising share of the surplus.

Where do those incomes come from?

The following two charts illustrate the composition of incomes of the top 1 percent and top 0.1 percent, respectively.



One way of making sense of the way those at the top and those at the very top manage to capture a portion of the surplus is by distinguishing between a labor component (in various shades of blue in both charts) and a capital component (in shades of green). When added together, the two components represent the total share of national income that goes to the top 1 percent (which rose from 11.1 to 20.2 percent) and the top 0.1 percent (which rose from 3.9 to 9.3 percent) between 1979 and 2014.

The labor component comprises two categories: employee compensation (e.g., payments to CEOs and executives in finance) and the labor part of noncorporate business profits (e.g, partnerships and sole proprietorships). Capital income can be similarly decomposed into various categories: interest paid to pension and insurance funds, net interest, corporate profits, noncorporate profits, and housing rents (net of mortgages).

As can be seen in the top chart above, by 2014 the top 1 percent derived over half of their incomes from capital-related sources. In earlier decades, from the late-1970s to the late-1990s, a much larger share of their income came from labor sources. They were the so-called “working rich.” This process culminated in 2000 when the capital share in top 1 percent incomes reached a low point of 49.4 percent. Since then, however, it has bounced back—to 58.6 percent in 2014. Thus, the “working rich” of the late-twentieth century are increasingly living off their capital income, or are in the process of being replaced by their offspring who are living off their inheritances.

Much the same trend, in an even exaggerated fashion, is true of those at the very top, the top 0.1% (in the lower chart). More than half of their income has always come from capital-related sources. They were never the “working rich”; they were always for the most part “coupon clippers.” The share of their income from capital-related sources was already 60 percent in 1979 and continued to grow (to 63 percent) by 2014.

What this means, in general terms, is the growth of inequality over decades is due to the ability of those at the top and those at the very top to capture a large portion of the growing surplus. But there has also been a change in the nature of that inequality in recent years, at least for those at the top—which is not due to escalating wage inequality, but to a boom in income from the ownership of stocks and bonds. They’ve now joined the ranks of the “coupon clippers,” who are able to use their accumulated wealth to get their share of the surplus.

It looks then as if those at the top have either turned into or been replaced by rentiers, thus joining the existing owners of capital at the very top—thereby mirroring, after a short interruption, the structure of inequality last seen during the first Gilded Age.