Posts Tagged ‘workers’

mike3may

One of the biggest crime waves in America is not robbery. It is, as Jeff Spross [ht: sm] explains, wage theft.

In dollar terms, what group of Americans steals the most from their fellow citizens each year?

The answer might surprise you: It’s employers, many of whom are committing what’s known as wage theft. It’s not just about underpaying workers. They’re not paying workers what they’re legally owed for the labor they put in.

It takes different forms: not paying workers the federal, state, or local minimum wage; not paying them overtime; or just monkeying around with job titles to avoid regulations.

No one knows exactly how big a problem wage theft is, but in 2012 federal and state agencies recovered $933 million for victims of wage theft. By comparison, all the property taken in all the robberies of all types in 2012, solved or unsolved, amounted to a little under $341 million.

Remember, that $933 million is just the wage theft that’s been addressed by authorities. The full scale of the problem is likely monumentally larger: Research suggests American workers are getting screwed out of $20 billion to $50 billion annually.

Actually, employers steal from workers in at least two different ways: when they don’t pay them what they’re legally owed, and even when they do. In the former case, the laws and enforcement are weak—but at least prosecutors and labor groups are getting more aggressive about pursuing wage theft. Maybe, then, workers will be able to recover the back pay they’re owed and employers, instead of just paying small fines when they’re caught, might actually go to jail.

In the latter case, the theft that occurs even when workers are paid what they’re legally owed, is a bit more difficult, at least within existing economic institutions. That’s because, under the rules of capitalism, workers receive a wage (which, at least under certain circumstances, equals the value of their labor power). But then, beyond the labor-market exchange, when workers start to produce, they create value that is equal not only to their wages, but also an additional amount, a surplus. Even when workers receive their legally mandated wages, that extra or surplus-value is appropriated by their employers. It’s legal and, within the ethical code of capitalism, “fair.”

So, within contemporary capitalism, we should be aware of two kinds of wage theft, both committed by employers: the theft of legally mandated wages and the theft that occurs even when workers receive their legally mandated wages.

The first is a case of individual theft, the second a social theft. Both, it seems, are countenanced within contemporary capitalism—and workers are made to suffer as a result.

P1-BV613_GLOBAL_16U_20151130185716

As I’ve been discussing over the course of the past week, the U.S. healthcare system is a nightmare, at least for workers and their families. It costs more and provides less than in other countries. Employees are being forced to pay higher and higher fees (in the form of premiums, deductibles, and other charges). And it relies on a private health-insurance industry, which is increasingly concentrated and profitable.

What about the other part of the system, the actual provision of health care? I’m thinking, in particular, of the pharmaceutical industry (which I focus on in this post) and hospitals (which I’ll take up in a future post).

According to a recent study by the Wall Street Journal, consumers in the United States nearly always pay more for branded drugs than their counterparts in England (39 higher and 1 lower), Norway (37 higher and 3 lower), and Ontario, Canada (28 higher and 2 lower). Thus, for example, Lucentis (which is used for the treatment of patients with wet age-related macular degeneration and other conditions) costs $1936 in the United States but only $894 in Norway, $1159 in England, and $1254 in Ontario. The same is true for many other drugs, from Abraxane (for treating cancer) to Yervoy (for treating skin cancer).

Part of the reason is that, in other countries, public healthcare systems have substantial negotiating power and are able to bargain with pharmaceutical companies for lower prices (or, in the case of Canada’s federal regulatory body, to set maximum prices). The U.S. market, however, “is highly fragmented, with bill payers ranging from employers to insurance companies to federal and state governments.” In particular, Medicare, the largest single U.S. payer for prescription drugs, is legally prohibited from negotiating drug prices.

Pharma

On the other side of the market, the U.S. pharmaceutical industry has become increasingly concentrated through a wave of numerous and increasingly large merger-and-acquisition deals. According to Capgemni Consulting, since 2010, approximately 200 pharmaceutical and biotech deals have taken place per year in the United States. 2014 saw several of the largest deals in the pharmaceutical industry to date, including the $66-billion purchase of Allergan by Actavis, Merck unloading its consumer health unit to Bayer, GSK and Novartis’s multibillion-dollar asset swap, as well as Novartis’s animal health unit sale to Eli Lilly.

Although high-profile, major acquisitions outweigh other deals by value, over 90 percent of deals were relatively small in size (less than $5 billion). Clearly, the motivation in these smaller deals is different.

Failure of bigger pharmaceutical companies to consistently develop new drugs and pressure from shareholders to deliver returns have forced large pharmaceutical companies to look outside for innovative drugs. This has resulted in new drug approvals emerging as a major trigger for acquisitions.

Most-profitable-industries2

The fragmented, unregulated system of drug purchases in the United States, combined with growing concentration of the pharmaceutical industry, means that health technology—with a 20.9 percent net profit margin—is now the most profitable industry in the country.

High drug prices are one of the key factors behind rising U.S. healthcare costs, and one of the main reasons why American workers pay more and yet receive poorer healthcare than in other rich countries.

Addendum

As if to confirm my analysis of the role of the pharmaceutical industry in creating a nightmarish U.S. healthcare system, we now have the examples of the Epipen and Pfizer.

As Aaron E. Caroll explains, the story of EpiPens is not just about how expensive they’ve become; it also reveals “so much of what’s wrong with our health care system.”

Epinephrine isn’t an elective medication. It doesn’t last, so people need to purchase the drug repeatedly. There’s little competition, but there are huge hurdles to enter the market, so a company can raise the price again and again with little pushback. The government encourages the product’s use, but makes no effort to control its cost. Insurance coverage shields some from the expense, allowing higher prices, but leaves those most at-risk most exposed to extreme out-of-pocket outlays. The poor are the most likely to consider going without because they can’t afford it.

EpiPens are a perfect example of a health care nightmare. They’re also just a typical example of the dysfunction of the American health care system.

And then we have Pfizer’s purchase of part of AstraZeneca’s antibiotics business, which doesn’t involve the development of any new drugs but (for $550 million upfront plus an unconditional $175 million in January 2019, and possibly a further $850 million plus royalties), Pfizer will have the right to sell three approved antibiotics and two drugs in clinical trials in most markets outside the U.S. and Canada, plus an additional drug (Merem) in North America.

gwc-uaw

It took two and a half years but, on the basis of yesterday’s ruling by the National Labor Relations Board (pdf), research and teaching assistants at Columbia University now have the right to form a union (as GWC-UAW Local 2110).

It comes as no surprise that Columbia’s administration opposed the ruling:

The university said in a statement Tuesday that it’s reviewing the ruling, but that it “disagrees with this outcome because we believe the academic relationship students have with faculty members and departments as part of their studies is not the same as between employer and employee.”

First and foremost, Columbia said, “students serving as research or teaching assistants come to Columbia to gain knowledge and expertise, and we believe there are legitimate concerns about the impact of involving a nonacademic third party in this scholarly training.”

And the consequences of the NLRB ruling extend far beyond Columbia:

NPR’s Yuki Noguchi reports that “only a small fraction of graduate students at public universities are currently represented by unions — but the decision governing private university students is expected to lead to unionization efforts that could organize tens of thousands more.”

The NLRB had long held that students who teach or research at a private university were not employees covered under the National Labor Relations Act, Yuki reports. That changed in 2000, when the board decided a case in favor of students, and changed again with another ruling four years later. Now the NLRB has reversed itself yet again.

In Tuesday’s decision, the board majority wrote that the 2004 ruling “deprived an entire category of workers of the protections of the Act, without a convincing justification in either the statutory language or the policies of the Act.”

unnamed-1_orig

Not surprisingly, Yale (where graduate-student employees have been attempting to organize their own union for 25 years) echoed Columbia’s response:

Peter Salovey, president of Yale, said in a separate statement that the “mentorship and training that Yale professors provide to graduate students is essential to educating the next generation of leading scholars” and that he’d “long been concerned that this relationship would become less productive and rewarding under a formal collective bargaining regime, in which professors would be ‘supervisors’ of their graduate student ‘employees.’”

But the American Association of University Professors, which argued in an amicus brief in the Columbia case that collective bargaining can improve graduate students’ academic freedom, applauded the NLRB decision.

“This is a tremendous victory for student workers, and the AAUP stands ready to work with graduate employees to defend their rights, including rights to academic freedom and shared governance participation,” Howard Bunsis, chair of the association’s Collective Bargaining Congress and a professor of accounting at Eastern Michigan University, said in a statement. “Graduate employees deserve a seat at the table and a voice in higher education.”

kff-image-2

On Tuesday, I began a series on the unhealthy state of the U.S. healthcare system—starting with the fact that the United States spends far more on health than any other country, yet the life expectancy of the American population is actually shorter than in other countries that spend far less.

Today, I want to look at what U.S. workers are forced to pay to get access to the healthcare system.

According to the Kaiser Family Foundation, about half of the non-elderly population—147 million people in total—are covered by employer-sponsored insurance programs.* The average annual single coverage premium in 2015 was $6,251 and the average family coverage premium was $17,545. Each rose 4 percent over the 2014 average premiums. During the same period, workers’ wages increased only 1.9 percent while prices declined by 0.2 percent.

But the gap is even larger when looked at over the long run. Between 1999 and 2015, workers’ contributions to premiums increased by a whopping 221 percent, even more than the growth in health insurance premiums (203 percent), and far outpacing both inflation (42 percent) and workers’ earnings (56 percent).

kff-image-4

Most covered workers face additional out-of-pocket costs when they use health care services. Eighty-one percent of covered workers have a general annual deductible for single coverage that must be met before most services are paid for by the plan.** Since 2010, there has also been a sharp increase in both the percentage of workers on health plans with deductibles—which require members to pay a certain amount toward their care before the plan starts paying—and the size of those deductibles. The result has been a 67-percent rise in deductibles (for single coverage) since 2010, far outpacing not only the 24-percent growth in premiums, but also the 10-percent growth in workers’ wages and 9-percent rise in inflation.

In recent years, the increase in U..S. health costs has in fact slowed down. But the slowdown has been invisible to American workers, who have been forced to pay much higher premiums and deductibles in order to get access to healthcare for themselves and their families.

 

*Fifty-seven percent of firms offer health benefits to at least some of their employees, covering about 63 percent workers at those firms.

**Even workers without a general annual deductible often face other types of cost sharing when they use services, such as copayments or coinsurance for office visits and hospitalizations, and when they purchase prescription drugs.

wealth

Everyone knows wealth in the United States is unequally distributed, even more than the nation’s income (and that’s saying something).

For example, according to a new report from the Congressional Budget Office [ht: ja],

In 2013, families in the top 10 percent of the wealth distribution held 76 percent of all family wealth, families in the 51st to the 90th percentiles held 23 percent, and those in the bottom half of the distribution held 1 percent. Average wealth was about $4 million for families in the top 10 percent of the wealth distribution, $316,000 for families in the 51st to 90th percentiles, and $36,000 for families in the 26th to 50th percentiles. On average, families at or below the 25th percentile were $13,000 in debt.

But, wait, it gets worse. The distribution of wealth among the nation’s families was more unequal in 2013 than it was in 1989. For instance, the difference in wealth held by families at the 90th percentile and the wealth of those in the middle widened from $532,000 to $861,000 over the period (both in 2013 dollars). The share of wealth held by families in the top 10 percent of the wealth distribution increased from 67 percent to 76 percent, whereas the share of wealth held by families in the bottom half of the distribution declined from 3 percent to 1 percent.*

Yes, that’s right: in 2013, the bottom half of U.S. families held only 1 percent of the nation’s wealth.

wealth-percentiles

And it gets even worse: from 1989 to 2013, the average wealth of families in the bottom half of the distribution was less in 2013 than in 1989. It declined by 19 percent (in contrast to the 153-percent increase for families in the top 10 percent). And the average wealth of people in the bottom quarter was thousands of dollars less in 2013 than it was in 1989.**

poor wealth

So, let’s get this straight. The share of wealth going to the top 10 percent of households, already high, actually increased between 1989 and 2013. And the share held by the bottom 50 percent, already tiny, fell. And, finally, the average wealth for families in the bottom half of the distribution was less in 2013 than in 1989 and many more of them were in debt.

Now, to put things in perspective, the United States had Democratic presidents (Bill Clinton and Barack Obama) during thirteen of the twenty-four years when workers and the poor were being fleeced.

And now they’re being asked to vote for one more Democrat, with the same economic program, because it will “make history”?

 

*To be clear, a large portion of the decline in wealth for the bottom 50 percent occurred after the crash. Still, compared with families in the top half of the distribution, families in the bottom half experienced disproportionately slower growth in wealth between 1989 and 2007, and they had a disproportionately larger decline in wealth after the 2007-09 recession.

**In 1989, families at or below the 25th percentile were about $1,000 in debt. By 2013, they were about $13,000 in debt, on average. Overall indebtedness also increased during the same period: by 2013, 12 percent of families had more debt than assets, and they were, on average, $32,000 in debt.

CnMW1_dW8AE9ln5

We all know that some large portion of workers and their jobs are threatened—now and in the future—by new technologies. That’s why I’ve been increasingly writing about the conditions and consequences of robots and automation.

According to McKinsey, “currently demonstrated technologies could automate 45 percent of the activities people are paid to perform and that about 60 percent of all occupations could see 30 percent or more of their constituent activities automated, again with technologies available today.” In their view, it’s not so much that whole occupations will be eliminated in the foreseeable future, but that automation will affect a great many jobs and activities within those jobs.

It means, under existing economic arrangements, automation will occur where it is technically feasible and financially profitable—and, where it does occur, workers will increasingly become appendages of machines.

mckinsey

The prime candidate for automation is what the authors of the report refer to as “predictable physical work,” that is, performing physical activities or operating machinery in a predictable environment. This includes many manufacturing activities but, as it turns out, it’s also true in the service sector. In fact, according to McKinsey, the most readily automatable sector in the U.S. economy is “accommodations and food service.”

But that’s only on technical grounds. Since the prevailing wage in that sector is very low, many of the workers’ activities may not in fact be automated based on cost considerations. In that case, it’s the threat of automation that will most affect workers and their jobs.

But there are many other activities and sectors that might be automated with the existing technologies (including software). These include manufacturing (where “performing physical activities or operating machinery in a predictable environment represents one-third of the workers’ overall time”), retailing (including “packaging objects for shipping and stocking merchandise” as well as “maintaining records of sales, gathering customer or product information, and other data-collection activities”), financial service and insurance (in which “50 percent of the overall time of the workforce. . .is devoted to collecting and processing data, where the technical potential for automation is high”), and so on.

And the two with the lowest technical feasibility for automation? Healthcare and education. But, even in those sectors, a large number of activities is susceptible to being automated—from food preparation to data collection—at least on technical grounds.

So, what’s going to happen with workers and their jobs as automation moves forward (and as new technologies, such as machine learning, are imagined and devised)?

From what we know about the past, the actual history of technology and capitalism, new forms of automaton will be invented and made technically feasible where their production is financially profitable, and they will become profitable when it’s possible for one enterprise to use automation to outcompete other enterprises (based on a wide range of factors, from lowering production costs to improving the quality of output) in order to secure higher profits.

And, as in the past, the effects on workers will simply be ignored by their employers. Some of their employees will lose their jobs (as they are replaced by robots and digital technologies); for others, their jobs will be fundamentally transformed (e.g., as their work is surveilled by machines and as they become appendages of the automated processes and technologies). Blue-collar workers already know this. White-collar workers are quickly discovering how and why it might happen to them. In both cases, a changing combination of actual automation and the threat of automation is making their work and their livelihoods less and less secure.

But, according to Steven Pearlstein, workers have no need to worry. The invisible hand will take care of them.

The winners from job-destroying technology hire more gardeners, housekeepers and day-care workers. They take more vacations and eat at more restaurants. They buy more cars and boats and bigger houses. They engage the services of more auto mechanics and personal trainers, psychologists and orthopedic surgeons.

Sure, Pearlstein admits, it may take “years, even decades” for the necessary adjustments to occur. But “People who lose their jobs must have the willingness and wherewithal to find new opportunities, learn new skills, move to new cities.”

Whether or not workers take it upon themselves to adjust to the “creative destruction” Pearlstein and mainstream economists celebrate, it is still the case that the decisions about automation will be taken by their employers, not workers themselves. And the benefits, as always, will be appropriated by the small group at the top, not the mass of employees at the bottom.

Perhaps the only hope for workers—until they are able to change the existing economic institutions—is to imagine a process whereby the tasks their employers currently assign to themselves and their managers will become automated. As a result, those who direct the enterprises will also become superfluous.

Pie in the sky, perhaps. But it’s an invisible hand no less utopian than the one Pearlstein and mainstream economists currently believe in.

Not so fast!

Posted: 18 August 2016 in Uncategorized
Tags: , , , ,

real wages-revised

Everyone has read or heard the story: the labor market has rebounded and workers, finally, are “getting a little bigger piece of the pie” (according to President Obama, back in June).

And that’s the way it looked—until the Bureau of Labor Statistics revised its data. What was originally reported as a 4.2 percent increase in the first quarter of 2016 now seems to be a 0.4 decline (a difference of 4.6 percentage points, in the wrong direction).

What’s more, real hourly compensation for the second quarter (in the nonfarm business sector) is down another 1.1 percent.

So, already in 2016, the decline in real wages has eaten up more than half the gain of 2.8 percent reported in 2015 (and after a mere 1.1 percent gain in 2014).

And, since 2009, real hourly wages have increased only 4 percent.

Workers may be getting a little bigger piece of the economic pie since the official end of the Great Recession but the emphasis should really be on “little.”

 

P.S. I’m not a conspiracy theorist by nature. And I don’t plan to start now. As far as I’m concerned, the revision in the real-wage data should not be understood as any kind of deliberate manipulation by the Bureau of Labor Statistics. But it does represent a cautionary tale about the precision of the numbers we use to understand what is going on in the U.S. economy—and about the willingness of some (like Paul Krugman) to dismiss workers’ anxiety about the state of the economy.