Posts Tagged ‘prices’

WSJ OPINION: THE COST OF DISASTER

Special mention

TMW2017-10-11color

fredgraph

Back in June, Kim Hemphill, in her letter to the editor of the Washington Post, challenged pharmaceutical industry claims that it must charge high prices on lifesaving drugs to recover research and development costs.

The case detailed in the June 11 Business article “Max’s best hope costs $750,000” was yet another example of how the pharmaceutical industry continues to put profits above morals and humanity. . .

Research and development costs are a part of the business pharmaceutical companies are in and should have little, if any, bearing on the ultimate price of a drug. What they charge for these specialty drugs is profit-motivated price gouging, plain and simple.

The fact is, as is clear from the chart above, pharmaceutical prices (at the wholesale level) have risen since 1981 at a much faster rate than for all commodities—more than 7 times compared to just two.

Most people, like Ms. Hemphill, think this is a case of “profit-motivated price gouging” on the part of drug companies. But it’s a difficult charge to prove.

Until now.

A new study published in the JAMA Internal Medicine journal directly challenges the industry’s argument that the reason for high drug prices is the sizable research and development outlay necessary to bring a drug to the U.S. market.

What the authors of the study show is that, in the case of 10 cancer drugs, the median revenue after approval of the drugs was $1658.4 million while the median cost of developing a single cancer drug was only $648.0 million.

Moreover, given that total spending (including a 7 percent cost of capital) to develop these drugs was $9 billion and total revenue to date was $67 billion, the postapproval revenue was more than 7-fold higher than the R&D spending.

drugs

Thus, as is clear in the figure from the study, development costs are more than recouped in a short period—and some companies boast more than a 10-fold higher revenue than research and development spending.

So, Ms. Hemphill was right: the pharmaceutical industry continues to put profits above morals and humanity.

fredgraph

The latest jobs report by the Bureau of Labor Statistics has the official unemployment rate declining by two percentage points, to 4.5 percent, in March.

And yet, as is clear from the chart above, workers’ wages (average hourly earnings of production and nonsupervisory workers) are barely keeping ahead of inflation (measured by the Consumer Price Index, less food and energy).

Workers are still waiting for their share of the current recovery.

rs-19045-rectangle

Everyone knows Wall Street is just a racket, soaking up large portions of the surplus and stuffing the pockets of wealthy bankers. That’s especially true since the spectacular crash of 2007-08, when millions lost their jobs and homes.

But many people also harbor the illusion, based on the relentless campaign from mainstream economists and financial journalists, that perhaps Wall Street bankers, even if they’re not doing God’s work, are at least doing something—anything—that is useful for the wider society. Why else would they have been bailed out with taxpayer money and, then as now, be paid such a fortune?

The defense of Wall Street rests on three major arguments, and Lynn Stout (Distinguished Professor of Corporate and Business Law at Cornell Law School) convincingly gives the lie to all of them.

Argument #1: Wall Street helps companies raise capital

If we look at the numbers, it’s obvious that raising capital for companies is only a sideline for most banks, and a minor one at that. Corporations raise capital in the so-called “primary” markets where they sell newly-issued stocks and bonds to investors. However, the vast majority of bankers’ time and effort is devoted to (and most bank profits come from) dealing, trading, and advising investors in the so-called “secondary” market where investors buy and sell existing securities with each other. In 2009, for example, less than 10 percent of the securities industry’s profits came from underwriting new stocks and bonds; the majority came instead from trading commissions and trading profits (Table 1219). This figure reflects the imbalance between the primary issuing market (which is relatively small) and the secondary trading market (which is enormous). In 2010, corporations issued only $131 billion in new stock (Table 1202). That same year, the World Bank reports, more than $15 trillion in stocks were traded in the U.S. secondary marketmore than the nation’s GDP. Yet secondary market trading is fundamentally a zero sum game—if I make money by buying low and selling high, it’s money you lost by buying high and selling low.

Argument #2: Wall Street provides liquidity (e.g., the ability for investors to sell their investments relatively quickly)

The problem with this line of argument is that Wall Street is providing far more liquidity (at a hefty price—remember that half-trillion-dollar payroll) than investors really need. Most of the money invested in stocks, bonds, and other securities comes from individuals who are saving for retirement, either by investing directly or through pension and mutual funds. These long-term investors don’t really need much liquidity, and they certainly don’t need a market where 165 percent of shares are bought and sold every year. They could get by with much less trading—and in fact, they did get by, quite happily. In 1976, when the transactions costs associated with buying and selling securities were much higher, fewer than 20 percent of equity shares changed hands every year. Yet no one was complaining in 1976 about any supposed lack of liquidity. Today we have nearly 10 times more trading, without any apparent benefit for anyone (other than Wall Street bankers and traders) from all that “liquidity.”

Argument #3: Wall Street trading helps allocate society’s resources more efficiently (by ensuring securities are priced accurately)

This argument is based on the notion of “price discovery”–the idea that the promise of speculative profits motivates traders to do research that uncovers socially useful information. The classic example is a wheat futures trader who researches weather patterns. If the trader accurately predicts a drought, the trader buys wheat futures, driving up wheat prices, causing farmers to plant more wheat, helping alleviate the effects of the drought. Thus (the argument goes) the trader’s profits from speculating in wheat futures are just compensation for providing socially valuable “price discovery.” Once again, however, this cheerful banker “just-so story” turns out to be unsupported by any significant evidence. Let’s start with the questionable premise that the average trader earns profits from doing good research. The well-established fact that very few actively-managed mutual funds routinely outperform the market undermines the claim that most trading is driven by truly superior information.

But even more significantly, the fact that a trader with superior information can move prices in the “correct” direction does not necessarily mean that society will benefit. It’s all a question of timing.  As famous economist Jack Hirshleifer pointed out many years ago, trading that makes prices more accurate when it’s too late to do anything about it is privately profitable but not socially beneficial. Most Wall Street trading in stocks, bonds, and derivatives moves information into prices only days–sometimes only microseconds–before it would arrive anyway. No real resources are reallocated in such a short time span. 

So, if Wall Street doesn’t help raise capital, provide liquidity, or help allocate resources efficiently, what does it do that benefits society?

Doctors and nurses make patients healthier.  Firefighters and EMTs save lives. Telecommunications companies and smart phone manufacturers permit people to communicate with each other at a distance. Automobile executives and airline pilots help people close that distance. Teachers and professors help students learn. Wall Street bankers help—mostly just themselves.

P1-BV613_GLOBAL_16U_20151130185716

As I’ve been discussing over the course of the past week, the U.S. healthcare system is a nightmare, at least for workers and their families. It costs more and provides less than in other countries. Employees are being forced to pay higher and higher fees (in the form of premiums, deductibles, and other charges). And it relies on a private health-insurance industry, which is increasingly concentrated and profitable.

What about the other part of the system, the actual provision of health care? I’m thinking, in particular, of the pharmaceutical industry (which I focus on in this post) and hospitals (which I’ll take up in a future post).

According to a recent study by the Wall Street Journal, consumers in the United States nearly always pay more for branded drugs than their counterparts in England (39 higher and 1 lower), Norway (37 higher and 3 lower), and Ontario, Canada (28 higher and 2 lower). Thus, for example, Lucentis (which is used for the treatment of patients with wet age-related macular degeneration and other conditions) costs $1936 in the United States but only $894 in Norway, $1159 in England, and $1254 in Ontario. The same is true for many other drugs, from Abraxane (for treating cancer) to Yervoy (for treating skin cancer).

Part of the reason is that, in other countries, public healthcare systems have substantial negotiating power and are able to bargain with pharmaceutical companies for lower prices (or, in the case of Canada’s federal regulatory body, to set maximum prices). The U.S. market, however, “is highly fragmented, with bill payers ranging from employers to insurance companies to federal and state governments.” In particular, Medicare, the largest single U.S. payer for prescription drugs, is legally prohibited from negotiating drug prices.

Pharma

On the other side of the market, the U.S. pharmaceutical industry has become increasingly concentrated through a wave of numerous and increasingly large merger-and-acquisition deals. According to Capgemni Consulting, since 2010, approximately 200 pharmaceutical and biotech deals have taken place per year in the United States. 2014 saw several of the largest deals in the pharmaceutical industry to date, including the $66-billion purchase of Allergan by Actavis, Merck unloading its consumer health unit to Bayer, GSK and Novartis’s multibillion-dollar asset swap, as well as Novartis’s animal health unit sale to Eli Lilly.

Although high-profile, major acquisitions outweigh other deals by value, over 90 percent of deals were relatively small in size (less than $5 billion). Clearly, the motivation in these smaller deals is different.

Failure of bigger pharmaceutical companies to consistently develop new drugs and pressure from shareholders to deliver returns have forced large pharmaceutical companies to look outside for innovative drugs. This has resulted in new drug approvals emerging as a major trigger for acquisitions.

Most-profitable-industries2

The fragmented, unregulated system of drug purchases in the United States, combined with growing concentration of the pharmaceutical industry, means that health technology—with a 20.9 percent net profit margin—is now the most profitable industry in the country.

High drug prices are one of the key factors behind rising U.S. healthcare costs, and one of the main reasons why American workers pay more and yet receive poorer healthcare than in other rich countries.

Addendum

As if to confirm my analysis of the role of the pharmaceutical industry in creating a nightmarish U.S. healthcare system, we now have the examples of the Epipen and Pfizer.

As Aaron E. Caroll explains, the story of EpiPens is not just about how expensive they’ve become; it also reveals “so much of what’s wrong with our health care system.”

Epinephrine isn’t an elective medication. It doesn’t last, so people need to purchase the drug repeatedly. There’s little competition, but there are huge hurdles to enter the market, so a company can raise the price again and again with little pushback. The government encourages the product’s use, but makes no effort to control its cost. Insurance coverage shields some from the expense, allowing higher prices, but leaves those most at-risk most exposed to extreme out-of-pocket outlays. The poor are the most likely to consider going without because they can’t afford it.

EpiPens are a perfect example of a health care nightmare. They’re also just a typical example of the dysfunction of the American health care system.

And then we have Pfizer’s purchase of part of AstraZeneca’s antibiotics business, which doesn’t involve the development of any new drugs but (for $550 million upfront plus an unconditional $175 million in January 2019, and possibly a further $850 million plus royalties), Pfizer will have the right to sell three approved antibiotics and two drugs in clinical trials in most markets outside the U.S. and Canada, plus an additional drug (Merem) in North America.

ratracepolyp

From the very beginning, one of the central claims on behalf of capitalism has been that it leads to increases in productivity—and, as a result, an increase in the wealth of nations. The idea is that, the more national wealth increases (the more commodities are produced per person hour worked), the higher living standards of ordinary people will be (i.e., real wages will increase).* We find that story in pretty much every text of mainstream economics, from Adam Smith to Deirdre McCloskey.

That’s why Karl Marx spent so much time (hundreds of pages, in fact) discussing productivity (along with machinery, mechanization, technical change, and so forth) in his critique of political economy. So, John Cassidy gets it wrong.

Marx (not to mention other nineteenth-century critics of capitalism) never denied that there was a connection between increases in productivity and a rise in workers’ wages. That would be silly, both theoretically and empirically. All he ever did was deny there’s an automatic or necessary relationship between them—and, perhaps more important, that increases in real wages didn’t mean workers weren’t being increasingly exploited.

9265

source

The first point (one even Cassidy, in a way, concedes) is easy to show: for a time (until 1973 or so, in the United States), workers’ real wages increased at roughly the same rate as productivity. Then (from 1973 onward), productivity continued to grow but workers’ wages stagnated.

One key question, for the pre-1973 period, is why productivity and workers’ real wages increased in tandem. Cassidy assumes that wages grew because of the increases in productivity. Nothing could be further from the truth. The explanation for the increases in productivity (having to do with the growth of manufacturing, capitalist competition, the role of U.S. corporations in the world economy, and so on) is separate from the change in wages (based on fast economic growth, unionization, a shortage of labor power, and so on). There’s simply no automatic relationship between increases in productivity and increases in real wages, which has been confirmed by their divergence after 1973.

The second point is, in my view, even more important. It’s possible for workers’ wages to increase (at or even above the rate of growth of productivity) and for capitalist exploitation to also be rising.

Let me explain. In Marxian theory, the rate of exploitation (s/v) is the ratio of surplus-value (s) to the value of labor power (v). The value of labor power is, in turn, equal to the exchange-value per unit use-value (e), or price, of the commodities in the wage bundle (q), or the real wage. So, we have v = e*q and, in terms of rates of change, Δv/v = Δe/e + Δq/q. Mathematically, exploitation can increase (Marx referred to it as relative surplus-value) if the value of labor power is decreasing (Δv/v is negative) even if real wages are going up (Δq/q is positive) as along as the change in the price of wage commodities is negative (Δe/e) and its absolute value is greater than the change in real wages (|Δe/e| > |Δq/q|).

For example, in terms of numbers: if real wages increase by 10 percent (workers are buying more things) but the prices of the items in the wage bundle (food, clothing, shelter) decrease by 20 percent, then the value of labor power (what capitalists have to pay to get access to the commodity labor power) will decrease by 10 percent. Voilà! Higher real wages can be (and, throughout much of the history of U.S. capitalism, have been) accompanied by rising exploitation.

And that’s precisely one of the effects of increasing productivity: it lowers the exchange-value per unit use-value of wage commodities.** Less labor is embodied in each unit of bread, shirts, and housing. The fact that workers are able to purchase more of those commodities (say, at the same rate as productivity is increasing) doesn’t mean exploitation is not also increasing.

That’s even more the case when real wages are stagnant (Δq/q is equal to zero). Then, the decline in the price of wage goods (again,  decreasing by 20 percent) is translated directly into an increase in exploitation (via a 20 percent decrease in v).

But what if rate of growth of domestic productivity begins to decline (as it did after 1973, and even more so in recent years)? Then, the domestic contribution to the decline in the price of wage commodities would fall (say, to 10 percent). But, at the same time, if jobs are offshored and cheaper wage goods are imported from abroad (think Walmart), that also leads to a decline in the price of wage goods (say, by another 10 percent). So, even with declining domestic productivity growth, the combination of domestic production and imported goods can lead to a decline in the price of wage goods (for a total, as before, of 20 percent) that, with constant real wages, decreases the value of labor power (by 20 percent) and increases the rate of exploitation.

So, while Cassidy and many others are worried about a slowdown in productivity growth (linking it to workers’ wages and living standards), workers know that increases in productivity don’t automatically lead to increases in real wages. And even if real wages do rise, it’s still likely they’ll be more exploited than before.

Their employers, not they, will be ones to benefit.

 

*It is merely presumed that the standard of living of those at the top, the capitalists and other members of the 1 percent, would be just fine.

**Another, separate issue is why productivity itself might increase. The Marxian argument is that, during the course of competing over “super-profits”—that is, distributions of the surplus-value among and between capitalist enterprises—capitalists will engage in technical change, which in turn leads to increases in productivity and a decline in the value of the commodities they produce. An interesting question, then, is why productivity in the United States and other advanced countries has slowed down. One reason may be a decline in competitive pressures among enterprises that produce goods and services.

126194_600

Back in 2014, in a post on inflation, I revealed my suspicion that

the real rate of inflation for consumer goods is higher than the official rate of 2.2 percent (over the past 12 months), thereby understating the extent to which working people are facing rising prices for the commodities they need to purchase in order to maintain themselves and their families.

Well, as it turns out, I was right. According to some recent research by Xavier Jaravel, the rate of inflation faced by high-income households is lower than for low-income households.

Why’s that? Because, with rising inequality, firms in the retail sector introduced more products catering to high-income consumers, and competitive pressure in that segment of the market drove down the prices of those products.

And why does it matter? Well, for one, any overall measure of inflation (like the Consumer Price Index) tends to understate the rate of inflation facing low-income consumers. That’s the point I made back in 2014.

The other implication is that, because households with different amounts of income face different prices for the goods they consume, economic inequality is actually worse than we thought.

inflation-inequality

So, here we have another vicious cycle: nominal economic inequality leads to different rates of product innovation (thus leading to different levels of consumer prices), which in turn worsens the degree of real inequality.

That vicious cycle of escalating inequality is, unfortunately, part of the normal workings of our current economic institutions.