Posts Tagged ‘costs’

NA-CL366_MEDSPE_16U_20160825163913

Last Wednesday, as part of my Unhealthy Healthcare series, I showed that the recent slowdown in U.S. healthcare costs has been invisible to American workers, because they have been forced to pay much higher premiums and deductibles in order to obtain access to healthcare for themselves and their families.

That conclusion has been confirmed by a recent Wall Street Journal article about the fact that workers are increasingly feeling the pain of paying for their healthcare.

Middle-class households are finding more of their health-care costs are coming out of their own pockets.

David Cutler, a Harvard health-care economist, said this may be “a story of three Americas.” One group, the rich, can afford health care easily. The poor can access public assistance. But for lower middle- to middle-income Americans, “the income struggles and the health-care struggles together are a really potent issue,” he said.

A June Brookings Institution study found middle-income households now devote the largest share of their spending to health care, 8.9%, a rise of more than three percentage points from 1984 to 2014.

By 2014, middle-income households’ health-care spending was 25% higher than what they were spending before the recession that began in 2007, even as spending fell for other “basic needs” such as food, housing, clothing and transportation, according to an analysis for The Wall Street Journal by Brookings senior fellow Diane Schanzenbach. These households cut back sharply on more discretionary categories like dining out and clothing.

fredgraph

While middle-income households are spending 25 percent more on health care, their real incomes actually fell 6.5 percent between 2007 and 2014, from $57,357 to $53,657.

Clearly, American workers are increasingly being squeezed by their employers at both ends—while they’re at work (since they’re working less and less time for themselves and more for their employers) and while they’re away from work (since they’ve been forced to assume a larger and larger share of the costs of their healthcare)

P1-BV613_GLOBAL_16U_20151130185716

As I’ve been discussing over the course of the past week, the U.S. healthcare system is a nightmare, at least for workers and their families. It costs more and provides less than in other countries. Employees are being forced to pay higher and higher fees (in the form of premiums, deductibles, and other charges). And it relies on a private health-insurance industry, which is increasingly concentrated and profitable.

What about the other part of the system, the actual provision of health care? I’m thinking, in particular, of the pharmaceutical industry (which I focus on in this post) and hospitals (which I’ll take up in a future post).

According to a recent study by the Wall Street Journal, consumers in the United States nearly always pay more for branded drugs than their counterparts in England (39 higher and 1 lower), Norway (37 higher and 3 lower), and Ontario, Canada (28 higher and 2 lower). Thus, for example, Lucentis (which is used for the treatment of patients with wet age-related macular degeneration and other conditions) costs $1936 in the United States but only $894 in Norway, $1159 in England, and $1254 in Ontario. The same is true for many other drugs, from Abraxane (for treating cancer) to Yervoy (for treating skin cancer).

Part of the reason is that, in other countries, public healthcare systems have substantial negotiating power and are able to bargain with pharmaceutical companies for lower prices (or, in the case of Canada’s federal regulatory body, to set maximum prices). The U.S. market, however, “is highly fragmented, with bill payers ranging from employers to insurance companies to federal and state governments.” In particular, Medicare, the largest single U.S. payer for prescription drugs, is legally prohibited from negotiating drug prices.

Pharma

On the other side of the market, the U.S. pharmaceutical industry has become increasingly concentrated through a wave of numerous and increasingly large merger-and-acquisition deals. According to Capgemni Consulting, since 2010, approximately 200 pharmaceutical and biotech deals have taken place per year in the United States. 2014 saw several of the largest deals in the pharmaceutical industry to date, including the $66-billion purchase of Allergan by Actavis, Merck unloading its consumer health unit to Bayer, GSK and Novartis’s multibillion-dollar asset swap, as well as Novartis’s animal health unit sale to Eli Lilly.

Although high-profile, major acquisitions outweigh other deals by value, over 90 percent of deals were relatively small in size (less than $5 billion). Clearly, the motivation in these smaller deals is different.

Failure of bigger pharmaceutical companies to consistently develop new drugs and pressure from shareholders to deliver returns have forced large pharmaceutical companies to look outside for innovative drugs. This has resulted in new drug approvals emerging as a major trigger for acquisitions.

Most-profitable-industries2

The fragmented, unregulated system of drug purchases in the United States, combined with growing concentration of the pharmaceutical industry, means that health technology—with a 20.9 percent net profit margin—is now the most profitable industry in the country.

High drug prices are one of the key factors behind rising U.S. healthcare costs, and one of the main reasons why American workers pay more and yet receive poorer healthcare than in other rich countries.

Addendum

As if to confirm my analysis of the role of the pharmaceutical industry in creating a nightmarish U.S. healthcare system, we now have the examples of the Epipen and Pfizer.

As Aaron E. Caroll explains, the story of EpiPens is not just about how expensive they’ve become; it also reveals “so much of what’s wrong with our health care system.”

Epinephrine isn’t an elective medication. It doesn’t last, so people need to purchase the drug repeatedly. There’s little competition, but there are huge hurdles to enter the market, so a company can raise the price again and again with little pushback. The government encourages the product’s use, but makes no effort to control its cost. Insurance coverage shields some from the expense, allowing higher prices, but leaves those most at-risk most exposed to extreme out-of-pocket outlays. The poor are the most likely to consider going without because they can’t afford it.

EpiPens are a perfect example of a health care nightmare. They’re also just a typical example of the dysfunction of the American health care system.

And then we have Pfizer’s purchase of part of AstraZeneca’s antibiotics business, which doesn’t involve the development of any new drugs but (for $550 million upfront plus an unconditional $175 million in January 2019, and possibly a further $850 million plus royalties), Pfizer will have the right to sell three approved antibiotics and two drugs in clinical trials in most markets outside the U.S. and Canada, plus an additional drug (Merem) in North America.

kff-image-2

On Tuesday, I began a series on the unhealthy state of the U.S. healthcare system—starting with the fact that the United States spends far more on health than any other country, yet the life expectancy of the American population is actually shorter than in other countries that spend far less.

Today, I want to look at what U.S. workers are forced to pay to get access to the healthcare system.

According to the Kaiser Family Foundation, about half of the non-elderly population—147 million people in total—are covered by employer-sponsored insurance programs.* The average annual single coverage premium in 2015 was $6,251 and the average family coverage premium was $17,545. Each rose 4 percent over the 2014 average premiums. During the same period, workers’ wages increased only 1.9 percent while prices declined by 0.2 percent.

But the gap is even larger when looked at over the long run. Between 1999 and 2015, workers’ contributions to premiums increased by a whopping 221 percent, even more than the growth in health insurance premiums (203 percent), and far outpacing both inflation (42 percent) and workers’ earnings (56 percent).

kff-image-4

Most covered workers face additional out-of-pocket costs when they use health care services. Eighty-one percent of covered workers have a general annual deductible for single coverage that must be met before most services are paid for by the plan.** Since 2010, there has also been a sharp increase in both the percentage of workers on health plans with deductibles—which require members to pay a certain amount toward their care before the plan starts paying—and the size of those deductibles. The result has been a 67-percent rise in deductibles (for single coverage) since 2010, far outpacing not only the 24-percent growth in premiums, but also the 10-percent growth in workers’ wages and 9-percent rise in inflation.

In recent years, the increase in U..S. health costs has in fact slowed down. But the slowdown has been invisible to American workers, who have been forced to pay much higher premiums and deductibles in order to get access to healthcare for themselves and their families.

 

*Fifty-seven percent of firms offer health benefits to at least some of their employees, covering about 63 percent workers at those firms.

**Even workers without a general annual deductible often face other types of cost sharing when they use services, such as copayments or coinsurance for office visits and hospitalizations, and when they purchase prescription drugs.

183538_600

Special mention

183532_600 183540_600

[ht: ja]

If there’s one area that isn’t contributing to higher college costs and historic levels of student debt, it’s faculty salaries—especially the pay received by adjunct professors.

According to Caroline Fredrickson,

In 1969, almost 80 percent of college faculty members were tenure or tenure track. Today, the numbers have essentially flipped, with two-thirds of faculty now non-tenure and half of those working only part-time, often with several different teaching jobs. . .

To say that these are low-wage jobs is an understatement. Based on data from the American Community Survey, 31 percent of part-time faculty are living near or below the federal poverty line. And, according to the UC Berkeley Labor Center, one in four families of part-time faculty are enrolled in at least one public assistance program like food stamps and Medicaid or qualify for the Earned Income Tax Credit. Known as the “Homeless Prof,” Mary-Faith Cerasoli teaches romance languages and prepares her courses in friends’ apartments when she can crash on a couch, or in her car when the friends can’t take her in. When a student asked to meet with her during office hours, she responded, “Sure, it’s the Pontiac Vibe parked on Stewart Avenue.”

Higher-Ed Ladder Fig. 6

According to a new study by Demos, the major cause of the rise in college tuition costs is not, as is often believed, administrative bloat or construction binges, but the decline in state funding for higher education.

In the past, state funding for education often rose and fell along with the economy: since higher education funding is viewed as “discretionary” spending, it is often a target for cuts when states are forced to close recessionary holes in their budgets. However, in the past decade, state funding for higher education has diverged from that trend. Six years after the great recession, state higher education funding per student remains 27 percent below its pre-recession level. Unfortunately, declining state support for higher education means that many students today have no choice but to take on significant debt to finance their educations, the negative effects of which are increasingly evident in young people’s lives.

The fact is, public higher education in the United States no longer exists. Because more than half of core educational expenses at “public” 4-year universities are now funded through tuition, a private source of revenue, they have effectively become subsidized private institutions.

Addendum

Higher-Ed Ladder Fig. 2

The other interesting piece of information in the Demos study is the enormous increase in part-time faculty. As Figure 2 shows, the number of employees per thousand students changed little between 1991 and 2011. But the composition of universities’ staff has changed dramatically. At both types of institutions, the relative number of full-time faculty has remained approximately constant and the number of executives and administrators has actually slightly decreased relative to the size of the student body. However, both types of institutions are employing substantially more part-time faculty (as well as professional staff—admissions and human resources staff, IT workers, athletic staff, and health workers). At the same time, the relative number of non-professional staff—workers providing clerical, technical, skilled craft, or maintenance services—shrank dramatically.

20150103_wwd000 (1)

Special mention

158016_600 JfCDKuS

19NRBo.AuSt.79

While a special compensation committee of the University of Kentucky Board of Trustees met Tuesday to discuss whether or not to increase President Eli Capilouto’s salary, which is currently $615,825, the Lexington Herald-Leader discovered that the UK president’s pay increased an average of 9.7 percent each year over the last decade, eclipsing the average annual tuition increase of 7.3 percent and far outpacing the average faculty and staff pay increase of 2.1 percent.

In 2012, analysts at the financial management firm Bain & Company wrote in a white paper for its clients about administrative spending in higher education,

Boards of trustees and presidents need to put their collective foot down on the growth of support and administrative costs. Those costs have grown faster than the cost of instruction across most campuses. In no other industry would overhead costs be allowed to grow at this rate—executives would lose their jobs.

As colleges and universities look to areas where they can make cuts and achieve efficiencies, they should start farthest from the core of teaching and research. Cut from the outside in, and build from the inside out.

The problem, of course, is that the presidents of colleges and universities are the ones benefiting from the increase in administrative spending.

TMW-25-08

Special mention

152842_600 152836_600

If “all warfare is based on deception” (Sun Tzu), the biggest deception concerns the costs of warfare.

That’s why the work done by the folks at costsofwar.org is so important. They have attempted to estimate the human, economics, and sociopolitical costs of the ongoing wars in Iraq, Afghanistan, and Pakistan. Here are some of their findings:

  • While we know how many US soldiers have died in the wars (just over 6000), what is startling is what we don’t know about the levels of injury and illness in those who have returned from the wars.  New disability claims continue to pour into the VA, with 550,000 just through last fall.  Many deaths and injuries among US contractors have not been identified.
  • At least 138,000 civilians have died and more will die in Afghanistan, Iraq, and Pakistan as a result of the fighting at the hands of all parties to the conflict.
  • The armed conflict in Pakistan, which the U.S. helps the Pakistani military fight by funding, equipping and training them, has taken as many lives as the conflict in neighboring Afghanistan.
  • Putting together the conservative numbers of war dead, in uniform and out, brings the total to 236,000.
  • Indirect deaths from the wars, including those related to malnutrition, damaged health infrastructure, and environmental degradation, may far outnumber deaths from combat. While these deaths are difficult to count due to factors such as lack of comparable baseline mortality figures, a 2008 survey by The Geneva Declaration Secretariat estimates that assuming a ratio of four indirect deaths to one direct death in contemporary conflicts would not be unreasonable.
  • Millions of people have been displaced indefinitely and are living in grossly inadequate conditions.  The current number of war refugees and displaced persons — 7,800,000 — is equivalent to all of the people of Connecticut and Kentucky fleeing their homes.
  • The wars have been accompanied by erosions in civil liberties at home and human rights violations abroad.
  • The human and economic costs of these wars will continue for decades, some costs not peaking until mid-century. Many of the wars’ costs are invisible to Americans, buried in a variety of budgets, and so have not been counted or assessed.  For example, while most people think the Pentagon war appropriations are equivalent to the wars’ budgetary costs, the true numbers are twice that, and the full economic cost of the wars much larger yet. Conservatively estimated, the war bills already paid and obligated to be paid are $3.2 trillion in constant dollars. A more reasonable estimate puts the number at nearly $4 trillion.
  • As with former US wars, the costs of paying for veterans’ care into the future will be a sizable portion of the full costs of the war.
  • The ripple effects on the U.S. economy have also been significant, including job loss and interest rate increases, and those effects have been underappreciated.

The folks at the Costs of War project also admit that “there are many costs of these wars that we have not yet been able to quantify and assess.” Unfortunately, they’ve received absolutely no help from mainstream economists who, through their lack of attention to the costs of war, perpetuate the deception that warfare is cheap.