Posts Tagged ‘models’

Right now, the United States is mired in an economic depression, the Pandemic Depression, not dissimilar to what happened in the 1930s and again after the crash of 2007-08.

Real (inflation-adjusted) gross domestic product contracted by an annual rate of 31.7 percent in the second quarter of 2020 (according to the Bureau of Economic Analysis) and at least 27 million American workers are currently unemployed (counting workers continuing to receive some kind of unemployment benefits, according to my own calculations).* By all accounts—from both macroeconomic data and anecdotes reported in the media—the current situation is an economic and social disaster equivalent to what the United States went through during the first and second Great Depressions.

The question is, does mainstream macroeconomics have anything to offer in terms of insights about the causes of the current crises or what should be done to solve them?

Many readers are, I’m sure, skeptical, given the abysmal track record of mainstream macroeconomic thinking in the United States. Going back just a bit more than a decade, to the Second Great Depression, it’s clear that mainstream macroeconomists failed on all counts: they didn’t predict the crash; they didn’t even include the possibility of such a crash within their basic theory or models; and they certainly didn’t know what to do once the crash occurred.

Can they do any better with the current depression?

The example I want to use was recently posted by Harvard’s Greg Mankiw, the author of the best-selling macroeconomics textbook on the market. I know it’s not the most sophisticated (or, if you prefer, technical or detailed) discussion out there but it does matter: next year, thousands upon thousands of students will receive their basic training in mainstream macroeconomic theory and its application to the Pandemic Depression from Mankiw’s text.

It should come as no surprise that Mankiw uses the macroeconomic model—of aggregate demand and supply—he has so laboriously built up over the course of many chapters to examine what he calls “the economic downturn of 2020.” His basic argument is that, first, aggregate demand declined (shifting to the left, from AD1 to AD2) due to a decline in the velocity of money (one of the exogenous variables that, in mainstream moderls, determines aggregate demand), and second, the long-run aggregate supply curve declines (shifts left, from LRAS1 to LRAS2), while the short-run aggregate supply curve (SRAS) stays the same. The result is a decline in output (the left-facing arrow at the bottom of the diagram).

This is all pretty straightforward stuff. Except: Mankiw wants to argue that it’s the “natural level of output” as represented by the long-run aggregate supply curve, not the perfectly elastic (or horizontal) short-run aggregate supply curve, that shifts to the left. Huh?

His only explanation is that

When a pandemic strikes and many businesses are temporarily closed, aggregate demand falls because people are staying at home rather than spending at those businesses. Because those businesses cannot produce goods and services, the economy’s potential output, as reflected in the LRAS curve, falls as well. The economy moves from point A to point B.

The problem is, there’s nothing in the way Mankiw has derived the long-run aggregate supply curve—from given resources (land, labor, and capital) and technology—that has changed. Instead, the shutdown of many businesses merely means that there’s enormous excess capacity in the economy. The “natural rate of output”—the level of output corresponding to the “natural level of unemployment”—remains as it was.

But Mankiw is trapped by his own model. The benefit of analyzing the current depression in terms of a shift in the long-run aggregate supply curve is that, as soon as the shutdown is lifted, the supply curve shifts back to the right and the economy moves back to its old long-run equilibrium. Problem solved!

And if the long-run aggregate supply curve doesn’t shift back to the right? Well, then, U.S. capitalism has in fact destroyed its resources—especially labor power—and the economy doesn’t recover, at least anytime soon.

Moreover, if he’d shifted the short-run aggregate supply curve (up in the diagram), well, then we’re in the land of inflation—with the price level rising—an even more severe decline in economic activity (smaller than B), and no return to long-run equilibrium. But prices are not, in general rising, which is why he uses the horizontal short-run aggregate supply curve in the first place (to reflect fixed prices, the result of monopoly enterprises).

Not only is Mankiw trapped by the logic of his own model. His analysis—both the model and the accompanying text—leaves out much of what is interesting and important about the Pandemic Depression.

We’ve seen, for example, that U.S. stock markets, after an initial downturn, have soared to new record highs, even as national output declines and unemployment reached numbers of workers not seen since the Great Depression of the 1930s. That doesn’t even warrant a mention in Mankiw’s analysis—which involves a discussion of assistance to workers and small businesses but nothing about the trillions of dollars available to the Treasury and Federal Reserve to bailout large corporations, keep credit flowing, and boost equity markets.

But there’s an even larger problem in Mankiw’s basic model: all downturns, whether recession or depressions, are the result of “accidents.”

Some surprise event shifts aggregate supply or aggregate demand, reducing production and employment. Policymakers are eager to return the economy to normal levels of production and employment as quickly as possible.

And the Pandemic Depression? Well, according to Mankiw, it was “by design.” But the distinction is meaningless: in all cases, the downturn occurs because of something outside the model—by some kind of “shock.”

So, capitalism itself is absolved. In Mankiw’s model, and in mainstream macroeconomics more generally, there’s nothing in capitalism itself—how profit rates behave, what decisions capitalists make, the fragility of the financial sector, obscene levels of inequality, and so on—that causes the economy to collapse.

If we step outside the confines of Mankiw’s model, then we can begin to see how U.S. capitalism, while it did not create the novel coronavirus, certainly produced and exacerbated the destructive effects of the pandemic on the American economy. For example, after decades of neglect of the public healthcare system and attempts to shore up the private provision of healthcare in the United States, the country was ill-prepared to diagnosis and contain the pandemic. Even more, it worsened the already-grotesque inequalities of healthcare—as well as incomes, wealth, and household finances—it had originally created.

That same economic system also left in the hands of private employers—not the government or workers themselves—the decisions of whether to keep workers employed or, as happened across the country, to furlough or lay off tens of millions of their employees. Any to add to the misery: many of the workers who were supposed to be on temporary layoffs are now finding they’ve lost their jobs permanently and are spending more and more time attempting to find new jobs.

None of those pre-existing economic conditions figures in Mankiw’s analysis. They can’t, because they don’t exist within mainstream macroeconomics, which has been studiously constructed precisely to provide a hydraulic model of macroeconomic equilibrium—starting with full employment and price stability, one or another external “shock” that moves the economy away from there, and then automatic mechanisms to return the economy to its original position—on the basis of aggregate demand and aggregate supply.

And that’s how we get Mankiw’s excuse for the Pandemic Depression:

given the circumstances, a large economic downturn was arguably the best outcome that could be achieved.

———

*Millions more workers are either unemployed but not receiving benefits or involuntarily underemployed, working part-time (often with cuts in pay and benefits) when they prefer to be working full-time.

labor shares

When I first began studying economics, the conventional wisdom was that “factor shares”—the shares of national income paid to labor and capital—were relatively constant.

So, there really was no need to worry about the problem of inequality. Poverty, maybe, but not the gap between wages and profits.

Now, of course, all of that has changed. Not only is there increasing recognition that the labor share changed, it’s been declining for more than four decades.

Even Stephen Cecchetti and Kim Schoenholtz, the authors of the textbook Money, Banking and Financial Markets, have acknowledged that

For at least the past 15 years, and possibly for several decades, labor’s share of national income has been declining and capital’s share has been rising in most advanced and many emerging economies.

Thus, for example, the labor share of national income in the United States has fallen by about 12 percent from 1970 to 2014 (as indicated by the index scale on the left side of the chart above).

But, as it turns out, that’s only part of the story. The share of national income going to workers has declined by even more than that.

There are two main reasons why the “labor share” doesn’t give an accurate picture of the “workers’ share” of national income. First, as Michael D. Giandrea and Shawn A. Sprague explain, the labor share (as calculated by the Bureau of Labor Statistics) includes both employee compensation and the labor compensation of proprietors (and thus a portion, minus the capital share, of the income going to proprietors). Second, the labor share does not account for inequality between the different groups who receive what is officially measured as labor compensation:

the compensation of a highly paid CEO and a low-wage worker would both be included in the labor share.

So, in order to get an accurate picture of workers’ share of national income, we need to turn to other data.

What I’ve done in the chart above is measure (on the right side of the chart) the shares of income going to the bottom 90 percent and the bottom 50 percent of Americans. And, not surprisingly, the declines are even more dramatic: 20 percent for the bottom 90 percent (falling from 66 percent of total factor income in 1970 to 53 percent in 2014) and even more, 45.8 percent, for the bottom 50 percent (from 19 to 10.3 percent between 1970 and 2014). Those are the shares actual workers—not proprietors or CEOs—take home.

Finally, the conventional wisdom has begun to change. Under existing economic institutions, factor shares do in fact change—and they’ve been turning against labor for decades now.

The bottom line, though, is the situation of workers is even worse than what is indicated by the declining labor share. The workers’ share has fallen even more dramatically in recent decades.

It’s time, then, for the old models—the old theoretical models as well as the models for organizing the economy—to be thrown out and replaced in order to create an economy that actually works for American workers.

p1-bz033_liondo_16u_20161022171815

source

Back in 2010, Charles Ferguson, the director of Inside Job, exposed the failure of prominent mainstream economists who wrote about and spoke on matters of economic policy to disclose their conflicts of interest in the lead-up to the crash of 2007-08. Reuters followed up by publishing a special report on the lack of a clear standard of disclosure for economists and other academics who testified before the Senate Banking Committee and the House Financial Services Committee between late 2008 and early 2010, as lawmakers debated the biggest overhaul of financial regulation since the 1930s.

Well, economists are still at it, leveraging their academic prestige with secret reports justifying corporate concentration.

That’s according to a new report from ProPublica:

If the government ends up approving the $85 billion AT&T-Time Warner merger, credit won’t necessarily belong to the executives, bankers, lawyers, and lobbyists pushing for the deal. More likely, it will be due to the professors.

A serial acquirer, AT&T must persuade the government to allow every major deal. Again and again, the company has relied on economists from America’s top universities to make its case before the Justice Department or the Federal Trade Commission. Moonlighting for a consulting firm named Compass Lexecon, they represented AT&T when it bought Centennial, DirecTV, and Leap Wireless; and when it tried unsuccessfully to absorb T-Mobile. And now AT&T and Time Warner have hired three top Compass Lexecon economists to counter criticism that the giant deal would harm consumers and concentrate too much media power in one company.

Today, “in front of the government, in many cases the most important advocate is the economist and lawyers come second,” said James Denvir, an antitrust lawyer at Boies, Schiller.

Economists who specialize in antitrust — affiliated with Chicago, Harvard, Princeton, the University of California, Berkeley, and other prestigious universities — reshaped their field through scholarly work showing that mergers create efficiencies of scale that benefit consumers. But they reap their most lucrative paydays by lending their academic authority to mergers their corporate clients propose. Corporate lawyers hire them from Compass Lexecon and half a dozen other firms to sway the government by documenting that a merger won’t be “anti-competitive”: in other words, that it won’t raise retail prices, stifle innovation, or restrict product offerings. Their optimistic forecasts, though, often turn out to be wrong, and the mergers they champion may be hurting the economy.

Right now, the United States is experiencing a wave of corporate mergers and acquisitions, leading to increasing levels of concentration, reminiscent of the first Gilded Age. And, according to ProPublica, a small number of hired guns from economics—who routinely move through the revolving door between government and corporate consulting—have written reports for and testified in favor of dozens of takeovers involving AT&T and many of the country’s other major corporations.

Looking forward, the appointment of Republican former U.S. Federal Trade Commission member Joshua Wright to lead Donald Trump’s transition team that is focused on the Federal Trade Commission may signal even more mergers in the years ahead. Earlier this month Wright expressed his view that

Economists have long rejected the “antitrust by the numbers” approach. Indeed, the quiet consensus among antitrust economists in academia and within the two antitrust agencies is that mergers between competitors do not often lead to market power but do often generate significant benefits for consumers — lower prices and higher quality. Sometimes mergers harm consumers, but those instances are relatively rare.

Because the economic case for a drastic change in merger policy is so weak, the new critics argue more antitrust enforcement is good for political reasons. Big companies have more political power, they say, so more antitrust can reduce this power disparity. Big companies can pay lower wages, so we should allow fewer big firms to merge to protect the working man. And big firms make more money, so using antitrust to prevent firms from becoming big will reduce income inequality too. Whatever the merits of these various policy goals, antitrust is an exceptionally poor tool to use to achieve them. Instead of allowing consumers to decide companies’ fates, courts and regulators decided them based on squishy assessments of impossible things to measure, like accumulated political power. The result was that antitrust became a tool to prevent firms from engaging in behavior that benefited consumers in the marketplace.

And, no doubt, there will be plenty of mainstream economists who will be willing, for large payouts, to present the models that justify a new wave of corporate mergers and acquisitions in the years ahead.

10

Mark Tansey, “Coastline Measure” (1987)

The pollsters got it wrong again, just as they did with the Brexit vote and the Colombia peace vote. In each case, they incorrectly predicted one side would win—Hillary Clinton, Remain, and yes—and many of us were taken in by the apparent certainty of the results.

I certainly was. In each case, I told family members, friends, and acquaintances it was quite possible the polls were wrong. But still, as the day approached, I found myself believing the “experts.”

It still seems, when it comes to polling, we have a great deal of difficulty with uncertainty:

Berwood Yost of Franklin & Marshall College said he wants to see polling get more comfortable with uncertainty. “The incentives now favor offering a single number that looks similar to other polls instead of really trying to report on the many possible campaign elements that could affect the outcome,” Yost said. “Certainty is rewarded, it seems.”

But election results are not the only area where uncertainty remains a problematic issue. Dani Rodrik thinks mainstream economists would do a better job defending the status quo if they acknowledged their uncertainty about the effects of globalization.

This reluctance to be honest about trade has cost economists their credibility with the public. Worse still, it has fed their opponents’ narrative. Economists’ failure to provide the full picture on trade, with all of the necessary distinctions and caveats, has made it easier to tar trade, often wrongly, with all sorts of ill effects. . .

In short, had economists gone public with the caveats, uncertainties, and skepticism of the seminar room, they might have become better defenders of the world economy.

To be fair, both groups—pollsters and mainstream economists—acknowledge the existence of uncertainty. Pollsters (and especially poll-based modelers, like one of the best, Nate Silver, as I’ve discussed here and here) always say they’re recognizing and capturing uncertainty, for example, in the “error term.”

silver

Even Silver, whose model included a much higher probability of a Donald Trump victory than most others, expressed both defensiveness about and confidence in his forecast:

Despite what you might think, we haven’t been trying to scare anyone with these updates. The goal of a probabilistic model is not to provide deterministic predictions (“Clinton will win Wisconsin”) but instead to provide an assessment of probabilities and risks. In 2012, the risks to to Obama were lower than was commonly acknowledged, because of the low number of undecided voters and his unusually robust polling in swing states. In 2016, just the opposite is true: There are lots of undecideds, and Clinton’s polling leads are somewhat thin in swing states. Nonetheless, Clinton is probably going to win, and she could win by a big margin.

slide_24

As for the mainstream economists, while they may acknowledge exceptions to the rule that “everyone benefits” from free markets and international trade in some of their models and seminar discussions, they acknowledge no uncertainty whatsoever when it comes to celebrating the current economic system in their textbooks and public pronouncements.

So, what’s the alternative? They (and we) need to find better ways of discussing and possibly “modeling” uncertainty. Since the margins of error, different probabilities, and exceptions to the rule are ways of hedging their bets anyway, why not just discuss the range of possible outcomes and all of what is included and excluded, said and unsaid, measurable and unmeasurable, and so forth?

The election pollsters and statisticians may claim the public demands a single projection, prediction, or forecast. By the same token, the mainstream economists are no doubt afraid of letting the barbarian critics through the gates. In both cases, the effect is to narrow the range of relevant factors and the likelihood of outcomes.

One alternative is to open up the models and develop a more robust language to talk about fundamental uncertainty. “We simply don’t know what’s going to happen.” In both cases, that would mean presenting the full range of possible outcomes (including the possibility that there can be still other possibilities, which haven’t been considered) and discussing the biases built into the models themselves (based on the assumptions that have been used to construct them). Instead of the pseudo-rigor associated with deterministic predictions, we’d have a real rigor predicated on uncertainty, including the uncertainty of the modelers themselves.

Admitting that they (and therefore we) simply don’t know would be a start.

phlogiston07_1000

Stanislas Wolff, “Phlogiston”

The other day, I argued (as I have many times over the years) that contemporary mainstream macroeconomics is in a sorry state.

Mainstream macroeconomists didn’t predict the crash. They didn’t even include the possibility of such a crash within their theory or models. And they certainly didn’t know what to do once the crash occurred.

I’m certainly not the only one who is critical of the basic theory and models of contemporary mainstream macroeconomics. And, at least recently (and, one might say, finally), many of the other critics are themselves mainstream economists—such as MIT emeritus professor and former IMF chief economist Olivier Blanchard (pdf), who has noted that the models that are central to mainstream economic research—so-called dynamic stochastic general equilibrium models—are “seriously flawed.”

Now, one of the most mainstream of the mainstream, Paul Romer (pdf), soon to be chief economist at the World Bank, has taken aim at mainstream macroeconomics.* You can get a taste of the severity of his criticisms from the abstract:

For more than three decades, macroeconomics has gone backwards. The treatment of identification now is no more credible than in the early 1970s but escapes challenge because it is so much more opaque. Macroeconomic theorists dismiss mere facts by feigning an obtuse ignorance about such simple assertions as “tight monetary policy can cause a recession.” Their models attribute fluctuations in aggregate variables to imaginary causal forces that are not influenced by the action that any person takes. A parallel with string theory from physics hints at a general failure mode of science that is triggered when respect for highly regarded leaders evolves into a deference to authority that displaces objective fact from its position as the ultimate determinant of scientific truth.

That’s right: in Romer’s view, macroeconomics (by which he means mainstream macroeconomics) “has gone backwards” for more than three decades.

Romer’s particular concern is with the “identification problem,” which in econometrics has to do with being able to solve for unique values of the parameters of a model (the so-called structural model, usually of simultaneous equations) from the values of the parameters of the reduced form of the model (i.e., the model in which the endogenous variables are expressed as functions of the exogenous variables). A supply-and-demand model of a market is a good example: it is not enough, in attempting to identify the two different supply and demand equations, to solely use observations of different quantities and prices. In particular, it’s impossible to estimate a downward slope (of the demand curve) and an upward slope (of the supply curve) with one linear regression line involving only two variables. That’s because both supply and demand curves can be shifting at the same time, and it can be difficult to disentangle the two effects. That, in a nutshell, is the “identification problem.”

The problem is similar in macroeconomic models, and Romer finds that many mainstream economists rely on models that require and presume exogenous shocks—imaginary shocks, which “occur at just the right time and by just the right amount” (hence phlogiston)—to generate the desired results. Thus, in his view, “the real business cycle model explains recessions as exogenous decreases in phlogiston.”

The issue with phlogiston is that it can’t be directly measured. Nor, as it turns out, can many of the other effects invoked by mainstream economists. Here’s how Romer summarizes these imaginary effects:

  • A general type of phlogiston that increases the quantity of consumption goods produced by given inputs
  • An “investment-specific” type of phlogiston that increases the quantity of capital goods produced by given inputs
  • A troll who makes random changes to the wages paid to all workers
  • A gremlin who makes random changes to the price of output
  • Aether, which increases the risk preference of investors
  • Caloric, which makes people want less leisure

So, there you have it: in Romer’s view, contemporary mainstream economists rely on various types of phlogiston, a troll, a gremlin, aether, and caloric. That’s how they attempt to solve the identification problem in their models.

But, for Romer, there’s a second identification problem: mainstream economists continue to build and apply these phlogiston-identified dynamic stochastic general equilibrium models because they have “a sense of identification with the group akin to identification with a religious faith or political platform.”

The conditions for failure are present when a few talented researchers come to be respected for genuine contributions on the cutting edge of mathematical modeling. Admiration evolves into deference to these leaders. Deference leads to effort along the specific lines that the leaders recommend. Because guidance from authority can align the efforts of many researchers, conformity to the facts is no longer needed as a coordinating device. As a result, if facts disconfirm the officially sanctioned theoretical vision, they are subordinated. Eventually, evidence stops being relevant. Progress in the field is judged by the purity of its mathematical theories, as determined by the authorities.

I, for one, have no problem with group identification (I often identify with Marxists and many of the other strangers in the strange land of economics). But when it’s identification with a few leaders, and when it’s an issue of the purity of the mathematics—and not shedding light on what is actually going on out there—well, then, there’s a serious problem.

As it turns out, modern mainstream economics has two identification problems—one in the imaginary solution of the models, the other with the imagined purity of the mathematics. Together, the two identification problems mean that what is often taken to be the cutting edge of modern macroeconomics is in fact seriously flawed—and has become increasingly flawed for more than three decades.

But let me leave the last word to Daniel Drezner, who has lost all patience with mainstream economists’ self-satisfaction with their theories, models, and standing in the world:

this is a complete and total crock.

 

*Other mainstream economists, such as Narayana Kocherlakota and Noah Smith, have expressed their substantial agreement with Romer.

27UP-View-master675-v3

I have argued many times over the years that mainstream economists, especially mainstream macroeconomists, largely ignore the issue of inequality. And when they do see it, they tend to misunderstand both its causes (often attributing it to exogenous events, such as globalization and technical change) and its consequences (often failing to connect it, other than through “political capture,” to events like the crash of 2007-08).

In my view, mainstream economists overlook or forget about the role inequality plays, especially in macroeconomic events, for two major reasons. First, their theoretical and empirical models—either based on a representative agent or undifferentiated macroeconomic relationships (such as consumption and investment)—can be solved without ever conceptualizing or measuring inequality. The models they use create a theoretical blindspot. But, second, even when it’s clear they could include inequality as a significant factor, they don’t. They literally choose not to see inequality as a relevant issue in making sense of macroeconomic fluctuations. So, as I see it, when it comes to inequality, mainstream economics (especially, as I say, mainstream macroeconomics) is haunted by both a theoretical and an ethical problem.

distribution

That’s why recent research by Kurt Mitman, Dirk Krueger, Fabrizio Perri is so interesting. What they show, using a standard macroeconomic model with household heterogeneity to account for an unequal wealth and consumption distribution, is that inequality does in fact matter. In particular, they demonstrate that the aggregate drop in expenditures depends on the distribution of wealth (e.g., it is much larger in an economy with many low-wealth consumers) and that the effects of a given macroeconomic shock are felt very differently in different segments of the wealth distribution (e.g., low-wealth households have little ability to insure themselves against risk, and thus the welfare impact of a recession is significantly larger for them). As a consequence, they make it abundantly clear that ignoring inequality means failing to understand the severity of a macroeconomic downturn and underestimating the welfare costs of a deep recession.

That’s not all the work that needs to be done, of course. Mitman et al. rely on exogenous macroeconomic shocks rather than analyzing how inequality itself plays a role in creating the conditions for an economic downturn. But even their limited attempt to include inequality as a significant factor in an otherwise-mainstream macroeconomic model demonstrates that such work can in fact be done.

In other words, it’s not that mainstream economists can’t make sense of inequality in their models. They simply, for the most part, choose not to.

Untitled

I know I shouldn’t. But there are so many wrong-headed assertions in the latest Bloomberg column by Noah Smith, “Economics Without Math Is Trendy, But It Doesn’t Add Up,” that I can’t let it pass.

But first let me give him credit for his opening observation, one I myself have made plenty of times on this blog and elsewhere:

There’s no question that mainstream academic macroeconomics failed pretty spectacularly in 2008. It didn’t just fail to predict the crisis — most models, including Nobel Prize-winning ones, didn’t even admit the possibility of a crisis. The vast majority of theories didn’t even include a financial sector.

And in the deep, long recession that followed, mainstream macro theory failed to give policymakers any consistent guidance as to how to respond. Some models recommended fiscal stimulus, some favored forward guidance by the central bank, and others said there was simply nothing at all to be done.

It is, in fact, as Smith himself claims, a “dismal record.”

But then Smith goes off the tracks, with a long series of misleading and mistaken assertions about economics, especially heterodox economics. Let me list some of them:

  • citing a mainstream economist’s characterization of heterodox economics (when he could have, just as easily, sent readers to the Heterodox Economics Directory—or, for that matter, my own blog posts on heterodox economics)
  • presuming that heterodox economics is mostly non-quantitative (although he might have consulted any number of books by economists from various heterodox traditions or journals in which heterodox economists publish articles, many of which contain quantitative—theoretical and empirical—work)
  • equating formal, mathematical, and quantitative (when, in fact, one can find formal models that are neither mathematical nor quantitative)
  • also equating nonquantitative, broad, and vague (when, in fact, there is plenty of nonquantitative work in economics that is quite specific and unambiguous)
  • arguing that nonquantitative economics is uniquely subject to interpretation and reinterpretation (as against, what, the singular meaning of the Arrow-Debreu general equilibrium system or the utility-maximization that serves as the microfoundations of mainstream macroeconomics?)
  • concluding that “heterodox economics hasn’t really produced a replacement for mainstream macro”

Actually, this is the kind of quick and easy dismissal of whole traditions—from Karl Marx to Hyman Minsky—most heterodox economists are quite familiar with.

My own view, for what it’s worth, is that there’s no need for work in economics to be formal, quantitative, or mathematical (however those terms are defined) in order for it be useful, valuable, or insightful (again, however defined)—including, of course, work in traditions that run from Marx to Minsky, that focused on the possibility of a crisis, warned of an impending crisis, and offered specific guidances of what to do once the crisis broke out.

But if Smith wants some heterodox macroeconomics that uses some combination of formal, mathematical, and quantitative techniques he need look no further than a volume of essays that happens to have been published in 2009 (and therefore written earlier), just as the crisis was spreading across the United States and the world economy. I’m referring to Heterodox Macroeconomics: Keynes, Marx and Globalization, edited by Jonathan P. Goldstein and Michael G. Hillard.

There, Smith will find the equation at the top of the post, which is very simple but contains an idea that one will simply not find in mainstream macroeconomics. It’s merely an income share-weighted version of a Keynesian consumption function (for a two-class world), which has the virtue of placing the distribution of income at the center of the macroeconomic story.* Add to that an investment function, which depends on the profit rate (which in turn depends on the profit share of income and capacity utilization) and you’ve got a system in which “alterations in the distribution of income can have important and potentially offsetting impacts on the level of effective demand.”

And heterodox traditions within macroeconomics have built on these relatively simply ideas, including

a microfounded Keynes–Marx theory of investment that further incorporates the external financing of investment based upon uncertain future profits, the irreversibility of investment and the coercive role of competition on investment. In this approach, the investment function is extended to depend on the profit rate, long-term and short-term heuristics for the firm’s financial robustness and the intensity of competition. It is the interaction of these factors that fundamentally alters the nature of the investment function, particularly the typical role assigned to capacity utilization. The main dynamic of the model is an investment-induced growth-financial safety tradeoff facing the firm. Using this approach, a ceteris paribus increase in the financial fragility of the firm reduces investment and can be used to explain autonomous financial crises. In addition, the typical behavior of the profit rate, particularly changes in income shares, is preserved in this theory. Along these lines, the interaction of the profit rate and financial determinants allows for real sector sources of financial fragility to be incorporated into a macro model. Here, a profit squeeze that shifts expectations of future profits forces firms and lenders to alter their perceptions on short-term and long-term levels of acceptable debt. The responses of these agents can produce a cycle based on increases in financial fragility.

It’s true: such a model does not lead to a specific forecast or prediction. (In fact, it’s more a long-term model than an explanation of short-run instabilities.) But it does provide an understanding of the movements of consumption and investment that help to explain how and why a crisis of capitalism might occur. Therefore, it represents a replacement for the mainstream macroeconomics that exhibited a dismal record with respect to the crash of 2007-08.

But maybe it’s not the lack of quantitative analysis in heterodox macroeconomics that troubles Smith so much. Perhaps it’s really the conclusion—the fact that

The current crisis combines the development of under-consumption, over-investment and financial fragility tendencies built up over the last 25 years and associated with a finance-led accumulation regime.

And, for that constellation of problems, there’s no particular advice or quick fix for Smith’s “policymakers and investors”—except, of course, to get rid of capitalism.

 

*Technically, consumption (C) is a function of the marginal propensity to consume of labor, labor’s share of income, the marginal propensity to consume of capital, and the profit share of income.

 

2014-10-19-incomeinequality

John Lennon (on the B side of “Imagine”) thought that life was hard, “really hard.” I can understand that.

But is modeling inequality really all that hard?

Paul Krugman seems to think so, at least when it comes to the size or personal distribution of income. That’s his excuse for why mainstream economists were late to the inequality party: they just didn’t know how to model it.

And, according to Krugman, not even Marx can be of much help.

Well, let’s see. It’s true, Marx focused on the factor distribution of income—wages, profits, and rent, to laborers, capitalists, and landowners—because his critique was directed at classical political economy. And the classical political economists—especially Smith and Ricardo—did, in fact, focus their attention on factor shares.

That was Marx’s goal in the chapter on the Trinity Formula: to show that what the classicals thought were separate sources of income to the three factors of production all stemmed from value created by labor. Thus, for example, laborers received in the form of wages part of the value they created (“that portion of his labour appears which we call necessary labour”); the rest, the surplus-value, was divided among capitalists (“as dividends proportionate to the share of the social capital each holds”) and landed property (which “is confined to transferring a portion of the produced surplus-value from the pockets of capital to its own”).

It is really just a short step to show that, in recent decades (from the mid-1970s onward), both that more surplus-value has been pumped out of the direct producers and that investment bankers, CEOs, and other members of the 1 percent have been able to capture a large share of that mass of surplus-value. That’s how we can connect changing factor (wage and profit) shares to the increasingly unequal individual distribution of income (including the rising percentage of income going to the top 1, .01, and .001 percents).

See, that wasn’t so hard. . .

6a01348793456c970c01b8d0708b22970c-800wi

source

You remember the dialogue:

Queen: Slave in the magic mirror, come from the farthest space, through wind and darkness I summon thee. Speak! Let me see thy face.

Magic Mirror: What wouldst thou know, my Queen?

Queen: Magic Mirror on the wall, who is the fairest one of all?

Magic Mirror: Famed is thy beauty, Majesty. But hold, a lovely maid I see. Rags cannot hide her gentle grace. Alas, she is more fair than thee.

I was reminded of this particular snippet from Snow White and the Seven Dwarfs while reading the various defenses of contemporary macroeconomic models. Mainstream macroeconomists failed to predict the most recent economic crisis, the worst since the Great Depression of the 1930s, but, according to them everything in macroeconomics is just fine.

There’s David Andolfatto, who argues that the goal of macro models is not really prediction; it is, instead, only conditional forecasts (“IF a set of circumstances hold, THEN a number of events are likely to follow.”). So, in his view, the existing models are mostly fine—as long as they’re supplemented with some “financial market frictions” and a bit of economic history.

Mark Thoma, for his part, mostly agrees with Andolfatto but adds we need to ask the right questions.

we didn’t foresee the need to ask questions (and build models) that would be useful in a financial crisis — we were focused on models that would explain “normal times” (which is connected to the fact that we thought the Great Moderation would continue due to arrogance on behalf of economists leading to the belief that modern policy tools, particularly from the Fed, would prevent major meltdowns, financial or otherwise). That is happening now, so we’ll be much more prepared if history repeats itself, but I have to wonder what other questions we should be asking, but aren’t.

Then, of course, there’s Paul Krugman who (not for the first time) defends hydraulic Keynesianism (aka Hicksian IS/LM models)—”little equilibrium models with some real-world adjustments”—which in his view have been “stunningly successful.”

And, finally, to complete my sample from just the last couple of days, we have Noah Smith, who defends the existing macroeconomic models—because they’re models!—and chides heterodox economists for not having any alternative models to offer.

The issue, as I see it, is not whether there’s a macroeconomic model (e.g., dynamic stochastic general equilibrium, as depicted in the illustration above, or Bastard Keynesian or whatever) that can, with the appropriate external “shock,” generate a boom-and-bust cycle or zero-lower-bound case for government intervention. There’s a whole host of models that can generate such outcomes.

No, there are two real issues that are never even mentioned in these attempts to defend contemporary macroeconomic models. First, what is widely recognized to be the single most important economic problem of our time—the growing inequality in the distribution of income and wealth—doesn’t (and, in models with a single representative agent, simply can’t) play a role in either causing boom-and-bust cycles or as a result of the lopsided recovery that has come from the kinds of fiscal and monetary policies that have been used in recent years.

That’s the specific issue. And then there’s a second, more general issue: the only way you can get an economic crisis from mainstream models (of whatever stripe, using however much math) is via some kind of external shock. The biggest problem with existing models is not that they failed to predict the crisis; it’s that the only possibility of a crisis comes from an exogenous event. The key failure of mainstream macroeconomic models is to exclude from analysis the idea that the “normal” workings of capitalism generate economic crises on a regular basis—some of which are relatively mild recessions, others of which (such as we’ve seen since 2007) are full-scale depressions.  What really should be of interest are theories that generate boom-and-bust cycles based on endogenous events within capitalism itself.

With respect to both these issues, contemporary mainstream macroeconomic models have “stunningly” failed.

I imagine that’s what the slave in the magic mirror, who simply will not lie to the Queen, would say.

Murner.Nerrenbeschwerung.kind

Mainstream economics has been a disaster, especially since the crash of 2007-08. It wasn’t able to predict the onset of the crisis. It didn’t even include the possibility of such a crisis. And it certainly hasn’t been a reliable guide to getting out of the crisis.

And yet economist after economist has been stepping forward—even on the liberal side of things—to try to convince us that things are pretty much OK in the land of mainstream economics.

Just the other day, Paul Krugman tried to convince us that, leaving aside the failure to predict the crisis or even envisioning the possibility of a crisis occurring, mainstream models “did a pretty good job of predicting how things would play out in the aftermath.” The problem, for Krugman, all comes down to the “bad behavior” of some economists who have been more interested in defending partisan turf than in getting things right.

Now, Mark Thoma wants to argue that the macroeconomic models—including the “dynamic stochastic general equilibrium” models that have become the stock-in-trade of mainstream macroeconomics for the past couple of decades—are just fine. The problem, as Thoma sees it, is not with the theory or the models but with the questions economists have been asking.

What neither Krugman nor Thoma wants to admit is those very same models—hydraulic IS-LM in the case of Krugman, the rational expectations, dynamic optimizing, and representative agents of DSGE—actually direct the behavior of economists and delimit the questions they can ask. Those models are so many theoretical lenses on the world, which determine how the economists who use them interpret the world.

I understand: Krugman and Thoma desperately want to keep the precious baby. But that also means we’re stuck with the increasingly dirty bathwater.