Posts Tagged ‘models’

p1-bz033_liondo_16u_20161022171815

source

Back in 2010, Charles Ferguson, the director of Inside Job, exposed the failure of prominent mainstream economists who wrote about and spoke on matters of economic policy to disclose their conflicts of interest in the lead-up to the crash of 2007-08. Reuters followed up by publishing a special report on the lack of a clear standard of disclosure for economists and other academics who testified before the Senate Banking Committee and the House Financial Services Committee between late 2008 and early 2010, as lawmakers debated the biggest overhaul of financial regulation since the 1930s.

Well, economists are still at it, leveraging their academic prestige with secret reports justifying corporate concentration.

That’s according to a new report from ProPublica:

If the government ends up approving the $85 billion AT&T-Time Warner merger, credit won’t necessarily belong to the executives, bankers, lawyers, and lobbyists pushing for the deal. More likely, it will be due to the professors.

A serial acquirer, AT&T must persuade the government to allow every major deal. Again and again, the company has relied on economists from America’s top universities to make its case before the Justice Department or the Federal Trade Commission. Moonlighting for a consulting firm named Compass Lexecon, they represented AT&T when it bought Centennial, DirecTV, and Leap Wireless; and when it tried unsuccessfully to absorb T-Mobile. And now AT&T and Time Warner have hired three top Compass Lexecon economists to counter criticism that the giant deal would harm consumers and concentrate too much media power in one company.

Today, “in front of the government, in many cases the most important advocate is the economist and lawyers come second,” said James Denvir, an antitrust lawyer at Boies, Schiller.

Economists who specialize in antitrust — affiliated with Chicago, Harvard, Princeton, the University of California, Berkeley, and other prestigious universities — reshaped their field through scholarly work showing that mergers create efficiencies of scale that benefit consumers. But they reap their most lucrative paydays by lending their academic authority to mergers their corporate clients propose. Corporate lawyers hire them from Compass Lexecon and half a dozen other firms to sway the government by documenting that a merger won’t be “anti-competitive”: in other words, that it won’t raise retail prices, stifle innovation, or restrict product offerings. Their optimistic forecasts, though, often turn out to be wrong, and the mergers they champion may be hurting the economy.

Right now, the United States is experiencing a wave of corporate mergers and acquisitions, leading to increasing levels of concentration, reminiscent of the first Gilded Age. And, according to ProPublica, a small number of hired guns from economics—who routinely move through the revolving door between government and corporate consulting—have written reports for and testified in favor of dozens of takeovers involving AT&T and many of the country’s other major corporations.

Looking forward, the appointment of Republican former U.S. Federal Trade Commission member Joshua Wright to lead Donald Trump’s transition team that is focused on the Federal Trade Commission may signal even more mergers in the years ahead. Earlier this month Wright expressed his view that

Economists have long rejected the “antitrust by the numbers” approach. Indeed, the quiet consensus among antitrust economists in academia and within the two antitrust agencies is that mergers between competitors do not often lead to market power but do often generate significant benefits for consumers — lower prices and higher quality. Sometimes mergers harm consumers, but those instances are relatively rare.

Because the economic case for a drastic change in merger policy is so weak, the new critics argue more antitrust enforcement is good for political reasons. Big companies have more political power, they say, so more antitrust can reduce this power disparity. Big companies can pay lower wages, so we should allow fewer big firms to merge to protect the working man. And big firms make more money, so using antitrust to prevent firms from becoming big will reduce income inequality too. Whatever the merits of these various policy goals, antitrust is an exceptionally poor tool to use to achieve them. Instead of allowing consumers to decide companies’ fates, courts and regulators decided them based on squishy assessments of impossible things to measure, like accumulated political power. The result was that antitrust became a tool to prevent firms from engaging in behavior that benefited consumers in the marketplace.

And, no doubt, there will be plenty of mainstream economists who will be willing, for large payouts, to present the models that justify a new wave of corporate mergers and acquisitions in the years ahead.

10

Mark Tansey, “Coastline Measure” (1987)

The pollsters got it wrong again, just as they did with the Brexit vote and the Colombia peace vote. In each case, they incorrectly predicted one side would win—Hillary Clinton, Remain, and yes—and many of us were taken in by the apparent certainty of the results.

I certainly was. In each case, I told family members, friends, and acquaintances it was quite possible the polls were wrong. But still, as the day approached, I found myself believing the “experts.”

It still seems, when it comes to polling, we have a great deal of difficult with uncertainty:

Berwood Yost of Franklin & Marshall College said he wants to see polling get more comfortable with uncertainty. “The incentives now favor offering a single number that looks similar to other polls instead of really trying to report on the many possible campaign elements that could affect the outcome,” Yost said. “Certainty is rewarded, it seems.”

But election results are not the only area where uncertainty remains a problematic issue. Dani Rodrik thinks mainstream economists would do a better job defending the status quo if they acknowledged their uncertainty about the effects of globalization.

This reluctance to be honest about trade has cost economists their credibility with the public. Worse still, it has fed their opponents’ narrative. Economists’ failure to provide the full picture on trade, with all of the necessary distinctions and caveats, has made it easier to tar trade, often wrongly, with all sorts of ill effects. . .

In short, had economists gone public with the caveats, uncertainties, and skepticism of the seminar room, they might have become better defenders of the world economy.

To be fair, both groups—pollsters and mainstream economists—acknowledge the existence of uncertainty. Pollsters (and especially poll-based modelers, like one of the best, Nate Silver, as I’ve discussed here and here) always say they’re recognizing and capturing uncertainty, for example, in the “error term.”

silver

Even Silver, whose model included a much higher probability of a Donald Trump victory than most others, expressed both defensiveness about and confidence in his forecast:

Despite what you might think, we haven’t been trying to scare anyone with these updates. The goal of a probabilistic model is not to provide deterministic predictions (“Clinton will win Wisconsin”) but instead to provide an assessment of probabilities and risks. In 2012, the risks to to Obama were lower than was commonly acknowledged, because of the low number of undecided voters and his unusually robust polling in swing states. In 2016, just the opposite is true: There are lots of undecideds, and Clinton’s polling leads are somewhat thin in swing states. Nonetheless, Clinton is probably going to win, and she could win by a big margin.

slide_24

As for the mainstream economists, while they may acknowledge exceptions to the rule that “everyone benefits” from free markets and international trade in some of their models and seminar discussions, they acknowledge no uncertainty whatsoever when it comes to celebrating the current economic system in their textbooks and public pronouncements.

So, what’s the alternative? They (and we) need to find better ways of discussing and possibly “modeling” uncertainty. Since the margins of error, different probabilities, and exceptions to the rule are ways of hedging their bets anyway, why not just discuss the range of possible outcomes and all of what is included and excluded, said and unsaid, measurable and unmeasurable, and so forth?

The election pollsters and statisticians may claim the public demands a single projection, prediction, or forecast. By the same token, the mainstream economists are no doubt afraid of letting the barbarian critics through the gates. In both cases, the effect is to narrow the range of relevant factors and the likelihood of outcomes.

One alternative is to open up the models and develop a more robust language to talk about fundamental uncertainty. “We simply don’t know what’s going to happen.” In both cases, that would mean presenting the full range of possible outcomes (including the possibility that there can be still other possibilities, which haven’t been considered) and discussing the biases built into the models themselves (based on the assumptions that have been used to construct them). Instead of the pseudo-rigor associated with deterministic predictions, we’d have a real rigor predicated on uncertainty, including the uncertainty of the modelers themselves.

Admitting that they (and therefore we) simply don’t know would be a start.

phlogiston07_1000

Stanislas Wolff, “Phlogiston”

The other day, I argued (as I have many times over the years) that contemporary mainstream macroeconomics is in a sorry state.

Mainstream macroeconomists didn’t predict the crash. They didn’t even include the possibility of such a crash within their theory or models. And they certainly didn’t know what to do once the crash occurred.

I’m certainly not the only one who is critical of the basic theory and models of contemporary mainstream macroeconomics. And, at least recently (and, one might say, finally), many of the other critics are themselves mainstream economists—such as MIT emeritus professor and former IMF chief economist Olivier Blanchard (pdf), who has noted that the models that are central to mainstream economic research—so-called dynamic stochastic general equilibrium models—are “seriously flawed.”

Now, one of the most mainstream of the mainstream, Paul Romer (pdf), soon to be chief economist at the World Bank, has taken aim at mainstream macroeconomics.* You can get a taste of the severity of his criticisms from the abstract:

For more than three decades, macroeconomics has gone backwards. The treatment of identification now is no more credible than in the early 1970s but escapes challenge because it is so much more opaque. Macroeconomic theorists dismiss mere facts by feigning an obtuse ignorance about such simple assertions as “tight monetary policy can cause a recession.” Their models attribute fluctuations in aggregate variables to imaginary causal forces that are not influenced by the action that any person takes. A parallel with string theory from physics hints at a general failure mode of science that is triggered when respect for highly regarded leaders evolves into a deference to authority that displaces objective fact from its position as the ultimate determinant of scientific truth.

That’s right: in Romer’s view, macroeconomics (by which he means mainstream macroeconomics) “has gone backwards” for more than three decades.

Romer’s particular concern is with the “identification problem,” which in econometrics has to do with being able to solve for unique values of the parameters of a model (the so-called structural model, usually of simultaneous equations) from the values of the parameters of the reduced form of the model (i.e., the model in which the endogenous variables are expressed as functions of the exogenous variables). A supply-and-demand model of a market is a good example: it is not enough, in attempting to identify the two different supply and demand equations, to solely use observations of different quantities and prices. In particular, it’s impossible to estimate a downward slope (of the demand curve) and an upward slope (of the supply curve) with one linear regression line involving only two variables. That’s because both supply and demand curves can be shifting at the same time, and it can be difficult to disentangle the two effects. That, in a nutshell, is the “identification problem.”

The problem is similar in macroeconomic models, and Romer finds that many mainstream economists rely on models that require and presume exogenous shocks—imaginary shocks, which “occur at just the right time and by just the right amount” (hence phlogiston)—to generate the desired results. Thus, in his view, “the real business cycle model explains recessions as exogenous decreases in phlogiston.”

The issue with phlogiston is that it can’t be directly measured. Nor, as it turns out, can many of the other effects invoked by mainstream economists. Here’s how Romer summarizes these imaginary effects:

  • A general type of phlogiston that increases the quantity of consumption goods produced by given inputs
  • An “investment-specific” type of phlogiston that increases the quantity of capital goods produced by given inputs
  • A troll who makes random changes to the wages paid to all workers
  • A gremlin who makes random changes to the price of output
  • Aether, which increases the risk preference of investors
  • Caloric, which makes people want less leisure

So, there you have it: in Romer’s view, contemporary mainstream economists rely on various types of phlogiston, a troll, a gremlin, aether, and caloric. That’s how they attempt to solve the identification problem in their models.

But, for Romer, there’s a second identification problem: mainstream economists continue to build and apply these phlogiston-identified dynamic stochastic general equilibrium models because they have “a sense of identification with the group akin to identification with a religious faith or political platform.”

The conditions for failure are present when a few talented researchers come to be respected for genuine contributions on the cutting edge of mathematical modeling. Admiration evolves into deference to these leaders. Deference leads to effort along the specific lines that the leaders recommend. Because guidance from authority can align the efforts of many researchers, conformity to the facts is no longer needed as a coordinating device. As a result, if facts disconfirm the officially sanctioned theoretical vision, they are subordinated. Eventually, evidence stops being relevant. Progress in the field is judged by the purity of its mathematical theories, as determined by the authorities.

I, for one, have no problem with group identification (I often identify with Marxists and many of the other strangers in the strange land of economics). But when it’s identification with a few leaders, and when it’s an issue of the purity of the mathematics—and not shedding light on what is actually going on out there—well, then, there’s a serious problem.

As it turns out, modern mainstream economics has two identification problems—one in the imaginary solution of the models, the other with the imagined purity of the mathematics. Together, the two identification problems mean that what is often taken to be the cutting edge of modern macroeconomics is in fact seriously flawed—and has become increasingly flawed for more than three decades.

But let me leave the last word to Daniel Drezner, who has lost all patience with mainstream economists’ self-satisfaction with their theories, models, and standing in the world:

this is a complete and total crock.

 

*Other mainstream economists, such as Narayana Kocherlakota and Noah Smith, have expressed their substantial agreement with Romer.

27UP-View-master675-v3

I have argued many times over the years that mainstream economists, especially mainstream macroeconomists, largely ignore the issue of inequality. And when they do see it, they tend to misunderstand both its causes (often attributing it to exogenous events, such as globalization and technical change) and its consequences (often failing to connect it, other than through “political capture,” to events like the crash of 2007-08).

In my view, mainstream economists overlook or forget about the role inequality plays, especially in macroeconomic events, for two major reasons. First, their theoretical and empirical models—either based on a representative agent or undifferentiated macroeconomic relationships (such as consumption and investment)—can be solved without ever conceptualizing or measuring inequality. The models they use create a theoretical blindspot. But, second, even when it’s clear they could include inequality as a significant factor, they don’t. They literally choose not to see inequality as a relevant issue in making sense of macroeconomic fluctuations. So, as I see it, when it comes to inequality, mainstream economics (especially, as I say, mainstream macroeconomics) is haunted by both a theoretical and an ethical problem.

distribution

That’s why recent research by Kurt Mitman, Dirk Krueger, Fabrizio Perri is so interesting. What they show, using a standard macroeconomic model with household heterogeneity to account for an unequal wealth and consumption distribution, is that inequality does in fact matter. In particular, they demonstrate that the aggregate drop in expenditures depends on the distribution of wealth (e.g., it is much larger in an economy with many low-wealth consumers) and that the effects of a given macroeconomic shock are felt very differently in different segments of the wealth distribution (e.g., low-wealth households have little ability to insure themselves against risk, and thus the welfare impact of a recession is significantly larger for them). As a consequence, they make it abundantly clear that ignoring inequality means failing to understand the severity of a macroeconomic downturn and underestimating the welfare costs of a deep recession.

That’s not all the work that needs to be done, of course. Mitman et al. rely on exogenous macroeconomic shocks rather than analyzing how inequality itself plays a role in creating the conditions for an economic downturn. But even their limited attempt to include inequality as a significant factor in an otherwise-mainstream macroeconomic model demonstrates that such work can in fact be done.

In other words, it’s not that mainstream economists can’t make sense of inequality in their models. They simply, for the most part, choose not to.

Untitled

I know I shouldn’t. But there are so many wrong-headed assertions in the latest Bloomberg column by Noah Smith, “Economics Without Math Is Trendy, But It Doesn’t Add Up,” that I can’t let it pass.

But first let me give him credit for his opening observation, one I myself have made plenty of times on this blog and elsewhere:

There’s no question that mainstream academic macroeconomics failed pretty spectacularly in 2008. It didn’t just fail to predict the crisis — most models, including Nobel Prize-winning ones, didn’t even admit the possibility of a crisis. The vast majority of theories didn’t even include a financial sector.

And in the deep, long recession that followed, mainstream macro theory failed to give policymakers any consistent guidance as to how to respond. Some models recommended fiscal stimulus, some favored forward guidance by the central bank, and others said there was simply nothing at all to be done.

It is, in fact, as Smith himself claims, a “dismal record.”

But then Smith goes off the tracks, with a long series of misleading and mistaken assertions about economics, especially heterodox economics. Let me list some of them:

  • citing a mainstream economist’s characterization of heterodox economics (when he could have, just as easily, sent readers to the Heterodox Economics Directory—or, for that matter, my own blog posts on heterodox economics)
  • presuming that heterodox economics is mostly non-quantitative (although he might have consulted any number of books by economists from various heterodox traditions or journals in which heterodox economists publish articles, many of which contain quantitative—theoretical and empirical—work)
  • equating formal, mathematical, and quantitative (when, in fact, one can find formal models that are neither mathematical nor quantitative)
  • also equating nonquantitative, broad, and vague (when, in fact, there is plenty of nonquantitative work in economics that is quite specific and unambiguous)
  • arguing that nonquantitative economics is uniquely subject to interpretation and reinterpretation (as against, what, the singular meaning of the Arrow-Debreu general equilibrium system or the utility-maximization that serves as the microfoundations of mainstream macroeconomics?)
  • concluding that “heterodox economics hasn’t really produced a replacement for mainstream macro”

Actually, this is the kind of quick and easy dismissal of whole traditions—from Karl Marx to Hyman Minsky—most heterodox economists are quite familiar with.

My own view, for what it’s worth, is that there’s no need for work in economics to be formal, quantitative, or mathematical (however those terms are defined) in order for it be useful, valuable, or insightful (again, however defined)—including, of course, work in traditions that run from Marx to Minsky, that focused on the possibility of a crisis, warned of an impending crisis, and offered specific guidances of what to do once the crisis broke out.

But if Smith wants some heterodox macroeconomics that uses some combination of formal, mathematical, and quantitative techniques he need look no further than a volume of essays that happens to have been published in 2009 (and therefore written earlier), just as the crisis was spreading across the United States and the world economy. I’m referring to Heterodox Macroeconomics: Keynes, Marx and Globalization, edited by Jonathan P. Goldstein and Michael G. Hillard.

There, Smith will find the equation at the top of the post, which is very simple but contains an idea that one will simply not find in mainstream macroeconomics. It’s merely an income share-weighted version of a Keynesian consumption function (for a two-class world), which has the virtue of placing the distribution of income at the center of the macroeconomic story.* Add to that an investment function, which depends on the profit rate (which in turn depends on the profit share of income and capacity utilization) and you’ve got a system in which “alterations in the distribution of income can have important and potentially offsetting impacts on the level of effective demand.”

And heterodox traditions within macroeconomics have built on these relatively simply ideas, including

a microfounded Keynes–Marx theory of investment that further incorporates the external financing of investment based upon uncertain future profits, the irreversibility of investment and the coercive role of competition on investment. In this approach, the investment function is extended to depend on the profit rate, long-term and short-term heuristics for the firm’s financial robustness and the intensity of competition. It is the interaction of these factors that fundamentally alters the nature of the investment function, particularly the typical role assigned to capacity utilization. The main dynamic of the model is an investment-induced growth-financial safety tradeoff facing the firm. Using this approach, a ceteris paribus increase in the financial fragility of the firm reduces investment and can be used to explain autonomous financial crises. In addition, the typical behavior of the profit rate, particularly changes in income shares, is preserved in this theory. Along these lines, the interaction of the profit rate and financial determinants allows for real sector sources of financial fragility to be incorporated into a macro model. Here, a profit squeeze that shifts expectations of future profits forces firms and lenders to alter their perceptions on short-term and long-term levels of acceptable debt. The responses of these agents can produce a cycle based on increases in financial fragility.

It’s true: such a model does not lead to a specific forecast or prediction. (In fact, it’s more a long-term model than an explanation of short-run instabilities.) But it does provide an understanding of the movements of consumption and investment that help to explain how and why a crisis of capitalism might occur. Therefore, it represents a replacement for the mainstream macroeconomics that exhibited a dismal record with respect to the crash of 2007-08.

But maybe it’s not the lack of quantitative analysis in heterodox macroeconomics that troubles Smith so much. Perhaps it’s really the conclusion—the fact that

The current crisis combines the development of under-consumption, over-investment and financial fragility tendencies built up over the last 25 years and associated with a nance- led accumulation regime.

And, for that constellation of problems, there’s no particular advice or quick fix for Smith’s “policymakers and investors”—except, of course, to get rid of capitalism.

 

*Technically, consumption (C) is a function of the marginal propensity to consume of labor, labor’s share of income, the marginal propensity to consume of capital, and the profit share of income.

 

2014-10-19-incomeinequality

John Lennon (on the B side of “Imagine”) thought that life was hard, “really hard.” I can understand that.

But is modeling inequality really all that hard?

Paul Krugman seems to think so, at least when it comes to the size or personal distribution of income. That’s his excuse for why mainstream economists were late to the inequality party: they just didn’t know how to model it.

And, according to Krugman, not even Marx can be of much help.

Well, let’s see. It’s true, Marx focused on the factor distribution of income—wages, profits, and rent, to laborers, capitalists, and landowners—because his critique was directed at classical political economy. And the classical political economists—especially Smith and Ricardo—did, in fact, focus their attention on factor shares.

That was Marx’s goal in the chapter on the Trinity Formula: to show that what the classicals thought were separate sources of income to the three factors of production all stemmed from value created by labor. Thus, for example, laborers received in the form of wages part of the value they created (“that portion of his labour appears which we call necessary labour”); the rest, the surplus-value, was divided among capitalists (“as dividends proportionate to the share of the social capital each holds”) and landed property (which “is confined to transferring a portion of the produced surplus-value from the pockets of capital to its own”).

It is really just a short step to show that, in recent decades (from the mid-1970s onward), both that more surplus-value has been pumped out of the direct producers and that investment bankers, CEOs, and other members of the 1 percent have been able to capture a large share of that mass of surplus-value. That’s how we can connect changing factor (wage and profit) shares to the increasingly unequal individual distribution of income (including the rising percentage of income going to the top 1, .01, and .001 percents).

See, that wasn’t so hard. . .

6a01348793456c970c01b8d0708b22970c-800wi

source

You remember the dialogue:

Queen: Slave in the magic mirror, come from the farthest space, through wind and darkness I summon thee. Speak! Let me see thy face.

Magic Mirror: What wouldst thou know, my Queen?

Queen: Magic Mirror on the wall, who is the fairest one of all?

Magic Mirror: Famed is thy beauty, Majesty. But hold, a lovely maid I see. Rags cannot hide her gentle grace. Alas, she is more fair than thee.

I was reminded of this particular snippet from Snow White and the Seven Dwarfs while reading the various defenses of contemporary macroeconomic models. Mainstream macroeconomists failed to predict the most recent economic crisis, the worst since the Great Depression of the 1930s, but, according to them everything in macroeconomics is just fine.

There’s David Andolfatto, who argues that the goal of macro models is not really prediction; it is, instead, only conditional forecasts (“IF a set of circumstances hold, THEN a number of events are likely to follow.”). So, in his view, the existing models are mostly fine—as long as they’re supplemented with some “financial market frictions” and a bit of economic history.

Mark Thoma, for his part, mostly agrees with Andolfatto but adds we need to ask the right questions.

we didn’t foresee the need to ask questions (and build models) that would be useful in a financial crisis — we were focused on models that would explain “normal times” (which is connected to the fact that we thought the Great Moderation would continue due to arrogance on behalf of economists leading to the belief that modern policy tools, particularly from the Fed, would prevent major meltdowns, financial or otherwise). That is happening now, so we’ll be much more prepared if history repeats itself, but I have to wonder what other questions we should be asking, but aren’t.

Then, of course, there’s Paul Krugman who (not for the first time) defends hydraulic Keynesianism (aka Hicksian IS/LM models)—”little equilibrium models with some real-world adjustments”—which in his view have been “stunningly successful.”

And, finally, to complete my sample from just the last couple of days, we have Noah Smith, who defends the existing macroeconomic models—because they’re models!—and chides heterodox economists for not having any alternative models to offer.

The issue, as I see it, is not whether there’s a macroeconomic model (e.g., dynamic stochastic general equilibrium, as depicted in the illustration above, or Bastard Keynesian or whatever) that can, with the appropriate external “shock,” generate a boom-and-bust cycle or zero-lower-bound case for government intervention. There’s a whole host of models that can generate such outcomes.

No, there are two real issues that are never even mentioned in these attempts to defend contemporary macroeconomic models. First, what is widely recognized to be the single most important economic problem of our time—the growing inequality in the distribution of income and wealth—doesn’t (and, in models with a single representative agent, simply can’t) play a role in either causing boom-and-bust cycles or as a result of the lopsided recovery that has come from the kinds of fiscal and monetary policies that have been used in recent years.

That’s the specific issue. And then there’s a second, more general issue: the only way you can get an economic crisis from mainstream models (of whatever stripe, using however much math) is via some kind of external shock. The biggest problem with existing models is not that they failed to predict the crisis; it’s that the only possibility of a crisis comes from an exogenous event. The key failure of mainstream macroeconomic models is to exclude from analysis the idea that the “normal” workings of capitalism generate economic crises on a regular basis—some of which are relatively mild recessions, others of which (such as we’ve seen since 2007) are full-scale depressions.  What really should be of interest are theories that generate boom-and-bust cycles based on endogenous events within capitalism itself.

With respect to both these issues, contemporary mainstream macroeconomic models have “stunningly” failed.

I imagine that’s what the slave in the magic mirror, who simply will not lie to the Queen, would say.