Posts Tagged ‘economists’

employment-rebound

Special mention

07-02-2020-mcfadden-915px

job-report-shocks-economists

Special mention

1_8  protective-gear

Tom Toles Editorial Cartoon - tt_c_c200510.tif

Special mention

05-07-2020-mcfadden-915px

Special mention

TMW2020-05-13color

Roy

Jean-Pierre Roy, “The Sultan and the Strange Loop” (2016)

The U word has once again reared its ugly head.

I’m referring of course to uncertainty, which at least a few of us are pleased has returned to occupy a prominent role in relation to scientific discourse. The idea that we simply do not know is swirling around us, haunting pretty much every pronouncement by economists, virological scientists, epidemiological modelers, and the like.

How many people will contract the novel coronavirus? How many fatalities has the virus caused thus far? And how many people will eventually die because of it? Do face masks work? How many workers have been laid off? How severe will the economic meltdown be in the second quarter and for the rest of the year?

We read and hear lots of answers to those questions but, while individual forecasts and predictions are often presented as uniquely “correct,” they differ from one another and change so often we are forced to admit our knowledge is radically uncertain.

Uncertainty, it seems, erupts every time normalcy is suspended and we are forced to confront the normal workings of scientific practice. It certainly happened during the first Great Depression, when John Maynard Keynes used the idea of radical uncertainty—as against probabilistic risk—to challenge neoclassical economics and its rosy predictions of stable growth and full employment.* And it occurred again during the second Great Depression, when mainstream macroeconomics, especially the so-called dynamic stochastic general equilibrium approach, was criticized for failing to take into account “massive uncertainty,” that is, the impossibility of predicting surprises and situations in which we simply do not know what is going to happen.

silver

The issue of uncertainty came to the fore again after the election of Donald Trump, which came as a shock to many—even though polls showed a race that was both fairly close and highly uncertain. FiveThirtyEight’s final, pre-election forecast put Hillary Clinton’s chance of winning at 71.4 percent, which elicited quite a few criticisms and attacks, since other models were much more confident about Clinton, variously putting her chances at 92 percent to 99 percent. But, as Nate Silver explained just after the election,

one of the reasons to build a model — perhaps the most important reason — is to measure uncertainty and to account for risk. If polling were perfect, you wouldn’t need to do this. . .There was widespread complacency about Clinton’s chances in a way that wasn’t justified by a careful analysis of the data and the uncertainties surrounding it.

In my view, Silver is one of the best when it comes to admitting the enormous gap between what we claim to know and what we actually know (as I argued back in 2012), which however is often undermined in an attempt to make the results of models seem more accurate and to conform to expectations.

And that’s just as much the case in social sciences (including, and perhaps especially, economics) and the natural sciences as it is in weather forecasting. Many, perhaps most, practitioners and pundits operate as if science is a single set of truths and not a discourse, with all the strengths and failings that implies. What I’m referring to are all the uncertainties, not to mention indeterminisms, linguistic risks and confusions, referrals and deferences to other knowledges and discourses, embedded assumptions (e.g., in both the data-gathering and the modeling) that are attendant upon any practice of discursive production and dissemination.

As Siobhan Roberts recently argued,

Science is full of epistemic uncertainty. Circling the unknowns, inching toward truth through argument and experiment is how progress is made. But science is often expected to be a monolithic collection of all the right answers. As a result, some scientists — and the politicians, policymakers and journalists who depend on them — are reluctant to acknowledge the inherent uncertainties, worried that candor undermines credibility.

What that means, in my view, is science is always subject to discussion and debate within and between contending positions, and therefore decisions need to be made —about facts, concepts, theories, models, and much else—all along the way.

As it turns out, acknowledging that uncertainty, and therefore openly disclosing the range of possible outcomes, does not undermine public trust in scientific facts and predictions. That was the conclusion of a study recently published in the Proceedings of the National Academy of the Sciences.

In the “posttruth” era where facts are increasingly contested, a common assumption is that communicating uncertainty will reduce public trust. . .Results show that whereas people do perceive greater uncertainty when it is communicated, we observed only a small decrease in trust in numbers and trustworthiness of the source, and mostly for verbal uncertainty communication. These results could help reassure all communicators of facts and science that they can be more open and transparent about the limits of human knowledge.

Even if communicating uncertainty does decrease people’s trust in and perceived reliability of scientific facts, including numbers, that in my view is not a bad thing. It serves to challenge the usual (especially these days, among liberals, progressives, and others who embrace a Superman theory of truth) that everyone can and should rely on science to make the key decisions.*** The alternative is to admit and accept that decision-making, under uncertainty, is both internal and external to scientific practice. The implication, as I see it, is that the production and communication of scientific facts as well as their subsequent use by other scientists and the general public is a contested terrain, full of uncertainty. 

Last year, even before the coronavirus pandemic, Scientific American [unfortunately, behind a paywall] published a special issue titled “Truth, Lies, and Uncertainty.” The symposium covers a wide range of topics, from medicine and mathematics to statistics and paleobiology. For those of us in economics, perhaps the most relevant is the article on physics (“Virtually Reality, by George Musser).

Musser begins by noting that “physics seems to be one of the only domains of human life where truth is clear-cut.”

The laws of physics describe hard reality. They are grounded in mathematical rigor and experimental proof. They give answers, not endless muddle. There is not one physics for you and one physics for me but a single physics for everyone and everywhere.

Or so it appears.

In fact, Musser explains, practicing physicists operate with considerable doubt and uncertainty, on everything from fundamental theories (such as quantum mechanics and string theory) to bench science (“Is a wire broken? Is the code buggy? Is the measurement a statistical fluke?”).

Consider, for example, quantum theory: if you

take quantum theory to be a representation of the world, you are led to think of it as a theory of co-existing alternative realities. Such multiple worlds or parallel universes also seem to be a consequence of cosmological theories: the same processes that gave rise to our universe should beget others as well. Additional parallel universes could exist in higher dimensions of space beyond our view. Those universes are populated with variations on our own universe. There is not a single definite reality.

Although theories that predict a multiverse are entirely objective—no observers or observer-dependent quantities appear in the basic equations—they do not eliminate the observer’s role but merely relocate it. They say that our view of reality is heavily filtered, and we have to take that into account when applying the theory. If we do not see a photon do two contradictory things at once, it does not mean the photon is not doing both. It might just mean we get to see only one of them. Likewise, in cosmology, our mere existence creates a bias in our observations. We necessarily live in a universe that can support human life, so our measurements of the cosmos might not be fully representative.

Musser’s view is that accepting uncertainty in physics actually leads to a better scientific practice, as long as physicists themselves are the ones who attempt to point out problems with their own ideas.

So, if physicists are willing to live with—and even to celebrate—uncertain knowledge, and even if the general public does lose a bit of trust when a degree of uncertainty is revealed, then it’s time for the rest of us (perhaps especially economists) to relinquish the idea of certain scientific knowledge.

Then, as Maggie Koerth recently explained in relation to the coronavirus pandemic, instead of waiting around around for “absolute, unequivocal facts” to decide our fate, we can get on with the task of making the “big, serious decisions” that currently face us.

 

*Although, as I explained back in 2011, the idea of fundamental uncertainty was first introduced into mainstream economic discourse by Frank Knight.

**And later central bankers (such as the Bank of England’s Andy Haldane) discovered that admitting uncertainty might actually “enhance understanding and therefore authority.”

***The irony is that “the Left” used to be skeptical about and critical of much of modern science—from phrenology, craniometry, and social Darwinism to the atom bomb, sociobiology, and evolutionary psychology.

 

dilbert-economist-3

Special mention

233282  refugees_years__emad_hajjaj

190902_r34850

Obscene levels of economics inequality in the United States are now so obvious they’ve become one of the main topics of public and political discourse (alongside and intertwined with two others, the climate crisis and the impeachment of Donald Trump).*

Most Americans, it seems, are aware of and increasingly incensed by the grotesque and still-growing gap between a tiny group at the top—wealthy individuals and large corporations—and everyone else. And this sense of unfairness and injustice is reflected in both the media and political campaigns. For example, Capital & Main, an award-winning nonprofit publication that reports from California, has launched a twelve-month long series on economic inequality in America, “United States of Inequality: 2020 and the Great Divide,” leading up to next year’s presidential election. And two of the leading presidential candidates in the Democratic Party, Bernie Sanders and Elizabeth Warren, have responded by making economic inequality one of the signature issues of their primary campaigns, regularly describing the devastating consequences of the enormous gap between the haves and have-nots and proposing policies (such as a wealth tax) to begin to close the gap and mitigate at least some of its effects.**

As if on cue, we’re also seeing a pushback. It should come as no surprise that America’s billionaires—from Starbucks CEO Howard Schultz to multi-billionaire hedge-fund manager Leon Cooperman—have gone on the offensive, complaining about how the various tax proposals, if enacted, would reduce what they consider to be the fortunes they’ve earned and undermine two areas they alone control: private philanthropy and corporate innovation.*** And ironically, as Paul Waldman has claimed,

the more billionaires keep talking about how their taxes shouldn’t be raised, the more likely it is that their taxes will in fact be raised, one way or another.

Similarly predictable is the attempt to rejigger the numbers so that inequality in the United States appears to be much less than official sources report. For example, according to the Census Bureau [pdf], in 2018, the top quintile of households (with an average income of $233.9 thousand) had 17 times more than the bottom quintile (whose average income was only $13.8 thousand).**** Phil Gramm and John F. Early argue that “this picture is false” because it focuses only on money income and excludes both taxes and transfer payments.***** Their conclusion?

America already redistributes enough income to compress the income difference between the top and bottom quintiles. . .down to 3.8 to 1 in income received.

There is one kernel of truth in Gramm and Early’s analysis: while the rich pay more in taxes, government transfers make up a much larger share of income of those at the bottom.****** But their calculations dramatically overstate the extent to which taxes and transfers decrease the degree of economic inequality in the United States. That’s because they fail to include unreported capital income, including dividends and interest paid to tax-exempt pension accounts and corporate retained earnings (which are included in other data sets, such as G. Zucman, T. Piketty, and E. Saez, “Distributional National Accounts: Methods and Estimates for the United States” [http://gabriel-zucman.eu/usdina/]).

Tax

As is clear in the table above, in 2014 (the last year for which data are available), the system of taxes and transfers only reduces the degree of inequality (measured as the ratio of top 10 percent average incomes to bottom 50 percent average incomes) from 18.7 to 1 to 10.1 to 1. And if we focus on post-tax cash incomes (thus excluding non-cash transfers, essentially Medicaid and Medicare), the resulting correction is even less: to 11.8 to 1. In both cases, the decrease in inequality is much less than in the Gramm and Early calculations.

The fact is, there are severe limits on what taxes and transfers can achieve in the face of the massive changes in the pre-tax distribution of income that have occurred in the United States since 1979. 

Pre-tax

As readers can see in the table above, while the average pre-tax incomes of the bottom 50 percent of Americans stagnated from 1979 to 2014, those of the top 10 percent increased by 100 percent and the incomes of the top 1 percent soared by even more, 183 percent.

If we compare the real incomes of the same groups after taxes and transfers, it’s clear that while the incomes of the bottom 50 percent of Americans did in fact inch upward from 1979 to 2014 (by a total of 18 percent, or only 0.5 percent a year), progressive taxes and transfers did not hamper the upsurge of income at the top: the average post-tax incomes of the top 10 percent doubled (by 2.86 percent a year) and those of the top 1 percent grew by more than 160 percent (by 4.8 percent a year).*******

The small group at the top continues to pull away from everyone else, both before and after taxes and transfers.

In my view, the degree of economic inequality in the United States is so severe that it can’t be sidetracked by billionaire complaints or swept away by the calculations of conservative economists. And, for that matter, it can’t be solved by enacting more taxes on the ultra-rich and more transfer payments for the rest of Americans. The problem is simply too large and systemic.

Only by understanding and attacking the roots of the inequality that has characterized the U.S. economy for decades now will we be able to close the enormous gap that has undermined the American Dream and shredded the fabric of political and social life in the United States.

 

*But, contra New York University historian Timothy Naftali, this is not the first time “we are having a national political conversation about billionaires in American life.” In fact, I’d argue, it’s a recurring debate in American history, stretching back at least to the rise of populism in the late-nineteenth century (and perhaps earlier, for example, to Shays’ Rebellion) and including the strike wave after the Panic of 1873, the anti-trust movement of the early-twentieth century, the crash of 1929 and the First Great Depression, and most recently the attacks on finance and the Occupy Wall Street movement during the Second Great Depression. In all those cases, Americans engaged in an intense national discussion of inequality and the role of the economic elite in political and social life.

**Even centrist Democrats have taken up, if only timidly, the banner of the anti-inequality campaign. For example, Rep. Brendan Boyle (D-PA), who has endorsed Joe Biden for the Democratic nomination, told The Washington Post he is crafting a new wealth tax proposal to introduce in the House of Representatives. And Rep. Don Beyer (D-VA), who last month endorsed South Bend Mayor Pete Buttigieg, has released a plan (with Sen. Chris Van Hollen of Maryland) for a new surtax on incomes over $2 million.

***The one area they don’t mention, which they also seek to control, is American politics—through lobbying, campaign donations, and the like. Wealthy individuals and large corporations attempt to exert such control although, as we just saw in Seattle—with Amazon’s $1.5 million campaign to unseat a socialist member of Seattle’s city council, Kshama Sawant—they’re not always successful.

****Money income includes the following categories: earnings; unemployment compensation; workers’ compensation; Social Security; supplemental security income; public assistance; veterans’ payments; survivor benefits; disability benefits; pension or retirement income; interest; dividends; rents, royalties, and estates and trusts; educational assistance; alimony; child support; financial assistance from outside of the household; and other income. The ratio of top to bottom rises to an astounding 60 to 1 in terms of only earnings. 

*****The Wall Street Journal column doesn’t explain how the alternative calculations were conducted. But Early, in a Cato Institute report [pdf], does explain their methodology.

******According to my calculations from the most comprehensive source (from G. Zucman, T. Piketty, and E. Saez, “Distributional National Accounts: Methods and Estimates for the United States” [http://gabriel-zucman.eu/usdina/]), in 2014, the bottom 50 percent of Americans received 74 percent of their post-tax income from transfers while, for the top percent, it was 19.5 percent.

*******What of the billionaires? Between 1979 and 2014, the average real post-tax incomes of the top .001 percent grew by 387 percent (or 11.1 percent a year), almost as much as their pre-tax incomes.

I cringe when I listen to or watch these interviews. But here it is, with the Real News Network.

The interview was based on my recent blog post, “Economics of poverty, or the poverty of economics.”

I also want to recommend a recent piece by Ingrid Harvold Kvangraven [ht: ms], who argues that

The interventions considered by the Nobel laureates tend to be removed from analyses of power and wider social change. In fact, the Nobel committee specifically gave it to Banerjee, Duflo and Kremer for addressing “smaller, more manageable questions,” rather than big ideas. While such small interventions might generate positive results at the micro-level, they do little to challenge the systems that produce the problems.

For example, rather than challenging the cuts to the school systems that are forced by austerity, the focus of the randomistas directs our attention to absenteeism of teachers, the effects of school meals and the number of teachers in the classroom on learning. Meanwhile, their lack of challenge to the existing economic order is perhaps also precisely one of the secrets to media and donor appeal, and ultimately also their success.

Exactly!

It’s the revenge of neoclassical economics, as reflected in this year’s prize in economics, which focuses attention on poor people’s “bad” decisions and away from the structural causes of poverty.

As I argued the other day on Twitter, it’s like saying the climate crisis will be solved by individuals turning off lights and recycling their garbage. Not bad things to do, certainly. But, together, all those individual efforts make up only 1-2 percent of the solution. The climate crisis cannot be solved unless and until we direct attention to the real, structural causes. Here, I’m thinking not only of the fossil fuel industry, but also the way the rest of contemporary capitalist economies are organized around the use of fossil fuels—in the production of goods and services, cars as well as digital information. Such a system generates enormous profits, which flow to a tiny group at the top, and continues to destroy the commons, where most of us live and work.

It’s that system that needs to be radically transformed. And as long as economists are lauded for focusing on technical issues around the margins and not on the real causes—of Third World poverty, global warming, and much else—the discipline of economics will continue to be impoverished.

711VrxpomkL._SL1000_

Yesterday, the winners of the 2019 winners of the so-called Nobel Prize in Economics were announced. Abhijit Banerjee, Esther Duflo, and Michael Kremer were recognized for improving “our ability to fight global poverty” and for transforming development economics into “a flourishing field of research” through their experiment-based approach.

The Royal Swedish Academy of Sciences declared:

This year’s Laureates have introduced a new approach to obtaining reliable answers about the best ways to fight global poverty. In brief, it involves dividing this issue into smaller, more manageable, questions–for example, the most effective interventions for improving educational outcomes or child health. They have shown that these smaller, more precise, questions are often best answered via carefully designed experiments among the people who are most affected.

As every year, mainstream economists lined up to laud the choice. Dani Rodrik declared it “a richly deserved recognition.” Richard Thaler, who won the award in 2017 (here’s a link to my analysis), extended his congratulations to the Banerjee, Duflo, and Kremer and to the committee “for making a prize that seemed inevitable happen sooner rather than later.” While Paul Krugman, the 2008 Nobel laureate, refers to it as “a very heartening prize—evidence-based economics with a real social purpose.”

Nothing new there. To a one, mainstream economists always use the occasion of the Nobel Prize to applaud themselves and their shared approach to economic and social analysis—a celebration of private property, free markets, and individual incentives.

What is novel this time around is that the winners include the first woman economist to win the prize (Duflo) and only the third non-white economist (Banerjee).*

But what about the content of their work? I’ve discussed the work of Duflo and Banerjee on numerous occasions on this blog (e.g., here, here, and here).

As it turns out, I’ve written a longer commentary on the “new development economics” as part of a symposium on my book Development and Globalization: A Marxian Class Analysis, which is forthcoming in the journal Rethinking Marxism.

I begin by noting that idea of Banerjee, Duflo, Kremer and the other new development economists is that asking “big questions” (e.g., about whether or not foreign aid works) is less important than the narrower ones concerning which particular development projects should be funded and how such projects should be organized. For this, they propose field experiments and randomized control trials—to design development projects such that people can be “nudged,” with the appropriate incentives, to move to the kinds of behaviors and outcomes presupposed within mainstream economic theory.

Here we are, then, in the aftermath of the Second Great Depression—in the uneven recovery from capitalism’s most severe set of crises since the great depression of the 1930s and, at the same time, a blossoming of interest in and discussion of socialism—and the best mainstream economists have to offer is a combination of big data, field experiments, and random trials. How is that an adequate response to grotesque and still-rising levels of economic inequality (as shown, e.g., by the World Inequality Lab), precarious employment for hundreds of millions of new and older workers (which has been demonstrated by the International Labour Organization), half a billion people projected to still be struggling to survive below the extreme-poverty line by 2030 (according to the World Bank), and the wage share falling in many countries (which even the International Monetary Fund acknowledges) as most of the world’s population are forced to have the freedom to sell their ability to work to a relatively small group of employers for stagnant or falling wages? Or, for that matter, to the reawakening of the rich socialist tradition, both as a critique of capitalism and as a way of imagining and enacting alternative economic and social institutions.

I go on to raise three critical issues concerning the kind of development economics that has been recognized by this year’s Nobel prize. First, the presumption that analytical techniques are neutral and the facts alone can adjudicate the debate between which development projects are successful and which are not is informed by an epistemological essentialism—in particular, a naïve empiricism—that many of us thought to have been effectively challenged and ultimately superseded within contemporary economic and social theory. Clearly, mainstream development economists ignore or reject the idea that different theories have, as both condition and consequence, different techniques of analysis and different sets of facts.

The second point is that class is missing from any of the analytical and policy-related work that is being conducted by mainstream development economists today. At least as a concept that is explicitly discussed and utilized in their research. One might argue that class is lurking in the background—a specter that haunts every attempt to “understand how poor people make decisions,” to design effective anti-poverty programs, to help workers acquire better skills so that they can be rewarded with higher wages, and so on. They are the classes that have been disciplined and punished by the existing set of economic and social institutions, and the worry of course is those institutions have lost their legitimacy precisely because of their uneven class implications. Class tensions may thus be simmering under the surface but that’s different from being overtly discussed and deployed—both theoretically and empirically—to make sense of the ravages of contemporary capitalism. That step remains beyond mainstream development economics.

The third problem is that the new development economists, like their colleagues in other areas of mainstream economics, take as given and homogeneous the subjectivity of both economists and economic agents. Economists (whether their mindset is that of the theoretician, engineer, or plumber) are seen as disinterested experts who consider the “economic problem” (of the “immense accumulation of commodities” by individuals and nations) as a transhistorical and transcultural phenomenon, and whose role is to tell policymakers and poor and working people what projects will and not reach the stated goal. Economic agents, the objects of economic theory and policy, are considered to be rational decision-makers who are attempting (via their saving and spending decisions, their participation in labor markets, and much else) to obtain as many goods and services as possible. Importantly, neither economists nor agents are understood to be constituted—in multiple and changing ways—by the various and contending theories that together comprise the arena of economic discourse.

The Nobel committee has recognized the work of Banerjee, Duflo, and Kremer as already having “helped to alleviate global poverty.” My own view is that it demonstrates, once again, the poverty of mainstream economics.

 

*The only other woman, in the 50-year history of the Nobel Prize in Economics, was Elinor Ostrom (2009), a political scientist; the other non-white winners were Sir Arthur Lewis (1979) and Amartya Sen (1998).

 

GamblE20190821_low

Special mention

228675  228832