Posts Tagged ‘academy’

China Financial Crisis Art

Chen Wenling, “What You See Might Not Be Real” (2009)

I’ll admit, there are times when I regret the fact that I’m a relativist. Wouldn’t it be nice, I say to myself on occasion, to be able to claim—beyond a shadow of a doubt, to my students, colleagues, or readers of this blog—that something or other (neoclassical economics or capitalism or name your poison) is wrong and that the alternative (Marxian economics or socialism or what have you) is absolutely correct.

But then I read a defense of capital-T truth—such as David Roberts’s [ht: ja] attack on the alt-right and fake news and his presumption that the liberal mainstream is uniquely capable of upholding “truth, justice, and the American way”—and I thank my lucky stars that I don’t have to make such outlandish, embarrassing arguments. Fortunately, my relativism means I’m not saddled with the mainstream liberals’ delusion that they have, if not God, at least Superman on their side.

I’ve been over this epistemological terrain before (e.g., here, here, and here). But it seems, in the current conjuncture, mainstream liberals—in their zeal to attack Donald Trump and the right-wing media’s defense of his administration’s outlandish claims about a wide variety of issues, from climate change to the Mueller investigation—increasingly invoke and rely on an absolutist theory of knowledge. And then, of course, claim for themselves the correct side in the current debates.

As Roberts sees it, the United States

is experiencing a deep epistemic breach, a split not just in what we value or want, but in who we trust, how we come to know things, and what we believe we know — what we believe exists, is true, has happened and is happening.

The primary source of this breach, to make a long story short, is the US conservative movement’s rejection of the mainstream institutions devoted to gathering and disseminating knowledge (journalism, science, the academy) — the ones society has appointed as referees in matters of factual dispute.

In their place, the right has created its own parallel set of institutions, most notably its own media ecosystem.

Consider the assumptions built into those statements for a moment. Roberts believes that society has appointed a unique set of mainstream institutions—journalism, science, the academy—to serve as referees when it comes to adjudicating the facts in play. Nowhere does he discuss how, historically, those institutions came to occupy such an exalted position. Perhaps even more important, he never considers the disputes—about the facts and much else—that exist among journalists, scientists, and academics. And, finally, Roberts never mentions all the times, in recent years and over the centuries, the members of those institutions who got it wrong.

What about the reporting on the weapons of mass destruction in Iraq? Or the Tuskegee Study of Untreated Syphilis in the Negro Male? Or the university professors and presidents, at Yale, Harvard, and elsewhere, who supported and helped devise the U.S. war in Vietnam?

The list could go on.

There is, in fact, good reason not to simply accept the “facts” as gathered and disseminated by mainstream institutions. Historically, we have often been misled, and even mangled and killed, by those supposed facts. And, epistemologically, the members of those institutions—not to mention others, located in different institutions—produce and disseminate alternative sets of facts.

Maybe that’s Roberts’s problem. He actually thinks facts are gathered, as if they’re just out there in the world, waiting to be plucked, harvested, or dug up like fruits and vegetables by people who have no particular interest in which facts find their way into their baskets.

Alternatively, we might see those facts as being created and manufactured, through a process of knowledge-production, which relies on concepts and theories that are set to work on the raw materials generated by still other concepts and theories. The implication is that different sets of concepts and theories lead to the production of different knowledges—different sets of facts and their discursive and social conditions of existence.

I have no doubt that many journalists, scientists, and academics “see themselves as beholden to values and standards that transcend party or faction.” But that doesn’t mean they actually operate that way, somehow above and apart from the paradigms they use and the social influences exerted on them and the institutions where they work.

As for as Roberts is concerned, only the “far right” rejects the “very idea of neutral, binding arbiters” and adheres to a “tribal epistemology.” And mainstream liberals? Well, supposedly, they have the facts on their side.

If one side rejects the epistemic authority of society’s core institutions and practices, there’s just nothing left to be done. Truth cannot speak for itself, like the voice of God from above. It can only speak through human institutions and practices.

For Roberts, it’s either epistemic authority or nihilism. Absolute truth or an “epistemic gulf” that separates an “increasingly large chunk of Americans,” who believe “a whole bunch of crazy things,” from liberal Democrats.

What Roberts can’t abide is that we “live in different worlds, with different stories and facts shaping our lives.” But, from a relativist perspective, that’s all we’ve ever had, inside and outside the institutions of journalism, science, and the academy. Throughout their entire history. Different stories and different sets of facts.

And that hasn’t stopped the conversation—the discussion and debate within and between those different, often incommensurable, stories and facts. The only time the conversation ends is when one set of stories and facts is imposed on and used to stamp out all the others. A project always carried out in the name of Truth.

Clearly, Roberts mourns the passing of a time of epistemological certainty and universal agreement that never existed.

Roberts instead should mourn the effects of a Superman theory of knowledge that got him and other mainstream liberals into trouble in the first place. In recent years, they and their cherished facts simply haven’t been persuasive to a large and perhaps growing part of the population.

And the rest of us are suffering the consequences.

 

p294

Special mention

GorreB20171016_low  "Trumpcare"  (Mark Streeter/Savannah Morning News)

Priggee

Special mention

download  201367

penis-size-statue-2011-03-23

The election and administration of Donald Trump have focused attention on the many symbols of racism and white supremacy that still exist across the United States. They’re a national disgrace. Fortunately, we’re also witnessing renewed efforts to dethrone Confederate monuments and other such symbols as part of a long-overdue campaign to rethink Americans’ history as a nation.

In economics, the problem is not monuments but the discipline itself. It’s the most disgraceful discipline in the academy. Therefore, we should dethrone ourselves.

DHYUGN0WAAAQPwP

In the United States, thanks to the work of the Southern Poverty Law Center, we know there are over 700 monuments and statues to the Confederacy, as well as scores of public schools, counties and cities, and military bases named for Confederate leaders and icons.

DHYUUSbWAAA1iB6

We also know those symbols do not represent any kind of shared heritage but, instead, conceal the real history of the Confederate States of America and the seven decades of Jim Crow segregation and oppression that followed the Reconstruction era. In fact, most of them were dedicated not immediately after the Civil War, but during two key periods in U.S. history:

The first began around 1900, amid the period in which states were enacting Jim Crow laws to disenfranchise the newly freed African Americans and re-segregate society. This spike lasted well into the 1920s, a period that saw a dramatic resurgence of the Ku Klux Klan, which had been born in the immediate aftermath of the Civil War.

The second spike began in the early 1950s and lasted through the 1960s, as the civil rights movement led to a backlash among segregationists. These two periods also coincided with the 50th and 100th anniversaries of the Civil War.

The problem, of course, is those statues have stayed up for so long because, like so many other features of our everyday landscape, they became so familiar that Americans hardly even noticed they were there.

It should come as no surprise, then, that a majority of Americans (62 percent) believe statues honoring leaders of the Confederacy should remain. However, a similar majority (55 percent) said they disapproved of the Trump’s response to the deadly violence that occurred at a white supremacist rally in Charlottesville. As a result, I expect Americans will be engaged in a new conversation about their history—especially the most disgraceful episodes of slavery, white supremacy, and racism—and what those symbols represent today.

The discipline of economics has a similar problem—not of statues but of sexism and hostility to women. It’s been so much a feature of our everyday academic landscape that economists hardly even noticed it was there.

They didn’t notice until reports surfaced—in the New York Times and the Washington Post—concerning Alice Wu’s senior thesis in economics at the University of California-Berkeley. Wu analyzed over a million posts on the anonymous online message board, Economics Job Market Rumors, to analyze how economists talk about women in the profession.

According to Wu,

Gender stereotyping can take a subtle or implicit form that makes it difficult to measure and analyze in economics. In addition, people tend not to reveal their true beliefs about gender if they care about political and social correctness in public. The anonymity on the Economics Job Market Rumors forum, however, removes such barriers, and thus provides a natural setting to study the existence and extent of gender stereotyping in this academic community online.

And the results of her analysis? The 30 words most associated with women were (in order, from top to bottom): hotter, lesbian, bb (Internet terminology for “baby”), sexism, tits, anal, marrying, feminazi, slut, hot, vagina, boobs, pregnant, pregnancy, cute, marry, levy, gorgeous, horny, crush, beautiful, secretary, dump, shopping, date, nonprofit, intentions, sexy, dated, and prostitute.

In contrast, the terms most associated with men included mathematician, pricing, adviser, textbook, motivated, Wharton, goals, Nobel, and philosopher. Indeed, the only derogatory terms in the list were bully and homo.

In my experience, that’s a pretty accurate description of how women and men are unequally seen, treated, and talked about in economics—and that’s been true for much of the history of the discipline.*

But, of course, that’s not the only reason economics is the most bankrupt, disgraceful discipline in the entire academy. It has long shunned and punished economists who endeavor to use theories and methods that fall outside mainstream economics—denying jobs, research funding, publication outlets, and honorifics to their “colleagues” who have the temerity to teach and do research utilizing other discourses and paradigms, from Marxism to feminism. 

Even the attempt to convince economists to adopt a code of ethics—like those in many other disciplines, from anthropology to medicine—was treated with disdain.

Sure, there’s a Bank of Sweden Prize in Economic Sciences in Memory of Alfred Nobel. And, in the United States, both a Council of Economic Advisers and a National Economic Council—but no White House Council of Social Advisers.

Economics may have national and international prominence. But it’s time we give up the hand-wringing and admit there is no standard of decency or intelligence (with the possible exception of mathematics) that economists don’t fail on.

We are, in short, a collective disgrace. That’s why we should dethrone ourselves.

 

*A history that includes Joan Robinson, who should have won the Nobel Prize in Economics but didn’t (because, of course, she was a non-neoclassical, female economist) and can’t (because she’s dead).

fredgraph

Mainstream economists and politicians have answers for everything.

Lose your job? Well, that’s just globalization and technology at work. Not much that can be done about that.

And if you still want a job? Then just move to where the jobs are—and make sure your children go to college in order to prepare themselves for the jobs that will be available in the future.

The fact is, they’re not particularly good answers. And people know it. That’s why working-class voters are questioning business as usual and registering their protest by supporting—in the case of Brexit, the 2016 U.S. presidential election, the 2017 snap election in Britain, and so on—alternative positions and politicians.

On the first point, it’s not simply globalization and technology. Large corporations, which employ most people, are the ones that decide—in the context of a global economy and by developing and adopting new technologies—when and where some jobs will be destroyed and new ones created. They use the surplus they appropriate from their existing workers and utilize it to determine the pattern of job destruction and creation, in order to get even more surplus.

Thus, in April 2017 (according to the data in the chart at the top of the post), employers eliminated 1.6 million jobs in the United States. In January 2009, things were even worse: corporations destroyed 2.6 million jobs across the U.S. economy. Of course, they also create new jobs—often in different companies, industries, regions, and countries. That leaves individual workers with the sole decision of whether or not to chase those jobs, since as a group they have absolutely no say in when or where old jobs are destroyed and new ones created.

What about their children and the advice to go to college? We already know the idea that higher education successfully levels the playing field across students with different backgrounds is a myth (and sending more kids to college doesn’t do much, if anything, to lower inequality).

Now we’re learning that, when states suffer a widespread loss of jobs, the damage extends to the next generation, where college attendance drops among the poorest students.

That’s the conclusion of new research Elizabeth O. Ananat and her coauthors, just published in Science (unfortunately behind a paywall). What they found is that

local job losses can both worsen adolescent mental health and lower academic performance and, thus, can increase income inequality in college attendance, particularly among African-American students and those from the poorest families.

Their argument is that macro-level job losses are best understood as “community-level traumas” that negatively affect the learning ability and the mental health not only of young people who experience job loss within their own families, but also of the other children in states where the destruction of jobs is widespread.

So, the problem can’t be solved by forcing individual workers to have the freedom to chase after jobs and send their children to college. Nor is the predicament confined to the white working-class. In fact, the effects of job losses are similar, but even worse, among African-American youth.

That’s why Ananat argues that

white working class people and African-American working class people are in the same boat due to job destruction. Imagine the policies we could have if folks found common ground over that.

And, I would add, those policies need to go beyond the “active labor market policies”—such as “rigorous job training and active matching of worker skills to employer needs”—the authors, along with mainstream economists and politicians, put forward.*

We also need to reconsider the fact that, within existing economic institutions, employers are the only ones who get to decide when and where jobs are destroyed and created. Giving workers the ability to participate as a group in the decisions about jobs—within existing enterprises and by assisting them to form their own enterprises, would improve their own mental health and that of the members of the wider community.

Such a change would also transform young people’s decisions about whether or not to go to college. It’s not just about jobs in the new economy. It would allow them to demand, as women in Lawrence, Massachusetts did over a century ago, both “bread and roses.”

 

*Policies to help “disadvantaged workers, especially African Americans, Hispanics and rural residents,” also need to go beyond encouraging the Fed to keep interest-rates low. That still leaves job decisions in the hands of employers.

wealth

One of the most pernicious myths in the United States is that higher education successfully levels the playing field across students with different backgrounds and therefore reduces wealth inequality.

The reality is quite different—for the population as a whole and, especially, for racial and ethnic minorities.

As is clear from the chart above, the share of wealth owned by the top 1 percent has risen dramatically since the mid-1970s, rising from 22.9 percent in 1976 to 38.6 percent in 2014. Meanwhile, the share owned by the bottom 90 percent has declined, falling from 34.2 percent to 27 percent. And that of the bottom 50 percent? It has remained virtually unchanged at a negligible amount, falling from 0.9 percent to zero.

During that same period, according to the U.S. Census Bureau (pdf), the proportion of Americans aged 25 to 29 with a bachelor’s degree or higher rose from 24 percent to 36 percent. (For the entire population 25 and older, the percentage with that level of education rose from 15 to 33.)

So, no, higher education has not leveled the playing field or reduced wealth inequality. In fact, it seems, quite the opposite appears to be the case.

And that’s true, too, for racial and ethnic disparities in wealth. As William R. Emmons and Lowell R. Ricketts (pdf) of the Federal Reserve Bank of St. Louis have concluded,

Despite generations of generally rising college-graduation rates, higher education’s promise of significantly reducing income and wealth disparities across all races and ethnicities remains largely unfulfilled. . .rather than promoting economic equality across all races and ethnicities, higher education unintentionally has become an engine for growing disparities.

2013

Thus, for example, median Hispanic and black wealth levels decline relative to similarly educated whites as education increases until the very top. Moreover, only about 7 percent of black families and 5 percent of Hispanic families have postgraduate degrees, and wealth disparities remain large even there.

Darrick Hamilton and William A. Darity, Jr. (pdf), who participated in the same symposium, go even further. According to them, the United States has a fundamental problem in discussing wealth disparities according to race and ethnicity:

Much of the framing around wealth disparity, including the use of alternative financial service products, focuses on the poor financial choices and decisionmaking on the part of largely Black, Latino, and poor borrowers, which is often tied to a culture of poverty thesis regarding an undervaluing and low acquisition of education.

Thus, while they agree that a college degree is positively associated with wealth within racial and ethnic groups, it is still the case that it does little to address the massive wealth gap across such groups.

And yet the myth persists. American elites and policymakers still to choose to emphasize the economic returns to education as the panacea to address socially established wealth disparities and structural barriers of racial and ethnic economic inclusion.

The question is, why?

According to Hamilton and Darity, such a view

follows from a neoliberal perspective, where the free market, as long as individual agents are properly incentivized, is supposed to be the solution to all our problems, economic or otherwise. The transcendence of Barack Obama becomes the ideal symbolism and spokesperson of this political perspective. His ascendency becomes an allegory of hard work, merit, efficiency, social mobility, freedom and fairness, individual agency, and personal responsibility. The neoliberal ideology is not limited to race. It more generally places the onus on individual actions, and more broadly leads to deficiency narratives for low achievement, but this is especially the case when considering race and other stigmatized workers. Perhaps the greatest rhetorical victory of this paradigm is convincing the masses that implicit in unfettered markets is the “American Dream”—the hope that, even if your lot in life is subpar, with patience and individual hard work, you can turn your proverbial “rags into riches.”

And so the myth of college and the American Dream is perpetuated, while the unequal distribution of wealth—across the entire population, and especially with respect to ethnic and racial minorities—which has been growing for decades, continues unabated.

Ivies

Back in graduate school, I was a member of SUPE, Students United for Public Education. We conducted a study in which we showed that the very rich and seemingly private Harvard University received more public monies than our own poorly funded and very public University of Massachusetts-Amherst.

A new study, by Open Books (pdf), broadens that study by investigating the amount of public monies that are funneled to the eight Ivy League schools: Harvard, Princeton, Yale, Cornell, Columbia, Dartmouth, Penn, and Brown.

The amount of taxpayer-funded payments and benefits—$41.59 billion over a six-year period (FY2010-FY2015)—is by itself extraordinary, more money ($4.31 billion) annually from the federal government than sixteen states.

But we’re also talking about universities whose endowment funds (in 2015) exceeded $119 billion, which is equivalent to nearly $2 million per undergraduate student. In FY2014, the balance sheet for all Ivy League colleges showed just under $195 billion in accumulated gross assets—equivalent to $3.35 million per undergraduate student. The Ivy League also employs 47 administrators who each earn more than $1 million per year (two executives each earned $20 million between 2010 and 2014). And, in a five-year period (2010-2014), the Ivy League spent $17.8 million on lobbying, which included issues mostly related to their endowment, federal contracting, immigration and student aid.

The bottom line is clear: Ivy League are nominally private universities that receive vast amounts of public financing, much more than the public colleges and universities that educate most students in the United States.