Posts Tagged ‘physics’

giphy2.gif

In a recent article, Dan Falk [ht: ja] identifies a fundamental problem in contemporary physics:

many physicists working today have been led astray by mathematics — seduced by equations that might be “beautiful” or “elegant” but which lack obvious connection to the real world.

What struck me is that, if you changed physics and physicists to economics and economists, you’d get the exact same article. And the same set of problems.

Economists—especially mainstream economists but, truth be told, not a few heterodox economists—are obsessed with mathematics and formal modeling as the only correct methods for achieving capital-t truth. Mathematical modeling for them represents the best, most scientific way of producing, disseminating, and determining the veracity of economic knowledge—because it is logical, concise, precise, and elegant.* In that sense, mathematics represents what can only described as a utopia for the practice of modern economics.**

Mathematical utopianism in economics is based on elevating mathematics to the status of a special code or language. It is considered both a neutral language and, at the same time, a language uniquely capable of capturing the essence of reality. Thus, economists see mathematics as having both an underprivileged and overprivileged status vis-à-vis other languages.

Let me explain. On one hand, mathematics is understood to be a neutral medium into which all statements of each theory, and the statements of all theories, can be translated without modifying them. Mathematics, in this view, is devoid of content. It is neutral with respect to the various theories where it is applied. Partial and general equilibrium, game theory, and mathematical programming are concepts that serve to communicate the content of a theory without changing that content. Similarly, mathematical operations on the mathematized objects of analysis are considered to be purely formal. Thus, as a result of the conceptual neutrality of the methods and procedures of mathematical formalization, the objects of analysis are said to be unaffected by their mathematical manipulation. On the other hand, mathematics is considered to be uniquely capable of interpreting theory in its ability to separate the rational kernel from the intuitional (vague, imprecise) husk, the essential from the inessential. It becomes the unique standard of logic, consistency, and proof. Once intuitions are formed, mathematical models can be constructed that prove (or not) the logical consistency of the theory. Other languages are considered incapable of doing this because the operations of mathematics have an essential truth value that other languages do not possess. Mathematical statements, for example, are considered to be based on the necessity of arriving at conclusions as a result of following universal mathematical rules.

It is in these two senses that mathematics is considered to be a special language or code. It is more important than other languages in that it is uniquely capable of generating truth statements. It is also less important in that it is conceived to have no impact on what is being thought and communicated.

The notion of mathematics as a special code is linked, in turn, to the twin pillars of traditional epistemology, empiricism and rationalism. The oversight of mathematics implied by its underprivileged status is informed by an empiricist conception of knowledge: mathematics is considered to be a universal instrument of representation. It is used as a tool to express the statements of a discourse that already, always has an essential grasp on the real. It is the universal language in and through which the objects (and the statements about those objects) of different economic and social theories can all be expressed. In other words, the role of mathematics is to express the various “intuitive” statements of the theorist in a neutral language such that they can be measured against reality. The underprivileged position of mathematics that is linked to an empiricist epistemology contrasts sharply with the overprivileged status of mathematics. This overprivileged conception of mathematics is associated with a rationalist theory of knowledge wherein the subject-object dichotomy is reversed. Here the subject becomes the active participant in discovering knowledge by operating on the theoretical model of reality. In this sense, the logical structure of theory—not the purported correspondence of theory to the facts—becomes the privileged or absolute standard of the process of theorizing. Reality, in turn, is said to correspond to the rational order of thought. The laws that govern reality are deduced from the singular set of mathematical models in and through which the essence of reality can be grasped.***

The conception of mathematics as a mere language contains, however, the seeds of its own destruction. The notion of language as a simple medium through which ideas are communicated has long been challenged—since language is both constitutive of, and constituted by, the process of theorizing. The use of mathematics in economics thus may be reconceptualized as a discursive condition of theories, which constrains and limits, and is partly determined by, those theories. Mathematical concepts—such as the equilibrium position associated with the solution to a set of simultaneous equations, the exogenous status of the rules of a game, or the definition of a series of overlapping value functions to optimize an overall goal—partly determine the notions of relation and causality among the theoretical objects designated by the theories in which the means of mathematical formalization are utilized. They are not the neutral conceptual tools to which the propositions of different theories can be reduced. Similarly, the rationalist idea of abstraction, of simplification, also leads to a fundamental problem. It implies that there is a noise that ultimately escapes the “fictional” mathematical model. It implies an empirical distance between the model and its domain of interpretation, the empirical concrete. And that distance is conceived to be part of the empirical concrete itself. There is a part of reality that necessarily escapes the model. Thus, rationalist deductions from the model cannot produce the truth of the real because something is always “missing.”

So, what’s the alternative? As I see it, there is a double movement that involves both the rejection of mathematics as the discovery of an extra-mathematical reality and the critique of the notion that mathematics merely expresses the form in which otherwise nonmathematical theories are communicated. Thus, for example, it is possible (using, e.g., the insights of Ludwig Wittgenstein and Edmund Husserl) to argue that mathematics is a historical, social invention, not a form of discovery of an independent reality; it is not discovered “out there,” but invented and reinvented over time based on rules that are handed by mathematicians and the actual users of mathematics (such as economists). By the same token, we can see mathematics as introducing both new concepts and new forms of reasoning into other domains, such as economics and for that matter physics (which is exactly what Gaston Bachelard has argued).

This double movement has various effects. It means that there are no grounds for considering mathematics to be a privileged language with respect to other, nonmathematical languages. There is, for example, no logical necessity inherent in the use of the mathematical language. The theorist makes choices about the kinds of mathematics that are used, about the steps from one mathematical argument to another, and whether or not any mathematics will be used at all. Different uses (or not) of mathematics and different kinds of mathematics will have determinate effects on the discourse in question. Discourses change as they are mathematized—they are changed, not in the direction of becoming more (or less) scientific, but by transforming the way the objects of the discourse are constructed, and the way statements are made about those objects.

Ultimately, this deconstruction of mathematics as a special code leads to a rejection of the conception of mathematics as a special language of representation. The status of mathematics is both more representational and less representational than allowed by the discourse of representation. More, in the sense that mathematics has effects on the very structure of the mathematized theory; mathematics is not neutral. Less, to the extent that the use of mathematics does not guarantee the scientificity of the theory in question; it is merely one discursive strategy among others.

One alternative approach to making sense of the use of mathematics in economic theory is to consider mathematics not in terms of representation, but as a form of “illustration.” For economists, mathematical concepts and models can be understood as metaphors or heuristic devices that illustrate part of the contradictory movement of economic and social processes. These concepts and models can be used, where appropriate, to consider in artificial isolation one or another moment in the course of the constant movement and change in the economy and society. Mathematics may be used, then, to illustrate the statements of economic theory but, like all metaphors (in economics as in literature and other areas of social thought), it outlives its usefulness and then has to be dismantled.

As I see it, this conception of mathematical models as illustrative metaphors does not constitute a flat rejection of their use in economic theory. Rather, it accords to mathematical concepts and models a discursive status different from the one that is attributed to them in the work of mathematical economists. It accepts the possibility—but not the necessity—of using mathematical propositions as metaphors that are borrowed from outside of economic theory and transformed to teach and develop some of the concepts and statements of one or another economic theory.

Deconstructing the status of mathematics as a special code has the advantage of transforming both the way economics is done within any particular theory and the way the debate between different economic theories itself is conducted. It undermines the Truth-effect associated with mathematical utopianism and focuses attention, instead, on the conditions and consequences of different ways of thinking about the economy.

That debate—about the effects of different languages on economics, and the effects of different economic theories on the wider society—has its own utopian moment: transforming economics into a space not of blind obedience to mathematical protocols, but of real theoretical and political choices.

 


*From time to time, there have been a few admonishments from among economists themselves. Oskar Morgenstern (e.g., in his essay “Limits to the Uses of Mathematics in Economics,” published in 1963) and, more forcefully, Nicholas Georgescu-Roegen (especially in his 1971 Entropy Law and the Economic Process), Philip Mirowski (e.g., in More Heat Than Light, in 1989), and Paul Romer have indicated some of the problems associated with the wholesale mathematization of economics. However, even their limited criticisms have been ignored for the most part by economists.

In recent years, students (such as the members of the International Student Initiative for Pluralism in Economics) have been at the forefront of questioning the fetishism of mathematical methods in economics:

It is clear that maths and statistics are crucial to our discipline. But all too often students learn to master quantitative methods without ever discussing if and why they should be used, the choice of assumptions and the applicability of results. Also, there are important aspects of economics which cannot be understood using exclusively quantitative methods: sound economic inquiry requires that quantitative methods are complemented by methods used by other social sciences. For instance, the understanding of institutions and culture could be greatly enhanced if qualitative analysis was given more attention in economics curricula. Nevertheless, most economics students never take a single class in qualitative methods.

Their pleas, too, have been mostly greeted with indifference or contempt by economists.

**As I see it, the current fad of relying on randomized experiments and big data does not really undo the longstanding utopian claims associated with mathematical modeling, since the formal models are still there in the background, orienting the issues (including the choice of data sets) taken up in the new experimental and data-heavy approach to economics. Then, in addition, there is the problem that others—such as John P. A. Ionnidis et al. (unfortunately behind a paywall)—have discovered: most economists use data sets that are much too small relative to the size of the effects they report. This means that a sizable fraction of the findings reported by economists are simply the result of publication bias—the tendency of academic journals to report accidental results that only appear to be statistically significant.

***Economists often move back and forth between the two otherwise diametrically opposed conceptions of mathematics because they represent two sides of the same epistemological coin: although each reverses the order of proof of the other, both empiricism and rationalism presume the same fundamental terms and some form of correspondence between them. In this sense, they are variant forms of an “essentialist” conception of the process of theorizing. Both of them invoke an absolute epistemological standard to guarantee the (singular, unique) scientificity of the production of economic knowledge.

10

Mark Tansey, “Coastline Measure” (1987)

 

I’ve been over this before.

But I continue to be amazed at the ubiquitous, facile references to science, evidence, and facts and the derision that is directed at the proposition that we live in a post-truth world. On topics as diverse as climate change, globalization, and the role of the working-class in electing Donald Trump, commentators invoke Truth, with a capital t, as an obvious, unproblematic characteristic of making statements about what is going on in the world.

To me, they’re about as silly—and dangerous—as attempting to measure the coastline using a tape measure.

This is the case even in studies, such as those conducted by Tali Sharot [ht: ja], about the supposed diminishing influence of evidence and the existence of confirmation bias.

The very first thing we need to realize is that beliefs are like fast cars, designer shoes, chocolate cupcakes and exotic holidays: they affect our well-being and happiness. So just as we aspire to fill our fridge with fresh fare and our wardrobe with nice attire, we try to fill our minds with information that makes us feel strong and right, and to avoid information that makes us confused or insecure.

In the words of Harper Lee, “people generally see what they look for and hear what they listen for.”

It’s not only in the domain of politics that people cherry-pick news; it is apparent when it comes to our health, wealth and relationships.

At one level, this makes sense to me. There’s a great deal of confirmation bias when we try to make sense of various dimensions of lives and the world in which we live.

But. . .

I also think people are curious about things—information, experiences, and so on—that don’t seem to fit their existing theories or discourses. And, when they do attempt to make sense of those new things, their ideas change (and, of course, as their ideas change, they see things in new ways).

Perhaps even more important, while people like Sharot acknowledge that people often “accept evidence that confirms their preconceived notions and assess counter evidence with a critical eye,” they never consider the possibility that the people who are conducting the research concerning confirmation bias are themselves subject to that same bias.

Why is it always people out there—you know, “the ones who are thinking about health, wealth, and relationships”—that cherry-pick the facts. What about the so-called scientists, including the ones who invoke the Truth; why aren’t they also subject to confirmation bias?

Sharot invokes “the way our brain works”—without ever acknowledging that she and her coinvestigators also use one theory, and ignore or reject other theories, to make sense of the brain and the diverse ways we process information. Others rely on the “scientific evidence” concerning climate change or the gains from globalization or the existence of a resentful white (but not black or Hispanic) working-class, which in their view others deny because they don’t believe the obvious “facts.”

What’s the difference?

I can pretty much guess the kind of response that will be offered (because I see it all the time, especially in economics): the distinction between everyday confirmation bias and real, Truth-based stems from the use of the “scientific method.”

The problem, of course, is there are different scientific methods, different ways of producing knowledge—whether in economics or cognitive neuroscience, political science or physics, anthropology or chemistry. All of those forms of knowledge production are just as conditioned and conditional as the way nonscientists produce (and consume and disseminate) knowledges about other aspects of the world.

As for me, I can’t wait for this period of fake interest in capital-t Truth to pass. Maybe then we can return to the much more interesting discussion of the conditionality of all forms of knowledge production.

 

It’s clear we are in the midst of an acute period of inequality: not only of grotesque levels of economic inequality (which are now well documented) but also of a wide-ranging discussion of the conditions and consequences of that extreme inequality (which appears to be taking off).

There are, of course, the deniers, like my dear friend Deirdre McCloskey. What inequality, is her mantra. The only thing that matters is economic growth, such that the amount of stuff people have today is much more than they’ve had throughout much of human history. OK, but that doesn’t tell us much about how that growth took place (it’s the surplus, Deirdre) or what it’s consequences are (on the majority who actually produce the surplus versus the tiny minority who appropriate it).

And then there are those who are actually thinking seriously about inequality, some of whose work is published in the latest issue of Science (a lot of which, unfortunately, is behind a paywall). Leave aside the silly article on econophysics (really, the existing distribution of income is a kind of “natural inequality,” which is what you would get from entropy?), the article that focuses on the psychological pathologies of the poor (what about those of the rich?), and the fact that all the economics is narrowly confined to mainstream theories (which have done more to deflect attention from, as against the wide range of heterodox theories that have actually focused on, inequality over the course of the past three decades). Just the fact that a special issue of such a prestigious journal is devoted to the problem of inequality tells us something about how it has risen to the top of our agenda.

And it offers lots here to think about: the types of inequality that can be found in the archeological record (Heather Pringle), the absence of fundamental inequalities in hunter-gatherer societies (Elizabeth Pennisi), the devastating effects of inequality on health (Emily Underwood), growing inequality in developing countries (Mara Hvistendahl and Martin Ravallion), the intergenerational transmission of inequality via unequal maternal circumstances and health at birth (Anna Aizer and Janet Currie), and finally a dire warning about what will happen if current inequalities continue to grow (Angus Deaton):

The distribution of wealth is more unequal than the distribution of income, and very high incomes will eventually pupate into very large fortunes, ultimately leading to a hereditary dystopia of idle rich.

The pair of articles by economists—one by Thomas Piketty and Emmanuel Saez, the other by David Autor—tells us a great deal about how the issue of inequality is being framed within mainstream economics (since, as I wrote above, all the various types of nonmainstream economics are simply ignored in the issue). For Piketty and Saez, it’s all about the inequality (both income and wealth) that separates the top 1 percent (and, within that, the top .1 percent and .01 percent) from everyone else, while Autor’s piece focuses on the inequality of earnings within the bottom 99 percent. The debate comes down to seeing inequality as a result of high CEO incomes and returns on accumulated wealth (especially when the rate of return on wealth is greater than the overall growth rate, leading to more concentration of wealth) versus the inequality that derives from earnings based on different levels and kinds of skill (presuming that earnings are equal to marginal productivities). In other words, it’s a (mostly) classical approach—which focuses on scarce wealth concentrated in the hands of the already richversus a (thoroughly) neoclassical approach—according to which scarce skills attract higher earnings. The solution from the classical perspective is a global tax on wealth; from the neoclassical viewpoint, all we need is an increase in education and skills for those at the bottom.

Here’s what I find interesting about the debate, not only between the economists but throughout the entire special issue: it’s all about economic inequality—what it is (absolute or relative), how it can be measured (within and across nations, and over time), what its causes and consequences are (including not only the health of individuals but also of society as a whole), and so on—but there’s not a single mention of class.

Not literally. The word class doesn’t appear in any of the articles or reviews. But class is the specter that, in my view, haunts this entire debate. We saw it back in the First Great Depression. And now we’re seeing it rear its ugly head once again, in the midst of the Second Great Depression. We didn’t solve it then. Perhaps, now, we’re ready to tackle it.

And, if we don’t, we’ll be faced with even more inequality all the time.

Update

American-Spectator-cover

As if on cue, the latest issue of the American Spectator focuses on what they consider to be the “new class warfare”—using as a threat the universal symbol of “off with their heads.”

For which Gavin Mueller offers the only appropriate response:

Remember this: no matter how many country clubbers flip through Piketty’s book, at bottom, the rich hate usThey disdain usThey mock us. And they fear us, even though the current balance of forces favors them overwhelmingly and sometimes “common ruin of the contending classes” seems like an optimistic outcome.

Yet I have to fall back on some advice I got as a kid: If the American Spectator wants to cry about class warfare, we should give them something to cry about.

 

Those of us in economics have known about the physics-envy of neoclassical economists since at least 1989 (when More Heat than Light: Economics as Social Physics was published). Physicist Lee Smolin, it seems, discovered the same phenomenon after 2007, when he started to read neoclassical economics.

This is from the transcript:

So why is the notion of equilibrium so powerful? I think part of the answer is this idea of physics envy, that economists thought that what they’re doing was more scientific, hence more correct, if it looked like physics. And physics had this timeless picture in which what really mattered, as we were saying before, is the whole history of the system. And in physics there’s also a big notion of coming to equilibrium which is, although however, it’s important to say, a different notion of equilibrium. And somehow people in economics got seduced into this model which again works in the small – if you have a small little corner of the economy, a small market – it may work for a while to characterize approximately what’s going on. Arbitrage is not always there. It’s not always, I mean, arbitrage, arbitrage does get eaten up. There are market forces which do push you towards equilibria. There’s some truth in it.

But the whole thing is a disaster if I can say that as an outsider. And it led indirectly – it wasn’t the only reason why regulations were lifted on markets and trading through the decades, but when people were making arguments to Congress, to the President’s office that the economy would be better off without regulation, this was the “scientific rationale for it” and led to the very unstable situation of the last economic crisis.

And indeed there’s still a very dangerous and unstable situation in the world economy because – well, I’m not an economist. I’m not gonna pontificate about the problems in the economy, but one could see how the idea of timelessness gave false comfort to an unsuccessful scientific theory in the realm of economics.

And that’s what happens when only neoclassical economics is taught—when students are not introduced to theories other than neoclassical economics or, for that matter, to the history and philosophy of science. They equate economics with the idea of presuming a unique timeless equilibrium outside of history and fail to learn that plenty of theoretical alternatives exist.

Special mention

I used to understand physics. Now, I don’t. Not at all.

I don’t understand dark energy and dark matter. And I don’t understand the latest “observation” of the Higgs boson in the data from the U.S. Tevatron accelerator before it was shut down. They’re beyond my 20-year-old (even then, partial and incomplete) understanding of physics.

I do, however, get the statistics:

  • Particle physics has an accepted definition for a “discovery”: a five-sigma level of certainty
  • The number of standard deviations, or sigmas, is a measure of how unlikely it is that an experimental result is simply down to chance rather than a real effect
  • Similarly, tossing a coin and getting a number of heads in a row may just be chance, rather than a sign of a “loaded” coin
  • The “three sigma” level represents about the same likelihood as tossing more than eight heads in a row
  • Five sigma, on the other hand, would correspond to tossing more than 20 in a row
  • With independent confirmation by other experiments, five-sigma findings become accepted discoveries

And I’m quite sympathetic to the the spontaneous philosophy of the physicists:

Most professional physicists would say that finding the Higgs in precisely the form that theory predicts would actually be a disappointment. Large-scale projects such as the LHC [Large Hadron Collider] are built with the aim of expanding knowledge, and confirming the existence of the Higgs right where we expect it – while it would be a triumph for our understanding of physics – would be far less exciting than not finding it. If future studies definitively confirm that the Higgs does not exist, much if not all of the Standard Model would have to be rewritten. That in turn would launch new lines of enquiry that would almost certainly revolutionise our understanding of the Universe, in much the same way as something missing in physics a century ago led to the development of the revolutionary ideas of quantum mechanics.

But I still have no idea what a bump in the data between 115 and 135 gigaelectronvolts looks like or what it would mean to rewrite the Standard Model.