David Golumbia’s response to Brian Lennon’s “Gaming the System.”
Playing with Rules
Playing with Rules
Playing with Rules
L. “Weberian conflation of capitalist rationalism with evangelical Christianity, which G. doesn’t do much to differentiate.” (6,756)
G. Fair enough - I think Weber is tremendous on these subjects, especially on the idea of capitalist rationalism itself, and he had a lot of relevant things and correct things to say on the conjoined nature of religious and economic beliefs. But the book’s “failure to differentiate” between them is proffered, no doubt much too softly, to draw attention to their nearly identical overt commitments to a transcendental signified; my hope is to shed light on the (largely) unexplored shared conceptual ground between the high techno-cultural gamers of the neoliberal elite (Rove, Scalito, Beck and so on) and the evangelical doctrine that informs so much of the mass they carefully manipulate to support their views. That shared ground is an explicit absolutism about meaning that they claim undergirds all of their practice - on the one hand the “original intent” of the framers of the Constitution and on the other a specifically literalist reading of the Bible. It is the style of interpretation that is precisely the same, and in both cases, critical observers will often note that practice virtually never follows theory - that is, the literalist/transcendentalist position with about meaning is almost always an ideological construction, one that is always threatened by power and force at the last moment. Which is exactly how the Supreme Court operates today - our version of literalism when it fits with our vision of political authority, and absolute political authority otherwise. That is to say, when the right wants to actually accomplish its goals, its overt Aristotelianism gives way to a radically Wittgensteinian philosophy of language (or as Frank Luntz puts it, channeling Stanley Fish: It’s Not What You Say, It’s What People Hear).
L. “[G.], far from accepting that Chomsky has any place on the left at all, near or far, banishes him unceremoniously to the right” (6,869)
G. I must apologize if the book gives this impression; I fully accept the overt left-anarchist-libertarianism of Chomsky’s political work, and try to point at this at least obliquely, e.g., “while Chomsky’s real-world politics are widely understood to be radically left, his institutional politics are often described exactly as authoritarian” (33); it is Chomsky qua linguist whom I am trying to situate in a rightist politics. I find his political writings incredibly helpful, almost unique, and authentically leftist. At the same time, I reject Chomsky’s persistent claim that the political and scientific views he holds are unrelated both in the world at large and in Chomsky’s own mind; I see this claim itself as part of rightist discourse inside the academy. Because Chomsky refuses to accept any political conditioning of “scientific” discourse (and because he, uniquely, tries to identify linguistics almost exclusively with the hard sciences), his views have been easy to integrate with the rightist parts of certain disciplines, which Chomsky refuses to see as rightist because he denies the interconnections between such spheres altogether. I do not even mean to assert that Chomsky’s internal institutional politics are rightist (indeed, my use of “institutional” in the book is probably mistaken, because it is really Chomsky’s disciplinary position as a linguist that is his specifically rightist aspect); my understanding is that he has been at least somewhat “activist” in his promotion of women, minorities, and even the study of endangered languages to some extent, during his career at MIT. There is more to say about the psychological frame out of which Chomsky sees the world: why, that is, he chooses to occupy “rightist” intellectual and “leftist” public political positions. My goal is to bring into broader view and consideration the largely unspoken politics of his linguistic views; I do not mean in any way to question the overt politics of his political views. To put the matter most starkly, and also speculatively: it would be no surprise to learn that Chomsky’s linguistic views, especially in their Fodorian cognitive science frame, have many adherents and even some advocates in conservative think tanks (Rand Corporation, Cato Foundation, etc.) and the bodies of the national security state, and few interested in the topic who subscribe to alternative views. One suspects that Chomsky’s political writings are no doubt treated with ridicule, when they are treated all, in those venues (as a rule; I am sure there are exceptions).
L. A “risk… that remains, in the end, fairly distant” (12,841)
G. The risks I’m talking about seem anything but distant - many of them have already been realized. The title of your review points toward a signal ambiguity in the word gaming. We think we know what it is to play a game, but in your title it takes on a more sinister meaning. In the book I discuss the practices of corporations like Enron and the early Microsoft (and Ballmer and Gates in their Harvard days) in which what is involved in winning a game is something much more like what a surface account of a given game might call cheating. Is cheating part of playing a game, or not? When you game a system, are you playing the game, or not?
While there certainly are examples of raw computing power being used to dominate industries (I discuss Walmart and Apple, and Microsoft deployed both power and cheating strategies in its first decades), even more apparent are examples where computers are used to mystify the object which is supposed to be at the center of some transaction, and thereby allow the mystifiers to mask activity that would in fact be “against the rules” were it done in public view.
We see these effects everywhere. One critical example is the world of high finance and the economic instability in recent world markets. Institutions and people are to blame and should be held responsible, much more responsible than many of them have been; but it’s worth meditating on the persistence of and even critical roles played by computational practices and tropes in so many of these events.
In the first category, there are ways in which what we might call the literal and effective use of computation has led to concentration of risk (both in terms of speed and the amount of capital at risk):
* Long-Term Capital Management (LTCM), the high-flying hedge fund of the 1990s backed by capitalists as central as Myron Scholes and Robert Merton (who were two-thirds responsible for the Black-Scholes option pricing model, itself deeply computational and somewhat obscurantist, and in another sense ‘the origin’ of the current crisis; see Holmes 2009, “Long-Term Capital Management”), collapsed due to huge amounts of leverage (ie, credit) borrowed because according to LTCM’s proprietary computer simulations should have been safe. But the “key assumption that the models depended on was the high correlation between the long and short positions… During LTCM’s crisis, however, this correlation dropped to 80%. Stress-testing against this lower correlation might have led LTCM to assume less leverage in taking this bet” (“Case Study: Long-Term Capital Management”)
* A variety of technologies today lumped under the heading “automated trading” utilize computational practices at a level of sophistication that human beings literally cannot understand the strategies being executed by the computer; this includes not simply the quantitative information one might expect but also “simple news” like standard corporate reports, and “more difficult to understand news… [such as] market sentiment (deciding if the news is good or bad)” (“Automated Trading”) This results in two kinds of risk: first, that it is simply not possible to manage a “black box” whose operation one cannot comprehend; and second, that the presence of such “black boxes” on both sides of trades creates a highly unstable market, so that “as the markets discovered in the collapse of August 2007, a lot of hedge funds using sophisticated computer models were making very similar bets. Rather than offsetting each other’s risks, they were reinforcing them” (Kuttner 2007)
* A particular kind of automated trading, invented by a man named Bernard Madoff, “high-speed trading” - which should be read with all the Virilio one can muster - which in a legitimate effort to make the tiny amounts of profit extractable from trading juts that much faster than one’s opponent has now led to a world that sounds like a kind of machine warfare: “The biggest volume generators at the moment are high-frequency trading firms you’ve never heard of - GETCO, founded a whopping 10 years ago, is the granddaddy - that try to get ahead of millisecond-by-millisecond price movements and take advantage of rebates paid by exchanges to those who create liquidity (that is, offer to buy or sell stock at a certain price and assume the market risk)” (Altaffer 2009); the practice raises questions of systemic risk that, alongside many others, have not been addressed by regulators. In the words of the Wall Street Journal, there is a second concern, more like cheating: “regulators worry that certain unscrupulous participants in the market with ultrafast computer technology could game these orders, trading ahead of them and affecting the price of the security” (Patterson and Ragow 2009)
But second, and more prevalent, are the Enron-type stories, where computerization and its “complexities” are used to hide outright fraud: (NO, want to say cheating is a kind of hyper-rational way of winning the game)
* Madoff’s two chief programmers played a critical role in his own huge “Ponzi scheme,” not because they wrote programs, but because “they used their special computer skills to create sophisticated, credible and entirely phony trading records that were critical to the success of Madoff’s scheme” (“Madoff’s Former IT Experts Arrested”)
* I put “Ponzi scheme” in quotation marks because it’s now clear that much of the entire market bubble was due to a different kind of computational deception. As Frontline recently reported, CFTC Chair Brooksley Born alerted Congress to the fact that an entirely unregulated area of the market had opened up, the “OTC Derivatives Market.” By “unregulated” what is meant is that unlike, for example, a public corporation like Microsoft and its stock, for an OTC Derivative there is absolutely no way to establish the value of the security. An unregulated market means that entities can essentially create money on their own, because nobody knows whether their securities are worth what they say. Frontline forces the viewer to wonder just how clearly even Greenspan and Rubin understood that the economy was built on illusory support (“Brooksley Born”; “The Warning”)
* In recent years, the easiest way to accomplish the creation of imaginary money is to securitize. The entire operation of securitization is something like writing an option on an underlying instrument - that is, trading cotton futures rather than trading cotton - and even the regulated part of this market has a profound distorting effect on the underlying “real” market in which it’s generally said to lessen risk - a cotton market in which you can leverage your bets against price fluctuations allows you to weather good and bad years; but trading in the leveraged instruments without regard to the actual cotton production, if taken to extremes, can become an end in itself, one that can for this reason have destructive effects within the same system (see Holmes 2009 on options markets in general and the interestedness of the Black-Scholes model used to price them, itself only implementable with computers in the 1970s).
This was bad enough, but coupled with unregulated underlying instruments (junk mortgages with variable interest rates), any sense of rational pricing was lost. Yet because computers demand discrete input for the value of instruments, there was nothing particularly less “real” about these kinds of instruments than any others - they show up on your screen, they have a number next to them as a value, you trade them with your buddy. When a Goldman Sachs trader passed a CMO for $1,000,000 to his buddy in Lehman Bros whose underlying value was actually $2, but all their screens from both of their proprietary trading systems and the internal programmers and analysts who built the securities in the first place said it was - who was gaming who? And with whose money?
On this second note, though, and to return to a theme I brought up in my first response, reflecting on current politics, it is all too easy to see not just some recent events, but perhaps more frighteningly the majority of recent events, as being subject to exactly this kind of computationalist gaming. In particular, capital itself - embodied in the form of the ownership class - seems to have “learned” how to execute this strategy. Supreme Court decisions like Bush v Gore and Citizens United look more than anything like capital taking control of institutions in a hierarchical fashion: the ownership class, the concentration of capital dictating exactly what it wants from constitutional power, over against exactly the principles it used to install itself into power. That is to say that the overt game was to install Alito, Roberts, even Scalia and Thomas as if they conformed to a principle to which a certain popular assent had been precalculated, even if they final results (the two cases in question) emerge as among the most “activist,” “legislative,” and “political” of decisions in US Supreme Court history (and even if some, but not all, of the intellects behind this gaming have concocted bizarre stories to justify their actions on literalist grounds). In its crudest form: “Let’s say we believe in literal interpretation because such semantic theories seem as if they must be the plainest form of common sense; once we get power we will rule exactly how we please.” Or, “it’s not what you say, it’s what they hear” - a level of meta-manipulation that their doctrine explicitly invalidates (as nihilist or relativist, and in the former case at least this seems an accurate characterization). And thus the ground shifts often from “original intent” to “textual literalism” (equally problematic absolutisms) and from “strict adherence” and “stare decisis” to novel invocations of the equal-protection clause to deny voters the right to an accurate count of their votes - or of the first amendment to deny individuals an equal voice in the political conversation that the framers clearly thought was the central site of democracy itself - what are any of these but super-rationalist gamings of all the available data, as if combed over by supercomputer, looking for the one loophole that on any putatively logical basis might justify your position, no matter how absurd? But what other logic rules our democracy? Surely Jefferson, were he alive today, would agree with all this talk of his views, up to and including the eradication of them from educational texts.
The rhetoric proclaims that computers will democratize politics - the human control of human beings. Yet the evidence suggests we aren’t even democratizing the human control of computers.
L. The “relations of production of The Cultural Logic of Computation itself (not least in its status as a ‘tenure book’)” (5,888)
G. As if.
Altaffer, Mary. 2009. “Bernard Madoff’s Other Legacy: High-Speed Stock Trading.” Time (Aug 24 2009).
“Automated Trading.” Wikipedia entry.
“Brooksley Born.” Wikipedia entry.
Bush v. Gore (00-949) 531 U.S. 98 (2000). US Supreme Court Decision.
“Case Study: Long-Term Capital Management.” Sungard Data Systems.
Citizens United v. Federal Election Commission (08-205). 558 U.S. ___ (2010). US Supreme Court Decision.
“Financial Crisis of 2007-2010.” Wikipedia entry.
Golumbia, David. 2009. “The Digital Securitization of Labor.” Internet as Playground and Factory conference, New School for Social Research, November.
Holmes, Brian. 2009. “Is It Written In the Stars? Global Finance, Precarious Destinies.” (Nov 6 2009).
Kuttner, Robert. 2007. “The Bubble Economy.” The American Prospect (Sep 24 2007).
“Long-Term Capital Management.” Wikipedia entry.
Luntz, Frank. 2006. Words that Work: It’s Not What You Say, It’s What They Hear. New York: Hyperion.
“Madoff’s Former IT Experts Arrested Over $65bn Fraud.” The Guardian UK (Nov 13 2009).
Patterson, Scott, and Geoffrey Rogow. 2009. “What’s Behind High-Frequency Trading?” The Wall Street Journal (Aug 1 2009).
“The Warning.” 2009. Frontline episode. PBS (Oct 20 2009).
Throughout this riposte, Golumbia directly quotes and responds to assertions made by Brian Lennon in his book review of Golumbia’s The Cultural Logic of Computation and Mark McGurl’s The Program Era: Postwar Fiction and the Rise of Creative Writing. Lennon’s review can be accessed here: Gaming the System as well as via the “Riposte To” link above.
Additionally, longer quotes that provide a bit more context are also provided in the glosses below. Here, for example, is the entire sentence from which this quote is taken: “But the other filament in Golumbia’s theoretical braid, one which sits at some odds with the spirit of this poststructuralist salvage operation, is a Weberian conflation of capitalist rationalism with evangelical Christianity, which Golumbia doesn’t do much to differentiate, in The Cultural Logic of Computation, from that counter-mystification through which modern liberal and left Euro-Atlantic secular intellectuals have imagined themselves somehow undefiled by the permeative cultural Christianity they discern in the enterprises of their declared opponents.”
From Gaming the System:”Working both centrifugally and centripetally from the relations of production of The Cultural Logic of Computation itself (not least in its status as a “tenure book”), Golumbia seats the female or feminized operators of a domestic workforce democratized by war’s exigency at the controls of the computer as world-war machine, suggestively linking the feminized technocratic class of the intellectuals to the subjugation-within-subjugation of the human computer under masculinist technocratic administration.”
From Gaming the System: “Such dubiety is conspicuous in Golumbia’s inventive critique of Chomsky, which, far from accepting that Chomsky has any place on the left at all, near or far, banishes him unceremoniously to the right of The Cultural Logic of Computation’s epistemo-political fold. Chomsky as “citation champ” drives the entire rightish intellective formation of computationalism, rerouting one academic discipline (linguistics) away from its leftish culturalism and exerting regressive pressure on a host of related orders, as well (psychology, philosophy, cognitive science, and computer science).”
From Gaming the System: “Without a doubt, McGurl’s comparatively steady poise is an asset, in so far as in its best pages, the work of The Program Era invites a response remote from the customary conflict mode, with its irresistibly predictable autocritical “problematizations.” But as we have noted, that, perhaps, is only one way of marking, in its contradistinctive change, in both of these undeductibly appraisable works, the danger of ending by merely, as it were, gaming the System: the custom Golumbia marks as a “style of authority,” and for which his final example is Bill Gates and Steve Ballmer, two Harvard mathematics majors assured of their place in the ruling class, ignoring their courses, then cramming “like mad” for the final exam (199). That that risk remains in the end fairly distant, here, wastes nothing of the categorical imperative to regard it.”