The Cold War Never Ended III

So last time I wrote another installment of my “Cold War Never Ended” series and tried to unravel something of the weird collective psychology surrounding the Cold War. Norman Mailer covered much of this ground in his 1957 essay The White Negro, which you probably know (if you know it at all) as an overheated and arm-waving piece on hipsters and race.*

Mailer going for the hipster look circa 1957.

Now, you might not give a damn about hipsters and probably find Mailer’s racial attitudes retrograde and weird, but The White Negro is still required reading, because (for my money, anyway) it is the best piece thing ever written on the Cold War as a phenomenon of mass psychology. Mailer defined the Cold War not in geopolitical terms, but as a unique event in the history of human consciousness:

Probably, we will never he able to determine the psychic havoc of the concentration camps and the atom bomb upon the unconscious mind of almost everyone alive in these years. for the first time in civilized history, perhaps for the first time in all of history, we have been forced to live with the suppressed knowledge that the smallest facets of our personality or the most minor projection of our ideas, or indeed the absence of ideas and the absence of personality could mean equally well that we might still be doomed to die as a cipher in some vast statistical operation in which our teeth would be counted, and our hair would be saved, but our death itself would be unknown, unhonored, and unremarked, a death which could not follow with dignity as a possible consequence to serious actions we had chosen, but rather a death by deus ex machina in a gas chamber or a radioactive city; and so if in the midst of civilization—that civilization founded upon the Faustian urge to dominate nature by mastering time, mastering the links of social cause and effect—in the middle of an economic civilization founded upon the confidence that time could indeed be subjected to our will, our psyche was subjected itself to the intolerable anxiety that death being causeless, life was causeless as well, and time deprived of cause and effect had come to a stop.

The phrase that sticks with me is the last one: “the intolerable anxiety that death being causeless, life was causeless as well, and time deprived of cause and effect had come to a stop.” If you live within an abiding awareness that human time can and probably will end, and end for no reason and to no purpose, then that finitude and absurdity infects every segment of time that we can experience. What can it mean to pursue a career, fall in love, raise children, care for gardens and pets, build a community, etc., when all these things are bracketed within an immense absurdity? What must one feel when we know that we have made this absurdity ourselves?

In these circumstances, it’s hard to avoid the feeling that you are just waiting around to die, and so anything you do has the quality of a distraction from the thought of death. (It has often been said that postwar consumerism was the flip side of postwar anxiety.) You do not really believe in anything, because nothing can go anywhere. Everything has the quality of dumb-show; activities feel like reflex actions, like movements made for the mere purpose of moving, not for any particular end. If there’s no future, what’s the point of doing anything now? Why work for a promotion? Why save for junior’s college fund? And yet here I am, doing things, filing reports or cooking hot dogs for the kids or taking the train to work, as if any of this will be here in 5 years, or even 5 months. The world becomes a little less real.

Under such circumstances, you’d think we’d all be running around having orgies and doing coke and murdering our enemies and generally living it up, but no, it turns out you just keep doing the usual stuff, just hanging around, waiting for the executioner to deliver the fatal blow. Maybe we would be living it up if we knew when that blow would fall, but the hellish ingenuity of the trap is that you can’t know. It might fall anytime, or never. It’s like the threat uttered by the Mystery Man in David Lynch’s Lost Highway: “In the East, the Far East, when a person is sentenced to death, they’re sent to a place where they can’t escape, never knowing when an executioner may step up behind them, and fire a bullet into the back of their head.”


It turns out that, in the Cold War, the fatal shot was never fired. But how would you have known that in 1957? So you couldn’t even give yourself over to real excess. But neither could you live in a present made meaningful by its place in an unfolding pattern to be realized in the future. Mailer admired hipsters because (so he thought) they were the ones who really did live as if there was no tomorrow; they just cut through the Gordian knot by accepting the executioner’s terms. But to do that would be to turn one’s back on everything, on career, family, home, security, everything. And it turns out that not too many people have the stomach for that. Mailer saw quite clearly that the Cold War was above all a challenge to the courage of human beings, and perhaps the most intolerable thing about it was that we could see, quite clearly, that we fail that test. It’s easier to carry out a dumbshow version of our lives, to perform an unmeaning series of rote actions intended to simulate a purposeful life, than to live out the truth of our condition.

This is the psychological condition that Mailer wanted to understand. The question his essay sought to answer is the obvious one: what is to be done? How can we make ourselves at home in this impossible state? Mailer’s answer was that it was that hipsters alone could live within this collective psychosis since they were themselves “psychopaths,” which is to say, they were themselves unmoored from time, purpose, history, and conscience. I will admit that I find this thesis interesting but basically wrong, for reasons I discuss at length in my book. But for my purposes here I’m only interested in his insight that the Cold War is primarily a historical moment of mass consciousness and only secondarily a military/economic/political clash.

The Cold War never ended, and neither has this series. Stay tuned for installment no. 4.

*BTW, check out Frank Conniff (you know, TV’s Frank) pretending to be Mailer, bantering with Lord Byron (Dana Gould), and reading from The White Negro on Paul F. Tompkins’ Dead Authors Podcast. LOL, as the kids say.

Posted in Cold War, Criticism, Hipsters, Historiography, Politics

The Cold War Never Ended, II

OK, so what I was saying last time was that


Actually, from one point of view, saying that the Cold War never ended is not so far from common sense. Putin’s Russia has been pursuing the same kind of authoritarian and bellicose policies as his Soviet predessesors, and op-ed columnists are saying it’s the Cold War all over again. And maybe there’s something to that. But I’m not saying that the Cold War has returned, I’m saying it never ended.

We ordinarily understand the Cold War as a geopolitical conflict between adversaries motivated by that characteristically modern thing, ideology. Or perhaps it was not so modern at all; the conflict of social philosophies might find its echo in the religious wars of the Middle Ages. But no matter; we understand that this was a conflict over ideology between adversaries equipped with weapons that, if used, would destroy all human life forever. This, too, was a modern situation, and unarguably so: many times before had the religious imagination conceived the end of the world through the intervention of some god, but at no other point in human history had the end of the world been grasped in a scientific way.

Science is (among other things) a way to understand reality as reality. Perhaps its greatest single contribution has been the idea that there is an “objective truth” that brackets and transcends individual human experiences (which are “subjective”) and in which things exist and events happen whether or not people are there to experience them or believe in them or otherwise participate in their reality.*

And what was different about the end of the world that nuclear weapons promised was that it would take place in that realm, in objective reality. This meant mass death for real, the final death, no kali yuga and the turning of the great wheel back to the beginning, no Last Judgment and afterlife, but a kind of Satanic inversion of the Last Judgement, with the arrow of time going thunk into the end of human history but without the long sought-for goal of theophany, without any goal at all, in fact, without meaning and without redemption. The scientific worldview had given the individual this idea of death (which was hard enough), but now death-as-the-end, the true terrifying nothing of death, was encountered on the species level.

What nuclear mass death promised was very modern indeed: human death without human meaning. Death would come at random: someone somewhere in the vast military-industral apparatus would screw up, push the wrong button or something, and the missiles would fly. You would never find out what it was, of course, because you would be dead. So all you knew was that you could die at any time, with no warning, for no cause that you would ever discover, and that your death likewise would cause nothing to happen, because all causality, all the affairs of human beings, would have come to a complete and eternal stop.

And above all there was nothing you could do to stop this from happening. The Cold War created vast superstates, but it also created the possibility for its citizens to glimpse (however fleetingly and partially) their true vastness. The Cold War state was its own aesthetic, a Cold War sublime. The state had its own ends, and if they were not yours it didn’t matter, nothing you could do would budge the state one iota from its blind and irresistible movements. And if you tried to deflect the state from its course, to try your strength against this Leviathan and fight its purposes, nothing would happen except that, for a terrible moment, the state might return your gaze, and you would then see (just for an instant) the true size and strength of the animal you had just provoked.

The state, then, was the god of the new age, or rather the demiurge: a blind, cruel, destroying god that (through some meta-historical reversal or enantiodromia) had reincarnated the God of the Old Testament, “the great fuming dyspeptic God who raged round his punishment laboratory.”** But the Biblical God is bigger than any destruction He can propose; He will always outlast us. When the bombs drop, though, the state will die like everything else. The state is godlike in its power and reach but in the end is merely what Freud called the prosthetic god, the human being extended and amplified by his technologies, like Ripley in her mech suit.

Being destroyed by God is comforting, in a way: at least He noticed. At least He cared enough to kill me. But the terror of the Cold War was the possibility of death at the hands of the prosthetic god we had made of ourselves, the state, which had taken the place of God. Maybe we killed God, or sent Him packing, His services no longer required. Most likely, He never existed in the first place. But (and this is the worst thought of all) maybe He did exist, and maybe He once cared for us, but that’s over now, he’s given up on us, he’s walked out and has left us alone to kill ourselves. Deus absconditus.

*Then again, maybe you don’t think that anyone had to invent that way of thinking—it just seems unproblematically there, like clouds and feeling hungry at lunchtime. “But isn’t reality, uh, reality?” I hear you asking. Well, no, for most human beings throughout most of history this is a very strange way to view the world—actually, an unthinkable one.

**Anthony Burgess, Little Wilson and Big God, 59.

Posted in Cold War, High Weirdness, Politics, Religion

The Cold War Never Ended I

*shuffles papers awkwardly after long and unexplained silence*


After he got shot with a pink laser beam from space, Philip K. Dick realized that time was an illusion created by some malign force to blind us and keep us in bondage, and that we are living within a “black iron prison” of occlusion and unable to perceive the true nature of reality. Time had in fact ceased in 70 A.D., and everything that had transpired since was just an illusion. In fact, it was still some time in the first century of the Roman Empire. This is why Dick liked to say that “the Empire never ended.”

R. Crumb, detail from “The Religious Experience of Philip K. Dick”

I guess that sounds kind of weird when you just write it all down like that. I’m not really going to write any more on the nature and possible explanations for Dick’s “2-3-74″ theophany, especially as it has been written about so well elsewhere. (Go read VALIS.) What interests me here is the resonance of that phrase, “the empire never ended,” and the weird kind of time-perceptual hiccup it induces. What if things we thought were history were really still present? In a sense, that happens all the time. Causality has a long reach, and we’re still dealing with fallout from the Cold War, the World Wars, the age of empire, even (more distantly) the Roman Empire. But that’s not the same thing as saying the Roman Empire never ended. The latter claim only really makes sense if (a) we are “occluded” (one of PKD’s favorite words), i.e. we have been deceived in some way so comprehensive it amounts to some kind of brainwashing or Matrix-like virtual reality long con; or (b) the power that might wish us to remain occluded has not messed with our perceptions, because it has not had to. It has simply disappeared from sight and continues its unofficial work through its official proxies—though of course to maintain the illusion of not existing, the secret sources of power must pull the odd string and kill the odd witness. The first explanation is prime PKD thematic territory; the second is the stuff of Illuminati conspiracy theories.

This kind of thinking is not very respectable for a humanities academic. I can get away with writing about it in my usual historicizing way, but if I start actually making this kind of argument myself I will revoke my scholarly immunity and find myself exiled to Cranksville, shelved in the New Age/Occult/Paranormal section of the bookstore. But what the hell, I’m going to go right ahead and make exactly this kind of argument. I’m not 100% serious, but then I’m not totally kidding, either. Think of this as what Marshall McLuhan called a probe, a thought-experiment whose usefulness we may measure not by asking “is this really true?” or even “are you kidding me?” but “is it useful for stimulating fresh thoughts and perceptions?” And in the end I want to suggest that this argument is not quite as weird as I am making it out to be at the front end. Maybe something of this idea can be assimilated to respectable thought after all. But we have to start out weird.

Actually, no, we don’t have to, but it will be more fun if we do.

So here is my thesis in occult historiography:


OK, there’s a teaser (for anyone who’s still reading, that is. Are you still reading?). This whole thing (much of which I’ve already written) is pretty long, so I’m going to post it in installments over the next few days.

Posted in High Weirdness, Historiography, Politics | 2 Comments

Across the Wide Missouri

Tonight at Sabbath services, while we said the Kaddish—the prayer commemorating people’s deaths—I thought of James Erb, friend and colleague at the University of Richmond, a small, titularly Baptist college in Richmond, VA. I was at UR for my first real job(s), the two years after I finished at Stanford: two successive sabbatical-replacement Visiting Assistant Professorships, first as piano faculty, then as musicologist. (And thus the eleventh-hour change of direction was made, despite my D.M.A.) Former students from the University of Richmond have alerted me to the fact that Jim died on November 11, and I’ve been thinking about him.

You know the folksong “Shenandoah”? The heartbreaking choral version that ends with a canon? That’s his arrangement. “I got lucky, really,” he told me once. “I’ve spent my life trying to get another success like that.” I’ve heard several of his arrangements, and they were all superb; that particular one just hit the sweet spot. So? He spent a wonderfully productive life conducting choirs at UR, interrupted by four years at Harvard when he got a Ph.D. in Musicology with a dissertation on Lassus. The Richmond students who remember him talk about his artistic standards, his inspiration, etc. He was my office next-door-neighbor my first year, and I have some memories of my own.

First, the interview. It was only a one-year position in a small, liberal-arts music department, and he was not present for the interview. So he insisted that he be allowed to drive me back to the airport, because he wouldn’t get to meet me otherwise. I—a baby doc just out of grad school—was suitably impressed, and we hit it off…both of us were voluble, exciteable, delighted to discuss music we liked, ironic senses of humor, etc. After I got the job, we next spoke at length at a music faculty mixer: one of those delightfully convivial, alcohol-lubricated conversations where shared musical tastes are discovered.  I was telling him how much I loved some of Rossini’s Sins of My Old Age, especially, “La Passeggiata,” which I had sung in choir when I was a sophomore at Cal Poly Pomona, under the late (lamented) voice teacher and choral conductor Charles Lindsley. “Oh, I’d love to do that with my singers,” said Jim, “but the real problem is that the piano part is so damned hard…” Well. That was my cue. “I’d love to do it!” I said. “That’s a great piano part!” So I learned it and played it, with the chamber singers, on their concert, and they all came in concert dress and did it on my piano recital in December. The piano part is a real kick, and it was fun to work with Jim and the singers on this. This is precisely what colleges and universities are good for: partners in crime, people who begin conversations with “Y’know what I’ve always wanted to do…?” and off you go.

During my two years there, I observed how much the students loved him (“He’s. So. awesome!” said one, as he brushed by her, speaking in a Mel Blanc voice), how they’d go to the gates of hell for him. This is your successful conductor, choral, band, or orchestral: the job description is “make them play or sing better than they actually are.” I also saw him once several years later at an AMS meeting, with Debbie and baby Ben, and we had a fun breakfast chatting. But that isn’t the main point: I have a specific debt to acknowledge.

In Fall 1993, I—white male phenotype in the dog days of Affirmative Action,* D.M.A. and not a Ph.D., years on the job market already—was despairing of ever getting a job, and was in a very dark frame of mind. We had a baby, Debbie was a grad student, doing her best to shore me up emotionally…and she informed me that I would continue applying until I found a job, because she didn’t intend to live with me if I was unhappy; she’d done that before and didn’t like it. (There is a traditional Jewish blessing, the ayshet-chayil. “A virtuous woman, who can find?” I think it’s a passage from the Book of Isaiah. I’m here to tell you, and others who know about rock-like support when needed will agree with me: it’s real. No joke.)

So there’s this fairly late-breaking job at the University of Northern Colorado. It was year-to-year, not even tenure-track (that happened later), but it was one of, I think, three positions nationally that entire year. I thought it was just another sabbatical replacement, so I initially ignored it, but Debbie forced me to apply. And I got an interview, and hit it off with everyone pretty well, and things looked good.  UNC was critically interested in new blood on the history side, and wanted to be sure—so the Chair of the Search, as it turned out, knew some people in Richmond and made some calls. One of his friends had actually played in an ensemble with me, and said good things, for which I was grateful. But more importantly, Jim Erb—may his soul sail on to glory—heard that I was being considered here, and made an unsolicited call of support. “I want to talk to you about Jonathan Bellman,” I heard he said. And he sang my praises, as colleague, pianist, academic…. And I got the gig. And the rest, as they say, is history.

The effect of a call like that cannot be overestimated, particular when one considers that it eventually became an ideal fit for me, then (when she got her job) for the thrice-feared Dr. K, and for our family. Jim Erb, a busy man, did not have to make that call, and I believe it happened at a key time. Thus, the door opened, and here we are.  I will tell that story for the rest of my life, always with this moral: nothing worthwhile ever goes strictly according to protocol. There is always a break, a glitch, a recovered fumble, a something…with an angel there to make it work out right.

Jim’s “Shenandoah,” was the last thing we heard before hitting the road for California.  Unemployed, apprehensive, and frankly scared about some troubling but inconclusive medical reports on the pregnancy-in-progress (which amounted to nothing, thankfully).  Final packing of the car, emptying of the apartment, and Jim’s song comes on the radio.  We stopped, listened, and Debbie said in a choked voice, “That’s it.  Turn it off,” and we hit the interstate, having heard the single most appropriate Leaving Richmond song possible.  And four months later, we had to fly back to Richmond for Ben to be delivered in a Richmond hospital (medical care COBRA payments; you don’t want to know), and Jim insisted I borrow his brand-new Camry for that long weekend, while he was gone at the AMS meeting in Pittsburgh. Lemons there were aplenty, and the Richmonders made us lemonade of the sweetest kind, Jim at the front of the line.

As we say in Yiddish: !פורן געזונד  Go well! Soar, journey. Your menschlakhkayt and humanity have always been a model for me of How One Acts, What We Do. Thanks from the bottom of my heart. Ave, atque vale!

*Don’t even bother getting offended. That was the reality, and plenty of people with far fewer qualifications but different plumbing got tenure-track gigs while I didn’t get phone calls. You’re welcome to your opinion of the justice of the situation, but it was demoralizing as hell for those of us on whom, in many ways, the rules had been changed. I was there, and I saw it, and I’ll never forget how I felt during that period.


Posted in Ave Atque Vale | 3 Comments

On Disparagement

“Ironic dismissal of passionate commitment to ideals is really just a more sophisticated way of being lazy.”  —Christopher J. Smith, Prof. of Music at Texas Tech, Director of the Vernacular Music Center, Balor of the Bouzouki

Growing up in Sunnyvale, my wife’s favorite music store was in Palo Alto, a shop called Melody Lane. In the mid-1980s, just after we were married and had moved back to the west coast, we stopped in there for a piano score and the youngish guy behind the counter asked what I was doing. I smiled and happily explained: I’d just gotten a job accompanying for the San Francisco Ballet school. “Great!” he snorted sarcastically. “We’ll see how that goes…”  This rankled—I mean, a job playing music, with benefits? That’s something of a holy grail, whenever and wherever. Debbie and I noted it, discussed it, and life went on. Two years later, I got into the Stanford DMA program, and again found myself at Melody Lane to buy something. Same guy and response: embittered judgment, dismissal, etc. “Great. Good luck with that.” From that moment on, I made it a point to be utterly jolly and excited about whatever I was buying there (“Hi! I’m looking for a two-piano version of that Liszt concerto that Jay Rosenblatt discovered and reconstructed! Maybe I’ll get to learn it!!”) just to make him grumble and belittle. This went on for years. I’d go in there gleefully, knowing that any excitement I showed would provoke a Pavlovian response of disparagement. Or maybe not even Pavlovian—for me, it was more like poking the brain of a pithed frog to get it to twitch.

The Gentle Reader may be forgiven for considering me to be sadistic, here, though I don’t believe it to be so.  The principle behind this strategy had been taught to me by the clarinetist in my chamber group at the University of Illinois, in 1981–82: we were assigned a coach who would only carp and nag, considering herself far above musicians of our mean caliber, and the violinist and I were getting discouraged.   “Oh,” smiled Kurt, “watch me. I put myself down, and she can’t help piling on. Everything I say, she adds something mean, so I make her keep doing it. That way, I’m in control!” So we tried it. “Well, if I could figure out my fingering…EVER…” “Screwed up my bowing…AGAIN…” And our coach was helpless. Every self-deprecating remark we made was followed by a snarling insult from her, and we began to have a good time—she became our unwitting marionette. It was hilarious, and (yes) she’d asked for it: pissing on students is not “coaching,” regardless which impressive institution you studied at, and how good you think you are.

To this day, I adore our piece, Bartók’s Contrasts, which holds a special place in my pedagogical history.

I’d gloss the comment of my feared friend Chris to this extent: ironic dismissal of passionate commitment to ideals, or indeed to anything, is as unsophisticated as anything on earth—simply a sneering “Huh-uh, no you can’t” with more syllables. Beyond being lazy, it is cowardly: the tacit acknowledgment that someone’s commitment, passion, and action have called you out, and cast your ironically superior pose into the light for what it is.

It follows, somewhat counter-intuitively, that such put-downs should be welcomed, because they tell the aspirer something very important about the critic. Someone snidely puts down your efforts, or ideals, or aspirations, or beliefs? Somewhere in your soul there should be a radiant smile, because you’ve now learned something very important.

Posted in Ethics, whatever

Exoticism and Racism and the Whole Damn Thing

Writing for the Atlantic group online (initially Quartz, “a digitally native news outlet launched by Atlantic Media in September 2012—it provides a 24/7 digital guide to the new global economy designed to serve business professionals who travel the world, are focused on international markets, and value critical thinking,” later on the Atlantic website itself), Far East Specialist Gwynn Guilford takes on musical exoticism in “It’s Time to Stop Using ‘Exoticism’ as an Excuse for Opera’s Racism” (July 23, 2014). My suspicion is that Guilford did not provide this title herself, because the article is more nuanced than that. For this I give her credit, and I also give her credit for consulting with and quoting authentic jan-yew-wine musicologists. Still, I can’t help but wonder where someone with no stated musical or theatrical background gets off writing so ambitiously about what ought to be happening in opera. I may have opinions about (say) the Chinese government’s approaches to environmental and food safety, but I’d be setting myself up a pretty thorough drubbing were I to go public with such thoughts, given my lack of background. As usual, an expectation of disciplinary preparation doesn’t seem to apply to the arts.

Guildford’s starting point was worse, far worse. The columnist Sharon Pian Chan had written a July 13 article in the Seattle Times about a forthcoming production of Gilbert and Sullivan’s Mikado, titled “Yellowface in Your Face,” which observes that “The opera is a fossil from an era when America was as homogeneous as milk, planes did not depart daily for other continents and immigrants did not fuel the economy,” continues with the dark and highly problematic observation that “the caricature of Japanese people as strange and barbarous was used to justify the internment of Japanese Americans during World War II,” and concludes “But this production? This is the wrong show—wrong for Seattle, wrong for this country, and wrong for this century.” And suddenly becoming a dramaturge, she counsels, “The Seattle Gilbert & Sullivan Society could, for instance, partner with the Asian-American theater group Pork Filled Players to reinterpret the opera.”

Yes, the art world should be grateful for her informed advice. Briefly:

Gilbert & Sullivan were English, and that the entire purpose of the high-camp invented “Japanese” was to satirize English politics and mores. Nothing at all to do with the U.S. So, the stuff about the homogenous U.S. and the internments? Entirely irrelevant—it was potential disloyalty and working as a fifth column in case of a Japanese ground attack on the west coast—not at all what she is claiming—but thanks for feeling the need to spew it and blame a country and culture entirely uninvolved. When the character Yum Yum says

Yes, I am indeed beautiful! Sometimes I sit and wonder, in my artless Japanese way, why it is that I am so much more attractive than anybody else in the whole world. Can this be vanity? No! Nature is lovely and rejoices in her loveliness. I am a child of Nature, and take after my mother.

…she is satirizing any number of things, none of them being actual Japanese. It is true that opera is an artistic artifact that touches on myriad aspects of its own culture, but it is likewise true that two-dimensional, anachronistic critique of operas a century and a half old makes the critic look stupid, not the opera.

I might point out, for those who remember the old National Lampoon satire magazine, that they used to excel at this. Anything having to do with Black people was a ludicrous send-up of urban white fears. “Help! Negroes!” characters would say, when Black people so much as appeared, and the satire got far subtler—ridiculous, outsized takes on sexual mythology, etc. It would be easy to criticize the magazine for its use of stereotypes, but the stereotypes were the entire point: this is what you, reader, might be thinking, so we’ll blow it out of proportion and make you sob with laughter. At yourself!

Sharon Pian Chan wanted an excuse to write about “Yellowface,” and so she did. Awkwardly enough, her closing suggestion about teaming up with an Asian theater group seem to echo the Miss Saigon kerfuffle, where demands were made about Asian actors being cast in the American production of that musical—including lead Lea Salonga, who as a Filipina was apparently the wrong sort of Asian. It ended up going nowhere, after a lot of press, and looked like little more than a hiring shakedown: our people could use the work, so here is a windy, self-righteous argument that on closer inspection looks like a defense of/insistence upon stereotyped casting, as long as lead roles were involved. It seems awfully close to a demand for racially “appropriate” casting when many directors are abandoning such two-dimensional approaches, and indeed when many artists of Asian descent seek to go beyond such roles—not comfortable ideological position, it seems to me, to stake out

Guilford’s main point about traditional “Asian” roles is stated toward the beginning:

The funny thing is, many more serious operas—Madame Butterfly and Turandot come to mind—do exactly the same thing. And it’s always been done that way. This is peculiar behavior for an industry said to be “dying.” When directors preserve cultural cliches simply because they were exotic a century ago, there’s an opportunity cost to those choices: the chance to move audiences anew. The tighter they cling to tradition for tradition’s sake, the more they rob the world’s most powerful art form of its relevance.

“Said to be dying” is not much for an outsider to build a case on: the precarious existences of opera companies have much to do with Boards, financial (mis)management, and so on. Interest in opera as an artform flourishes in universities and in many smaller companies, and before we roll our eyes about externally supported museum culture we should remember that symphonic wind ensembles do also, and their military, civic, and pedagogically entwined history is the very opposite of that of an elite museum culture. Further, a call to “make it more relevant” is a fairly naïve thing to say—“relevant” concepts of (say) Shakespeare plays are hit-and-miss, depending on the quality of the concept and how thoroughly they have been thought through. So, relevant to whom?  This one writer? There are a couple of further issues, too:

Opera itself is about nothing if not stereotypes. Cultural stereotypes, yes, like the meek, dependent Asian female. But what about gender stereotypes? Violetta in Verdi’s Traviata, Tosca in Puccini’s eponymous opera; flawed past, too much “generosity” (insert primal scream here—oh, and did I mention Grizabella in Cats?), possible chance for redemption in a harshly judgmental environment, these are hardly original, nuanced characters, yet one doesn’t hear gender representatives demanding rehabilitation…such stereotypes are the very essence of the entertainment form. Theatrical make-up is, after all, the extension of ancient Greek masks: exaggeration of features to help the audience follow the conceit. Hence, Black actors in the mid-twentieth century putting on make-up to do Blackface, manly men wearing make-up that makes them look more like pitiless tyrants (another operatic stereotype), and so on. The innocent girl, the ardent lover, the disapproving father—these are all cultural two-dimensionalities without which opera would not exist.

And for everyone taking offense at a racial depiction, I’ll raise you one disapproving father. Not that I’d know anything about that, of course.

The essential problem here is that the multiplicities of meaning in musical exoticism—a phrase that can become, in the hands of cruder composer and critic alike, something of a blunt instrument—so the layers of meaning an additional culture can provide are brushed aside in favor of surface meaning only. That’s hardly how art, even entertainment-art, works, and to her credit Guilford’s discussion moves in this direction: layers of meaning may be added by either intuitive or counter-intuitive ethnic casting, which would have lain well outside composers’ expectations. It is not that this is a recent realization; Eastman professor Ralph P. Locke, who has thought and written about exoticism for decades, has written about the various contexts and concentrations of exoticism that one finds in opera: including musical style, not including music but including text, gesture, costumes scenery, etc. The various possible combinations of exotic content make for myriad interpretive possibilities. Why, then, is it necessary take offense and proscribe this or that approach? The Seattle production followed, apparently, traditional lines (as did a stupendously good Opera à la Carte one I saw in my late teens). Why are opera’s gender stereotypes, for example, shrugged off while action is demanded on the most comic, campy, and exaggerated ethnic ones? (And yes, I’m aware that the racially insulting lyrics to “I’ve Made a Little List” have been changed. That’s a different case: they are completely non-integral to the work or even the meaning of the song…indeed, they are sufficiently dated as to dilute the song’s effect.)

Audiences are not ineducable, and my civil rights-era upbringing leads me to become especially annoyed when someone else instructs me about what I should find offensive. Often, such guardians are against innovative interpretations; in this case, by contrast, the traditional approach is to be eschewed, but a particular kind of innovation prescribed. I have difficulty understanding how this kind of pre-defined “relevance” helps the cause of any art.

To close with a fragment of a new theme: I note that popular music frequently avoids the the kind of censure that seems to be open game in opera. The Stones’ “Brown Sugar”? “A rocker so compelling that it discourages exegesis,” says Robert Christgau primly. So the almighty Mick gets a pass: the race and gender stuff in the lyrics are apparently off limits for criticism, regardless how much the original song is played on oldies stations. I mean, come on! It’s the Stones! Don’t be like that!

Especially given how problematic the “Yellowface” oversimplification is, I’m going to say that this Gilbert and Sullivan classic gets the same privilege.

Posted in Concert Culture, Criticism, Current Affairs, Ethics, Opera, Politics | 4 Comments

How to put a dollar sign on everything on this planet

Capitalism makes everything fungible.


But it’s how capitalism makes things fungible that’s interesting. And in looking into the question, we also see the relationship between capitalism and scientific ways of knowing.

For the kind of thinking I want to discuss here, the main thing is to find the most basic unit. For the physical sciences, this quest for the smallest, the foundational particle has led to atoms, and then electrons and protons and neutrons, and then quarks, and now string, which may or may not exist outside the mathematical equations used to describe them. When you get all the way down there, the solidity of the fundamental particle begins to waver and break, and the very notion of “particle” becomes fraught. Perhaps there is finally no real distinction between the smallest particle and the calculation of it. Could it be the same with capitalism? At first thought, the basic unit of capital is perhaps the smallest unit of currency—say, a penny. But this is naïve: stock trades hinge on fractions of cents, and fractions can keep receding into infinity. And in any event, the principles of fungibility extend even to places where no exact currency amounts are specified. How much do you have to spend to hold your table at a Starbucks? It depends. How are you dressed? Commodity relations drift from things to people—this is old news. But how do we make sense of it?

Perhaps the foundational unit of capital is not to be measured in currency but in time. Again, though, the smallest common unit of the clock can be infinitely subdivided: microsecond, nanosecond, picosecond, femtosecond. However, time subdivided is still time quantified. As with the distance traveled by the arrow in Zeno’s paradox, we can keep subdividing forever and still come up with some definite amount. The main thing is not any particular subdivision of time, but that time is subdivided. Time itself has no marks; humans superimpose the marks of the clock face upon it. The quantization of time is our invention and exists only insofar as we act as if it does. But everybody does, and this has incalculable effects on everything in our lives.

In considering the foundational unit of capital and seeking the smallest unit, we are going about it the wrong way. Whether we are talking about currency or time, what is foundational is not some particular entity but the very principle of quantization—the assumption that, for a quality (money or time) to enter into reasonable discussion, it must be put on a footing whereby it can be counted on a uniform scale. Once we have determined the scale, we can assign value. Once we can assign value, we can establish a basis for exchange. The unit of measure is just a detail.

The foundational unit of capital is not a unit at all but is itself the principle of exchange. The principle of exchange is the very fact of exchangeability. The idea that there are identity relations that could be set up between unlike things forms the necessary basis of exchange. Human presence is not the same kind of thing as a cup of coffee, and yet a capitalist society will seek always to find the common denominator between them, so they can both be entered into calculation. The value of your presence at Starbucks on a busy afternoon is variable, some complex and moving equation of the latte you just paid for plus your own social capital (itself indexed by dress, grooming, race, gender, the possession of a smartphone or laptop, etc.) plus the time of day (early morning rush or midmorning lull), itself given a value by a whole set of other equations derived from the start times and break times of the sum total of workers in a given area, etc., etc. Perhaps no human being could ever run all the numbers. But there is some instinctive rule-of-thumb by which we all operate and which we assume in the course of our daily lives, and the foundational understanding that vouchsafes that rule of thumb and makes it real in all our human interactions is the abiding awareness that the intangibles of human life are in fact not so intangible, that they have been (or eventually will be) figured out, quantized, assigned a value, placed on a scale. That somehow there is a value to be assigned to a human presence in any given place at any given time. That there is a value to human beings, and a value to everything that humans value. Everything is quantized. Or, more precisely (though more clumsily), everything becomes part of the regime of quantification.

Imagine a scene from the libertarian utopia, that dream that is to the neoliberal state what the classless society is to Marxists. In this utopia, all schools have been privatized, market principles hold inflexible sway, and the best education can be had for top dollar. Only rich people can send their children to the best schools, but this is as it should be, as wealth is an index of hard work, innovation, and entrepreneurship. Wealth is a matter of will: poor people are poor because they have failed. But wait, says someone, we cannot blame children for their parents’ idleness and stupidity, can we? And the thoughtful libertarian scratches his head and says no, I guess we can’t, but (here he brightens a little) there is, after all, no problem that the market cannot solve. And behold, a market solution appears: sports apparel corporations are willing to tattoo children with their logos and in return will fund tuition at expensive schools. So in return for turning their children into walking billboards, poor parents can afford decent education.

I doubt that, for the libertarian true believer, there is anything wrong with this arrangement. But I also doubt that most parents would happily consent to it. We feel that children have some quality that should be protected from exploitation, and however we might define “exploitation,” this thought experiment offers a pretty good example of it.

But my libertarian interlocutor would spot the weakness of that last sentence right away. What do you mean by “some quality that should be protected”? Define it! Well, uh, maybe we could call it innocence . . . Then what do you mean by innocence, and how do you propose to “protect” it? And at this point our conversation might take a predictable turn. Libertarians would try to pin me down to what specific harm concerns me, and by “specific” they would mean some measurable harm. This puts me in a bind.

On the one hand, I could say that innocence cannot be measured or assigned a dollar value, but in a positivist age (as ours surely is), such an answer is no answer at all, because qualities that cannot be measured or assigned a value cannot be said to exist at all. Or at least such qualities cannot render themselves accessible to rational speech, which amounts to much the same thing. Such talk is what the logical positivists of yore meant by “nonsense.”

But if I concede the positivist’s threshold assumption that there is no quality in human life we can meaningfully discuss that cannot be quantified, then I am forced to redefine a metaphysical term like “innocence” in physical terms—which is to say, redefining it to mean something that suits the terms of my interlocutor’s worldview. And this amounts to redefining it out of existence. Innocence with a dollar sign on it is not what I mean by innocence. Maybe not what you mean by it either. One of the things we want children to be innocent of is the inescapable aspect of valuation and exchange that torments their parents. Children need unconditional love and deserve to feel that the care they receive is likewise unconditional. If everything has a value, then love must as well. In the libertarian utopia, there is no unconditional anything. The tattoo I’m imagining in my thought-experiment is the ineradicable mark of the exchange of parental care for cash on the barrelhead. It is a mark of the supremacy of money over whatever it is we mean by “unconditional love and care.” This is why it is indecent. In the face of such indecency, I would have to insist: children deserve* protection from a world that demands something of them for the mere fact of their existence. They will learn the brutal truth soon enough; grant them a few years of innocence.

But for me to make this argument at all, I cannot place it on the grounds of quantification. To demand that I do so is to demand that I surrender the warrant of my argument, which is that you can’t put a dollar sign on everything.


This same debater’s trick — insisting that the your interlocutor’s terms be redefined in such a way as to confirm the warrant of your own argument — is played by neopositivists of all stripes. Richard Dawkins is often accused of being ignorant of the religions he attacks, and he will sometimes respond by saying that one can only be responsible for knowing facts, not lies or fairy tales, and will then ask his critics to name one fact that religion has uncovered. A glance at atheist Twitter feeds will confirm that this is a popular argumentative move among Dawkins’ followers, who will sometimes turn it around and ask what facts Dawkins has gotten wrong.

dawkins flow chart

Fig. 1: Something I retweeted

positivist tweet

Fig. 2: Some guy responding to the thing I retweeted

But such arguments are misleading, because they assume that facts are all that count, and in religion they surely aren’t. It is only fundamentalism—the strange twin of neopositivism—that insists that religious knowledge lays claim to facts in the same way that science does. For that matter, while there are certainly facts regarding poetry and music, poetry and music themselves don’t concern facts, either. There are a lot of music-analytical and music-historical facts about, say, Wagner’s Ring cycle, but what facts might we learn from listening to that music? Do we learn metallurgical principles from the forging scene in Siegfried? Can the Rheinmaidens teach us to swim? Of course you would have to be an idiot to expect such a thing. Centuries ago, Jonathan Swift lampooned such excesses of scientific zeal by parodying those who complained of Homer’s factual inaccuracies: “We freely acknowledge Him to be the inventor of the Compass, of Gunpowder, and the Circulation of the Blood: but I challenge any of his admirers to show me in all his Writings, a complete account of the Spleen. Does he not also leave us wholly to seek in the Art of Political Wagering? What can be more defective and unsatisfactory than his long Dissertation upon Tea? And as to his method of salivation without mercury, so much celebrated of late, it is to my own knowledge and experience a thing very little to be relied on.”** It is ironic that neopositivists, so keen on the progress of knowledge — so insistent that the only legitimate knowledge is that which progresses — should be making the same crass blunders as their intellectual forebears from three centuries ago.

One thing that this line of thought might suggest is the intimate relationship between neoliberalism and neopositivism—the latter the intellectual operating system of the former. This comes as unwelcome news for those despise both the Christian and libertarian Right and see scientific skepticism as their salvation — people who are psyched when Neil DeGrasse Tyson appears on the Jon Stewart show or who put xkcd cartoons on their Facebook walls. That’s most of my friends. That’s me, for the most part.*** If you think that intelligent-design activists are endangering American education and that climate-change denialists are endangering the entire planet — and I do — then it is natural to assume that scientific skepticism is a force of progress against the forces of reaction. We might then assume that our side is a plucky little band of dissidents manning the barricades against the hegemonic Right. It feels disloyal to question the philosophy that motivates our allies against the obscurantists. But what if, in adopting the neopositivist worldview, I were to discover that “I was not shoring up the revolutionary barricades: instead, I was cheering on the Tsar’s cavalry”?****

Here is what’s at stake in all this. Most people think of philosophy as just another abstruse academic pursuit, or worse. Unsurprisingly, neopositivists hate and fear nonpositivist philosophy, since it can undermine their arguments in ways they cannot effectively defend. But what I want to suggest is that philosophy matters, because whether or not you know it, you have a philosophy, and it is probably making you miserable. And it is probably the very philosophy you hoped would make your life better. 

It is often said that the most basic problem facing the left today is the problem of hope, or rather the seeming impossibility of imagining a future that looks any different from the present. As Frederic Jameson wrote, it is easier to imagine the end of the world than to imagine the end of capitalism. The Left finds itself confined to figuring out an agenda that seems always to have been dictated within someone else’s terms and having to shrink its visions down to suit the confining worldview of some invisible master. I want to suggest that the master is a philosophy, a way of thinking, and that it is how our best friends think.

Somehow we have to find a way out of this. The chains that bind us are forged in the mind, and it is in the mind that they must be broken. Karl Marx famously said that philosophers have only interpreted the world, while the point is to change it. In our own time, it seems, we will only change the world when we find a new way to interpret it.

*What do you mean by “deserve”? asks the libertarian. This is perhaps the root of my quarrel with libertarianism: the libertarian does not believe in any inalienable right save one, the right to property. You have no right to clean drinking water, medicine, education, food, shelter, or anything else, only the right to buy these things. This is why the “freedom” espoused by libertarians feels so much like slavery to those who are not libertarians. Despite whatever flattering ideas libertarians might cherish about their own beliefs — and I know what I’m talking about, as a kind of libertarianism is one of the many stupid political ideas I have tried on and abandoned over the years — libertarianism is actually about control and domination. As Corey Robin has noted, even Robert Nozick was moved to write that his fellow libertarians were “filled, paradoxically, with resentment at other freer ways of being.”

**Jonathan Swift, Tale of a Tub, section V, “A Digression in the Modern Kind”

***Well, except for the xkcd part. Some of the jokes are OK, don’t get me wrong, but for the most part this cartoon, which treats illustration as a mere delivery device for verbal content, is the best illustration I can think of for the threadbare, talky, tacky, boring aesthetic sense of contemporary geek culture.

****This is from an essay titled “Why I Am No Longer a Skeptic,” by Stephen Bonds. This essay appears to have been taken down, but you can use the Wayback Machine to find it.

Posted in Philosophy, Politics, Uncategorized | 2 Comments