A Good Offense

I’m sure I’ve held forth on this subject before, but will do so again at a friend’s pointed request. A new cousin (new to me, that is—I have found several in recent years, and it is delightful) has posted several links to my Facebook page on the subject of music education, whether or not it should be “defended,” and related issues. One of these is Peter Greene’s Stop Defending Music Education on HuffPo (“Defend it because music is awesome in ways that no other field is awesome.”). Music Parent Tony (from Music Parent blog) takes the By Any Means Necessary position, observing that the differences between utopia and reality make the debates for arguing funding priorities justified, and rightly throwing some arched-eyebrow shade on the Greene quote in the parentheses, above. And an interview clip with English stand-up comedian Stewart Lee—an unashamed artistic idealist—rightly points up the problem with fighting battles on your opponent’s turf. If you win the argument on their terms, you still don’t win; the best you can hope for in this case is bullying someone into alloting a few more dollars in student awesomeness so those troublesome parents will go away. And given the increasing prevalence of gunmetal-gray utilitarianism amongst our legislators and even some school administrators, all that means is that they consolidate their propaganda and stomp out arts programs once and for all next year, helped (doubtless) by burgeoning anti-tax sentiment and a timely twitch in the financial markets.

I see Greene’s point, but “music is freakin’ magical”—another of his phrases—isn’t much of a defense. It’s universal it’s great it’s important it’s human it makes us who we are…look, however much I empathize, this kind of effusion doesn’t get the job done because it sounds like the stereotype of the incoherent artist, a type easy to dismiss with smiling condescension and what-alternatives-do-I-have withdrawal of financial support. Those with less artistic experience than we have, those who don’t know what we know, cannot be expected to be persuaded by ranting and arm-waving. Righteousness is of no help, here, and we cannot rely upon it.

So, what to do? How to advocate for, if not defend, Music Education as a priority? I start a new semester in two days, and my School of Music has a long and prominent history in Music Education, Bands, Jazz, Orchestra, Choirs, etc.—it is a so-called “mainstage” school, and if I resented that when I first came here, I understand it better now. No, our primary focus isn’t on producing virtuoso carbon-copies of whatever Famous Teacher loans his or her name and occasional high-maintenance presence to the institution, and we are not a musical think-tank of Bœthian pretension. Now, hot players are indeed made here and I take no little pride in many of our academic achievements, but they aren’t the main focus. So with my own School of Music in mind, what we corporately aspire to and strive to send students and teachers out into the world to do, I’ll offer my own take. In this case, I think that the best defense is a good offense.

Forget test scores, forget the idiotic “Mozart Effect.” People with serious musical experience, in school or out, know the following:

1) How to take (and benefit from) constant criticism;

2) how to perform under pressure;

3) how to self-correct much faster than others;

4) how to communicate non-verbally;

5) how to work alone, with independence and self-motivation; and

6) how to work collaboratively with others, especially under highly stressful, high-risk conditions and negotiating various wild personalities.


Not enough? Here’s something more global: good parents and teachers know that if a young person finds That Thing, the thing that can consume you, you’ll never give up the feeling you get from it. Now, that Thing can change; it may well not always be Pokémon or Music or Star Trek or Baseball or Chess. But the sense of something grabbing you by the throat and dragging you along with it because it’s just that great is both a tremendous gift and as good an investment in future success, stability, productivity, and happiness that you will ever find.

Not everyone can, should, or wants to be a professional in music; there are many other aspects to life. But thinking as a potential employer, how would you feel about hiring someone who possesses the qualities outlined in my six items, above? That’s why music, theater, and art should be primary expenditures in the public education system. And, of course, young people keep up on the academics just so they can be involved with such activities—as they do with sports, which never seem to be fighting for their curricular existence. It’s a win-win-win-win-win, which is why the shriveled minds that think of the arts as an “extra” are doing exponential damage to the educational system.

We all have That Teacher—a high-school choir or band director, a piano teacher, a scholar-mentor, and the most fortunate among us have several—who passed the flame on. Music is wonderful on its own; no argument, Mr. Greene. It also makes better citizens of people—a point understood by sages from at least Aristotle to Shinichi Suzuki—and unavoidably, folks) makes more resourceful, creative, independent, dependable workers. I see no benefit to keeping that particular light under a bushel.

Want to kick the local and national economy in the tail? Make sure the schools have competently taught and managed arts programs with sufficient funding.

Or, you could go low-tax, I-built-that, finger-wagging austerity route. Let us know how that works out for you. Our kids—musicians’ kids—will still thrive because we prioritize that. Sorry about yours; turns out hangin’ in the mall, owing to a lack of in-school options, isn’t much of a catalyst for excellence or heuristic for valuable skills, but we warned you.

My favorite line from the Boss: “No retreat, baby, no surrender.” A good year, everyone!

Posted in Academia, Current Affairs, Education | 5 Comments

The doctrine of performerly abnegation

If you spend much time around classical musicians, you will be very familiar with the idea that those who perform best, interpret least. Stick to the score and change nothing, not so much as an accent mark or slur. Don’t put your own personality into the music — your job is to be a window through which the composition is viewed, a perfectly transparent piece of polished glass. Above all, respect the composer’s intentions. You can always discover what they are if you just look hard enough at the score, and, once divined, they will always prove superior to any intentions of your own. The composer is always right, and to the degree that your own ideas are distinguishable from the composer’s, you are always wrong. The highest attainment of the performing musician is to realize the will of the composer and make it one’s own. The greatest performance is the one in which the performer disappears and the composer stands fully before us. Failing that, we should at least have a performance of the sort the composer would have heard in his own day. Richard Taruskin dismissed the latter notion as “Wellsian time-travel fantasies.” But the stronger version of this ideal, that a composer’s very self is to be invoked in a performance, resembles nothing so much as the invocation of a god.

Let’s call this the doctrine of performerly abnegation. Anyone who has spent five minutes in a music conservatory or has read sputtering one-star Amazon reviews of classical recordings knows what I am talking about. It’s omnipresent in classical music culture (or what I like to call the maestro-industrial complex). I don’t even know why I’m going to waste my time arguing against this point of view. Musicologists have been debunking it for decades, but their scholarly objections have not made the slightest difference to what the great majority of classical performers think and say. The doctrine of performerly abnegation is not a scholarly position, but a religious one, and religious ideas are not susceptible to scholarly disproof. It is hard, maybe impossible, for human beings to live up to the doctrine of performerly abnegation, just as it is for us to live up to religious ideals generally. The doctrine’s validity is vouchsafed by the strenuousness of the aspirant’s attempts to live in accordance with it. Belief, as I’ve said elsewhere, is a practice, not a one-time buy-in. To practice a belief is to sustain it against resistance; the resistance is provided by the cost the belief imposes. If you object that the doctrine of performerly abnegation puts the performer in an abject role, you miss the point: that is exactly why the doctrine is so compelling.

If the guiding ideal of classical music performance is a fundamentally religious one, then it follows that composers are its gods. This is a familiar idea, but there is one aspect of it that interests me. Gods are imagined in very different ways by different religious traditions, but it seems safe to say that gods, however much we may imagine them in our image, always occupy a different category of being from ourselves. Just how and in what respect they might be different seems infinitely variable, but we can get a sense of the difference by thinking of what kinds of interactions we can imagine between ourselves and a god, be it the God of Abraham or a minor deity from some archaic pantheon. We can worship a god, plead with a god, invoke, praise, bargain with, rage at, and even curse a god. But there seems something a bit odd about the idea of collaborating with a god. Whatever the mode of our interaction with a god, it always presumes that the god possesses some quality that exempts it from the even-playing-field assumptions that underlie all true collaborations.

A collaborator can say “I dunno, I don’t think that’s working. What if we try it like this?” A collaborator not only can challenge your idea but suggest an alternative, and if you are truly collaborating back, you are bound to give such suggestions a hearing. And you do so not out of mere politeness: if attending to a collaborator’s idea is only done for show, it’s not a real collaboration. Each party in a collaboration must be persuadable by the other, which in turn means that each must be able to change his or her mind. Which in turn suggests that each party in a collaboration must be able to entertain the possibility that the other possesses some valuable and missing insight.

This is exactly what we can’t imagine gods doing. I once heard Hans-Ulrich Gumbrecht give a talk in which he argued that the qualities we imagine of gods are always the qualities we miss in ourselves. Individual human beings are limited by space and time, so we imagine the Abrahamic God as omnipresent and eternal. Human beings do not control the weather and fear its capricious violence, so Northern European pagans imagined Thor (Doner, Thunar) as the master of storms. And so on.* Which means that a god is precisely the one entity that can never collaborate with humans. By definition, their powers begin where ours leave off. In the god’s domain, you cannot know something the god doesn’t. Why on earth would a god listen to your suggestions?

Here is my main problem with the theology of classical music performance: we assume a kind of creative relationship between performers and composers that bears no resemblance to how human beings (not gods) actually go about their business. Think about it. If you’re a performer, have you ever, in your life, created an interpretation entirely out of your own head? With no reference whatsoever to anything anyone ever told you? Obviously not. Even if you haven’t taken a lesson in 20 years, your playing still bears the marks of every teacher you ever had, to say nothing of all the musicians you ever played with, all the music you ever played, etc. Your creative work is massively, unavoidably collaborative.

The same goes for writers. At the very beginning of my book Dig, in the Acknowledgments section, I wrote this:

There is an esoteric book somewhere that lists the author as “The Interdependent Universe,” and this seems pretty apt to me. The interdependent universe is the real author of every book: no book was ever written without the support of a thousand and one connections, large and small, between its author and everyone he or she encounters in the process of writing.

If you have ever gotten anything into print, you know this is true. Nothing gets published without editor suggestions, reader reports, colleague comments on early drafts, spousal encouragement of your half-formed new idea one morning over coffee, etc. Even writing on a blog, like this one, where I don’t have an editor and can write anything I please, is conditioned by my relationship with a readership. And unless you are a deranged egomaniac, you have to admit that at least sometimes people will point out something in your writing that never occurred to you. It could be small, like the fact that you used the word “delineate” 23 times in a single chapter.


Or it could be something larger, like an unsavory implication in a certain line of argument or the possible application of your idea to a problem you’ve never even heard about.

I’m not a composer, but I have to imagine that composers learn a great deal from the performers of their music. The composers I’ve known certainly have. But classical music culture remains enthralled by the story of Beethoven scorning Ignaz Schuppanzigh’s “wretched fiddle” and heeding only his muse. This story bears out one of the great formative myths of the maestro-industrial complex — the myth of the composer-god, perched atop a holy mountain in lonely contemplation, his mind fully sufficient to itself and fully knowable only to itself.

klinger beethoven

Max Klinger’s Beethoven Monument, 1902

So classical music culture takes the composer to be something other than a human being. You and I might collaborate in our creative work as a matter of course, but the composer does not. You and I cannot know our own minds perfectly: however carefully I plan and execute a piece of writing, there will always be meanings latent in what I write that I will never notice unless someone else points them out to me. The meanings I intend will always be exceeded by the sum total of meanings available in anything I create. I call some of the shots, but I don’t call all the shots.

But the doctrine of performerly abnegation insists, against all reason and experience, that a composer’s intention perfectly matches the total of available meanings in a piece of music. That any meaning the composer did not consciously intend is necessarily an illegitimate meaning. That the composer is self-consciously aware — in the sense that I am aware that I am writing this parenthetical clause — of every meaning in every piece of music he composes.

If composers are actual, according-to-Hoyle gods, then sure, this makes sense. But if we agree that the great composers were, in the end, human beings (granted, very unusual human beings), then this picture of the composer’s mind cannot possibly be true.

The performer is the collaborator of the composer. It doesn’t make any difference if the composer has been dead for centuries. As David Byrne points out,

interpreting a written score, reading music notation, is itself a form of collaboration. The performer is remaking and in some ways rewriting the piece every time he plays it. The vagueness and ambiguities of notation allow for this, and it’s not entirely a bad thing. A lot of music stays relevant thanks to the opportunities for liberal interpretation by new artists. [David Byrne, How Music Works (McSweeney’s, 2012), 187-88.]

“Liberal interpretation,” though, is exactly what is missing from the maestro-industrial complex — is in fact systematically excluded and repressed by the doctrine of performerly abnegation.

*Please don’t take me to be dismissing these gods, or any gods, as merely the fictional creations of human beings. For one thing, there’s no “merely” about the creation of stories. Story-telling is powerful magic. And for another thing, how do you know you aren’t a figure in a story being told by some god? Perhaps a god you yourself have created through your own storytelling? I rather like the idea that the world forms as we tell stories about the gods just as they in turn tell stories about us, each act of storytelling generating the other — a paradoxical mutual co-creation something like the famous M. C. Escher “Drawing Hands” lithograph:

DrawingHandsThough please don’t think that this metaphysics is “what I really think” about the nature of the universe. I try to “really think” as few things as possible. Metaphysical speculations presented herein are for amusement only. No warranty expressed or implied. Blog may contain nuts.


Posted in Concert Culture, Performance, Religion, the Maestro-Industrial Complex | 5 Comments



credit: Jim Lears


This is part 3 of my series on magical thinking in everyday life (“You’re Soaking In It”). Part 1 is here; part 2 is here.

So last time I wrote about the little rationalist demon that sits on the shoulder of practically everyone in the modern secular West. I tried to show that this demon is in one way or another a tiresome, demanding spirit. Religious fundamentalists are hardened in their fanaticism by the demon’s provoking insistence that their beliefs are nonsense; rationalist fundamentalists are goaded to an irrational denial of their own beliefs in their attempts to appease the demon. When the demon says “jump,” the former group says “go away” and the latter group says “how high?” Either way, the demon gets a satisfyingly outsized reaction. And demons like it that way, since they are not so very different from us and enjoy the attention.

As I wrote last time, fundamentalists can only accept one truth and despise pluralism — for example, the idea that there can be multiple gods or kinds of knowledge.* They place the greatest possible value upon purity, and pluralism is impure. But most of us are impure. Depending on our mood and situation, we’re sort of religious, sort of rationalist, and, if we are completely honest, given to the odd bout of magical thinking as well. Maybe you have a deck of tarot cards gathering dust on a high shelf somewhere, or maybe you’re like Ivy in that little story I wrote a while back and, in the right mood, will make a decision out of a feeling that disparate things in your world have come together in a certain patterned and meaningful way.

But at such moments of magical thinking, the rationalist demon will come for you too, just as it does for your purer brethren. When it makes its presence felt, you don’t say “get thee behind me Satan” and neither do you snap right to attention; your response is kind of impure and mixed up as well. You might grudgingly set to tidying away your irrational notions but (like a sullen teenager told to clean his room) maybe mumble “oh, fuck you” under your breath. Or else you rebel against the demon and ignore him — but then you feel guilty about it and eventually give him his due. This is what I meant when I wrote that we tend not to trust our own experience and, when called to account for our actions, will forget whatever magical thinking might have impelled them.**

I have been writing about our demon as if rationalism and skepticism were his only qualities. Did you notice, though, that in my last post I wrote almost as much about belief as I did about skepticism? This is unavoidable, as the one term implies the other. How can you believe in something unless there is the possibility that you might not believe in it? How can you be skeptical without having first failed in the belief of something? I wrote last time about the strange symmetry between hardcore believers and hardcore skeptics, and this symmetry is due to the way each group tries to maintain a pure state of belief or skepticism, as if it were possible for one term to exist entirely without reference to the other. Their futile attempt at purity drives them to repress what they cannot recognize in themselves. As I wrote last time, the shadow side of the skeptic is belief, and the shadow side of the believer is skepticism.


The ace of spades from Marshall McLuhan’s “Distant Early Warning” card deck

So now we’ve learned something about our demon: he does not represent a single simple idea, but an idea polarized, pointing in two contrary directions. All your better sort of demons have two horns on their heads, and our demon’s horns are the horns of a dilemma — that paradigmatically modern dilemma between skepticism and belief.

You’ll notice that I’ve pushed the demon-on-the-shoulder metaphor pretty far by now. Is it even a metaphor any more? Well, I don’t mean to imply that a rationalist demon with horns and maybe a pitchfork exists in the same way as my computer exists. But at this point he is something more than a figure of speech. I am treating him as if he exists in some way (exactly how is not important right now), and, by invoking the mighty magical formula AS IF, I obtain a magical result: a very abstract idea is somehow much more easily comprehensible.

Remember Ramsey Dukes’s distinction between asking what things are and asking whether they work. If we are interested only whether an idea works to make sense of something — for example, whether the personification of a habitual and inhibiting rationalism makes such an abstract idea easier to understand — then we can enable this line of inquiry by setting aside the question of whether it exists (or even the more philosophical question of what it means for something to “exist”) and proceeding to our investigations by assuming it does. We say, “let’s proceed as if a complex abstract dynamic within intellectual history is an entity with intelligence and volition” and let the procedure be justified by what it yields.

What I am doing is to personify a complex problem of human thought and behavior. As Dukes has repeatedly pointed out, personification is a good strategy for human beings, whose brains have evolved to be extraordinarily sensitive to the subtle behavioral cues of fellow human beings and can read phenomena more acutely when they are treated as if they are humans as well. Much of my argument in this post is potted from Dukes’s Uncle Ramsey’s Little Book of Demons, which in turn works through ideas developed in an earlier book of essays, What I Did In My Holidays: Essays on Black Magic, Satanism, Devil Worship, and other Niceties. That rather blood-curdling title hides a down-to-earth and humane notion: we do best when we negotiate with and reconcile ourselves to intelligent entities — be they demons or our neighbors — rather than blindly accept or reject them. “When you start trying to treat phenomena as if they were human you learn many things. One of them is how better to treat your fellow humans as if they too were human.” (URLBoD, 85.) “Do unto others as you would have them do unto you” turns out to be the dread secret at the heart of demonic pacts — understanding, of course, that by “demon” we mean any apparently non-sentient phenomenon to which we are according some measure of our own sentience. The more of our own sentience we accord the world, the more subtle our interactions with it.

This is the way out of our impasse: it is how we can relate to the pervasive and limiting rationalism of our age without either saying “go away” or “do you want fries with that?” Robert Anton Wilson wrote that “belief is the death of intelligence,” but as I pointed out last time, neither he nor anyone else could transcend belief altogether. Wilson’s way of handling belief was to turn belief and skepticism into entities he could invoke at need or pleasure. In Cosmic Trigger, he writes his autobiography from not from the usual standpoint of some presumed unitary self, but as a shifting array of different personae: “the Skeptic,” “the Materialist,” “the Fool,” “the Shaman,” “the Reporter,” and so on. When skepticism is called for, the Skeptic takes the reins, but he is politely asked to step aside when circumstances call for someone else to step up. Each persona is treated in an as-if way: if we call up the Skeptic, we are not saying that his point of view is natural and inevitable, but only that we are treating his viewpoint in a more limited and heuristic way. We don’t believe that it is really true; we act as if it were really true. Thus we arrive at what Wilson calls “model agnosticism,” which assumes that no one model of reality can be entirely congruent with reality itself.

This is what my rationalist demon is saying to me right now:

stfuPenn Jillette is the perfect embodiment of my rationalist demon. In real life, Jillette is a kind of Vegas Dawkins, a blowhard pseudo-skeptic who indirectly inspired the South Park satire of the New Atheist movement. But though he’s not who I would choose to play me in a Hollywood movie about my life, he is (I confess) rather like me: a big guy with a loud voice, confident in his views, taking up a lot of space in a room. Or at least Jillette is like one part of me, which is the side that likes to call bullshit on bad ideas and gets pleasure from dispatching them with a certain merry brutality. Dukes calls such a personality an “ideas yob” — sort of like the yobs who vandalize parks and beat up strangers at soccer matches, only with ideas. After years of feeling helpless, uncontrollable, and immoderate rage at the sight of trampled flowers and damaged trees, Dukes realized that “vandalism and rabid anti-vandalism represent the two horns of the same demon” and asked himself, “how could a sweet, loving person like myself … harbor anything akin to a demon of vandalism in my soul?”

It took a lot of soul searching and dialogue to discover that I am an ideas yob, in that I have a demon in me that, when it sees neat and tidy prejudices, cliched arguments set out in rows with well trimmed dogma, feels like any vandal faced with prissy flower beds and twee tree saplings. It wants to sing “Ere we go, ere we go, ere we go” and trample over the lot of them. (URLBoD 104-05.)

As Dukes points out elsewhere, every demon we harbor offers some service in return for the torments we suffer at its hands — otherwise, we wouldn’t put up with them. For me, thinking of my rationalism as an intelligent entity makes it clearer to me that, however much I am now trying to wriggle out from its powerful grip and imagine philosophical justifications for a kind of controlled irrationality, I am cannot do so without his help. I am calling upon him right now, in writing this post.

manservant hecubus


He has been with me throughout my academic career, indeed throughout my life, arming me against toxic ideas and the tyranny of received wisdom. In Hermetic symbolism, intellect is symbolized by the sword — a weapon that cuts through illusions, divides truth from falsehood, and dissects all things, rendering them available to critical analysis.*** And you need that weapon, especially if you spend any time investigating irrational systems of thought, because without the ability to say “bullshit!” you make yourself prey to uncontrolled belief and also to those who can leverage your credulity for power (a.k.a. weird cults). Or you might just go insane. Magicians aren’t the only ones that think everything is connected to everything else. Paranoiacs do too.

Visualizing my rationalist demon as Penn Jillette allows me to see him a bit more in the round. He’s a fun guy to have around — sometimes. But you have to know where to draw the line. He reminds me of an old friend (let’s call him “Magnus”) who phones me sometimes. Magnus is grandiloquent, grandstanding, jovial, bursting with life and energy, supremely confident in his opinions and not terribly interested in yours. If I let him, he would talk all day, holding forth with his Correct Views on Everything. He is living proof that you do not need to be female to get mansplained to within an inch of your life. He and I go way back, and I am always delighted to hear his voice on the other end of the line, because I genuinely enjoy his good-humored damning of whatever it is that’s exercising him. A bit of this goes a long way, though, and at a certain point in the conversation I will suddenly remember an urgent appointment and (politely) shut down his monologue. But even if I were in no mood for Magnus’s conversation, I wouldn’t yell “go away” into my phone when he calls — that’s no way to treat an old friend, nor even a casual acquaintance. Neither, though, would I simply listen for as long as he wanted to talk. I have learned how to take what I value from Magnus’s friendship and avoid having my patience abused. I have learned to enjoy his worldview without taking on board those of his ideas that don’t get me anywhere.

What I am suggesting is that troublesome idea-entities, like our rationalist demon, are best handled in the same way. Knowing when to invite the demon out for a drink and when to spend time in other, perhaps more congenial company, is both a personal skill and an intellectual one. It is intellectual work conducted in the spirit of glorious impurity and messy humanity that our lives as social animals bequeath to us. It is not at all how we are trained in the academy, though, which I think is one reason why so much writing in the humanities is so flavorless, humorless, and somehow as distant from actual humanity as can readily be imagined. We allow ourselves to become besotted by the ideal of airtight logical consistency, which isn’t something you find often among human beings, or in their works of imagination.

Perhaps irrationality has a place in the humanities, because irrationality is the condition of humanity generally. And perhaps irrationality is already at home in the humanities, and we just don’t know it. And perhaps that’s not a bad thing. I want to consider this some time soon.

*I guess you could have a polytheist fundamentalist, though — someone who insists on the One Truth that there are many gods. Likewise, it has often been noted that dogmatic postmodernists believe in the One Truth that all truth is relative. But this is only a slightly more paradoxical version of monism.

**”Weird shit happens to us all the time, but if we can’t find an explanation for it that fits our education and cultural norms, we file it away in a mental folder marked ‘awkward/miscellaneous’ and never look at it again.”

***The words critic, critique, critical, etc., all come from the Greek rootword krinein, which means both “to decide” and “to separate.” The symbolism whereby the cutting blade represents the intellect is very widespread: in Buddhism, Manjushri, the bodhisattva of wisdom, is usually pictured with a sword.

Posted in Magic, Philosophy | 1 Comment

A Visual Allegory for the Death of Classical Music

frontier psych

Image | Posted on by


This is the second installment of my “Magic: You’re Soaking in It” series. I don’t know if there will be a third part. I have at least one other projected series (the one on Sun Ra’s Space is the Place) still hanging fire. Well, let’s write this one and see where we are . . .

In the previous installment of this post, I argued that magic (you know, the pentagrams-and-incense kind of magic, not the sawing-ladies-in-half kind), however outré it may seem, is actually a part of our everyday experience: you’re soaking in it. Or, as the Insane Clown Posse would say,

big icp

For those that need one, an explanation.

And to make this point, I wrote a little story about Ivy, who used augury to answer an important question about the direction of her life. At the end I suggested that Ivy would probably not tell many people about her moment of personal revelation and indeed might eventually forget about it entirely. But at the time, the revelation would carry real weight.

I came up with this example when explaining to a friend what I wanted to do in my next book. Maybe something like what I have described in Ivy’s story has never happened to you, but my friend — a major intellectual presence in my field and a hard-headed, no-nonsense kind of person — clearly knew what I was talking about.

I got to the end of my story and said to my friend, so, imagine that, at that moment you made your decision, a little rationalist demon appeared on your shoulder and said “wait, what are you doing? How could you make such an important decision on the basis of chance? What possible relationship could there be between the flight of crows are whether or not to dump your boyfriend? Explain yourself!” To which my friend replied, sometimes I wish I could kill that little demon.

I thought this was an interesting response. It suggests three things:

1. The rationalist demon wins just about every time: sooner or later, we know we’ll have report to him and account for ourselves.

2. The demon’s presence is unwelcome and his authority over us is a burden.

3. The demon’s point of view is not even useful in the kind of situation I have described.

I take no. 3 for granted. I have written a lot in the last couple of years about the ways that calculative rationality cannot account for aesthetic and existential experience, and how our dogged attempts to reduce experience along some quantified scale constitute a kind of barbarism. The meaning of Ivy’s experience doesn’t have anything to do with whatever it is our demon is going on about. Although perhaps I will need to make that argument at greater length elsewhere. We’ll see. But for now, I want to spend more time on nos. 1 and 2.

Regarding no. 1: Rationalism (that is, causal reasoning based on the known workings of the physical universe)* is the regulating background idea by which educated persons in our society judge the value of any particular notion or proposition. This is true even of the most bigoted of religious fundamentalists. It is often and rightly said that fundamentalism is an essentially modern phenomenon. On some level, fundamentalists know just as well as anyone else that their religious ideas are imperiled or outright falsified by rationalism. Fundamentalists don’t seem to care much what science has to say about their beliefs, but I would suggest that fundamentalists are the ones who care the most. They just refuse to meet science half-way. Their response to skeptical modernity is not that of the liberal churchman, for whom biblical tales are metaphors and scripture represents knowledge of a different order from that of science. This schema of different but non-competing knowledges is Stephen Jay Gould’s “non-overlapping magisteria.” Fundamentalists cannot accept two-ness in knowledge, though. For them, there can only be one truth.

And if their truth cannot be proven, demonstrated, or otherwise handled with the tools of rationalism, then they will place all their energy into belief. Belief is what fundamentalists use to sustain an idea in the face of rational disproof. Belief is a practice. One doesn’t simply believe in something for all time: belief is something you need to renew constantly through spiritual practices (prayer, fasting, etc.) and through participation in a community of fellow believers. The more of a headwind your belief faces, the more vigorously you will have to paddle. Fundamentalists are as dogmatic, humorless, unimaginative, and ruthless as they are because they are having to work very hard indeed to sustain their beliefs. Their beliefs are ridiculous in precise proportion to the energy with which they are maintained in the face of opposition. And the force of opposition is what sustains them in their belief. Like Tertullian, they believe because it is absurd. Their belief is the flip side of rationalist unbelief.

And they are confirmed in their desperate mission of belief because of no. 2: they experience the demands of modern rationalism as a burden. But here I should stop calling them “them”: this is something we all have in common. Rationalism too relies on a belief that must be practiced and which demands a certain cost of human energy.

It is not easy to live in a disenchanted universe. In fact, the systematic post-Enlightenment denial of intelligence, agency, and meaning in the affairs of the universe has proven toxic to human life. I have written a certain amount about this myself, and in doing so I have only added a couple of drops to the vast ocean of writing that identifies rational disenchantment as the cognitive signature of modernity and also its curse. From this point of view, religious fundamentalists and hardcore scientific rationalists of the Dawkins variety are both the damaged children of modernity. Both groups face the same yawning chasm of meaninglessness, but the one puts all its energy into negating it and the other into identifying with it.

But there remains a certain symmetry between these groups. Robert Anton Wilson writes in his preface to Cosmic Trigger I (a widely-read and entertaining late-20th-century occult narrative) that, while he has received a lot of great fan mail over the years,

not all the mail I have received about this book has been intelligent and thoughtful. I have received several quite nutty and unintentionally funny poison-pen letters from two groups of dogmatists — Fundamentalist Christians and Fundamentalist Materialists.

The Fundamentalist Christians have told me that I am a slave of Satan and should have the demons expelled with an exorcism. The Fundamentalist Materialists inform me that I am a liar, charlatan, fraud and scoundrel. Aside from this minor difference, the letters are astoundingly similar. Both groups share the same crusading zeal and the same total lack of humor, charity and common human decency.

These intolerable cults have served to confirm me in my agnosticism by presenting further evidence to support my contention that when dogma enters the brain, all intellectual activity ceases. [RAW, Cosmic Trigger I, viii-ix.]


Portrait of Robert Anton Wilson as a skinny, naked Louis CK about to go onstage and do some prop comedy.

Elsewhere in the same passage, Wilson paraphrases the last paragraph in his best-known epigram, “belief is the death of intelligence.”

One thing both groups have in common is their tendency to refuse calling their point of view a belief. A good number of Christians refuse even to acknowledge that Christianity is a religion: if a religion can be any one of the many competing ways of understanding the divine, then Christianity cannot be the same kind of thing, because it alone is true. Or else Christianity is the only religion and the others are . . . uh, well, call them something else. My local Half-Price Books shelves Christian books in the section marked “religion” and everything else (Jewish, Muslim, Buddhist, Pagan, etc.) in “Metaphysical.” If you inhabit this worldview, you might not even acknowledge that your belief is a belief at all, much less (as I have argued earlier) a belief you have to practice and sustain in the face of your own rational doubts — the demon that sits forever on your shoulder.

Scientific rationalists say almost the exact same thing: I don’t “believe” in science, I know science to be true.** I used to say this exact thing myself. But you cannot keep belief entirely out of the picture. At the beginning of his Principles of Psychology, William James writes about the scope of his work and describes intention towards a goal (“the pursuance of future ends and the choice of means for their attainment”) as the hallmark of mental life, commenting that the question of intelligence and purpose is central to “the deepest of all philosophic problems:”

Is the Kosmos an expression of intelligence rational in its inward nature, or a brute external fact pure and simple? If we find ourselves, in contemplating it, unable to banish the impression that it is a realm of final purposes, that it exists for the sake of something, we place intelligence at tile heart of it and have a religion. If, on the contrary, in surveying its irremediable flux, we can think of the present only as so much mere mechanical sprouting from the past, occurring with no reference to the future, we are atheists and materialists.

This philosophic conflict animates the quarrel between Darwinian scientists and the proponents of “intelligent design,” though I don’t want to spend any time on that, because this post is already long enough and I don’t want to get distracted from my point, which is, as I say, that you cannot keep belief entirely out of the picture. And the decision to interpret the world as having or not having final purposes is itself a matter of belief. It is not, and cannot be, a scientific proposition. It is a metaphysical idea, a statement about the universe that permits scientific thinking, not itself a statement that can be proved or disproved through scientific (empirical, experimental, repeatable, falsifiable) means. To think you can purge metaphysics from any philosophical system is to dream the impossible logico-mathematical dream of a “set of all sets,” the unshakable logical foundation whose propositions include themselves.

What I am saying is, to arrive at science, you need a bit of belief. However, since scientific rationalism cannot find a place for belief, belief is denied and repressed. It doesn’t go away, though, but goes underground, where it becomes the shadow side of scientism. Remember what I said about the shadow in my last post? “Write a list of everything you dislike most in others. Congratulations, you have just described the part of yourself you don’t like and can’t acknowledge.” Well, then.

So as I wrote earlier, fundamentalist rationalism, like religious fundamentalism, relies on a belief that must be practiced and that demands a constant output of human energy. Each fundamentalism is the shadow of the other: the shadow side of religious fundamentalism is doubt, and the shadow side of rationalist fundamentalism is belief. Most of us, probably, are not fundamentalists of either kind. But whether one is from a religious background or a scientific-atheist background or a nothing-in-particular background, any half-educated person in the modern West will still have that little rationalist demon resting on her shoulder almost all the time, monitoring her thoughts and telling her whether her thoughts are acceptable to the ruling dispensation. A religious person and a secular person will react differently to that demon, but the demon will be there all the same. And either way, that demon will stand over their experience.

We normally associate the word “superstition” with the sort of magical thinking that Ivy entertains in my little hypothetical scenario. But consider the etymology of “superstition”: it comes from a Latin root that means “to stand over.” Superstition is a kind of thinking in which a received idea stands over one’s present experience. “Step on a crack, break your mother’s back.” Really? Did you test that hypothesis? How many mothers with broken backs have you met? Your experience almost certainly contradicts the saying. But if, contrary to your own experience, you jump over the crack anyway, you are letting something you’ve heard, a little rhyme jingling in your ear, stand over your own experience. You don’t really pay attention to your experience; you make your experience fit the received idea. Superstitions are hard to uproot because they monitor and remap our reality as it unfolds. Whatever happens to us that doesn’t fit the script is shuffled off to that mental folder marked “awkward/inconvenient.”

Actual scientists are rightly irritated by the superstitious mishandling of scientific concepts in the popular press — for example, when they hear people say “science tells us,” as if science is some monolithic and unquestionable Vatican-like authority, or when they encounter the rebadged phrenology passed off as neuroscience, a.k.a. neurobollocks, that has gone hand-in-hand with the mindfulness fad. But that’s not really my point. I call superstition anything that stands over our experience and insists that we discount it for an idea we must take on faith or authority. And from this point of view, the rationalist demon that would tell Ivy not to pay attention to her own emotional experience is a figure of superstitious dread.

So am I saying, like my friend, that we should kill that little demon? No. Because magic is nothing if not an excellent way of bargaining with demons. But that subject will have to wait for another time. For now, I will simply say that while Robert Anton Wilson called belief “the death of intelligence,” he also acknowledged that he was no more able to transcend it than anyone else. How he dealt with belief, though, says much about how magical thinking works, and what it might have to offer scholars in the humanities.

*[added several days later] To split hairs: “causal reasoning based on the known workings of the physical universe” is rationality, whereas rationalism is the belief (yes, belief — read the whole post) that this style of reasoning is the sole means of establishing truth.

**Notice how often those who go along with the assumptions of scientific rationalism without thinking all that much about them — for example, the clever fellows who do the highly-entertaining Cracked Podcast — will say “I believe in science.” If you called them on it they’d probably backpedal, but this is a case of an offhand vernacular expression being the more accurate (or at least the more honest) one.



Posted in Magic, Philosophy, Religion | 4 Comments

Battlefield medicine

J. F. Martel, the author of Reclaiming Art in the Age of Artifice (a wonderful book about which more soon, I hope) sent out a tweet a few weeks ago:

I remembered this tweet when I read this week’s issue of The New Yorker, which has an essay on Andy Puddicombe, a “mindfulness guru for the tech set.” I hate every word in that clause, with the exception of “for” and “the.” But I will focus on “mindfulness,” a word whose recent misuse rivals that of “literally.” What’s at stake in “mindfulness,” as the word is now used, is meditation, and more particularly meditation conceived and used as a technology. Even more particularly, it is meditation used as “tech,” in the same sense that Scientologists use the word, as a set of mental techniques used to maximize personal success and gain.

I should say up front that I am hardly a neutral observer here, and I am hardly innocent of the self-seeking and self-delusion I detest in the current “mindfulness” fad.* I have my own history with meditation. I started meditating in 2009, and even as recently as that meditation was still a bit ooky and New-Age. When I told people I had gotten into meditation, I, hard-bitten skeptical rationalist that I was, would feel slightly embarrassed about it.** But as I write now, in 2015, meditation has attained the same place in American culture that yoga did in the last decade, and hardly anyone needs to apologize for doing it. To meditate is to be on the winning team: you’re in the same company as half of Silicon Valley and those pretty blonde girls, eyes soulfully closed, that show up everywhere on the covers of news magazines.

I learned to meditate at Sanshiji, a small Zen center in Bloomington that was founded by Shohaku Okumura, a great Soto Zen teacher who (among other things) wrote what is, for my money, the single best introduction to the Zen teachings of Dogen Zenji. I studied and meditated at Sanshinji for four years and took lay ordination there.*** I’m not saying that to establish my bone fides, though. I was a washout as a Zen student, and the only qualification I have for writing this post is my failure. Failure can be instructive, though, or at least amusing.

At the center of Dogen’s teaching is the idea of practice unmotivated by the thought of gain. The practice of zazen (Zen meditation) is called shikantaza, or “just sitting.” Just sitting is not sitting for anything; if you are sitting in order to reduce stress or something you are not just sitting. But the longer you sit, the more this “just sitting” seems like a koan, a riddle that the rational mind cannot solve. What if I am sitting in order to be a nicer person? Or to get better at meditation? Or (in the parlance of the Bodhisattva Vow) to save all beings? Still not “just sitting.” Saving all beings is the main point of Mahayana Buddhism, but still, “just sitting” itself doesn’t have a point. It’s pointless. Which is to say, zazen is good for nothing. And yet it is also the practice of “actualizing the fundamental point” — this is the title of Dogen’s most famous essay, genjokoan. The fundamental point is pointlessness . . . wait, what? If you are confused, well, then, join the club. It takes a lot of zazen to even begin to get a glimpse of what this all means. If it means anything at all. Maybe it doesn’t. Maybe meaning is not the point.

Well, whatever the point was, I missed it, even as I practiced it for hours each day, year after year. Even as I could say, with total conviction, that zazen is good for nothing, I was meditating in order to get stuff — to become nicer and happier, gain insight, to become an expert meditator, and (let’s be real here) to flatter a certain image of myself as a compassionate and insightful and accomplished person. I did not see this as it was happening. My intellectual fluency with the ideas of “just sitting” and “no-gaining-mind” itself became a point of pride that blinded me to the very fact that I was, in fact, pursuing something after all: being the very best no-gaining-mind student ever. Chögyam Trungpa has my number:

The problem is that ego can convert anything to its own use, even spirituality. Ego is constantly attempting to acquire and apply the teachings of spirituality for its own benefit. The teachings are treated as an external thing, external to “me,” a philosophy which we try to imitate. We do not actually want to identify with or become the teachings. So if our teacher speaks of renunciation of ego, we attempt to mimic renunciation of ego. We go through the motions, make the appropriate gestures, but we really do not want to sacrifice any part of our way of life. (Cutting Through Spiritual Materialism, 13.)

Ouch. None of this is pleasant to admit. But I am admitting it anyway, because those of us who have come up against hardships in meditation need to point out, to those who haven’t (not yet, anyway), that meditative practice ain’t no joke. It’s really, really hard, and in ways that are not the same as, say, the practice of a musical instrument or a profession. The “path” (as everyone calls it) is beset on all sides by deadfall traps.


such an apt visual metaphor …

But this aspect of meditation is systematically ignored in the present-day culture of “mindfulness.” Secularized mindfulness meditation à la Jon Kabat-Zinn (mindfulness-based stress reduction, or MBSR) bills itself as merely a technique, not a spiritual path and certainly not a hazardous one. But secularized as it is, MBSR still has its origins in Buddhist metaphysics. Even as those metaphysics are ignored or disavowed, they are still deeply embedded in the practices themselves. And when you repress awareness of anything in your makeup, bad things start to happen. That’s what the shadow self is.**** Suffering and its sidekick, desire for gain, have become the shadow of mindfulness.

OK, to back up, a little Buddhism 101: Siddhartha Gautama (a.k.a. the Buddha) was above all concerned with the problem of suffering. Indeed, he considered suffering the fundamental condition of human existence. Simply put, suffering, or dukkha, is the condition of wanting things to be other than they are. We want stuff (sex, money, love, security, etc.) and not having it makes us full of envy and anxiety. We hate whatever it is that makes the things we love go away — like death, for example, which makes our friends and family go away. And the thing is, everything goes away. Impermanence is the other side of dukkha: it is what dooms our loves to remain unsatisfied and our hates to be ever-renewed. In fact, dukkha might better be translated as “dissatisfaction.” It’s a dynamic force, always turning, like a wheel. Or perhaps like a two-stroke motor: craving and aversion are the up and down strokes for the engine of suffering. What is called mindfulness is the practice of cultivating an equanimity in the face of the cycles of craving and aversion. If it means anything, mindfulness means being present for all sensations, good and bad, and seeing that “good” and “bad” are simply the names we give to them.

Suffering is a bummer, though, and to market “mindfulness” to Americans (who as a rule do not enjoy thinking about suffering) the meditation salesmen need to work on the branding. The New Yorker piece ends with a telling detail. Andy Puddicombe is conferring with one of his clients, a hair salon mogul, and prescribing some new exercises based on the Tibetan Four Ordinary Foundations. This is a practice in which meditators reflect upon (1) the good fortune of rebirth in the human realm,***** (2) impermanence, (3) karma, and (4) suffering. Now, Puddicombe has training as a Tibetan priest: he knows this stuff cold. But I suspect he also knows that

  • no. 1 depends upon a mythology that secular Westerners will find off-puttingly foreign and superstitious-sounding;
  • no. 2 is depressing;
  • no. 3 is a complex and intellectual-demanding teaching of interdependent causality that business-minded Westerners will find irritatingly abstract and philosophical;
  • no. 4 is really fucking depressing.

So Puddicombe renamed them. No. 1 becomes “appreciation,” no. 2 becomes “change,” no. 3 becomes “cause and effect,” and no. 4 becomes “Acceptance.”

But again, that which you deny will become your shadow. You can try to turn “suffering” into “acceptance,” but it doesn’t matter what you call it, mindfulness meditation remains a practice for the systematic investigation of suffering. Business culture seems to believe that you can change the thing itself if you just change the perception of it: this is “re-branding.” (In this way, marketing is actually very close to magic.) But the thing is, even if you remove it from the terminology, temples, robes, doctrines, and ceremonial forms of Buddhism, “mindfulness,” in its very form and expectations, remains inextricable from Buddhist metaphysics. I’m not saying that I think the fundamental truth of existence is suffering; I’m saying that, if you want to inhabit that worldview, mindfulness is an absolutely crackerjack way to do it.

This might mean a couple of things. First, a whole lot of people who are picking up meditation just to de-stress and enhance their productivity might be in for trouble. If you get deep enough into this practice, you will get an up-close-and-personal view of suffering. You will realize suffering. It will no longer be an idea, but something you see moment-to-moment. This can be a good or bad thing, depending on where you are. Those who have major depression should be very careful. (I wasn’t.) It really pisses me off to see meditation marketed as a cure for depression. Some of the people who are promulgating this stupid and dangerous idea are merely naïve; others should know better.

But more likely, what it means is that suffering will simply be denied, and year by year the practice will be changed to adapt it more and more to the popular understanding of meditation as a kind of mind hack, a wetware productivity app. This is why I liked Martel’s comparison of the wildfire spread of Eastern spirituality to the Romanization of the early Christian church. It’s another good-news-bad-news situation: The good news is that a powerful way of seeing through certain delusions, along with a powerful new (to the West) style of thought, is being made available to exponentially more people now than when it was just a few Beats and hippies showing up to Shunryu Suzuki’s zendo back in the 1960s. (I might write about the already palpable influence of meditation culture on present-day intellectual life in some future blog post.) But the bad news is that, in the process of popularization, the delicate complexity of a contemplative philosophy is brutally simplified and the selfless core of its teaching is corrupted by wealth and power. A clichéd complaint, I know. But some clichés are true.

In the meantime, it suits our neoliberal overlords just fine to have their peons meditate in the break room. One of the dissenting voices quoted in that New Yorker piece, Ronald Purser, comments that mindfulness “keeps us within the fences of the neoliberal capitalist paradigm. It’s saying, ‘It’s your problem, get with the program, fix your stress, and get back to work!’” Meditation is cheap — actually, it’s free. You don’t need specialized equipment. You don’t even need someone to show you how to do it: just sit somewhere quietly and see what happens. That’s a lot cheaper than having to pay for someone’s therapy and meds. As the neoliberal regime turns the screws harder and harder in a kind of vast science-fair experiment to see how much productivity can be squeezed out of people before they just up and die, meditation becomes battlefield medicine. Patch ’em up and send ’em back out there. I always think of a George Grosz satirical drawing of a WWI medico pronouncing a charred skeleton fit for active duty:

grosz KVRemember that quote from Chögyam Trungpa’s Spiritual Materialism: “The problem is that ego can convert anything to its own use, even spirituality. Ego is constantly attempting to acquire and apply the teachings of spirituality for its own benefit.” Replace the word “ego” with “capitalism” and see how little the meaning changes. Capitalism is the organization of the ego on planetary scale.

I still sit zazen, by the way.

*Here’s a clue for self-development: write a list of everything you dislike most in others. Congratulations, you have just described the part of yourself you don’t like and can’t acknowledge. This is what Jungians like to call “the shadow.”

**Notice how I poked fun of hard-bitten, skeptical rationalists in a recent post on magic. See what I mean about the shadow?

***So far as I know, there is no exact equivalent to Zen lay ordination, or jukai, in Western religious traditions. It does not imply the same commitment as priest ordination (I am no priest!) but it is still pretty much the same ceremony and it involves a lot of study and preparation, including and especially the hand-sewing of a miniature robe called a rakusu.

****See note *, above.

*****That is, as opposed to the realms of animals, hungry ghosts, warring spirits, gods, and hell. OK, now some Buddhism 102: In Tibetan Buddhism, these are the realms through which we are said to transmigrate, rebirth after rebirth, and it is only in the human realm that we can practice the Dharma and get a fighting chance of breaking the eternal cycle of suffering. So the point is, you’re supposed to make the most of your lucky break and practice hard enough to ensure at least a favorable rebirth. Those schools of Buddhism that favor ideas of rebirth tend to believe that it takes many, many lifetimes of practice to achieve the cessation of suffering — in other words, enlightenment. The Soto Zen school tends to believe neither in enlightenment nor in rebirth, which makes it a tough sell to the power-of-positive-thinking American consumers of Eastern spirituality. Which makes it richly ironic that “Zen” has become the premium brand-name for pseudo-Buddho crapola of every possible description.

Posted in Current Affairs, Religion, Technology | 4 Comments

Disciplinarity and Gatekeeping

I don’t want to let Phil’s superb post pass without a couple of comments beyond the “TELL IT, REV’RUNT!” that was my initial reaction. “The horizon of a discipline,” he notes, “is inevitably smaller and more limited” than “the full range of other minds [that] constitutes the true horizon that bounds the humanist.” To choose a humbler disciplinary home over the vast green fields of The Humanities means that “you are necessarily accepting a more limited participation in the expressive realm of the human. And that seems like a drag, giving up that pleasure for . . . for what? In part, for power. It’s ‘the discipline’ that tells you what matters; you don’t get to decide.”

This can be particularly and perniciously true in graduate school. Students sometimes feel they need guidance in finding a topic, but the guidance received might have more to do with The Kind Of Department We Want To Be than the individual student’s disposition and strength. Phil mentioned UCLA as a standout in cultural studies in musicology, and it has indeed been so for some time. One wonders if other departments, seeking to stake out different turf, try to move students in a particular direction to strengthen that position. “They’re telling me to read up on genre,” a student once told me with a sigh, and I had the distinct feeling that this had more to do with the way her department conceived itself than her scholarly well-being. Of course, no one department can do everything; such areas as combinatoric theory and psychomusicology are studied at certain institutions but would be impractical in the vast majority of places. In large part, though, one’s graduate school experience has a crucial effect on one’s longer-term place in the the discipline, and the kinds of questions one is prepared to ask.

After I got my job here, an administrator (not in music) said in a meeting, “Graduate students don’t understand that the questions are generated in the literature, not out of thin air. That’s where you find the useful research questions.” That’s pure discipline (in the worst sense), and it is dead wrong. The real questions are the ones you’re asking, and no one else even notices. The Literature, in this context, does nothing but deaden and restrain; you, looking at The Actual Stuff, cannot believe that someone hasn’t yet asked what are to you these most obvious questions. One of the commonest closing gambits of articles, theses, and dissertations is the faux-humble “suggestions for further research,” which are invariably both (even) less interesting that whatever you’ve just fought your way through and safely within its worldview. How likely or even possible is it for an author to suggest something more interesting and worthwhile than (save us!) “the present study”? Too often such suggestions are wan attempts at making a homework assignment-type study look like part of a bigger, more promising picture. For my money, your question is not one that another individual is suggesting that you ask in order to validate their stuff. It’s not that you’ll be free of the literature; of course, you have to scour it to make sure that you’ve got adequate background to ask said question, and that someone hasn’t been there first. If all the great sages of your discipline haven’t even approached a question you find to be important, then put the pedal to the floor.

Still, as Phil points out, there are still boundaries and we ignore them at our own peril. “You can’t just treat these boundaries as if they aren’t there,” he observes; “where are you going to get your stuff published? If your stuff doesn’t get published, you won’t get a job, or you won’t get to keep the job you already have. That’s real. You better believe you’ve got something invested in the discipline.”

Too true, and while your research questions and topic are your responsibility, the consequences belong to you too, so your place in the discipline is a more multifaceted issue than, simply, this one research project. So it interests you; good. Is it likely to interest anyone else, to expand into something broader, to be the sort of thing a department might find attractive? Sure, anything can be great research, as potential graduate students are assured by faculty seeking to bolster flagging numbers, but if potential colleagues elsewhere find it trivial they won’t be interested in numbering you among their colleagues, and that’s not something you get to appeal, engage, or critique.

The business of becoming someone’s colleague touches on Phil’s comment that our “…discipline is constituted in the jobs we apply for: 19th-century opera, popular music and jazz, music and disability, etc. Academic appointments serve academic fields that exist so we can give degrees in them. Make no mistake, we provide a consumer product, and that product is a diploma.” Two points, here: I would prefer not to slight the generalist jobs that people apply for and a vast minority actually land. Specialty—what I’ve just been discussing—is all but irrelevant in such cases. You’re a Debussy scholar? Great; do try to keep publishing, but your 4 + 4 load will include the entire music history sequence, music appreciation, upper division courses in all periods, and the senior capstone seminar.

And to choose one from among the given examples: how common are listings that request a “music and disability” specialty? Fact: a certain number of people working in a thriving subdiscipline does not mean there are jobs in it. To my eye, the specialties for which people are hired still relate rather closely to the major historical periods—she’s a rock-solid nineteenth-century scholar, he’ll anchor the medieval side of things, and let’s see if there are some secondary interests we can parlay into course offerings (Women in Music, History of Rock, Pedagogy, etc. etc.). There’s a reason for this; specialists in major historical periods are far more likely to be able to mentor a variety of kinds of research in those periods than will the narrower specialist. For the vast majority of institutions, more marginal areas are at best secondary. All the fell developments in the economic and cultural ecosystem do not amount to more hires in Music and Disability Studies or even Performance Practices, believe me. So while the intellectual question and subdiscipline you choose should not be defined by the Discipline or Literature in the sense of What Everyone Else is Doing, never doubt that a reality principle will still obtain.

Finally, of Susan McClary: “…it’s the brilliance and dash of her writing that makes her stand out in our collective memory. There is a lesson in this, graduate students.” Hold on, graduate students—put your pens down for a second. “To achieve style, begin by affecting none,” wrote E. B. White, and that is sage advice. For me, what makes Susan McClary stand out in collective memory is not style so much as content: she was indefatigable in her sharp cultural interrogations of canonical works (e.g. Mozart’s G Major piano concerto K. 453/II, Bach Brandenburg Concerto #5/I, the opera Carmen, and much more), and her resistance to what she perceived as sacralization without thought, obedient cultural idol-worship, and the safe and lazy concept of the autonomous artwork which absolved later listeners from awareness of the cultural contexts taken for granted by historical composers and their listeners.

I personally found her writing to be searing and bilious, but that’s not the point. The lesson I would stress for any graduate student is to write like you, not like Susan McClary or anyone else. Say what you have to say, but don’t lard it with “brilliance and dash”— simply play your game, as the sports announcers say and I never tire of repeating. Your game. In the vast landscape of American academia, the “searing” written idiom has receded, as has cultural criticism itself to a certain extent, and I have to wonder if it was the bitter tone that eventually wore everyone out. Musicology departments rarely exist in a vacuum, and most music schools still search out that body of applicants who had a fantastic band conductor or choir teacher and simply want More Of That Sound rather than those drawn to vinegary critiques of our art.

Posted in Academia