A Visual Allegory for the Death of Classical Music

frontier psych

Image | Posted on by | Leave a comment

Superstition

This is the second installment of my “Magic: You’re Soaking in It” series. I don’t know if there will be a third part. I have at least one other projected series (the one on Sun Ra’s Space is the Place) still hanging fire. Well, let’s write this one and see where we are . . .

In the previous installment of this post, I argued that magic (you know, the pentagrams-and-incense kind of magic, not the sawing-ladies-in-half kind), however outré it may seem, is actually a part of our everyday experience: you’re soaking in it. Or, as the Insane Clown Posse would say,

big icp

For those that need one, an explanation.

And to make this point, I wrote a little story about Ivy, who used augury to answer an important question about the direction of her life. At the end I suggested that Ivy would probably not tell many people about her moment of personal revelation and indeed might eventually forget about it entirely. But at the time, the revelation would carry real weight.

I came up with this example when explaining to a friend what I wanted to do in my next book. Maybe something like what I have described in Ivy’s story has never happened to you, but my friend — a major intellectual presence in my field and a hard-headed, no-nonsense kind of person — clearly knew what I was talking about.

I got to the end of my story and said to my friend, so, imagine that, at that moment you made your decision, a little rationalist demon appeared on your shoulder and said “wait, what are you doing? How could you make such an important decision on the basis of chance? What possible relationship could there be between the flight of crows are whether or not to dump your boyfriend? Explain yourself!” To which my friend replied, sometimes I wish I could kill that little demon.

I thought this was an interesting response. It suggests three things:

1. The rationalist demon wins just about every time: sooner or later, we know we’ll have report to him and account for ourselves.

2. The demon’s presence is unwelcome and his authority over us is a burden.

3. The demon’s point of view is not even useful in the kind of situation I have described.

I take no. 3 for granted. I have written a lot in the last couple of years about the ways that calculative rationality cannot account for aesthetic and existential experience, and how our dogged attempts to reduce experience along some quantified scale constitute a kind of barbarism. The meaning of Ivy’s experience doesn’t have anything to do with whatever it is our demon is going on about. Although perhaps I will need to make that argument at greater length elsewhere. We’ll see. But for now, I want to spend more time on nos. 1 and 2.

Regarding no. 1: Rationalism (that is, causal reasoning based on the known workings of the physical universe) is the regulating background idea by which educated persons in our society judge the value of any particular notion or proposition. This is true even of the most bigoted of religious fundamentalists. It is often and rightly said that fundamentalism is an essentially modern phenomenon. On some level, fundamentalists know just as well as anyone else that their religious ideas are imperiled or outright falsified by rationalism. Fundamentalists don’t seem to care much what science has to say about their beliefs, but I would suggest that fundamentalists are the ones who care the most. They just refuse to meet science half-way. Their response to skeptical modernity is not that of the liberal churchman, for whom biblical tales are metaphors and scripture represents knowledge of a different order from that of science. This schema of different but non-competing knowledges is Stephen Jay Gould’s “non-overlapping magisteria.” Fundamentalists cannot accept two-ness in knowledge, though. For them, there can only be one truth.

And if their truth cannot be proven, demonstrated, or otherwise handled with the tools of rationalism, then they will place all their energy into belief. Belief is what fundamentalists use to sustain an idea in the face of rational disproof. Belief is a practice. One doesn’t simply believe in something for all time: belief is something you need to renew constantly through spiritual practices (prayer, fasting, etc.) and through participation in a community of fellow believers. The more of a headwind your belief faces, the more vigorously you will have to paddle. Fundamentalists are as dogmatic, humorless, unimaginative, and ruthless as they are because they are having to work very hard indeed to sustain their beliefs. Their beliefs are ridiculous in precise proportion to the energy with which they are maintained in the face of opposition. And the force of opposition is what sustains them in their belief. Like Tertullian, they believe because it is absurd. Their belief is the flip side of rationalist unbelief.

And they are confirmed in their desperate mission of belief because of no. 2: they experience the demands of modern rationalism as a burden. But here I should stop calling them “them”: this is something we all have in common. Rationalism too relies on a belief that must be practiced and which demands a certain cost of human energy.

It is not easy to live in a disenchanted universe. In fact, the systematic post-Enlightenment denial of intelligence, agency, and meaning in the affairs of the universe has proven toxic to human life. I have written a certain amount about this myself, and in doing so I have only added a couple of drops to the vast ocean of writing that identifies rational disenchantment as the cognitive signature of modernity and also its curse. From this point of view, religious fundamentalists and hardcore scientific rationalists of the Dawkins variety are both the damaged children of modernity. Both groups face the same yawning chasm of meaninglessness, but the one puts all its energy into negating it and the other into identifying with it.

But there remains a certain symmetry between these groups. Robert Anton Wilson writes in his preface to Cosmic Trigger I (a widely-read and entertaining late-20th-century occult narrative) that, while he has received a lot of great fan mail over the years,

not all the mail I have received about this book has been intelligent and thoughtful. I have received several quite nutty and unintentionally funny poison-pen letters from two groups of dogmatists — Fundamentalist Christians and Fundamentalist Materialists.

The Fundamentalist Christians have told me that I am a slave of Satan and should have the demons expelled with an exorcism. The Fundamentalist Materialists inform me that I am a liar, charlatan, fraud and scoundrel. Aside from this minor difference, the letters are astoundingly similar. Both groups share the same crusading zeal and the same total lack of humor, charity and common human decency.

These intolerable cults have served to confirm me in my agnosticism by presenting further evidence to support my contention that when dogma enters the brain, all intellectual activity ceases. [RAW, Cosmic Trigger I, viii-ix.]

wilson-cosmic-trigger-ii-new-falcon

Portrait of Robert Anton Wilson as a skinny, naked Louis CK about to go onstage and do some prop comedy.

Elsewhere in the same passage, Wilson paraphrases the last paragraph in his best-known epigram, “belief is the death of intelligence.”

One thing both groups have in common is their tendency to refuse calling their point of view a belief. A good number of Christians refuse even to acknowledge that Christianity is a religion: if a religion can be any one of the many competing ways of understanding the divine, then Christianity cannot be the same kind of thing, because it alone is true. Or else Christianity is the only religion and the others are . . . uh, well, call them something else. My local Half-Price Books shelves Christian books in the section marked “religion” and everything else (Jewish, Muslim, Buddhist, Pagan, etc.) in “Metaphysical.” If you inhabit this worldview, you might not even acknowledge that your belief is a belief at all, much less (as I have argued earlier) a belief you have to practice and sustain in the face of your own rational doubts — the demon that sits forever on your shoulder.

Scientific rationalists say almost the exact same thing: I don’t “believe” in science, I know science to be true.* I used to say this exact thing myself. But you cannot keep belief entirely out of the picture. At the beginning of his Principles of Psychology, William James writes about the scope of his work and describes intention towards a goal (“the pursuance of future ends and the choice of means for their attainment”) as the hallmark of mental life, commenting that the question of intelligence and purpose is central to “the deepest of all philosophic problems:”

Is the Kosmos an expression of intelligence rational in its inward nature, or a brute external fact pure and simple? If we find ourselves, in contemplating it, unable to banish the impression that it is a realm of final purposes, that it exists for the sake of something, we place intelligence at tile heart of it and have a religion. If, on the contrary, in surveying its irremediable flux, we can think of the present only as so much mere mechanical sprouting from the past, occurring with no reference to the future, we are atheists and materialists.

This philosophic conflict animates the quarrel between Darwinian scientists and the proponents of “intelligent design,” though I don’t want to spend any time on that, because this post is already long enough and I don’t want to get distracted from my point, which is, as I say, that you cannot keep belief entirely out of the picture. And the decision to interpret the world as having or not having final purposes is itself a matter of belief. It is not, and cannot be, a scientific proposition. It is a metaphysical idea, a statement about the universe that permits scientific thinking, not itself a statement that can be proved or disproved through scientific (empirical, experimental, repeatable, falsifiable) means. To think you can purge metaphysics from any philosophical system is to dream the impossible logico-mathematical dream of a “set of all sets,” the unshakable logical foundation whose propositions include themselves.

What I am saying is, to arrive at science, you need a bit of belief. However, since scientific rationalism cannot find a place for belief, belief is denied and repressed. It doesn’t go away, though, but goes underground, where it becomes the shadow side of scientism. Remember what I said about the shadow in my last post? “Write a list of everything you dislike most in others. Congratulations, you have just described the part of yourself you don’t like and can’t acknowledge.” Well, then.

So as I wrote earlier, fundamentalist rationalism, like religious fundamentalism, relies on a belief that must be practiced and that demands a constant output of human energy. Each fundamentalism is the shadow of the other: the shadow side of religious fundamentalism is doubt, and the shadow side of rationalist fundamentalism is belief. Most of us, probably, are not fundamentalists of either kind. But whether one is from a religious background or a scientific-atheist background or a nothing-in-particular background, any half-educated person in the modern West will still have that little rationalist demon resting on her shoulder almost all the time, monitoring her thoughts and telling her whether her thoughts are acceptable to the ruling dispensation. A religious person and a secular person will react differently to that demon, but the demon will be there all the same. And either way, that demon will stand over their experience.

We normally associate the word “superstition” with the sort of magical thinking that Ivy entertains in my little hypothetical scenario. But consider the etymology of “superstition”: it comes from a Latin root that means “to stand over.” Superstition is a kind of thinking in which a received idea stands over one’s present experience. “Step on a crack, break your mother’s back.” Really? Did you test that hypothesis? How many mothers with broken backs have you met? Your experience almost certainly contradicts the saying. But if, contrary to your own experience, you jump over the crack anyway, you are letting something you’ve heard, a little rhyme jingling in your ear, stand over your own experience. You don’t really pay attention to your experience; you make your experience fit the received idea. Superstitions are hard to uproot because they monitor and remap our reality as it unfolds. Whatever happens to us that doesn’t fit the script is shuffled off to that mental folder marked “awkward/inconvenient.”

Actual scientists are rightly irritated by the superstitious mishandling of scientific concepts in the popular press — for example, when they hear people say “science tells us,” as if science is some monolithic and unquestionable Vatican-like authority, or when they encounter the rebadged phrenology passed off as neuroscience, a.k.a. neurobollocks, that has gone hand-in-hand with the mindfulness fad. But that’s not really my point. I call superstition anything that stands over our experience and insists that we discount it for an idea we must take on faith or authority. And from this point of view, the rationalist demon that would tell Ivy not to pay attention to her own emotional experience is a figure of superstitious dread.

So am I saying, like my friend, that we should kill that little demon? No. Because magic is nothing if not an excellent way of bargaining with demons. But that subject will have to wait for another time. For now, I will simply say that while Robert Anton Wilson called belief “the death of intelligence,” he also acknowledged that he was no more able to transcend it than anyone else. How he dealt with belief, though, says much about how magical thinking works, and what it might have to offer scholars in the humanities.

 

*Notice how often those who go along with the assumptions of scientific rationalism without thinking all that much about them — for example, the clever fellows who do the highly-entertaining Cracked Podcast — will say “I believe in science.” If you called them on it they’d probably backpedal, but this is a case of an offhand vernacular expression being the more accurate (or at least the more honest) one.

 

 

Posted in Magic, Philosophy, Religion | 2 Comments

Battlefield medicine

J. F. Martel, the author of Reclaiming Art in the Age of Artifice (a wonderful book about which more soon, I hope) sent out a tweet a few weeks ago:

I remembered this tweet when I read this week’s issue of The New Yorker, which has an essay on Andy Puddicombe, a “mindfulness guru for the tech set.” I hate every word in that clause, with the exception of “for” and “the.” But I will focus on “mindfulness,” a word whose recent misuse rivals that of “literally.” What’s at stake in “mindfulness,” as the word is now used, is meditation, and more particularly meditation conceived and used as a technology. Even more particularly, it is meditation used as “tech,” in the same sense that Scientologists use the word, as a set of mental techniques used to maximize personal success and gain.

I should say up front that I am hardly a neutral observer here, and I am hardly innocent of the self-seeking and self-delusion I detest in the current “mindfulness” fad.* I have my own history with meditation. I started meditating in 2009, and even as recently as that meditation was still a bit ooky and New-Age. When I told people I had gotten into meditation, I, hard-bitten skeptical rationalist that I was, would feel slightly embarrassed about it.** But as I write now, in 2015, meditation has attained the same place in American culture that yoga did in the last decade, and hardly anyone needs to apologize for doing it. To meditate is to be on the winning team: you’re in the same company as half of Silicon Valley and those pretty blonde girls, eyes soulfully closed, that show up everywhere on the covers of news magazines.

I learned to meditate at Sanshiji, a small Zen center in Bloomington that was founded by Shohaku Okumura, a great Soto Zen teacher who (among other things) wrote what is, for my money, the single best introduction to the Zen teachings of Dogen Zenji. I studied and meditated at Sanshinji for four years and took lay ordination there.*** I’m not saying that to establish my bone fides, though. I was a washout as a Zen student, and the only qualification I have for writing this post is my failure. Failure can be instructive, though, or at least amusing.

At the center of Dogen’s teaching is the idea of practice unmotivated by the thought of gain. The practice of zazen (Zen meditation) is called shikantaza, or “just sitting.” Just sitting is not sitting for anything; if you are sitting in order to reduce stress or something you are not just sitting. But the longer you sit, the more this “just sitting” seems like a koan, a riddle that the rational mind cannot solve. What if I am sitting in order to be a nicer person? Or to get better at meditation? Or (in the parlance of the Bodhisattva Vow) to save all beings? Still not “just sitting.” Saving all beings is the main point of Mahayana Buddhism, but still, “just sitting” itself doesn’t have a point. It’s pointless. Which is to say, zazen is good for nothing. And yet it is also the practice of “actualizing the fundamental point” — this is the title of Dogen’s most famous essay, genjokoan. The fundamental point is pointlessness . . . wait, what? If you are confused, well, then, join the club. It takes a lot of zazen to even begin to get a glimpse of what this all means. If it means anything at all. Maybe it doesn’t. Maybe meaning is not the point.

Well, whatever the point was, I missed it, even as I practiced it for hours each day, year after year. Even as I could say, with total conviction, that zazen is good for nothing, I was meditating in order to get stuff — to become nicer and happier, gain insight, to become an expert meditator, and (let’s be real here) to flatter a certain image of myself as a compassionate and insightful and accomplished person. I did not see this as it was happening. My intellectual fluency with the ideas of “just sitting” and “no-gaining-mind” itself became a point of pride that blinded me to the very fact that I was, in fact, pursuing something after all: being the very best no-gaining-mind student ever. Chögyam Trungpa has my number:

The problem is that ego can convert anything to its own use, even spirituality. Ego is constantly attempting to acquire and apply the teachings of spirituality for its own benefit. The teachings are treated as an external thing, external to “me,” a philosophy which we try to imitate. We do not actually want to identify with or become the teachings. So if our teacher speaks of renunciation of ego, we attempt to mimic renunciation of ego. We go through the motions, make the appropriate gestures, but we really do not want to sacrifice any part of our way of life. (Cutting Through Spiritual Materialism, 13.)

Ouch. None of this is pleasant to admit. But I am admitting it anyway, because those of us who have come up against hardships in meditation need to point out, to those who haven’t (not yet, anyway), that meditative practice ain’t no joke. It’s really, really hard, and in ways that are not the same as, say, the practice of a musical instrument or a profession. The “path” (as everyone calls it) is beset on all sides by deadfall traps.

deadfall

such an apt visual metaphor …

But this aspect of meditation is systematically ignored in the present-day culture of “mindfulness.” Secularized mindfulness meditation à la Jon Kabat-Zinn (mindfulness-based stress reduction, or MBSR) bills itself as merely a technique, not a spiritual path and certainly not a hazardous one. But secularized as it is, MBSR still has its origins in Buddhist metaphysics. Even as those metaphysics are ignored or disavowed, they are still deeply embedded in the practices themselves. And when you repress awareness of anything in your makeup, bad things start to happen. That’s what the shadow self is.**** Suffering and its sidekick, desire for gain, have become the shadow of mindfulness.

OK, to back up, a little Buddhism 101: Siddhartha Gautama (a.k.a. the Buddha) was above all concerned with the problem of suffering. Indeed, he considered suffering the fundamental condition of human existence. Simply put, suffering, or dukkha, is the condition of wanting things to be other than they are. We want stuff (sex, money, love, security, etc.) and not having it makes us full of envy and anxiety. We hate whatever it is that makes the things we love go away — like death, for example, which makes our friends and family go away. And the thing is, everything goes away. Impermanence is the other side of dukkha: it is what dooms our loves to remain unsatisfied and our hates to be ever-renewed. In fact, dukkha might better be translated as “dissatisfaction.” It’s a dynamic force, always turning, like a wheel. Or perhaps like a two-stroke motor: craving and aversion are the up and down strokes for the engine of suffering. What is called mindfulness is the practice of cultivating an equanimity in the face of the cycles of craving and aversion. If it means anything, mindfulness means being present for all sensations, good and bad, and seeing that “good” and “bad” are simply the names we give to them.

Suffering is a bummer, though, and to market “mindfulness” to Americans (who as a rule do not enjoy thinking about suffering) the meditation salesmen need to work on the branding. The New Yorker piece ends with a telling detail. Andy Puddicombe is conferring with one of his clients, a hair salon mogul, and prescribing some new exercises based on the Tibetan Four Ordinary Foundations. This is a practice in which meditators reflect upon (1) the good fortune of rebirth in the human realm,***** (2) impermanence, (3) karma, and (4) suffering. Now, Puddicombe has training as a Tibetan priest: he knows this stuff cold. But I suspect he also knows that

  • no. 1 depends upon a mythology that secular Westerners will find off-puttingly foreign and superstitious-sounding;
  • no. 2 is depressing;
  • no. 3 is a complex and intellectual-demanding teaching of interdependent causality that business-minded Westerners will find irritatingly abstract and philosophical;
  • no. 4 is really fucking depressing.

So Puddicombe renamed them. No. 1 becomes “appreciation,” no. 2 becomes “change,” no. 3 becomes “cause and effect,” and no. 4 becomes “Acceptance.”

But again, that which you deny will become your shadow. You can try to turn “suffering” into “acceptance,” but it doesn’t matter what you call it, mindfulness meditation remains a practice for the systematic investigation of suffering. Business culture seems to believe that you can change the thing itself if you just change the perception of it: this is “re-branding.” (In this way, marketing is actually very close to magic.) But the thing is, even if you remove it from the terminology, temples, robes, doctrines, and ceremonial forms of Buddhism, “mindfulness,” in its very form and expectations, remains inextricable from Buddhist metaphysics. I’m not saying that I think the fundamental truth of existence is suffering; I’m saying that, if you want to inhabit that worldview, mindfulness is an absolutely crackerjack way to do it.

This might mean a couple of things. First, a whole lot of people who are picking up meditation just to de-stress and enhance their productivity might be in for trouble. If you get deep enough into this practice, you will get an up-close-and-personal view of suffering. You will realize suffering. It will no longer be an idea, but something you see moment-to-moment. This can be a good or bad thing, depending on where you are. Those who have major depression should be very careful. (I wasn’t.) It really pisses me off to see meditation marketed as a cure for depression. Some of the people who are promulgating this stupid and dangerous idea are merely naïve; others should know better.

But more likely, what it means is that suffering will simply be denied, and year by year the practice will be changed to adapt it more and more to the popular understanding of meditation as a kind of mind hack, a wetware productivity app. This is why I liked Martel’s comparison of the wildfire spread of Eastern spirituality to the Romanization of the early Christian church. It’s another good-news-bad-news situation: The good news is that a powerful way of seeing through certain delusions, along with a powerful new (to the West) style of thought, is being made available to exponentially more people now than when it was just a few Beats and hippies showing up to Shunryu Suzuki’s zendo back in the 1960s. (I might write about the already palpable influence of meditation culture on present-day intellectual life in some future blog post.) But the bad news is that, in the process of popularization, the delicate complexity of a contemplative philosophy is brutally simplified and the selfless core of its teaching is corrupted by wealth and power. A clichéd complaint, I know. But some clichés are true.

In the meantime, it suits our neoliberal overlords just fine to have their peons meditate in the break room. One of the dissenting voices quoted in that New Yorker piece, Ronald Purser, comments that mindfulness “keeps us within the fences of the neoliberal capitalist paradigm. It’s saying, ‘It’s your problem, get with the program, fix your stress, and get back to work!’” Meditation is cheap — actually, it’s free. You don’t need specialized equipment. You don’t even need someone to show you how to do it: just sit somewhere quietly and see what happens. That’s a lot cheaper than having to pay for someone’s therapy and meds. As the neoliberal regime turns the screws harder and harder in a kind of vast science-fair experiment to see how much productivity can be squeezed out of people before they just up and die, meditation becomes battlefield medicine. Patch ’em up and send ’em back out there. I always think of a George Grosz satirical drawing of a WWI medico pronouncing a charred skeleton fit for active duty:

grosz KVRemember that quote from Chögyam Trungpa’s Spiritual Materialism: “The problem is that ego can convert anything to its own use, even spirituality. Ego is constantly attempting to acquire and apply the teachings of spirituality for its own benefit.” Replace the word “ego” with “capitalism” and see how little the meaning changes. Capitalism is the organization of the ego on planetary scale.

I still sit zazen, by the way.

*Here’s a clue for self-development: write a list of everything you dislike most in others. Congratulations, you have just described the part of yourself you don’t like and can’t acknowledge. This is what Jungians like to call “the shadow.”

**Notice how I poked fun of hard-bitten, skeptical rationalists in a recent post on magic. See what I mean about the shadow?

***So far as I know, there is no exact equivalent to Zen lay ordination, or jukai, in Western religious traditions. It does not imply the same commitment as priest ordination (I am no priest!) but it is still pretty much the same ceremony and it involves a lot of study and preparation, including and especially the hand-sewing of a miniature robe called a rakusu.

****See note *, above.

*****That is, as opposed to the realms of animals, hungry ghosts, warring spirits, gods, and hell. OK, now some Buddhism 102: In Tibetan Buddhism, these are the realms through which we are said to transmigrate, rebirth after rebirth, and it is only in the human realm that we can practice the Dharma and get a fighting chance of breaking the eternal cycle of suffering. So the point is, you’re supposed to make the most of your lucky break and practice hard enough to ensure at least a favorable rebirth. Those schools of Buddhism that favor ideas of rebirth tend to believe that it takes many, many lifetimes of practice to achieve the cessation of suffering — in other words, enlightenment. The Soto Zen school tends to believe neither in enlightenment nor in rebirth, which makes it a tough sell to the power-of-positive-thinking American consumers of Eastern spirituality. Which makes it richly ironic that “Zen” has become the premium brand-name for pseudo-Buddho crapola of every possible description.

Posted in Current Affairs, Religion, Technology | 4 Comments

Disciplinarity and Gatekeeping

I don’t want to let Phil’s superb post pass without a couple of comments beyond the “TELL IT, REV’RUNT!” that was my initial reaction. “The horizon of a discipline,” he notes, “is inevitably smaller and more limited” than “the full range of other minds [that] constitutes the true horizon that bounds the humanist.” To choose a humbler disciplinary home over the vast green fields of The Humanities means that “you are necessarily accepting a more limited participation in the expressive realm of the human. And that seems like a drag, giving up that pleasure for . . . for what? In part, for power. It’s ‘the discipline’ that tells you what matters; you don’t get to decide.”

This can be particularly and perniciously true in graduate school. Students sometimes feel they need guidance in finding a topic, but the guidance received might have more to do with The Kind Of Department We Want To Be than the individual student’s disposition and strength. Phil mentioned UCLA as a standout in cultural studies in musicology, and it has indeed been so for some time. One wonders if other departments, seeking to stake out different turf, try to move students in a particular direction to strengthen that position. “They’re telling me to read up on genre,” a student once told me with a sigh, and I had the distinct feeling that this had more to do with the way her department conceived itself than her scholarly well-being. Of course, no one department can do everything; such areas as combinatoric theory and psychomusicology are studied at certain institutions but would be impractical in the vast majority of places. In large part, though, one’s graduate school experience has a crucial effect on one’s longer-term place in the the discipline, and the kinds of questions one is prepared to ask.

After I got my job here, an administrator (not in music) said in a meeting, “Graduate students don’t understand that the questions are generated in the literature, not out of thin air. That’s where you find the useful research questions.” That’s pure discipline (in the worst sense), and it is dead wrong. The real questions are the ones you’re asking, and no one else even notices. The Literature, in this context, does nothing but deaden and restrain; you, looking at The Actual Stuff, cannot believe that someone hasn’t yet asked what are to you these most obvious questions. One of the commonest closing gambits of articles, theses, and dissertations is the faux-humble “suggestions for further research,” which are invariably both (even) less interesting that whatever you’ve just fought your way through and safely within its worldview. How likely or even possible is it for an author to suggest something more interesting and worthwhile than (save us!) “the present study”? Too often such suggestions are wan attempts at making a homework assignment-type study look like part of a bigger, more promising picture. For my money, your question is not one that another individual is suggesting that you ask in order to validate their stuff. It’s not that you’ll be free of the literature; of course, you have to scour it to make sure that you’ve got adequate background to ask said question, and that someone hasn’t been there first. If all the great sages of your discipline haven’t even approached a question you find to be important, then put the pedal to the floor.

Still, as Phil points out, there are still boundaries and we ignore them at our own peril. “You can’t just treat these boundaries as if they aren’t there,” he observes; “where are you going to get your stuff published? If your stuff doesn’t get published, you won’t get a job, or you won’t get to keep the job you already have. That’s real. You better believe you’ve got something invested in the discipline.”

Too true, and while your research questions and topic are your responsibility, the consequences belong to you too, so your place in the discipline is a more multifaceted issue than, simply, this one research project. So it interests you; good. Is it likely to interest anyone else, to expand into something broader, to be the sort of thing a department might find attractive? Sure, anything can be great research, as potential graduate students are assured by faculty seeking to bolster flagging numbers, but if potential colleagues elsewhere find it trivial they won’t be interested in numbering you among their colleagues, and that’s not something you get to appeal, engage, or critique.

The business of becoming someone’s colleague touches on Phil’s comment that our “…discipline is constituted in the jobs we apply for: 19th-century opera, popular music and jazz, music and disability, etc. Academic appointments serve academic fields that exist so we can give degrees in them. Make no mistake, we provide a consumer product, and that product is a diploma.” Two points, here: I would prefer not to slight the generalist jobs that people apply for and a vast minority actually land. Specialty—what I’ve just been discussing—is all but irrelevant in such cases. You’re a Debussy scholar? Great; do try to keep publishing, but your 4 + 4 load will include the entire music history sequence, music appreciation, upper division courses in all periods, and the senior capstone seminar.

And to choose one from among the given examples: how common are listings that request a “music and disability” specialty? Fact: a certain number of people working in a thriving subdiscipline does not mean there are jobs in it. To my eye, the specialties for which people are hired still relate rather closely to the major historical periods—she’s a rock-solid nineteenth-century scholar, he’ll anchor the medieval side of things, and let’s see if there are some secondary interests we can parlay into course offerings (Women in Music, History of Rock, Pedagogy, etc. etc.). There’s a reason for this; specialists in major historical periods are far more likely to be able to mentor a variety of kinds of research in those periods than will the narrower specialist. For the vast majority of institutions, more marginal areas are at best secondary. All the fell developments in the economic and cultural ecosystem do not amount to more hires in Music and Disability Studies or even Performance Practices, believe me. So while the intellectual question and subdiscipline you choose should not be defined by the Discipline or Literature in the sense of What Everyone Else is Doing, never doubt that a reality principle will still obtain.

Finally, of Susan McClary: “…it’s the brilliance and dash of her writing that makes her stand out in our collective memory. There is a lesson in this, graduate students.” Hold on, graduate students—put your pens down for a second. “To achieve style, begin by affecting none,” wrote E. B. White, and that is sage advice. For me, what makes Susan McClary stand out in collective memory is not style so much as content: she was indefatigable in her sharp cultural interrogations of canonical works (e.g. Mozart’s G Major piano concerto K. 453/II, Bach Brandenburg Concerto #5/I, the opera Carmen, and much more), and her resistance to what she perceived as sacralization without thought, obedient cultural idol-worship, and the safe and lazy concept of the autonomous artwork which absolved later listeners from awareness of the cultural contexts taken for granted by historical composers and their listeners.

I personally found her writing to be searing and bilious, but that’s not the point. The lesson I would stress for any graduate student is to write like you, not like Susan McClary or anyone else. Say what you have to say, but don’t lard it with “brilliance and dash”— simply play your game, as the sports announcers say and I never tire of repeating. Your game. In the vast landscape of American academia, the “searing” written idiom has receded, as has cultural criticism itself to a certain extent, and I have to wonder if it was the bitter tone that eventually wore everyone out. Musicology departments rarely exist in a vacuum, and most music schools still search out that body of applicants who had a fantastic band conductor or choir teacher and simply want More Of That Sound rather than those drawn to vinegary critiques of our art.

Posted in Academia

Disciplinarity (Or, Musicology is Anything You Can Get Away With)

I like musicologists. I like books and journals of musicology, musicology conferences and symposia, dinners with musicological colleagues near and far, musicological gossip and chit-chat. I like the concrete manifestations of musicology in the world. But I do not love that abstraction, “the discipline of musicology.” Like every academic discipline, musicology is nice in the concrete but lousy in the abstract.

A discipline is the claim of the general against the particular, the many against the one. It represents a limit placed on individual intellectual autonomy. For this reason alone I would not cross the street to save “the discipline” if were being attacked by a giant octopus.

octopus attack

Let’s face it, I only wrote that last sentence so I could use this image.

When you come upon a piece of scholarship that looks relevant to something you’re working on and yet also looks like it will take a tiresome lot of extra work to figure out (like if it’s something in a style that you don’t especially enjoy reading or involves things you don’t do well), invoking “the discipline” is a great way to say “not my job.” If an idea is not widely discussed in “the discipline,” you can say it’s marginal or unimportant. If a lot of people are suddenly talking about that same idea, you can say it’s “trendy.”

trendy flowchart“Trendy,” in this case, would mean you’re saying that an unwelcome idea is in the discipline but not of it; you can’t say it’s not already here, but you can deny it full rights of citizenship. No matter what, you have a reason not to consider new ideas. You therefore also have a reason not to talk to new people.

Too bad. The primary pleasure that scholarship offers is the chance to encounter other minds and thereby expand one’s own. The full range of other minds constitutes the true horizon that bounds the humanist; nothing human should be alien to us. The horizon of a discipline is inevitably smaller and more limited. If you choose to live within the latter, you are necessarily accepting a more limited participation in the expressive realm of the human. And that seems like a drag, giving up that pleasure for . . . for what?

In part, for power. It’s “the discipline” that tells you what matters; you don’t get to decide. “The discipline” stands over your work, telling you how much it matters and where it belongs. Unless you work your way up to being the discipline, in which case I guess the reward is getting to push people around. “The discipline” is an enabling ideology, a validation of power and hierarchy. To assimilate your own values to “the discipline,” to make “the discipline” the object of your care and protection, is to take on the role of the gatekeeper. And there are always academics with a lot invested in gatekeeping.

But do we have to have gates in the first place? There is a word I’ve borrowed from Teilhard de Chardin, the nöosphere, which relates to the idea that the entire world is a networked system of information within which ideas constantly circulate. A conversation between any two people within the nöosphere would constitute a single strand of it; conversations among multiple interlocutors are ganglions, knots of exfoliating thought-branches. But the whole networked structure, like the structure of the internet, is just that, networked.

map of the internet

A map of the internet.

There are no gates anywhere; people talk; talk gets out. You see concentrations within a vast cloud of interconnections that perhaps relate to particular kinds of conversation, talk about politics or sports or music. But they are only regions of greater or lesser density within the cloud of connections. Nothing is really separate from anything else. Why treat music as if it were? As if it were a family quarrel or an inside joke? Again: do we need gates at all?

Of course we do. You can’t just be a professor of “various awesome things.” A professor’s academic appointment is always in some field and indeed in some particular and advertised subfield. The discipline is constituted in the jobs we apply for: 19th-century opera, popular music and jazz, music and disability, etc. Academic appointments serve academic fields that exist so we can give degrees in them. Make no mistake, we provide a consumer product, and that product is a diploma.

The departments so formed will determine what it thinks a well-trained professional in the field should look like. Students so formed will go on to perpetuate some version of the same disciplinary identity or perhaps to challenge it, but either way their conversations will assume the priority of whatever disciplinary boundaries they have been given. Those boundaries continue to live on, if only as phantoms to be invoked and then banished. Fields exist because boundaries exist. Once a boundary springs up, it simultaneously defines both an inside and an outside.

When I’ve presented my work at gatherings in the interdisciplinary humanities (which 90% of the time means cultural studies in literature and film) I suddenly feel the boundaries of my own disciplinarity, if only because of the boundaries I encounter in others, and the way I see myself reflected in their perception. “Your work is so . . . phenomenological,” they say. That means that I am inclined, at some point in my talk, to consider what something sounds like.* But I can’t help but hear a tinge of suspicion, or maybe disdain, when people call my stuff “phenomenological”– the more political cult-studs types harbor a lingering prejudice against what they take to be a fetishistic and bourgeois attachment to the form and appearance of an artwork, perhaps out of a lingering Brechtian assumption that critique confronts (and trumps) the aesthetic, and that if you are too busy listening you won’t be thinking. And I bristle at this, and what comes to my mind is such scholars believe their job is only to fashion an abstract socio-cultural theory such that, should some expressive text wander across their path, they’ll be able to decode the everliving shit out of it. Which, entranced by “theory,” they almost never get around to doing.

You see how, whenever a boundary appears, it suddenly marks an inside and an outside? There we were, getting along so well, and now we’re quarreling.

You can’t just treat these boundaries as if they aren’t there. Where are you going to get your stuff published? If your stuff doesn’t get published, you won’t get a job, or you won’t get to keep the job you already have. That’s real. You better believe you’ve got something invested in the discipline.

But disciplinary boundaries change over time, clearly, and they do so because individuals question and overwrite them. Resisting disciplinarity is a gamble, but a calculated one, and sometimes it pays off. Look at Susan McClary. You don’t have to be a huge fan of McClary’s particular brand of hermeneutics to appreciate the license her work gave to all of us coming up behind her. She took a lot of crap — the critical response to Feminine Endings was perhaps the most epic bout of mansplaining in the history of musicology — but she got the chocolate of old-school musicology in with the peanut butter of cultural studies, did it with style, and she got away with it.** And once you know someone has gotten away with something, you can try to get away with something of your own. Andy Warhol liked to say (and Marshall McLuhan liked to repeat) that art is anything you can get away with. That’s my definition of musicology, too: musicology is anything you can get away with. Inasmuch as I’ve been published in musicological journals, read by musicologists, presented at musicological conferences, and engaged in musicological chit-chat, I’m a musicologist. I’m happy to wear that jacket.

Now, the example of Susan McClary might have some of you scratching your heads: if anyone has set the terms of “the discipline of musicology” in the past 20+ years, surely it would be her. How is she some kind of rebel against the system? Well, she isn’t anymore. This brings me to what I might call Ford’s Law of Interdisciplinarity:

Any successful interdisciplinary work will reconstitute the discipline on new ground.

You can’t have a permanent site of interdisciplinary anarchy for the same reason you can’t have a stable anarchist utopia. As the anarchist theorist Hakim Bey wrote,

The slogan ‘Revolution!’ has mutated from tocsin to toxin, a malign pseudo-Gnostic fate-trap, a nightmare where no matter how we struggle we never escape that evil Aeon, that incubus the State, one State after another, every ‘heaven’ ruled by yet one more evil angel.

Or, put in academic terms: once someone has written something successful in, say, sound studies, soon there will be other sound-studies scholars, then AMS evening sessions on sound studies, edited anthologies of sound studies, and then someone will post a job description for a “scholar in 20th century music and media, with an emphasis in sound studies.” And so it goes. Something that started off sounding the notes in the cracks between the keys has become its own tuning system. And this is inevitable.

So what I want imagine here is a humanist’s version of Hakim Bey’s “Temporary Autonomous Zone.” The TAZ is anarchist utopia on the model of a big party: people show up, make their own rules, get away with stuff, and disperse before the law shows up or before they themselves become the law. Burning Man is probably the most famous example of the TAZ in action, although Burning Man has recently incorporated itself as a nonprofit, which doubtless does not surprise Bey at all. Bey is notable (and often reviled) among anarchists for conceding that power and hierarchy are unavoidable in all human institutions, and that the dream of a true anarchist utopia — a stable, permanent state of statelessness — is just that, a dream. For Bey, freedom is to be found in those transient moments between its emergence and its institutionalization; the moment freedom takes institutional form, it is no longer freedom. Freedom is bounded not only by limits of space, but time.

Interdisciplinarity is likewise a TAZ. It is not a thing, and certainly not a durable thing, but a process, an action, and it must always make itself anew in cycles of birth and death. Interdisciplinarity is an event. A scholar doing interdisciplinary work is like someone throwing a party. It’s fun while it lasts, and you can take pictures, but you can’t make the party last forever: sooner or later, everyone has to go home. Don’t get too attached; call it beautiful, and move on.

*For musicologists, talking about “the notes on the page” (or, more recently, the sounds in my ear) is second nature. Right now UCLA is the hottest place for cultural-studies approaches in musicology, but even so the books its faculty have published (Elisabeth Le Guin’s Boccherini’s Body, Mitchell Morris’s The Persistence of Sentiment, and Robert Fink’s Repeating Ourselves all come to mind) have a lot of musical specificity. That Teutonic sauropod, Carl Dahlhaus, might irritate some of us when he insists that the proper business of a musicologist is write about (say) the C major prelude and fugue from WTC I in such a way that we will have something different to say about the C major prelude and fugue from WTC II. But we all have been trained to think this way, and, regardless of our chosen idiom within musicology, this is usually how we write as well. I’m not saying that this is always the case or that it necessarily should be — it’s just that musical specificity, in whatever form it takes, lies at the heart of our disciplinary identity, and that you only realize how true that is when you hang out with humanities people from outside our field.

**The fact that she was (and is) such a stylish writer is very much to the point here. McClary was not the only scholar making the Reese’s peanut butter cup of cultural musicology in the early 1990s, but it’s the brilliance and dash of her writing that makes her stand out in our collective memory. There is a lesson in this, graduate students.

Posted in Academia, Musicology | 5 Comments

And now, the Republican response to the SCOTUS ruling on marriage equality . . .

obannion

As someone said on Twitter, it was a bad week for people who like to put people in their place. Which means it was a bad week for the Republican Party. Various conservatives, from Justice Scalia on down (and it goes very far down indeed) reacted pretty much exactly like O’Bannion, the bullying asshole in Richard Linklater’s Dazed and Confused, who derives sadistic joy from hazing high-school freshies and gets his comeuppance at the end. The spectacle of a nominal adult driven to berserk but completely impotent rage (to the point of smashing his own beloved implement of harassment and humiliation) is, of course, highly entertaining.

Posted in Politics, Uncategorized

Thoughts on writing the 600th Dial M post

According to the WordPress dashboard, this is the 600th Dial M post. Actually, I think we passed that point awhile ago: when I was migrating the old Typepad site to WordPress I took advantage of the opportunity to delete a few of my old posts that seemed too ephemeral, fatuous, or mean-spirited to be worth keeping around. I should also mention that, in the migration, I deleted the old Typepad site, which turned out to be a mistake. It broke every internal link, stripped all the pictures from our posts, and most irritatingly someone immediately bought the old Dial M domain and put up a site for “princess cut engagement rings.” This summer, my graduate assistant Alex is rebuilding the old posts and putting everything back in order, so I’m happy about that. It means that there will be a proper public archive of what is, by now, a pretty large body of work. When I went up for tenure I had my GA count up all the words I had written for the site between 2006 and 2010, so I could give the Promotion and Tenure Committee some quantified measure of all the work I had put in on Dial M. It came to 189000 words — about the equivalent of two books. (And remember, that doesn’t count what Jonathan wrote.) Now, a page of blogging does not equal a page of published scholarship. I didn’t really write the equivalent of two books, or even one. But Jonathan and I have done something, and I still don’t know exactly what that something is.*

To be sure, I have written about blogging often enough in the past. But reading back at those nature-of-the-medium posts I find that, in this matter as in almost all others, I no longer really agree with anything I once thought. Over the years I have found a variety of ways to argue that blogging is (or will be) a big deal for scholars. But it isn’t, and at this point I don’t think it will be. To be sure, people will continue to write various kind of things and publish them on their personal websites, but the idea that there is a “blogosphere” within which a characteristic kind of writing and intellectual engagement can be cultivated — that ship has sailed.

But that’s not to say that such a thing never existed. I think it did, to some degree, in the years of our first Dial M stint (2006-2010). Back then, what excited me about the medium itself (as opposed to any particular bit of content we or anyone else generated for it) was the feeling that this was an autonomous intellectual/creative sphere where people could write whatever they pleased without having to satisfy any institutional gatekeepers or participate in someone else’s business plan. The 90/10 chaff-to-wheat ratio unavoidable in all human endeavors would remain in effect, but the good stuff could generate its own conversations and its own audience, and we wouldn’t have to ask anyone’s permission to do it. The authority to bestow attention on any given conversation was distributed throughout the network of interlinked blogs — that blogosphere you heard so much about in those days. Writing would beget writing. No-one was getting paid, but we were creating, and that was what mattered.

That was the idea, anyway. And again, that really did sort of happen, at least for a while. There was a lot of writing being done in those days and a lot of blogs launched, some borne along for a good many years by their high-minded ambitions. Yes, yes, most blogs were and are crap. But when the medium was generating its own heat, there was energy behind a certain kind of writing, located somewhere between epistolic and essayistic registers, that at its best was exciting and vital.

Perhaps the zeitgeist has moved on from blogging to podcasts. I know I listen to a lot more podcasts these days than I read blogs, and I don’t think I’m alone in that. But last year my old friend Ken Woods put his finger on what I think is the real cause for the change, which is social media, especially Twitter and Facebook. Do read his post on it. I’m having a hard time refraining from just quoting the whole thing here. But this, for me, is the crux:

… this shift of power is, really, all about power. Blogging used to be primarily fed by a decentralized system of mutual support among individual bloggers (even a site like Blognoggle was basically a blog) and readers. Blogging platforms gave individuals not only the power to publish to a world-wide audience (a truly historical breakthrough), but to decide which other writers to support.  It was a highly democratic and completely decentralized system.

Facebook is neither. Every post that goes up on Facebook is there because Facebook is looking to make money or gather information from it.  Running a blog in 2008 meant that when I published a popular new post, it wasn’t just bringing in new readers who might take an interest in me, it was also increasing the value of  the blogs of fellow bloggers on my blogroll. More readers for Ken meant more readers for Pliable. More readers for Alex Ross meant more readers for Jessica Duchen or Jeremy Denk.

Now, more readers for Ken means more profit and more power for Facebook. More readers for Pliable means more profit and more power for Facebook. I’d say well over 55% of my readership comes here from FB, especially for the most popular posts, the ones that really take off. Almost all of the remainder come from Twitter or Google searches.  Almost every reader for almost everything I write is making money for or enhancing the power of one of those three companies. 

I got into a Twitter beef with a couple of musicologist friends** in which I vented my spleen about this — my resentment at having to bend the knee to companies that perform psychological experiments on us and sell us out to government and corporate surveillance. I know I wouldn’t be on Twitter if I weren’t still writing for Dial M, but at the end of the day I really do want people to read my stuff, and as Ken points out, all the old ways people had of following blogs are basically gone. It’s social media or nothing. And that pisses me off, because I remember what it was like before.

The dominance of social media means that a blogger can have readers, but not a readership. People read the things they see on Facebook. My post on depression got more readers than anything I’ve ever written because it went up on a lot of FB walls . . . but that doesn’t change the overall pattern of site traffic. The old traffic pattern, which was driven by loyal readers who wanted to follow a given conversation — that’s gone. So the blogger is back in the same situation he was when dealing with academic publishers: he has a gatekeeper. To be sure, he can still keep writing his stuff and no-one is going to stop him, but one of the main incentives to writing — the sense that you’re having a conversation with people, or that your writing is doing something in the nöosphere — is now contingent on pleasing a boss. It’s just that the boss is now a collective abstraction. The question is now not “what does the editor want,” it’s what does social media want. This is not at all the same question as “what does my readership want?” It feels a whole lot less free.

So what’s the alternative? Putting all my elective writing energy into tweets and status updates? Not my style. The fact remains that, for me, writing is something I have an itch to do regardless of the conditions under which it is read. My book is right now somewhere around the 1,200,000th best-selling book on Amazon, which makes it about half as popular as Ramsey Dukes’s Uncle Ramsey’s Little Book of Demons. But I’m still glad I wrote that book exactly the way I wrote it.

regret nothing

Academics who read my post on Dukes might have thought that this was a case of an academic bestowing legitimacy on a crank writer, but so far as the mainstream is concerned, we’re all cranks.

And this is one reason I am going to keep writing at Dial M. There is something noble, somehow, about being a crank. Or at least kind of fun. In his wonderful essay “A Crank’s Progress,” Dukes imagines forming a Crank’s Union dedicated to the mutual encouragement and support of all marginal, obscure, and enthusiastic writers:

 … the first task of the Crank’s Union [is] to admit that it is exciting to be a crank; to recognize the thrill of seeing sudden connections of meaning between unrelated phenomena; to pit oneself against vertigo on the brink of the unknown, and then to jump.

In an earlier essay I pointed out that in an ordered and over-safe world there is a real need for dangerous ideas — that is a message that cranks would well understand as they live on the brink. When the News of the World published a poisonous article about a group of well-meaning seekers that I belonged to, I felt annoyed and hurt but, on reflection, realized that if ever the tabloids started to write in praise of any group I belonged to it would be time for me to go out and find something weirder to join. So a real crank should not only be proud to be such, but also rather jealous of that status: public recognition should mark the time to move on to crazier pastures.

It’s more fun to be free and unimportant than being important and dancing like an organ-grinder’s monkey on the end of a string. I like to write, and I’m grateful that I live in an age when I can write what I like and still get anyone to read it. So thanks if you’re reading, regardless of how you got here. I’ll be seeing you through the next 600 posts.

*I can tell you what it is according to university P&T committees, though. It’s filed under “service,” which could be defined as the work you’re expected to do but won’t get you tenure. Of course this is unfortunate, but I can’t be bothered to waste my breath complaining about it, because it will never change. Academia is intrinsically hierarchical, and any intellectual work you do that circumvents institutional hierarchy will not and cannot receive institutional rewards.

**To my friends: I apologize. It occurred to me afterwards that you might have inferred from my irritable tweets that I think that other scholars’ writing on Twitter is less valuable than my writing on Dial M. I don’t, and I regret it if it came off that way. What occurs to me now is, I don’t value blogging as such, as a medium, very much at all any more. What I value is writing, and the hegemony of social media works against writing. Twitter can serve scholarly ends, but whatever those ends are, I can’t bring myself to think of it as a genre of writing. It seems more like a placeholder for writing: ideally, you get in a Twitter conversation from which ideas arise in epigrammatic form and you work them out later. That’s what happened to me in this case — I had a bunch of Twitter exchanges that left me frustrated with the limits the medium placed on my ability to express myself coherently and without giving gratuitous offense, and am now using a medium that allows me to write something more considered.

But social media doesn’t thrive on considered essays, it wants cable-news-style heat. Facebook is infinitely more horrible than Twitter, but both are environments engineered to get people talking, and talking loudly. Talking, not reflecting, and not really writing, either, except on the terms of social media. It’s as if these platforms kept the worst parts of the old blogosphere (the noisy, gassy, self-promoting, narcissistic, petulant, and ignorantly opinionated parts) and cut out everything else. Or made everything else serve them. This doesn’t mean that my brilliant and wise friends somehow turn into mouth-breathing internet trolls the moment they open up their Twitter feeds. To think so would be to indulge in the crudest kind of technological determinism. There’s no reason a 21st-century Basho couldn’t express the most sublime thoughts and feelings in 140-character lines. But to do so you would have to swim hard against the current of the medium, which in the end has its own agenda, and I guarantee that it isn’t your agenda.

 

Posted in Academia, Blogging, Technology, Writing