Battlefield medicine

J. F. Martel, the author of Reclaiming Art in the Age of Artifice (a wonderful book about which more soon, I hope) sent out a tweet a few weeks ago:

I remembered this tweet when I read this week’s issue of The New Yorker, which has an essay on Andy Puddicombe, a “mindfulness guru for the tech set.” I hate every word in that clause, with the exception of “for” and “the.” But I will focus on “mindfulness,” a word whose recent misuse rivals that of “literally.” What’s at stake in “mindfulness,” as the word is now used, is meditation, and more particularly meditation conceived and used as a technology. Even more particularly, it is meditation used as “tech,” in the same sense that Scientologists use the word, as a set of mental techniques used to maximize personal success and gain.

I should say up front that I am hardly a neutral observer here, and I am hardly innocent of the self-seeking and self-delusion I detest in the current “mindfulness” fad.* I have my own history with meditation. I started meditating in 2009, and even as recently as that meditation was still a bit ooky and New-Age. When I told people I had gotten into meditation, I, hard-bitten skeptical rationalist that I was, would feel slightly embarrassed about it.** But as I write now, in 2015, meditation has attained the same place in American culture that yoga did in the last decade, and hardly anyone needs to apologize for doing it. To meditate is to be on the winning team: you’re in the same company as half of Silicon Valley and those pretty blonde girls, eyes soulfully closed, that show up everywhere on the covers of news magazines.

I learned to meditate at Sanshiji, a small Zen center in Bloomington that was founded by Shohaku Okumura, a great Soto Zen teacher who (among other things) wrote what is, for my money, the single best introduction to the Zen teachings of Dogen Zenji. I studied and meditated at Sanshinji for four years and took lay ordination there.*** I’m not saying that to establish my bone fides, though. I was a washout as a Zen student, and the only qualification I have for writing this post is my failure. Failure can be instructive, though, or at least amusing.

At the center of Dogen’s teaching is the idea of practice unmotivated by the thought of gain. The practice of zazen (Zen meditation) is called shikantaza, or “just sitting.” Just sitting is not sitting for anything; if you are sitting in order to reduce stress or something you are not just sitting. But the longer you sit, the more this “just sitting” seems like a koan, a riddle that the rational mind cannot solve. What if I am sitting in order to be a nicer person? Or to get better at meditation? Or (in the parlance of the Bodhisattva Vow) to save all beings? Still not “just sitting.” Saving all beings is the main point of Mahayana Buddhism, but still, “just sitting” itself doesn’t have a point. It’s pointless. Which is to say, zazen is good for nothing. And yet it is also the practice of “actualizing the fundamental point” — this is the title of Dogen’s most famous essay, genjokoan. The fundamental point is pointlessness . . . wait, what? If you are confused, well, then, join the club. It takes a lot of zazen to even begin to get a glimpse of what this all means. If it means anything at all. Maybe it doesn’t. Maybe meaning is not the point.

Well, whatever the point was, I missed it, even as I practiced it for hours each day, year after year. Even as I could say, with total conviction, that zazen is good for nothing, I was meditating in order to get stuff — to become nicer and happier, gain insight, to become an expert meditator, and (let’s be real here) to flatter a certain image of myself as a compassionate and insightful and accomplished person. I did not see this as it was happening. My intellectual fluency with the ideas of “just sitting” and “no-gaining-mind” itself became a point of pride that blinded me to the very fact that I was, in fact, pursuing something after all: being the very best no-gaining-mind student ever. Chögyam Trungpa has my number:

The problem is that ego can convert anything to its own use, even spirituality. Ego is constantly attempting to acquire and apply the teachings of spirituality for its own benefit. The teachings are treated as an external thing, external to “me,” a philosophy which we try to imitate. We do not actually want to identify with or become the teachings. So if our teacher speaks of renunciation of ego, we attempt to mimic renunciation of ego. We go through the motions, make the appropriate gestures, but we really do not want to sacrifice any part of our way of life. (Cutting Through Spiritual Materialism, 13.)

Ouch. None of this is pleasant to admit. But I am admitting it anyway, because those of us who have come up against hardships in meditation need to point out, to those who haven’t (not yet, anyway), that meditative practice ain’t no joke. It’s really, really hard, and in ways that are not the same as, say, the practice of a musical instrument or a profession. The “path” (as everyone calls it) is beset on all sides by deadfall traps.

deadfall

such an apt visual metaphor …

But this aspect of meditation is systematically ignored in the present-day culture of “mindfulness.” Secularized mindfulness meditation à la Jon Kabat-Zinn (mindfulness-based stress reduction, or MBSR) bills itself as merely a technique, not a spiritual path and certainly not a hazardous one. But secularized as it is, MBSR still has its origins in Buddhist metaphysics. Even as those metaphysics are ignored or disavowed, they are still deeply embedded in the practices themselves. And when you repress awareness of anything in your makeup, bad things start to happen. That’s what the shadow self is.**** Suffering and its sidekick, desire for gain, have become the shadow of mindfulness.

OK, to back up, a little Buddhism 101: Siddhartha Gautama (a.k.a. the Buddha) was above all concerned with the problem of suffering. Indeed, he considered suffering the fundamental condition of human existence. Simply put, suffering, or dukkha, is the condition of wanting things to be other than they are. We want stuff (sex, money, love, security, etc.) and not having it makes us full of envy and anxiety. We hate whatever it is that makes the things we love go away — like death, for example, which makes our friends and family go away. And the thing is, everything goes away. Impermanence is the other side of dukkha: it is what dooms our loves to remain unsatisfied and our hates to be ever-renewed. In fact, dukkha might better be translated as “dissatisfaction.” It’s a dynamic force, always turning, like a wheel. Or perhaps like a two-stroke motor: craving and aversion are the up and down strokes for the engine of suffering. What is called mindfulness is the practice of cultivating an equanimity in the face of the cycles of craving and aversion. If it means anything, mindfulness means being present for all sensations, good and bad, and seeing that “good” and “bad” are simply the names we give to them.

Suffering is a bummer, though, and to market “mindfulness” to Americans (who as a rule do not enjoy thinking about suffering) the meditation salesmen need to work on the branding. The New Yorker piece ends with a telling detail. Andy Puddicombe is conferring with one of his clients, a hair salon mogul, and prescribing some new exercises based on the Tibetan Four Ordinary Foundations. This is a practice in which meditators reflect upon (1) the good fortune of rebirth in the human realm,***** (2) impermanence, (3) karma, and (4) suffering. Now, Puddicombe has training as a Tibetan priest: he knows this stuff cold. But I suspect he also knows that

  • no. 1 depends upon a mythology that secular Westerners will find off-puttingly foreign and superstitious-sounding;
  • no. 2 is depressing;
  • no. 3 is a complex and intellectual-demanding teaching of interdependent causality that business-minded Westerners will find irritatingly abstract and philosophical;
  • no. 4 is really fucking depressing.

So Puddicombe renamed them. No. 1 becomes “appreciation,” no. 2 becomes “change,” no. 3 becomes “cause and effect,” and no. 4 becomes “Acceptance.”

But again, that which you deny will become your shadow. You can try to turn “suffering” into “acceptance,” but it doesn’t matter what you call it, mindfulness meditation remains a practice for the systematic investigation of suffering. Business culture seems to believe that you can change the thing itself if you just change the perception of it: this is “re-branding.” (In this way, marketing is actually very close to magic.) But the thing is, even if you remove it from the terminology, temples, robes, doctrines, and ceremonial forms of Buddhism, “mindfulness,” in its very form and expectations, remains inextricable from Buddhist metaphysics. I’m not saying that I think the fundamental truth of existence is suffering; I’m saying that, if you want to inhabit that worldview, mindfulness is an absolutely crackerjack way to do it.

This might mean a couple of things. First, a whole lot of people who are picking up meditation just to de-stress and enhance their productivity might be in for trouble. If you get deep enough into this practice, you will get an up-close-and-personal view of suffering. You will realize suffering. It will no longer be an idea, but something you see moment-to-moment. This can be a good or bad thing, depending on where you are. Those who have major depression should be very careful. (I wasn’t.) It really pisses me off to see meditation marketed as a cure for depression. Some of the people who are promulgating this stupid and dangerous idea are merely naïve; others should know better.

But more likely, what it means is that suffering will simply be denied, and year by year the practice will be changed to adapt it more and more to the popular understanding of meditation as a kind of mind hack, a wetware productivity app. This is why I liked Martel’s comparison of the wildfire spread of Eastern spirituality to the Romanization of the early Christian church. It’s another good-news-bad-news situation: The good news is that a powerful way of seeing through certain delusions, along with a powerful new (to the West) style of thought, is being made available to exponentially more people now than when it was just a few Beats and hippies showing up to Shunryu Suzuki’s zendo back in the 1960s. (I might write about the already palpable influence of meditation culture on present-day intellectual life in some future blog post.) But the bad news is that, in the process of popularization, the delicate complexity of a contemplative philosophy is brutally simplified and the selfless core of its teaching is corrupted by wealth and power. A clichéd complaint, I know. But some clichés are true.

In the meantime, it suits our neoliberal overlords just fine to have their peons meditate in the break room. One of the dissenting voices quoted in that New Yorker piece, Ronald Purser, comments that mindfulness “keeps us within the fences of the neoliberal capitalist paradigm. It’s saying, ‘It’s your problem, get with the program, fix your stress, and get back to work!’” Meditation is cheap — actually, it’s free. You don’t need specialized equipment. You don’t even need someone to show you how to do it: just sit somewhere quietly and see what happens. That’s a lot cheaper than having to pay for someone’s therapy and meds. As the neoliberal regime turns the screws harder and harder in a kind of vast science-fair experiment to see how much productivity can be squeezed out of people before they just up and die, meditation becomes battlefield medicine. Patch ’em up and send ’em back out there. I always think of a George Grosz satirical drawing of a WWI medico pronouncing a charred skeleton fit for active duty:

grosz KVRemember that quote from Chögyam Trungpa’s Spiritual Materialism: “The problem is that ego can convert anything to its own use, even spirituality. Ego is constantly attempting to acquire and apply the teachings of spirituality for its own benefit.” Replace the word “ego” with “capitalism” and see how little the meaning changes. Capitalism is the organization of the ego on planetary scale.

I still sit zazen, by the way.

*Here’s a clue for self-development: write a list of everything you dislike most in others. Congratulations, you have just described the part of yourself you don’t like and can’t acknowledge. This is what Jungians like to call “the shadow.”

**Notice how I poked fun of hard-bitten, skeptical rationalists in a recent post on magic. See what I mean about the shadow?

***So far as I know, there is no exact equivalent to Zen lay ordination, or jukai, in Western religious traditions. It does not imply the same commitment as priest ordination (I am no priest!) but it is still pretty much the same ceremony and it involves a lot of study and preparation, including and especially the hand-sewing of a miniature robe called a rakusu.

****See note *, above.

*****That is, as opposed to the realms of animals, hungry ghosts, warring spirits, gods, and hell. OK, now some Buddhism 102: In Tibetan Buddhism, these are the realms through which we are said to transmigrate, rebirth after rebirth, and it is only in the human realm that we can practice the Dharma and get a fighting chance of breaking the eternal cycle of suffering. So the point is, you’re supposed to make the most of your lucky break and practice hard enough to ensure at least a favorable rebirth. Those schools of Buddhism that favor ideas of rebirth tend to believe that it takes many, many lifetimes of practice to achieve the cessation of suffering — in other words, enlightenment. The Soto Zen school tends to believe neither in enlightenment nor in rebirth, which makes it a tough sell to the power-of-positive-thinking American consumers of Eastern spirituality. Which makes it richly ironic that “Zen” has become the premium brand-name for pseudo-Buddho crapola of every possible description.

Posted in Current Affairs, Religion, Technology | 2 Comments

Disciplinarity and Gatekeeping

I don’t want to let Phil’s superb post pass without a couple of comments beyond the “TELL IT, REV’RUNT!” that was my initial reaction. “The horizon of a discipline,” he notes, “is inevitably smaller and more limited” than “the full range of other minds [that] constitutes the true horizon that bounds the humanist.” To choose a humbler disciplinary home over the vast green fields of The Humanities means that “you are necessarily accepting a more limited participation in the expressive realm of the human. And that seems like a drag, giving up that pleasure for . . . for what? In part, for power. It’s ‘the discipline’ that tells you what matters; you don’t get to decide.”

This can be particularly and perniciously true in graduate school. Students sometimes feel they need guidance in finding a topic, but the guidance received might have more to do with The Kind Of Department We Want To Be than the individual student’s disposition and strength. Phil mentioned UCLA as a standout in cultural studies in musicology, and it has indeed been so for some time. One wonders if other departments, seeking to stake out different turf, try to move students in a particular direction to strengthen that position. “They’re telling me to read up on genre,” a student once told me with a sigh, and I had the distinct feeling that this had more to do with the way her department conceived itself than her scholarly well-being. Of course, no one department can do everything; such areas as combinatoric theory and psychomusicology are studied at certain institutions but would be impractical in the vast majority of places. In large part, though, one’s graduate school experience has a crucial effect on one’s longer-term place in the the discipline, and the kinds of questions one is prepared to ask.

After I got my job here, an administrator (not in music) said in a meeting, “Graduate students don’t understand that the questions are generated in the literature, not out of thin air. That’s where you find the useful research questions.” That’s pure discipline (in the worst sense), and it is dead wrong. The real questions are the ones you’re asking, and no one else even notices. The Literature, in this context, does nothing but deaden and restrain; you, looking at The Actual Stuff, cannot believe that someone hasn’t yet asked what are to you these most obvious questions. One of the commonest closing gambits of articles, theses, and dissertations is the faux-humble “suggestions for further research,” which are invariably both (even) less interesting that whatever you’ve just fought your way through and safely within its worldview. How likely or even possible is it for an author to suggest something more interesting and worthwhile than (save us!) “the present study”? Too often such suggestions are wan attempts at making a homework assignment-type study look like part of a bigger, more promising picture. For my money, your question is not one that another individual is suggesting that you ask in order to validate their stuff. It’s not that you’ll be free of the literature; of course, you have to scour it to make sure that you’ve got adequate background to ask said question, and that someone hasn’t been there first. If all the great sages of your discipline haven’t even approached a question you find to be important, then put the pedal to the floor.

Still, as Phil points out, there are still boundaries and we ignore them at our own peril. “You can’t just treat these boundaries as if they aren’t there,” he observes; “where are you going to get your stuff published? If your stuff doesn’t get published, you won’t get a job, or you won’t get to keep the job you already have. That’s real. You better believe you’ve got something invested in the discipline.”

Too true, and while your research questions and topic are your responsibility, the consequences belong to you too, so your place in the discipline is a more multifaceted issue than, simply, this one research project. So it interests you; good. Is it likely to interest anyone else, to expand into something broader, to be the sort of thing a department might find attractive? Sure, anything can be great research, as potential graduate students are assured by faculty seeking to bolster flagging numbers, but if potential colleagues elsewhere find it trivial they won’t be interested in numbering you among their colleagues, and that’s not something you get to appeal, engage, or critique.

The business of becoming someone’s colleague touches on Phil’s comment that our “…discipline is constituted in the jobs we apply for: 19th-century opera, popular music and jazz, music and disability, etc. Academic appointments serve academic fields that exist so we can give degrees in them. Make no mistake, we provide a consumer product, and that product is a diploma.” Two points, here: I would prefer not to slight the generalist jobs that people apply for and a vast minority actually land. Specialty—what I’ve just been discussing—is all but irrelevant in such cases. You’re a Debussy scholar? Great; do try to keep publishing, but your 4 + 4 load will include the entire music history sequence, music appreciation, upper division courses in all periods, and the senior capstone seminar.

And to choose one from among the given examples: how common are listings that request a “music and disability” specialty? Fact: a certain number of people working in a thriving subdiscipline does not mean there are jobs in it. To my eye, the specialties for which people are hired still relate rather closely to the major historical periods—she’s a rock-solid nineteenth-century scholar, he’ll anchor the medieval side of things, and let’s see if there are some secondary interests we can parlay into course offerings (Women in Music, History of Rock, Pedagogy, etc. etc.). There’s a reason for this; specialists in major historical periods are far more likely to be able to mentor a variety of kinds of research in those periods than will the narrower specialist. For the vast majority of institutions, more marginal areas are at best secondary. All the fell developments in the economic and cultural ecosystem do not amount to more hires in Music and Disability Studies or even Performance Practices, believe me. So while the intellectual question and subdiscipline you choose should not be defined by the Discipline or Literature in the sense of What Everyone Else is Doing, never doubt that a reality principle will still obtain.

Finally, of Susan McClary: “…it’s the brilliance and dash of her writing that makes her stand out in our collective memory. There is a lesson in this, graduate students.” Hold on, graduate students—put your pens down for a second. “To achieve style, begin by affecting none,” wrote E. B. White, and that is sage advice. For me, what makes Susan McClary stand out in collective memory is not style so much as content: she was indefatigable in her sharp cultural interrogations of canonical works (e.g. Mozart’s G Major piano concerto K. 453/II, Bach Brandenburg Concerto #5/I, the opera Carmen, and much more), and her resistance to what she perceived as sacralization without thought, obedient cultural idol-worship, and the safe and lazy concept of the autonomous artwork which absolved later listeners from awareness of the cultural contexts taken for granted by historical composers and their listeners.

I personally found her writing to be searing and bilious, but that’s not the point. The lesson I would stress for any graduate student is to write like you, not like Susan McClary or anyone else. Say what you have to say, but don’t lard it with “brilliance and dash”— simply play your game, as the sports announcers say and I never tire of repeating. Your game. In the vast landscape of American academia, the “searing” written idiom has receded, as has cultural criticism itself to a certain extent, and I have to wonder if it was the bitter tone that eventually wore everyone out. Musicology departments rarely exist in a vacuum, and most music schools still search out that body of applicants who had a fantastic band conductor or choir teacher and simply want More Of That Sound rather than those drawn to vinegary critiques of our art.

Posted in Academia | Leave a comment

Disciplinarity (Or, Musicology is Anything You Can Get Away With)

I like musicologists. I like books and journals of musicology, musicology conferences and symposia, dinners with musicological colleagues near and far, musicological gossip and chit-chat. I like the concrete manifestations of musicology in the world. But I do not love that abstraction, “the discipline of musicology.” Like every academic discipline, musicology is nice in the concrete but lousy in the abstract.

A discipline is the claim of the general against the particular, the many against the one. It represents a limit placed on individual intellectual autonomy. For this reason alone I would not cross the street to save “the discipline” if were being attacked by a giant octopus.

octopus attack

Let’s face it, I only wrote that last sentence so I could use this image.

When you come upon a piece of scholarship that looks relevant to something you’re working on and yet also looks like it will take a tiresome lot of extra work to figure out (like if it’s something in a style that you don’t especially enjoy reading or involves things you don’t do well), invoking “the discipline” is a great way to say “not my job.” If an idea is not widely discussed in “the discipline,” you can say it’s marginal or unimportant. If a lot of people are suddenly talking about that same idea, you can say it’s “trendy.”

trendy flowchart“Trendy,” in this case, would mean you’re saying that an unwelcome idea is in the discipline but not of it; you can’t say it’s not already here, but you can deny it full rights of citizenship. No matter what, you have a reason not to consider new ideas. You therefore also have a reason not to talk to new people.

Too bad. The primary pleasure that scholarship offers is the chance to encounter other minds and thereby expand one’s own. The full range of other minds constitutes the true horizon that bounds the humanist; nothing human should be alien to us. The horizon of a discipline is inevitably smaller and more limited. If you choose to live within the latter, you are necessarily accepting a more limited participation in the expressive realm of the human. And that seems like a drag, giving up that pleasure for . . . for what?

In part, for power. It’s “the discipline” that tells you what matters; you don’t get to decide. “The discipline” stands over your work, telling you how much it matters and where it belongs. Unless you work your way up to being the discipline, in which case I guess the reward is getting to push people around. “The discipline” is an enabling ideology, a validation of power and hierarchy. To assimilate your own values to “the discipline,” to make “the discipline” the object of your care and protection, is to take on the role of the gatekeeper. And there are always academics with a lot invested in gatekeeping.

But do we have to have gates in the first place? There is a word I’ve borrowed from Teilhard de Chardin, the nöosphere, which relates to the idea that the entire world is a networked system of information within which ideas constantly circulate. A conversation between any two people within the nöosphere would constitute a single strand of it; conversations among multiple interlocutors are ganglions, knots of exfoliating thought-branches. But the whole networked structure, like the structure of the internet, is just that, networked.

map of the internet

A map of the internet.

There are no gates anywhere; people talk; talk gets out. You see concentrations within a vast cloud of interconnections that perhaps relate to particular kinds of conversation, talk about politics or sports or music. But they are only regions of greater or lesser density within the cloud of connections. Nothing is really separate from anything else. Why treat music as if it were? As if it were a family quarrel or an inside joke? Again: do we need gates at all?

Of course we do. You can’t just be a professor of “various awesome things.” A professor’s academic appointment is always in some field and indeed in some particular and advertised subfield. The discipline is constituted in the jobs we apply for: 19th-century opera, popular music and jazz, music and disability, etc. Academic appointments serve academic fields that exist so we can give degrees in them. Make no mistake, we provide a consumer product, and that product is a diploma.

The departments so formed will determine what it thinks a well-trained professional in the field should look like. Students so formed will go on to perpetuate some version of the same disciplinary identity or perhaps to challenge it, but either way their conversations will assume the priority of whatever disciplinary boundaries they have been given. Those boundaries continue to live on, if only as phantoms to be invoked and then banished. Fields exist because boundaries exist. Once a boundary springs up, it simultaneously defines both an inside and an outside.

When I’ve presented my work at gatherings in the interdisciplinary humanities (which 90% of the time means cultural studies in literature and film) I suddenly feel the boundaries of my own disciplinarity, if only because of the boundaries I encounter in others, and the way I see myself reflected in their perception. “Your work is so . . . phenomenological,” they say. That means that I am inclined, at some point in my talk, to consider what something sounds like.* But I can’t help but hear a tinge of suspicion, or maybe disdain, when people call my stuff “phenomenological”– the more political cult-studs types harbor a lingering prejudice against what they take to be a fetishistic and bourgeois attachment to the form and appearance of an artwork, perhaps out of a lingering Brechtian assumption that critique confronts (and trumps) the aesthetic, and that if you are too busy listening you won’t be thinking. And I bristle at this, and what comes to my mind is such scholars believe their job is only to fashion an abstract socio-cultural theory such that, should some expressive text wander across their path, they’ll be able to decode the everliving shit out of it. Which, entranced by “theory,” they almost never get around to doing.

You see how, whenever a boundary appears, it suddenly marks an inside and an outside? There we were, getting along so well, and now we’re quarreling.

You can’t just treat these boundaries as if they aren’t there. Where are you going to get your stuff published? If your stuff doesn’t get published, you won’t get a job, or you won’t get to keep the job you already have. That’s real. You better believe you’ve got something invested in the discipline.

But disciplinary boundaries change over time, clearly, and they do so because individuals question and overwrite them. Resisting disciplinarity is a gamble, but a calculated one, and sometimes it pays off. Look at Susan McClary. You don’t have to be a huge fan of McClary’s particular brand of hermeneutics to appreciate the license her work gave to all of us coming up behind her. She took a lot of crap — the critical response to Feminine Endings was perhaps the most epic bout of mansplaining in the history of musicology — but she got the chocolate of old-school musicology in with the peanut butter of cultural studies, did it with style, and she got away with it.** And once you know someone has gotten away with something, you can try to get away with something of your own. Andy Warhol liked to say (and Marshall McLuhan liked to repeat) that art is anything you can get away with. That’s my definition of musicology, too: musicology is anything you can get away with. Inasmuch as I’ve been published in musicological journals, read by musicologists, presented at musicological conferences, and engaged in musicological chit-chat, I’m a musicologist. I’m happy to wear that jacket.

Now, the example of Susan McClary might have some of you scratching your heads: if anyone has set the terms of “the discipline of musicology” in the past 20+ years, surely it would be her. How is she some kind of rebel against the system? Well, she isn’t anymore. This brings me to what I might call Ford’s Law of Interdisciplinarity:

Any successful interdisciplinary work will reconstitute the discipline on new ground.

You can’t have a permanent site of interdisciplinary anarchy for the same reason you can’t have a stable anarchist utopia. As the anarchist theorist Hakim Bey wrote,

The slogan ‘Revolution!’ has mutated from tocsin to toxin, a malign pseudo-Gnostic fate-trap, a nightmare where no matter how we struggle we never escape that evil Aeon, that incubus the State, one State after another, every ‘heaven’ ruled by yet one more evil angel.

Or, put in academic terms: once someone has written something successful in, say, sound studies, soon there will be other sound-studies scholars, then AMS evening sessions on sound studies, edited anthologies of sound studies, and then someone will post a job description for a “scholar in 20th century music and media, with an emphasis in sound studies.” And so it goes. Something that started off sounding the notes in the cracks between the keys has become its own tuning system. And this is inevitable.

So what I want imagine here is a humanist’s version of Hakim Bey’s “Temporary Autonomous Zone.” The TAZ is anarchist utopia on the model of a big party: people show up, make their own rules, get away with stuff, and disperse before the law shows up or before they themselves become the law. Burning Man is probably the most famous example of the TAZ in action, although Burning Man has recently incorporated itself as a nonprofit, which doubtless does not surprise Bey at all. Bey is notable (and often reviled) among anarchists for conceding that power and hierarchy are unavoidable in all human institutions, and that the dream of a true anarchist utopia — a stable, permanent state of statelessness — is just that, a dream. For Bey, freedom is to be found in those transient moments between its emergence and its institutionalization; the moment freedom takes institutional form, it is no longer freedom. Freedom is bounded not only by limits of space, but time.

Interdisciplinarity is likewise a TAZ. It is not a thing, and certainly not a durable thing, but a process, an action, and it must always make itself anew in cycles of birth and death. Interdisciplinarity is an event. A scholar doing interdisciplinary work is like someone throwing a party. It’s fun while it lasts, and you can take pictures, but you can’t make the party last forever: sooner or later, everyone has to go home. Don’t get too attached; call it beautiful, and move on.

*For musicologists, talking about “the notes on the page” (or, more recently, the sounds in my ear) is second nature. Right now UCLA is the hottest place for cultural-studies approaches in musicology, but even so the books its faculty have published (Elisabeth Le Guin’s Boccherini’s Body, Mitchell Morris’s The Persistence of Sentiment, and Robert Fink’s Repeating Ourselves all come to mind) have a lot of musical specificity. That Teutonic sauropod, Carl Dahlhaus, might irritate some of us when he insists that the proper business of a musicologist is write about (say) the C major prelude and fugue from WTC I in such a way that we will have something different to say about the C major prelude and fugue from WTC II. But we all have been trained to think this way, and, regardless of our chosen idiom within musicology, this is usually how we write as well. I’m not saying that this is always the case or that it necessarily should be — it’s just that musical specificity, in whatever form it takes, lies at the heart of our disciplinary identity, and that you only realize how true that is when you hang out with humanities people from outside our field.

**The fact that she was (and is) such a stylish writer is very much to the point here. McClary was not the only scholar making the Reese’s peanut butter cup of cultural musicology in the early 1990s, but it’s the brilliance and dash of her writing that makes her stand out in our collective memory. There is a lesson in this, graduate students.

Posted in Academia, Musicology | 5 Comments

And now, the Republican response to the SCOTUS ruling on marriage equality . . .

obannion

As someone said on Twitter, it was a bad week for people who like to put people in their place. Which means it was a bad week for the Republican Party. Various conservatives, from Justice Scalia on down (and it goes very far down indeed) reacted pretty much exactly like O’Bannion, the bullying asshole in Richard Linklater’s Dazed and Confused, who derives sadistic joy from hazing high-school freshies and gets his comeuppance at the end. The spectacle of a nominal adult driven to berserk but completely impotent rage (to the point of smashing his own beloved implement of harassment and humiliation) is, of course, highly entertaining.

Posted in Politics, Uncategorized | Leave a comment

Thoughts on writing the 600th Dial M post

According to the WordPress dashboard, this is the 600th Dial M post. Actually, I think we passed that point awhile ago: when I was migrating the old Typepad site to WordPress I took advantage of the opportunity to delete a few of my old posts that seemed too ephemeral, fatuous, or mean-spirited to be worth keeping around. I should also mention that, in the migration, I deleted the old Typepad site, which turned out to be a mistake. It broke every internal link, stripped all the pictures from our posts, and most irritatingly someone immediately bought the old Dial M domain and put up a site for “princess cut engagement rings.” This summer, my graduate assistant Alex is rebuilding the old posts and putting everything back in order, so I’m happy about that. It means that there will be a proper public archive of what is, by now, a pretty large body of work. When I went up for tenure I had my GA count up all the words I had written for the site between 2006 and 2010, so I could give the Promotion and Tenure Committee some quantified measure of all the work I had put in on Dial M. It came to 189000 words — about the equivalent of two books. (And remember, that doesn’t count what Jonathan wrote.) Now, a page of blogging does not equal a page of published scholarship. I didn’t really write the equivalent of two books, or even one. But Jonathan and I have done something, and I still don’t know exactly what that something is.*

To be sure, I have written about blogging often enough in the past. But reading back at those nature-of-the-medium posts I find that, in this matter as in almost all others, I no longer really agree with anything I once thought. Over the years I have found a variety of ways to argue that blogging is (or will be) a big deal for scholars. But it isn’t, and at this point I don’t think it will be. To be sure, people will continue to write various kind of things and publish them on their personal websites, but the idea that there is a “blogosphere” within which a characteristic kind of writing and intellectual engagement can be cultivated — that ship has sailed.

But that’s not to say that such a thing never existed. I think it did, to some degree, in the years of our first Dial M stint (2006-2010). Back then, what excited me about the medium itself (as opposed to any particular bit of content we or anyone else generated for it) was the feeling that this was an autonomous intellectual/creative sphere where people could write whatever they pleased without having to satisfy any institutional gatekeepers or participate in someone else’s business plan. The 90/10 chaff-to-wheat ratio unavoidable in all human endeavors would remain in effect, but the good stuff could generate its own conversations and its own audience, and we wouldn’t have to ask anyone’s permission to do it. The authority to bestow attention on any given conversation was distributed throughout the network of interlinked blogs — that blogosphere you heard so much about in those days. Writing would beget writing. No-one was getting paid, but we were creating, and that was what mattered.

That was the idea, anyway. And again, that really did sort of happen, at least for a while. There was a lot of writing being done in those days and a lot of blogs launched, some borne along for a good many years by their high-minded ambitions. Yes, yes, most blogs were and are crap. But when the medium was generating its own heat, there was energy behind a certain kind of writing, located somewhere between epistolic and essayistic registers, that at its best was exciting and vital.

Perhaps the zeitgeist has moved on from blogging to podcasts. I know I listen to a lot more podcasts these days than I read blogs, and I don’t think I’m alone in that. But last year my old friend Ken Woods put his finger on what I think is the real cause for the change, which is social media, especially Twitter and Facebook. Do read his post on it. I’m having a hard time refraining from just quoting the whole thing here. But this, for me, is the crux:

… this shift of power is, really, all about power. Blogging used to be primarily fed by a decentralized system of mutual support among individual bloggers (even a site like Blognoggle was basically a blog) and readers. Blogging platforms gave individuals not only the power to publish to a world-wide audience (a truly historical breakthrough), but to decide which other writers to support.  It was a highly democratic and completely decentralized system.

Facebook is neither. Every post that goes up on Facebook is there because Facebook is looking to make money or gather information from it.  Running a blog in 2008 meant that when I published a popular new post, it wasn’t just bringing in new readers who might take an interest in me, it was also increasing the value of  the blogs of fellow bloggers on my blogroll. More readers for Ken meant more readers for Pliable. More readers for Alex Ross meant more readers for Jessica Duchen or Jeremy Denk.

Now, more readers for Ken means more profit and more power for Facebook. More readers for Pliable means more profit and more power for Facebook. I’d say well over 55% of my readership comes here from FB, especially for the most popular posts, the ones that really take off. Almost all of the remainder come from Twitter or Google searches.  Almost every reader for almost everything I write is making money for or enhancing the power of one of those three companies. 

I got into a Twitter beef with a couple of musicologist friends** in which I vented my spleen about this — my resentment at having to bend the knee to companies that perform psychological experiments on us and sell us out to government and corporate surveillance. I know I wouldn’t be on Twitter if I weren’t still writing for Dial M, but at the end of the day I really do want people to read my stuff, and as Ken points out, all the old ways people had of following blogs are basically gone. It’s social media or nothing. And that pisses me off, because I remember what it was like before.

The dominance of social media means that a blogger can have readers, but not a readership. People read the things they see on Facebook. My post on depression got more readers than anything I’ve ever written because it went up on a lot of FB walls . . . but that doesn’t change the overall pattern of site traffic. The old traffic pattern, which was driven by loyal readers who wanted to follow a given conversation — that’s gone. So the blogger is back in the same situation he was when dealing with academic publishers: he has a gatekeeper. To be sure, he can still keep writing his stuff and no-one is going to stop him, but one of the main incentives to writing — the sense that you’re having a conversation with people, or that your writing is doing something in the nöosphere — is now contingent on pleasing a boss. It’s just that the boss is now a collective abstraction. The question is now not “what does the editor want,” it’s what does social media want. This is not at all the same question as “what does my readership want?” It feels a whole lot less free.

So what’s the alternative? Putting all my elective writing energy into tweets and status updates? Not my style. The fact remains that, for me, writing is something I have an itch to do regardless of the conditions under which it is read. My book is right now somewhere around the 1,200,000th best-selling book on Amazon, which makes it about half as popular as Ramsey Dukes’s Uncle Ramsey’s Little Book of Demons. But I’m still glad I wrote that book exactly the way I wrote it.

regret nothing

Academics who read my post on Dukes might have thought that this was a case of an academic bestowing legitimacy on a crank writer, but so far as the mainstream is concerned, we’re all cranks.

And this is one reason I am going to keep writing at Dial M. There is something noble, somehow, about being a crank. Or at least kind of fun. In his wonderful essay “A Crank’s Progress,” Dukes imagines forming a Crank’s Union dedicated to the mutual encouragement and support of all marginal, obscure, and enthusiastic writers:

 … the first task of the Crank’s Union [is] to admit that it is exciting to be a crank; to recognize the thrill of seeing sudden connections of meaning between unrelated phenomena; to pit oneself against vertigo on the brink of the unknown, and then to jump.

In an earlier essay I pointed out that in an ordered and over-safe world there is a real need for dangerous ideas — that is a message that cranks would well understand as they live on the brink. When the News of the World published a poisonous article about a group of well-meaning seekers that I belonged to, I felt annoyed and hurt but, on reflection, realized that if ever the tabloids started to write in praise of any group I belonged to it would be time for me to go out and find something weirder to join. So a real crank should not only be proud to be such, but also rather jealous of that status: public recognition should mark the time to move on to crazier pastures.

It’s more fun to be free and unimportant than being important and dancing like an organ-grinder’s monkey on the end of a string. I like to write, and I’m grateful that I live in an age when I can write what I like and still get anyone to read it. So thanks if you’re reading, regardless of how you got here. I’ll be seeing you through the next 600 posts.

*I can tell you what it is according to university P&T committees, though. It’s filed under “service,” which could be defined as the work you’re expected to do but won’t get you tenure. Of course this is unfortunate, but I can’t be bothered to waste my breath complaining about it, because it will never change. Academia is intrinsically hierarchical, and any intellectual work you do that circumvents institutional hierarchy will not and cannot receive institutional rewards.

**To my friends: I apologize. It occurred to me afterwards that you might have inferred from my irritable tweets that I think that other scholars’ writing on Twitter is less valuable than my writing on Dial M. I don’t, and I regret it if it came off that way. What occurs to me now is, I don’t value blogging as such, as a medium, very much at all any more. What I value is writing, and the hegemony of social media works against writing. Twitter can serve scholarly ends, but whatever those ends are, I can’t bring myself to think of it as a genre of writing. It seems more like a placeholder for writing: ideally, you get in a Twitter conversation from which ideas arise in epigrammatic form and you work them out later. That’s what happened to me in this case — I had a bunch of Twitter exchanges that left me frustrated with the limits the medium placed on my ability to express myself coherently and without giving gratuitous offense, and am now using a medium that allows me to write something more considered.

But social media doesn’t thrive on considered essays, it wants cable-news-style heat. Facebook is infinitely more horrible than Twitter, but both are environments engineered to get people talking, and talking loudly. Talking, not reflecting, and not really writing, either, except on the terms of social media. It’s as if these platforms kept the worst parts of the old blogosphere (the noisy, gassy, self-promoting, narcissistic, petulant, and ignorantly opinionated parts) and cut out everything else. Or made everything else serve them. This doesn’t mean that my brilliant and wise friends somehow turn into mouth-breathing internet trolls the moment they open up their Twitter feeds. To think so would be to indulge in the crudest kind of technological determinism. There’s no reason a 21st-century Basho couldn’t express the most sublime thoughts and feelings in 140-character lines. But to do so you would have to swim hard against the current of the medium, which in the end has its own agenda, and I guarantee that it isn’t your agenda.

 

Posted in Academia, Blogging, Technology, Writing

מע טאר ניט (Me tor nit=Yiddish: “I dassn’t”)

Apologies for my long absence; all continues to be well and insanely busy in my life. Phil has been blogging like a champ, deep philosophical stuff, and I’ve been killing myself with projects, some of which may eventually be mentioned In This Space. I note Phil’s recent pieces with admiration because one of my projects is the closest thing to deep philosophical thought I’ve ever done, and it has almost finished me off—not, by any means, my strong suit. And it is high time for me to begin blogging again.

And recent pieces on of higher education have gotten me off the dime.

A much-linked piece by the pseudonymous professor “Edward Schlosser” has provoked a good deal of hand-wringing from the O Tempora! O Mores! set (whom, of course, ye shall always have with you) about “political correctness,” the unwillingness of students to confront uncomfortable ideas, to be challenged, and so on. I’m not going to bother with the reactions of the right-wing hyena caste (“You broke it, Sir, you bought it” is one typically perceptive observation); the very concept of political correctness is a pavlovian prompt for them, so we can move beyond their cackling, drooling reactions without further comment. Anyway, a brief summary of the piece itself:

It’s all the fault of political correctness (see the similar comments by impartial cultural scholar and observer Jerry Seinfeld). Schlosser’s “liberal” students terrify him/her because they’re afraid of new, uncomfortable ideas and resentful of positions not their own. And as a result, faculty are afraid to truly teach because the hypersensitive students refuse to tolerate even the possibility of both intellectual challenges and psychological triggers, and so free speech and the true openness of fearless intellectual inquiry—here represented in the person of “Edward Schlosser,” natch—are shut down. Faculty know they won’t be supported by their administrators; “Schlosser” cites adjuncts who did not have contracts renewed because of lily-livered administrators, students bullying faculty by refusing to hear challenging subject matter, which should be the job of higher education…and on and on. One of the money quotes: “I once saw an adjunct not get his contract renewed after students complained that he exposed them to ‘offensive’ texts written by Edward Said and Mark Twain. His response, that the texts were meant to be a little upsetting, only fueled the students’ ire and sealed his fate.”

Is anything this simple? That anecdote sounds like a horror story one hears at the bar about someone this guy’s brother-in-law knew once. I wonder how Said and Twain were presented, and if “a little upsetting” were indeed the words used. I do know that faculty can be—can be, not by definition are—self-righteous as hell when they want to, ah, challenge student complacency, and deaf to their own tone. Personally, I’ve never known students to revolt because I chose to try to make them uncomfortable. Actually, more often it’s “Here we go. Bellman’s playing the ‘uncomfortable’ card again [*eyeroll].” And I remember that feeling from my own education, high school especially; you know when you’re being not only confronted but “confronted,” and often—for your own purposes—you choose not to play, not to take the bait, not to respond as desired. You’re accused of apathy, but maybe the teacher would like you to cry or wag your tail or provide whatever jack-in-the-box response s/he expects from People Your Age…and you see it coming. Maybe it’s on such a simplistic, black-and-white level that you sense you’re being condescended to more than confronted, maybe you don’t like being manipulated this way. And let us face it, there is a certain manipulative component in such “confrontation”—and maybe you just don’t want to play that stupid game. As a cynical California kid, I resented it for its predictability, though I’m not completely innocent: I once received the student comment, “He encourages us to express his opinions.” ZING!

The Great Unmentioned, here, is that in our pedagogical confrontation of student complacency, we often hide, not least from ourselves, a good dollop of paternalistic authority. I tend to rage against that, myself, and always did, so it’s hard to make the argument that it’s always Good For Them… Where is the dividing line between “trying to get students out of their comfort zones” and “gratuitously supercilious in-class posturing”? How willing are faculty to get out of their comfort zones, usually? As if! Student complacency amounts to a sort of cultural piñata for faculty grumbling, but intellectual complacency and rigidity is certainly not unknown among us, ourselves.

*          *          *          *          *          *          *

Now, I have had people object to the content I present. A fairly recent example was when I gave a guest lecture in a Music Education class, “Music and Social Change,” a topic I was given. Well, party time, right? The Depression, Racism, the 60s…I don’t think I even made it up to Vietnam. The presentation seemed successful, with participation and discussion and so on, but I was later told that one of the students complained that I showed a youtube video accompanying Nina Simone’s version of “Strange Fruit”—the song is about lynching—with actual photos of lynched Blacks in the south. She didn’t feel she should have had to look at it; I should have warned them and given them the opportunity to look away. To be clear: no “situation” arose, and the complaints only got to me much later. All I heard was that the professor in the class and another colleague sat this person down and said now look here, this is the part where you go to college and get your doors blown off, you should be thankful, your education is working as it should, etc. When I heard about this, I first thought “Now look here, I saw Holocaust footage in grade school and in junior high and no one asked how I’d feel about that, by God, I was made to confront…”

Self-righteous bona fides established? Good. I’m no longer quite so sure.

The issue of triggers—ideas or images or stimuli that cause unpredictable reactions in students and for this reason are others’ responsibility to avoid—is a recent concern, and another favorite complaint, and the entitled hypersensitivity of the poor spoiled darlings always implied. Sure, worrying about things like that can be annoying, and I suppose students can use such concepts on no more than a fraidy-cat basis. However, if we look at it from the being-human, rather than the why-don’t-you-thank-me-for-confronting-you angle: do I want to be the guy confronting people about rape in class when one of the students has actually been raped? Actually, a person who suffered that kind of attack doesn’t need to be lectured about his or her comfort zone by me, thanks, and it would be unforgivably hubristic for me to imagine that I had that right. Would it really be OK to have a Vietnamese boat person in class and rub her nose in The Rime of the Ancient Mariner? How about if I, as a student, am taken out of my own comfort zone by a glib professor insisting that I look at things from a Nazi or White Supremacist point of view? I had distant relatives that died during the Second World War, but others have much worse stories. Is that the professor’s right? Am I an entitled little Scheiß for rejecting the prof’s authority to do such? College professors after WWI and WWII had students who came home deeply scarred—shell shocked, or as it’s called now PTSD. My suspicion is faculty would not have self-righteously tried to turn students out of their safe places, so to speak. Look at it from the Japanese point of view! Stop clinging to your tired assumptions! Perhaps Lebensraum was a reasonable idea because the Germans felt threatened!

Previously—perhaps during the Golden Age so mistily recalled by my fellow Baby Boomers—people who came from families where violence and trauma occurred were less likely to be in universities at all, since our institutions were peopled by a more privileged class. The problem wasn’t as acute. Today, we rightly seek to extend educational opportunity to everyone, but that means we share our learning space with more people who have been scarred by literal or figurative combat. I cannot make the case to myself that all the adjustment has to be on one side only.

The Point: there is no one right answer. On one level, students need to be made uncomfortable, yes—less secure in what they think they know, bless them, which is usually what they’ve been told by authority figures (parents, ministers, music teachers…). Faculty need to support the integrity of each other’s efforts, to communicate with each other, to bring the hammer down (one way or another) on students who—for reasons of immaturity—try to play cards they have no business playing. The purpose of tenure is to protect faculty from collegial and student caprice, but non-tenurable adjunct faculty need the same protections from student caprice just as students should be protected from faculty caprice. Administrators need to protect faculty, not their own convenience, etc. etc. But none of this is new: these are the standard concerns and circumstances of higher education. It has never been a perfect system; there are humans involved, and humans can be petty and stupid and self-absorbed and lazy. So the education process is in a sense devoted to getting us all, student and teacher alike, out of those particular comfort zones.

What does not help is concern-troll columns about how much better it was before/elsewhere, with the usual dire hell-in-a-handbasket predictions. Rather: it’s case by case. We try to educate—to challenge students and move them forward while remembering that justice is always to be tempered with mercy, and that we really haven’t walked in their shoes. Are some students spoiled and entitled? Sure, and we can (he said in a low, even tone) address that as needed. Single parents with emotional scars? They’re there too, folks. Do we feel the same about blithely making them uncomfortable? Say I’m a green Ph.D. student from a privileged background, teaching adjunct; how far do I really get to go with people who have not shared my advantages and potentially have wounds of which I cannot even conceive?

Critical thinking requires that the moment an opinion is stated, on a blog (such as this) or (especially) in the New York Times or some other high-profile opinion-making journalistic organ, the reader must ask what the author’s agenda is, does he or she have skin in the game, what this is really about. My suspicion is that if we all—students and faculty alike—had to face real scrutiny, we’d cork up fast. We all need to be called on our crap, all the time.

Posted in Academia, Current Affairs, Education, Politics | 3 Comments

You’re soaking in it, part 1

If you’re a certain age (old) you might remember Madge the manicurist, who was forever insulting women about their chapped hands and tricking them into soaking their fingertips in Palmolive dishwashing liquid.

As a kid, I never understood these commercials. They seemed like fairy tales or dreams, where people act out of inscrutable inner compulsions and events move along according to a logic sealed off from daylight awareness. Why did these ladies, unasked, place their hands in dishes of green goo? They always seemed surprised to find themselves in this situation, yet would relax when Madge explained that it was dishwashing detergent. But this was even more mysterious. Why did the ladies decide to go along with it? Why did Madge contrive this whole situation in the first place? And how?

Perhaps there is something in successful commercials that communicates with us on the level of dreams. There is a certain form that everyone encounters in dreams: you find yourself in a situation where odd behavior is expected and you go along with it. You just find yourself doing it. And then, at a certain point, you come to a flash of lucidity in your dream, and you realize that you are doing something extremely strange, something your daylight self would never do, and you have been doing it all along.

holy mountain

Wait, I’m eating a face.

And in that flash of recognition, in the dream, you find a complicated reconciliation. There is a fragment of rational awareness that registers the face-eating* as peculiar, but it is a fragment embedded in the overwhelmingly persuasive logic of the dream. I find myself in the midst of eating a face, and that’s unsettling . . . but . . . it’s what I need to be doing now. So you keep eating the face, cuz what else are you gonna do? Not eat the face? That doesn’t make sense. Only when you wake up does your rational understanding overwhelm the logic of the dream and you say to yourself, “wow, what a weird dream.”

This post isn’t really about dreams; it’s about magic, again. I’m picking up a thread from my last post, where I ended up suggesting that there is something in magical styles of thought that humanist thinkers (even the odd musicologist) use all the time, whether they know if or not — usually not. Magic: you’re soaking in it.

I want to make this point clearly, and I’m going to keep making it, because most people assume that magic is the enthusiasm of a tiny and peculiar minority—occultists, pagans, new-agers, acidheads, and people from cultures that educated westerners quietly believe are backwards.** Just about every time I tell a fellow academic that I’m working on a book that deals with magic I feel the same frisson of discomfort. I can practically see the thought balloon emerge from my interlocutor’s head: wait, you can’t write a book about something that doesn’t exist, can you? Well, if magic doesn’t exist, I guess you can write about the kooks who believe in it. But hang on, then, are you writing about those kooks, or are you actually one of them? Just how crazy are you?

crazy

My response.

My response is always a variant of “you’re soaking in it.” We are all doing magic, or at least engaging in magical thinking, all the time. And this is not necessarily a bad thing.

Imagine the following scenario. A woman (let’s call her Ivy) has a furious argument with her boyfriend and goes out for a walk to think some things over. She’s been with this guy for a long time and has a lot invested in the relationship. Only now she’s asking if it’s a real investment or something more like the sunk-costs fallacy you get into with an old lemon of a car — “I can’t get rid of it now, after all the money I’ve put into it!” She thinks back over the argument, thinks about its long, inconclusive prehistory, the cankered bickering over the same damn thing, over and over, whatever that might be: something big and immovable, like money, career, sex, commitment, or kids. She asks whether he will ever really change and whether she can ask him to, what the nature of their relationship is now, was it really that good to start with, does she actually love him or is he a kind of bad habit . . .  that sort of thing.

Let’s imagine the scene in more particular detail. It’s a cheerless winter afternoon, shading into dusk. As the sun gets lower in the sky, the gray overcast of the afternoon starts breaking up into streaks of cloud and blue sky. Ivy looks up from her preoccupied, huddled passage through the streets and is suddenly arrested by the beauty of the sky. Her heart lifts then. She stops and feels a pervading calm, a centeredness within the totality of her surroundings. She feels herself standing at the axis of the big sky full of broken light and clouds. She feels her roots in the ground. She truly experiences this moment in which she is undecided and yet full of potential. She feels connected to everything.

A flight of crows rises noisily into the air. Their collective movement wavers between the direction from which Ivy came, back towards the apartment where her boyfriend is waiting on her return, and the direction she’s walking now, away from home and certainty. She suddenly feels sure that the crows’ flight is telling her something: the direction they take has the same meaning as the direction of her relationship. If they fly homewards, then she too will go home. If they fly away from home, then she too will fly away. If they just keep circling around, taking no particular direction, then that would seem to be a dismally accurate commentary on the current inconclusive state of her relationship. But no matter what, in this charged moment of heightened and centered awareness, when all things seem pregnant with meaning and the world itself seems intent on telling her something about her life, the flight of crows becomes a sign.***

And lo, at that moment the sun comes fully out from behind the broken bank of clouds, and the crows wheel around and fly straight towards it — away from her home, away from her boyfriend, away from her old life, and into the light. And Ivy knows, beyond any doubt, that it’s over. She goes home, tells her boyfriend she loves him (which she does, she now realizes, only she also realizes that it doesn’t change anything) and packs her stuff. “But why?,” he asks. Chances are, she’s not going to tell him it was because of the crows. She probably isn’t going to tell her friends about the crows either, when they ask her about her breakup. And as time goes on and her moment of clarity becomes muddied (as all such moments do) by the complications of life, she probably won’t even tell herself. In the story she tells of her own life, the crows will be forgotten and only the rational deliberations behind her decision will remain.

Unless, of course, she decides that magical thinking isn’t totally worthless after all. In which case, she trusts her experiences more than what any authority will tell her about them. But for the most part (and here I must defer full explanation to the next part of this post) we don’t trust our own experiences. Weird shit happens to us all the time, but if we can’t find an explanation for it that fits our education and cultural norms, we file it away in a mental folder marked “awkward/miscellaneous” and never look at it again.

Now, maybe the exact story I told hasn’t happened to you, but something like it probably has. The recently bereaved, for example, very often have a similar kind of experience, where something strange and striking in their environment seems to offer not only a reminder of the lost loved one but a tangible presence — “I felt that she was there with me once more.” Here is an example of such an experience, a short essay in which Michael Shermer, a professional skeptic of the sort I usually make rude comments about, has a genuinely weird experience and doesn’t simply dismiss it or assimilate it to rational explanation. “I savored the experience more than the explanation,” he writes. Not a bad definition of how magical thinking works.

I want to think more about this piece, and its author’s experience, in my next installment. But in the meantime, I want you to consider that, at such strange moments, we experience something like the dissonance of consciousness that I have argued belongs to certain dreams, where something of waking consciousness penetrates dream consciousness and makes itself uncannily felt. Moments of magical thinking present us with the same dissonance, but in reverse: those moments of waking life where we find ourselves thinking magically are moments in which dream consciousness invades the daylight realm.

Does this mean that they don’t matter? Well, they matter insofar as dreams do. But then, how do dreams matter?

*This is an image from Alejandro Jodorowsky’s surreal film The Holy Mountain.

**You will not get your standard American academic to admit s/he believes that any culture is superior to any other, not even under torture. But, dear reader, ask yourself this. If you don’t think that Western culture is better than any other and if you also think magic is bullshit, then how do you square your cultural relativism with the fact that there are lots of places in the world (especially in what used to be called the Third World) where people continue to make offerings to household gods, make decisions with the aid of divination, lay curses on enemies, or believe themselves to be cursed? Most people don’t think about it, or if they do, they quietly file the resulting cognitive dissonance into the mental folder marked “awkward/miscellaneous.” But it is this very question that lay at the center of the debate between Lucien Lévy-Bruhl and E. E. Evans-Pritchard, a debate that shaped the modern discipline of anthropology. Anthropologists have thought long and hard on this problem and have thereby come to a more sophisticated understanding of magic. But for the most part the rest of western intellectual culture has not.

***By the way, the word “augury” originally meant the practice of divination by watching the flight of birds. Dale Pendell’s short poetic treatise on divination is called The Language of Birds, appropriately, and is full of lovely, obscure words for various  forms of divination: selenosciamancy (divination by the shadows of moonlight through trees), clednomancy (divination by hearing chance words) philematomancy (divination by kissing), margaritomancy (divination by heating and roasting pearls), omphalomancy (divination by counting knots on an umbilical cord). Basically, if it manifests in your experience and you can’t control or predict its behavior, you can use it for divination.

 

 

Posted in High Weirdness, Magic