The Tattooed Academic

I’ve been in the United States now for 26 years, going on 27. That span of time has seen a lot of changes, large and small. A large one: a wholesale revision of mainstream American attitudes towards LGBTQ civil rights. A small one: when I got here the only beer you could buy was the style I think of as “lawnmower lager.” On a hot day after vigorous yard work, served as near as possible to frozen, it tastes great. Under most other circumstances, it disappoints. And that was the only beer you could get, unless you bought an import. And then Pete’s Wicked Ale appeared, and that seemed to kick off a domestic craft brewing revolution that means you can now find esoterica like Belgian framboise lambic pretty much anywhere. And the same kind of change has been enacted more generally in American cuisine, to the point now that we’ve started rediscovering the humble virtues of the food and drink we’ve spent the last couple of decades running away from. Pabst Blue Ribbon has gone from being what your Dad drinks to being an arch joke to being what you drink ironically to being what you drink with a veneration for American vernacular food culture.

And speaking of Frank Booth, how about tattoos. When I came to the U.S. in 1987, tattoos were associated — in my mind, at least, and in the minds of nervous bougies like me — as belonging to the kind of scary lowlife who populate that iconic moment of Blue Velvet. I didn’t know anyone who had a tattoo, or at least anyone who showed it off, until around 1990 or so, when I was slightly shocked (but also curious and sort of psyched) to see a cellist I worked with roll up his sleeve and show off a simple, unstylized outline of a Hill cello he had tattooed on his right bicep. This was a surprise — not only that someone I had filed in one category (non-tattoo) was actually in the other, but that he was sporting a tattoo that wasn’t about being badass or threatening. It didn’t have skulls and flames and snakes and what-all; it was a monument only to my friend’s enthusiasm for Hill cellos.

In retrospect, this innocently geeky tatt was a sign of things to come. One small but telling thing that has changed completely in the last quarter-century has been the mainstreaming of tattoos. I think I first really noticed this when I moved to Austin about a decade ago. When I would take my kids to the Barton Springs Pool, I would see other young parents piling out of minivans and poking Cheerios into the mouths of fretful babies and doing normal parental things, but sleeved up and sometimes with just about every visible square inch of skin below the neck covered in tatts. And it started to occur to me then that the social meaning of tattoos had changed. There were still lots of situations where you would get the stinkeye for having one – obviously, since there still are now — and to be sure a lot of the tattoos appearing incongruously on the bodies of these solidly domesticated and vaguely middle-aged Austinites were still of the skulls-snakes-flames variety, but still, something had changed. With the new phenomenon of the cool geek came the phenomenon of the geek tattoo. And that’s where we are now.

But in the early 1990s, tattoos were generally assumed to be the domain of bikers, barflies, vets, and rockers, and this assumption informs David Foster Wallace’s Infinite Jest. Much of Infinite Jest takes place in a drug recovery center, where addicts from various backgrounds deal with the fallout from the bad choices they made during their years enslaved by various Substances. And in one passage, Wallace makes the tattoo a metonym for all those bad, act-in-haste-repent-in-leisure choices:

Because the whole thing about tattoos is that they’re permanent, of course, irrevocable once gotten–which of course the irrevocability of a tattoo is what jacks up the adrenaline of the intoxicated decision to sit down in the chair and actually get it (the tattoo)–but the chilling thing about the intoxication is that it seems to make you consider only the adrenaline of the moment itself, not the irrevocability that produces the adrenaline. It’s like the intoxication keeps your tattoo-type person from being able to project his imagination past the adrenaline of the impulse and even consider the permanent consequences that are producing the buzz of excitement.

Now, this passage is told from the perspective of Tiny Ewell, an alcoholic lawyer who is himself untattooed but becomes obsessed with the tattoos of his fellow recovery center residents. So the assumption here that “your tattoo-type person” is by nature impulsive and self-destructive is perhaps only Ewell’s, but I suspect it also pretty much reflects Wallace’s own point of view. It was, after all, pretty much the usual opinion of intellectual middle-class types in the mid 1990s.

Not that Wallace doesn’t have a point. The whole thing with tattoos is that they’re permanent, while the motivation to get them (whether impulsive or well-thought-out) almost certainly is not. A recent NYT chin-stroker makes this point, as does your Mom and at least half your friends. From this point of view, the basic reality of tattoos is permanence. This becomes particularly obvious in the comical-sad litany of misbegotten tattoos in Infinite Jest: tattoos of the names of girls who left, enthusiasms that waned, gods that failed, and mad whims that are sometimes cannot even be recalled. So you’re stuck with a swastika on your chest or something possibly even scarier, and what are you gonna do? DFW takes several pages to consider the question, but it all pretty much comes down to resignation and covering up. And yes, I wouldn’t want to be in this situation, and probably you wouldn’t either.

(Though even in the roughest cases there is some hope.)

But. It seems to me that it is at least as likely that what tattoos really signify is not permanence but impermanence. When I look back over my life, I can find very few things I have consistently loved, believed in, or stood for. I can hardly think of a single thing I felt strongly about in 1990 that I would care to have on my skin now. As much as people go on about their identity as Christians, Trekkies, vegans, or whatever, I always feel that such talk usually makes a false assumption about the stability of any human identity. So given this, my first question would be, what kind of tattoo would be impermanence-proof? I used to think that this would be the really interesting thing about tattoos: they challenge you with a really tough question. But since I can never think of a foolproof answer, I’m not so sure. So now I’m thinking the real question re. tattoos is, how do you square yourself with the impermanence of your identity? From this point of view, the very fact that we ask of tattoos what they likely can’t deliver is the most interesting thing about them. Seen this way, a tattoo is like a stick in a stream. Sometimes the water looks still — maybe you can’t tell if it’s running or not — but if you poke a stick into it you can tell it’s moving. Maybe the value of a tattoo is in part that you probably will grow past the point where it made sense to get it. The value of the image is less what it says about who you really than how it expresses some past version of yourself. The tattoo’s intrinsic aesthetic value (if any) is compounded with something subtler — a kind of curatorship of the self. Maybe the likelihood of obsolescence is a feature, not a bug.

Or whatever. So theorizes someone who (like Tiny Ewell) doesn’t have any tattoos himself but just bugs his friends to show him theirs. A few fellow academics have been kind enough to share photos and descriptions of their tatts, all of which are gloriously cool and original and seem as likely as any to stand the test of time. The one that got me thinking of this whole line of thought belongs to Brad Osborn, a music theorist at KU whose tatt will be immediately identifiable to music theory nerds:

perspectives tatt

Robin James, a philosopher and sound artist at UNC Charlotte, has a full sleeve:

photo-1

Robin has a thoughtful approach to conceiving and planning a tattoo, and to me it really paid off. She writes,

I just wanted a visually interesting tattoo–something more “design-y” than representational, something that looked good from a distance (you can still see the pattern/design). The two images are Sputnik 1 & Voyager 1. I picked those because I’m a longtime scifi/space opera fan, so that sort of imagery is really appealing to me, as is the mid-century design of those ships. I guess I also picked them b/c they’re historically significant, but mainly I picked them b/c they look good on my arm.

Phil Gentry, a musicology prof. at U. of Delaware and longtime musicoloblogger probably well-known to most of the Dial M readership, has a tattoo of a score by John Cage:

54_548750921616_7543_n

Phil writes,

My standard joke is that I got it so that when other John Cage scholars challenge me, I can whip it out and question their commitment to the subject. In reality, I got it in graduate school for sappy sentimental reasons–it’s one of Cage’s “62 mesostics re: merce cunningham,” which is to say, closest thing to a love song Cage wrote.

Those are the academic tattoos that colleagues have shared with me, but in the interests of fueling my creepy obsession with other people’s skin art, I strongly encourage my tattooed readers to send me pictures of their own. It’s a new reader challenge! Haven’t had one of those since like 2008 or something. Drop me a note at my uni email: fordp at indiana dot edu

A few more geeky academic/artsy/intellectual tatts before you go. One I wish I’d thought of is Jonathan Lethem’s tattoo of the Ubik aerosol can from the cover of one of Philip K. Dick’s trippiest and best novels:

lethem ubik tattoo

You can’t really see it; this is the design:

ubik_f

Another writer, Carey Harrison, has the entire first page of Theodor Adorno’s Minima Moralia tattooed on his back:

adorno tatt

Librarians, for some reason, seem particularly prone to getting tattoos, and as librarians tend to be awesome, so too are their tatts. My favorite is a card from some alternative-universe tarot deck — Arcanum XXII, “The Librarian”:

librarian tatt

If you do a Google image search of “music tattoos,” you get a pretty mixed bag. A lot of music tatts land in what my wife calls “piano scarf” territory. (You know, like those music-novelty gifts bedecked with treble clefs and noteheads that your aunt gives you for your birthday because she knows you’re a musician.) But as Phil’s tatt shows, avant-garde graphic notation offers real scope for a distinctive and visually-interesting design. Here’s one last tatt — a score to Brian Eno’s Music for Airports:

eno_tattoo

I don’t think I’ve exhausted this topic yet.

Posted in Academia, Geeks, Uncategorized, whatever | 1 Comment

End-of-year death spiral

We have now entered the “death spiral” stage of the academic year.

scream

 

Posted in Memes | Leave a comment

Roads Not Traveled

One of the perquisites of the dozing, decrepit professorial alte kocker maturing academic consists of invitations to visit other institutions, perhaps to participate in a colloquium or give a lecture. Maybe the subject of your lecture is within your research area; maybe it’s on a subject they chose and they’re just trusting you to come up with something. It may not be just the lecture; perhaps they also think it’d be a good idea for their students to meet you and spend time with you; you have one or a dozen friends there, and they know you well enough to know that whatever you are, it isn’t aloof.

I’m sitting in Indianapolis Airport following the second of these this semester, and last of my big extras (others included chairing yet another faculty search, which is punishing but hugely important work, and doing concert commentary and fetch-and-carry for a local Bach St. Matthew Passion event). Both lectures were pretty stressful to produce—national papers are a half-hour in length, but these were 45 minutes to an hour. As they were at institutions very unlike my own, they provide opportunities for reflection: what would professional life have been like at a different kind of school? On the job market, one applies (well, I sure as hell applied) everywhere: state undergraduate institutions, liberal arts colleges, regional state universities (which is where I ended up), research-extensive flagships and Ivies and Ivy-equivalents. Who can tell? One is confident enough of one’s own research brilliance, teaching brilliance, everything brilliance (particularly if one has a family to support) that it’s all about finding a door—any door—to kick open. Of course, the more experience you gain, the more you realize (I just had this conversation with a friend) how important The Fit is not just them to you, young’un, but you to them. So maybe you’re not really Jedi Academy material, or appropriate for the Bucolic University of the Rolling Verdant Hills, or ready to be Chair of Music at the Aristotle’s Lyceum.

Spoiler alert: I’m at the right place, which features hardworking and often terrific students, frighteningly harmonious and collaborative and mutually supportive faculty, miraculous accomplishments given what we’ll call problematic funding streams, and buckets of intangibles.

But what would it be like at a liberal arts school? In mid-February I gave a lecture at Drake University in Des Moines, IA. This is a liberal arts college—no graduate degree programs in music, so no TAs, all courses taught by “real” faculty, lots of extra attention, students who tend to make and take opportunities for things like writing for the newspaper, experimenting with this and that paraprofessional activity, and who tend to be lucid, confident, funny, and socially assured. The university had a kind of ongoing Event that was spearheaded by a Religious Studies professor and, y’know, willing henchmen from other departments—this year’s subject was “Ineffability,” and a friend there put my name forward as someone delusional enough to think he had something to say about it who might take on the issue. So it was the sort of event that we really don’t do at my place: funding was found, I gave a lecture well outside my comfort zone (in a local church, noch) to a mixed multitude of students and faculty and the curious, met people, fielded questions, spent time with students (learning of their interests, dispositions, how much they enjoyed their school, etc.). They clearly benefited from the faculty attention they received, and to a large extent repaid it with effort.

I think I could have been a faculty member in such a situation. Actually, I was­—my first two years were two Visiting Assistant Professor positions at Bucolic University. It was a great experience. And that wasn’t the kind of place where I ultimately landed.

So I’m getting ready to wing my way home from the University of Indiana—Bloomington, home base to Phil-im-Bart Ford (in Tübingen, Germany, they refer to one of the founding figures of their city as “Eberhardt-im-Bart,” E-with-the-huge-beard) and several other friends. So I met a batch of really bright, lively grad students (loads of crazy-interesting research ideas), spent time with friends (including a great 2+-hour chat with Phil), and hit ’em with the earliest version of something I hope to include in the book I’ll be writing this fall. The takeaway: really good, helpful, penetrating questions and comments from faculty, from students, and from faculty friends afterwards. Taking out your recital pieces for friends before a performance, you know? And did I mention the food and drink?

I could, I think, have taught at such a place also. Fantastic library, lots of resources for bringing people in and looking after grad students really good esprit de corps among the musicologists (which translates into mentoring the hell out of their students). I didn’t really land at a place like this, either.

Soon I’ll have rough drafts and stuff to read at home, but no more travel is planned until the summer. And I ended up at a place with aspects of both of the above kinds of institutions, and where efforts are made to customize situations and duties to faculty. As I say, there are also those countless intangibles.

From the faculty perspective, from the student perspective, these are all ideal places, depending on one’s personal disposition. There is an absolute necessity for all such. This is a key point because we still constantly encounter the looking-over-one’s-shoulder addiction to “prestige” (no way for those quotes to be scare-enough or ironic-enough), and the idea of prestige really is crap—abject crap. You’re too good to teach X students? You simply can’t do any kind of research or creative activity without loads of supplemental funding and TLC? Please. Sappy conclusion: grow where you’re planted, and you may well find that although you never saw it coming, you were planted in a situation you were put on earth to be in. “’Tis a gift to come down where you ought to be,” sang the Shakers, and it is a glistening, resonant Truth.

Posted in Academia, Education, IU, Musicology, Teaching | 3 Comments

Hum me a few bars

A stupid joke for April Fools Day, in honor of my Dad, who knew a million of ‘em:

A guy walks into a bar and orders a beer. There’s a piano player entertaining the clientele, and his gimmick is, he has a monkey that walks down the bar collecting tips. The guy with the beer doesn’t give him anything, so the monkey keeps coming back and trying to get a tip. Finally the guy (a cheapskate and a mean drunk besides) smacks the monkey, which retaliates by jumping up on the guy’s beer glass and lowering his genitals into the guy’s beer. The guy angrily strides over to the piano player and yells “hey, do you know your monkey dipped his balls in my beer?” And the piano player responds, “No, but if you hum me a few bars I can fake it.”

Well, maybe that’s not the best joke in the world, but (1) it’s music-related, and this blog is after all about “music and related matters,” and (2) it’s a good example of what you can do in an academic blog that you can’t do in an academic publication. That’s something I like about blogging: I get to do whatever the hell I want to do. Think I’m being self-indulgent? Insufficiently music-related? Reinscribing phallogocentrism?

Image

And I don’t have to. That’s the best part.

In academic essays and monographs, you have to care, pretty much by definition. Caring what other people think of you is the engine that drives academic writing. And yet it’s more complicated than that, too, because you also have to be original. The academic-humanities biz is founded on a contradiction between its demand for novelty and the mechanism by which novelty is judged.

On the one hand, newness is the coin of the academic realm. The final test of all Ph.D. graduate students, the dissertation, demands that they say something that no-one has ever said before: maybe it’s a study of a previously unstudied manuscript, broadcast, genre, idea, etc., or maybe it’s a a new way of looking at something already well-known, but either way, it has to be new. In this way, the dissertation is the model for all forms of academic communication. Every piece of academic writing takes place in a virtual space of conversation, a nöosphere, for which the price of admission is saying something new. And while there are greater and lesser forms of newness, it is still nonetheless true that new newness, real originality, is what wins you the biggest rewards. To do what Edward Said did when he wrote Orientalism — to give fellow scholars a new vocabulary to describe the world and and even a new way of seeing it, and to change the game so profoundly that for generations no-one will be able to enter your part of the nöosphere without invoking your ideas, even if they don’t like them — this is the greatest glory to which an academic can aspire.

But: how do you know what’s new? How do you know if something is represents a useful or relevant kind of newness? The only way to tell is by showing it to experts in the field — that is, through peer review. And how will your peers judge your work? Inevitably, in terms of what’s already been done, what is already known. If you start off an academic essay with the monkey joke, even if you have a good reason for doing so,* you’ve got some ‘splainin’ to do.

Image

I’m pretty sure every published academic has worn that look at one time or another. You try something in a piece of academic writing that you know is a little out, but you want to see if you can get away with it, and then you get back comments from peer reviewers. Shit, they didn’t go for it. “Revise and resubmit.” You lose the monkey joke. Your writing gets more normal-looking, less personal, more like everyone else. You have to be original, but it has to be a kind of domesticated originality. Be original, but be original in a recognizable and respectable way. All academic writing is impaled on the horns of this dilemma.

The last time I wrote, I ended up saying that a lot of the books I’ve assigned in my “current readings in cultural studies” class seem to be exercises in rattling the bars of the cage — that is, they seem to show that their authors have become aware of some taboo (often unspoken) in academic writing. They reach the point where any well-socialized academic knows that s/he will be going out, walking on thin ice, getting into the realm of can-I-get-away-with-this — in short, they discover for themselves the limits of the cage that has been fashioned for them by the peer-review structure of academic knowledge production. And they begin to test the bars of that cage, sometimes one at a time, sometimes banging away on a bunch of them at the same time.

I define “cultural studies” very loosely: in my syllabus, I write that “cultural studies” is more an idiom than a discipline, methodology, or genre. The term denotes a motley assortment of scholarly topics and approaches that have little in common save a kind of lingua franca that lets their parent disciplines talk to one another. A lingua franca is a border language that allows neighboring peoples to trade with one another; this particular lingua franca is marked by its oppositional temper, its interpretive templates of race, class, and gender, and its “theoretical” (i.e. sort of like philosophy but not exactly) style of thought. But it seems to me that as we get further and further away from the heroic era of “high theory” that gave us the basic vocabulary and race-class-gender problematics of present-day cultural studies, that identity is shifting somewhat. It now seems to me that “cultural studies” is (or at least can be) that area of the academic humanities in which every publication can be an opportunity to test the bars of the cage.

Here are one such bar: obviously shitty writing. Let’s face it, a lot of the time, reading academic writing feels like chewing through cinderblocks, and this can be especially true of cult-studs writing. I’m not talking about difficult or obscure writing; I’m talking about a fugliness born of a utilitarianism that treats words as a mere delivery device for “content.” As if “style” and “content” were separable; as if beauty of language must always be subordinate to message, Expression the poor relation of Idea.

Joshua Clover’s 1989: Bob Dylan Didn’t Have This to Sing About bills itself as an exercise in “lyrical theory,” which is to say, it is a difficult and sometimes obscure work in the Adorno-Benjamin mold, but it is also the work of a working poet who can use lyrical language to do the work of theory. The sometimes surprising beauty of its sentences introduces a vector of complexity that is not simply assimilated to “content.” Even if you aren’t a true-believer Adornian (and I’m not) — even if you think everything Clover says is bullshit — the performance of the text is valuable in its own right. In this way it’s a lot like Wayne Koestenbaum’s The Queen’s Throat, which is likewise the work of a poet.

But most humanities academics take it for granted that while good writing will improve their chances of getting published, it’s not anywhere near the main point of the exercise. The idea that one’s writerly expression might be part of the intellectual substance of one’s submitted work — as if the proper qualification for writing were not a Ph.D. but an MFA — is alien and even upsetting. And yet there are some ideas that simply cannot be approached without lyrical virtuosity; I’ve often felt that some aspects of musical performance (saying what something actually sounds like) call for these kinds of skills. The fact that academics don’t usually see those skills as something that can reasonably be demanded of us means that the ideas that call for those skills will remain unexplored. A taboo; a bar of the cage.

I have a bunch more bars to hum for you, but I have to run off and teach . . . .

*See my essay, “We’re in the Monkey: Mediated Simian Presence in the Interwar American Film Musical,” forthcoming in the Journal of the American Musicological Society 67, no. 2 (2014).**

**April Fools.

Posted in Academia, Books, Criticism, Performance, Uncategorized, Writing | 3 Comments

Bars of the Cage II: For Amusement Only

So are academics condemned to go around clutching their brows in existential agony? No. For the most part, we don’t. Sure, there are some people who really commit to seeing the world as the so-called world, everything in quotation marks, all perspectives ironized. Those people are often the ones who never finish their dissertation and seem somehow damaged by their time in graduate school, like this guy from the comic universe of The Onion. Poor guy can’t even order Mexican take-out without

analyzing the menu’s content as a text, or ‘text,’ sub subjecting it to a rigorous critical reevaluation informed by Derrida, De Man, etc., as a construct, or ‘construct,’ made up of multi-varied and, in fact, often self-contradictory messages, or ‘meanings,’ derived from the cultural signifiers evoked by the menu, or ‘menu,’ and the resultant assumptions within not only the mind of the menu’s ‘authors’ and ‘readers,’ but also within the larger context of our current postmodern media environment. Man, I’ve got to finish my dissertation before I end up in a rubber room.

According to the article, this graduate student is currently “considering taking a leave of absence from his graduate studies to spend several months living in his mother’s basement in Elmira, NY.”

No, you can’t commit to that way of looking at the world, because it will make you a little crazy. Sorry, I’m going to indulge in a little autobiography here: when I was in graduate school, I was that guy. I went slightly crazy in exactly this way. I didn’t move into my Mom’s basement, but I did take a couple of years off from graduate school (circa 1997-1998) and learn how to be a normal(-ish) member of society with a normal(-ish) job that didn’t require me to unpack the Lacanian implications of anything.

Image

Then, within four months, my Dad died and my son was born, and this gave me furiously to think. Life is short; better make it count. What was a worthy thing to do with my life? I knew I still wanted to be a scholar, and what interested me, as a scholar, were the intellectual problems that had occupied me before dropping out, but viewed from a different angle. When I returned to my program (tanned, rested, and ready) to write my dissertation, I had learned how to use the ironizing worldview in a more strictly instrumental way, as a tool I could wield against ideas I wished to cut down to size. These were the ideas were the same ones that sent me spinning off into the aether earlier in graduate school—the mental habits of the theory-driven, abstraction-drunk whiz kid. I came to see this cognitive pattern as just one possible choice among many, and as such something that has a history, a beginning middle and end. I could see the roots of this cognitive style in postwar hip culture and, further back, the the avant-gardes of the early 20th century. I was no longer beholden to this worldview, but could give it a taste of its own medicine and ironize the tar out of it. In this way, I was free. At the head of my dissertation I quoted Nietzsche: “I have attempted to describe a feeling that has frequently tormented me; I take my revenge on it by making it public.”* Kind of pompous, sure, but that’s how I felt. I had realized I couldn’t solve the conundrums enacted by the endless self-reflexivities of theory, but I could justify not having to care more about them than I wanted to. To me, this felt like a big deal, coming out of the late-nineties cultural-theory boom market. (Keep in mind, University of Minnesota, especially the CSDS program on the East Bank, was a place of huge intellectual energy around the concerns of the “high theory” age.) It was my own little declaration of intellectual independence. On these terms, I could finish my dissertation with equanimity.

To be honest, my dissertation turned out to be a bit of a mess — my reach exceeded my grasp, and I really didn’t yet have a handle on the ideas I was trying to understand — but some of its concerns have carried onto the book I ended up writing. The main thing they have in common is that trick of ironizing the ironists, getting hip to the hipsters. And please note, I still think I’m right! I’m not taking back anything. I couldn’t have gotten a handle on the ideas I wrote about without ironizing them. If I hadn’t, I would have been left with the usual binary, are-you-for-it-or-are-you-against-it option: write yet another sentimental ode to countercultural transgression (the liberal academic option), or write a grumpy screed about how all this transgression is ruining America (the conservative think-tank option). Historicism was a way out of another bind.

But lately I’ve been having second thoughts about historicism. Because it seems to me that those of us in the irony game tend to manage the toxic, crazy-making aspects of this worldview by treating them in exactly the same limited, ends-means, instrumentalizing kind of way I learned to do when I decided to return to my program and write my dissertation. In self-defense, we learn to reserve our critical selves for academic work. Once we learn the trick, we become historical ironists, deconstructive skeptics, putters-of-things-in-scare-quotes, but (as the sign on the pinball machine says) for amusement only.

Image

It’s not like you actually live that way. To vary a formula from Marx, we teach in the morning, deconstruct in the afternoon (or whenever it is we’re writing our professional publications), and meet social obligations in the evening. But the person you are when you go to your daughter’s violin recital is not the person who’s deconstructing stuff. You’re listening to her play the Bach double concerto, not sitting there stewing about the reified social forms of quote-unquote art music. Or if you are, you will not have a very good time and will probably end up spoiling it for everyone. You’d better learn to compartmentalize, because being full-on ironic and relativizing all the time will mess with your life.

But then . . . if you treat the ironic and historicizing perspective as a tool and nothing more, something you do for professional purposes and then set aside when you’re not working, then your academic writing comes to seem like a game, not how you actually see the world. You’re just a tourist in Ironyville. The high mission of the intellectual is surely to see things in an original way and to enlighten others by showing them the world in the brighter and truer colors revealed by that unique perception. But what does it mean if you’re only seeing things in that way because it’s comme il faut, a convention, something you do to maintain your professional poise? You discover a loss of integrity between your intellectual work and your life. You find you’re a long way down from the “high seriousness” of someone like Adorno, for whom “it is part of morality not to be at home in one’s home.”

For the contemporary scholar in the critical humanities, Adorno’s critical attitude of homelessness degenerates into a mere convention of that highly specialized literary genre, the academic monograph. A critical stance becomes one of those things which, if missing, would probably occasion some suspicious questioning from the anonymous academic reviewers who adjudicate the merit of your work and decide whether and where your stuff gets published. Without a patina of historicist irony, you come off as “uncritical,” and no-one wants that.

But on some level this must be unsatisfying. As humanists, nothing human is alien to us, at least ideally. So how is it we find ourselves in the grip of a taboo — a taboo on the uncritical, the unironic, the unhistoricized? Such a taboo, usually unspoken and unavowed, is what I call a bar in the cage.

As writers plying a minor literary genre, we become aware of the conventions that bind us and limit our movement. That’s one thing our training allows us to do, after all: notice when something limits our free action, notice when something isn’t being said. And if you’ve been trained that way, once you become aware of a limit on your action, you start to feel suffocated. If you are keeping it real, as a humanist and a free, thinking person, you are going to insist on saying the thing that no-one is saying.

Of course, sometimes keeping it real can go wrong.

Right now I’m teaching M603, my “current readings in cultural studies” seminar. We’re reading mostly recent academic monographs, along with a few articles. For me right now, monographs are more interesting to teach than single essays, because a monograph is a larger canvas on which to spread your ideas and also offers you fewer places to hide. You can tell what writer hold dear, what they want to protect, what they want to find a place for. And sometimes that means rubbing up against the tacit conventions and rules, maybe deciding that it’s time to keep it real and do something that people are going to think is weird. You rattle the bars of the cage, or even try to break the bars. It seems to me that a lot of recent cultural-studies(ish) monographs on music are exercises in rattling the bars of the cage.

Next time: some examples of what I mean by that last sentence.

*Friedrich Nietzsche, “History in the Service and Disservice of Life,” trans. Gary Brown, in Unmodern Observations, ed. William Arrowsmith (New Haven: Yale University Press, 1990), 87.

Posted in Criticism, Historiography, Intellectuals, Philosophy, Uncategorized | 1 Comment

Bars of the Cage I: quote-unquote “Irony”

So hey, I’m back. Long absences are just gonna happen. Sometimes it might be long absences punctuated by flurries of posting, when I think I have something to say.

So, irony. Or “irony.” See what I did there? When you put things in “scare quotes,” you do something to those words. There is some subtle shift in meaning, not so much a shift between entirely different ideas as the same idea seen in two different perspectives.

If I say, I saw Jessica’s new boyfriend, you will understand me to be saying that I saw Jessica’s new boyfriend. His name is Steve. He’s into golf. OK but now with the magical interposition of scare quotes, I can say that I saw Jessica’s new “boyfriend.”

Dr-Evil-Air-Quotes

Nudge nudge, wink wink. The matter at hand is unchanged — there is still Jessica, and there is still Steve who likes golf — but I am seeing them at a different angle. “Boyfriend” is what Jessica calls Steve. It’s what she’s telling her friends and probably what she’s telling herself. But is Steve really a boyfriend or just some dude she’s hooking up with? Whatever Jessica might be telling people, I remain agnostic. Steve may or may not be really Jessica’s boyfriend; all I can say is that what she’s calling him. Steve is Jessica’s so-called boyfriend. In German, the word is sogenannte, which (as I heard one German professor remark) is a word loaded with verbal aggression of a peculiarly academic sort.

To view the world as occupied by nothing but things that are only ever so-called  – indeed, perhaps, to view the world itself as merely sogennante, “the world” rather than the world — to see things this way is to see them ironically. I like Kenneth Burke’s definition of irony as a perspective of perspectives. In irony, at least two perspectives are always at play: 1. the ordinary, taken-for-granted perspective of those who are participating in a social situation, where Steve is Jessica’s boyfriend; and 2. another, more abstract, more meta perspective, where Steve is Jessica’s “boyfriend.” Perspective no. 2 brackets perspective no. 1; scare quotes simply act out this bracketing in typography. Perspective 1, seen from Perspective 2, is just something some people think, probably wrongly; Perspective 1 is the contingent and limited perspective of those who have some skin in whatever game they are playing.

Perspective 2 — irony — is the perspective that does not believe, does not take sides, has no skin in the game, and fancies itself distinct from what it beholds. (Or at least it thinks itself merely in but not of the social world it observes.) So when one sees things ironically, one sees Perspective 1 or indeed any number of contending worldviews within the more abstract frame of Perspective 2; one has a high and remote vantage point from which to view various perspectives fighting it out in the world. Thus Burke’s definition of irony as a perspective of perspectives.

In my book, I argue that hipness is a sensibility (or what cultural-studies types most often call, after Raymond Williams, a structure of feeling) marked above all by this ironic, bracketing style of perception. As a scholar, my strategy was to bracket hipness itself, to turn the weapons of irony on the ironists. Early on I write, If hip culture offers us a good deal of delusion and posturing to go with its great works of imagination, my aim has been to understand a little better, without sentimentality or anger, the roots of its destructive illusions and profitable conceits alike. Or as I often say to people who ask me about my book, I’m not in the business of saying what’s hip, just what people at certain places and times thought was hip, and why.

And in this respect I’m not at all unusual. This is what humanities scholars usually do. But in scholarship we tend to think of it as historicism, and humanities scholarship, especially when it deals with ideas that we moderns tend to think are weird, almost always falls back on some kind of historicism. A Medievalist, for example, might want to ask whether people in the middle ages really believed in the miracles that were so much a part of their life and collective imagination. A historicist would say, in effect, that people thought a lot of crazy shit back then. Or, as Steven Justice puts it (a bit more elegantly):

By a simple device, the miracle story, which earlier historians had blushed at … become [for later cultural historians] a nearly bottomless resource of metaphor and metonym, first for discovering demotic consensus and then for unmasking its hidden coercions. All you had to do was shift attention from the truth of a miracle story, or your source’s investment in that truth, to its meaning. [Steven Justice, "Did the Middle Ages Believe in Their Miracles?," Representations 103, no. 1 (2008), 2.]

This gets at something important: by marking truth as unknowable or at least unknown, the historicist method gives up on truth and settles for meaning. Did the saint really curse the laundry girls so that their immediately hair fell out? I ain’t saying he did and I ain’t saying he didn’t; I’m only interested in what the story meant to Medieval people. In the writings of historicist Medievalists, accounts of miracles became “symptomatic fictions laid out for diagnosis, which required neither belief nor disbelief from their authors and audiences, but served as an entrée into their experience and the structures that organized it.” Or, put another way, the only truth we accept is the truth that there are multiple and irreconcilable perspectives, whether on miracles or hipsters or anything else.

In other words, a historicist is not interested in miracles, only “miracles.”

This historicism is second nature for humanistic scholars. It is perhaps the first and most important cognitive shift we undergo when we go to graduate school. As undergrad piano majors or whatever, we learn that Beethoven is a genius. When we go to graduate school in musicology, we learn that Beethoven was a “genius.” One of the reasons that music performance majors and music scholars don’t quite trust each other is that the bracketing and historicist perspective seems snotty and disrespectful in the same way as saying “Steve is Jessica’s ‘boyfriend’” while rolling your eyes and making little finger-twitching air quote gestures. “Why can’t you believe in anything?”, asks the performer. “Why can’t you be more critical?”, asks the musicologist.

To be critical is to take an ironic view of history. The academic prizes her critical intellect above all; it is what wards off the darkness and superstition of the past and frees her from the social coercion that necessarily follows from superstition. Without our ability to understand the “common sense” of our times us as just another limited point of view, we would be cheerleaders for the status quo. Indeed, we would have no way to form any idea about our situations; we would only have the ideas given us by our situation.

And indeed it’s not just a few humanities scholars who make a habit of irony. In some accounts (like Louis Sass’s impressive Madness and Modernism) the perspectival abstractions of irony constitute the cognitive signature of modernity itself. The characteristic action of the modern thinker is to strip humanity of its illusions. Darwin revokes our unique and privileged place in creation and reveals us to be a kind of bipedal, mostly hairless ape. Marx suggests that our ideas of beauty and morality are mere adornments and alibis for the brute workings of political economy. Freud leaves us without even the dignity of our will and reason: it turns out that we are puppets on strings pulled by our unconscious drives, which we cannot control and indeed can scarcely even perceive. And Nietzsche tells us God is dead. Whatever we hold true is just something we think because of class bias, sexual neurosis, religious bigotry, or whatever.

What then is true? If you are a Marxist, then the only truth is the history given us by class conflict; if you are a Darwinian, you can only put you faith in the material processes of natural selection; if you are a Freudian, unconscious drives constitute the bedrock of reality. But the characteristically modern anxiety is the fear that even these bedrock monisms are just belated attempts to reassert a capital-T Truth in a situation where no such Truth can be believed. The fury with which Marxists, Freudians, and materialist neo-Darwinians insist on the truth of their chosen -isms hints at bad conscience. Deep down, we might suspect that there’s no real reason to privilege these or any other points of view, just a threshold decision born of secret desperation. In Darwin’s Dangerous Idea, Daniel Dennett writes that Darwinian materialism is like the “universal acid” of comic-book fantasy:

Universal acid is a liquid so corrosive that it will eat through anything! The problem is: what do you keep it in? It dissolves glass bottles and stainless-steel canisters as readily as paper bags. What would happen if you somehow came upon or created a dollop of universal acid? Would the whole planet eventually be destroyed? What would it leave in its wake? After everything had been transformed by its encounter with universal acid, what would the world look like?

What I am suggesting is that it is not Darwin’s or Marx’s or Freud’s or anyone else’s idea that is the universal acid: it is the ironic perspective within which these ideas, and indeed every possible idea, is just another point of view, and once you start seeing the world this way there is no way to stop. Irony just keeps on dissolving everything it touches. Conservatives like to call this “relativism” and blame postmodernism for it, but the postmodernists are late to the party: vertigo in the face of a meaningless plurality of perspectives is the basic condition of modernity. “Postmodernism,” in this context, is just modernism whistling past the graveyard.

The condition of modernity offers an uncomfortable existential situation. You feel yourself cut loose in a world stripped of illusions. You are gnawed by the pervading awareness, or at least the suspicion, that there is no truth anymore, only an endless series of perspectives. You have no place of certainty to rest and begin to suspect that in this superfluity of meaning, of multiple and divergent perspectives, meaning itself has become debased. Everything has a meaning, but nothing is any longer meaningful.

And if you feel this way, how do you get through your life? It’s a lot easier to go to church, attend junior’s school show, go to work, enjoy the Superbowl, etc., when you just kind of accept these things at face value. But what if you can’t anymore? When you drive your kid to her violin lesson, do you think “here I am, participating in a social ritual aimed at enhancing my child’s social capital and and thus my own standing in my community?” I hope not, because that’s how depressed people think. When you’re the guy who’s got depression, you become suddenly aware that there’s a big difference between what something is in a bare empirical way and what you can take from it. A child laughing and dancing by the sea can represent youth and joyful vitality, the promise of renewal, the eternal dance of life . . . or you might just be looking at a meat robot lumbering about at the beginning of a long wind-down into entropy.

All of which is to say that the ironizing, historicizing point of view is toxic to human life. Does that mean that humanities professors are wandering around clutching their brows in existential despair? Not exactly. But this is getting long, so I will write a second installment soon. Or soon-ish.

Posted in Academia, Hipsters, Historiography, Intellectuals, Philosophy, Uncategorized | 2 Comments

Love is the beginning and love is the end

This one goes out to my baby:

Exactly 20 years ago, on Feb. 2, 1994, Helen and I went on our first date. We got dinner at It’s Greek to Me in Minneapolis and realized (probably before we finished the appetizer) that we never wanted to be apart, ever. So we never have been. What would those 20 years have been like otherwise? Don’t even want to think about it. Can’t picture it, really. Who I am is who I have become because we have been together.

Not sure what the song* has to do with any of this, but the aching sweetness of it somehow reminds me of my wife, and what she means to me. Something to do with “love is the beginning and love is the end.” Something to do with “you cannot separate the part from the whole.”

Love you, babe.

*The studio version is pretty amazing, too.

Posted in Life | 3 Comments