So are academics condemned to go around clutching their brows in existential agony? No. For the most part, we don’t. Sure, there are some people who really commit to seeing the world as the so-called world, everything in quotation marks, all perspectives ironized. Those people are often the ones who never finish their dissertation and seem somehow damaged by their time in graduate school, like this guy from the comic universe of The Onion. Poor guy can’t even order Mexican take-out without
analyzing the menu’s content as a text, or ‘text,’ sub subjecting it to a rigorous critical reevaluation informed by Derrida, De Man, etc., as a construct, or ‘construct,’ made up of multi-varied and, in fact, often self-contradictory messages, or ‘meanings,’ derived from the cultural signifiers evoked by the menu, or ‘menu,’ and the resultant assumptions within not only the mind of the menu’s ‘authors’ and ‘readers,’ but also within the larger context of our current postmodern media environment. Man, I’ve got to finish my dissertation before I end up in a rubber room.
According to the article, this graduate student is currently “considering taking a leave of absence from his graduate studies to spend several months living in his mother’s basement in Elmira, NY.”
No, you can’t commit to that way of looking at the world, because it will make you a little crazy. Sorry, I’m going to indulge in a little autobiography here: when I was in graduate school, I was that guy. I went slightly crazy in exactly this way. I didn’t move into my Mom’s basement, but I did take a couple of years off from graduate school (circa 1997-1998) and learn how to be a normal(-ish) member of society with a normal(-ish) job that didn’t require me to unpack the Lacanian implications of anything.
Then, within four months, my Dad died and my son was born, and this gave me furiously to think. Life is short; better make it count. What was a worthy thing to do with my life? I knew I still wanted to be a scholar, and what interested me, as a scholar, were the intellectual problems that had occupied me before dropping out, but viewed from a different angle. When I returned to my program (tanned, rested, and ready) to write my dissertation, I had learned how to use the ironizing worldview in a more strictly instrumental way, as a tool I could wield against ideas I wished to cut down to size. These were the ideas were the same ones that sent me spinning off into the aether earlier in graduate school—the mental habits of the theory-driven, abstraction-drunk whiz kid. I came to see this cognitive pattern as just one possible choice among many, and as such something that has a history, a beginning middle and end. I could see the roots of this cognitive style in postwar hip culture and, further back, the the avant-gardes of the early 20th century. I was no longer beholden to this worldview, but could give it a taste of its own medicine and ironize the tar out of it. In this way, I was free. At the head of my dissertation I quoted Nietzsche: “I have attempted to describe a feeling that has frequently tormented me; I take my revenge on it by making it public.”* Kind of pompous, sure, but that’s how I felt. I had realized I couldn’t solve the conundrums enacted by the endless self-reflexivities of theory, but I could justify not having to care more about them than I wanted to. To me, this felt like a big deal, coming out of the late-nineties cultural-theory boom market. (Keep in mind, University of Minnesota, especially the CSDS program on the East Bank, was a place of huge intellectual energy around the concerns of the “high theory” age.) It was my own little declaration of intellectual independence. On these terms, I could finish my dissertation with equanimity.
To be honest, my dissertation turned out to be a bit of a mess — my reach exceeded my grasp, and I really didn’t yet have a handle on the ideas I was trying to understand — but some of its concerns have carried onto the book I ended up writing. The main thing they have in common is that trick of ironizing the ironists, getting hip to the hipsters. And please note, I still think I’m right! I’m not taking back anything. I couldn’t have gotten a handle on the ideas I wrote about without ironizing them. If I hadn’t, I would have been left with the usual binary, are-you-for-it-or-are-you-against-it option: write yet another sentimental ode to countercultural transgression (the liberal academic option), or write a grumpy screed about how all this transgression is ruining America (the conservative think-tank option). Historicism was a way out of another bind.
But lately I’ve been having second thoughts about historicism. Because it seems to me that those of us in the irony game tend to manage the toxic, crazy-making aspects of this worldview by treating them in exactly the same limited, ends-means, instrumentalizing kind of way I learned to do when I decided to return to my program and write my dissertation. In self-defense, we learn to reserve our critical selves for academic work. Once we learn the trick, we become historical ironists, deconstructive skeptics, putters-of-things-in-scare-quotes, but (as the sign on the pinball machine says) for amusement only.
It’s not like you actually live that way. To vary a formula from Marx, we teach in the morning, deconstruct in the afternoon (or whenever it is we’re writing our professional publications), and meet social obligations in the evening. But the person you are when you go to your daughter’s violin recital is not the person who’s deconstructing stuff. You’re listening to her play the Bach double concerto, not sitting there stewing about the reified social forms of quote-unquote art music. Or if you are, you will not have a very good time and will probably end up spoiling it for everyone. You’d better learn to compartmentalize, because being full-on ironic and relativizing all the time will mess with your life.
But then . . . if you treat the ironic and historicizing perspective as a tool and nothing more, something you do for professional purposes and then set aside when you’re not working, then your academic writing comes to seem like a game, not how you actually see the world. You’re just a tourist in Ironyville. The high mission of the intellectual is surely to see things in an original way and to enlighten others by showing them the world in the brighter and truer colors revealed by that unique perception. But what does it mean if you’re only seeing things in that way because it’s comme il faut, a convention, something you do to maintain your professional poise? You discover a loss of integrity between your intellectual work and your life. You find you’re a long way down from the “high seriousness” of someone like Adorno, for whom “it is part of morality not to be at home in one’s home.”
For the contemporary scholar in the critical humanities, Adorno’s critical attitude of homelessness degenerates into a mere convention of that highly specialized literary genre, the academic monograph. A critical stance becomes one of those things which, if missing, would probably occasion some suspicious questioning from the anonymous academic reviewers who adjudicate the merit of your work and decide whether and where your stuff gets published. Without a patina of historicist irony, you come off as “uncritical,” and no-one wants that.
But on some level this must be unsatisfying. As humanists, nothing human is alien to us, at least ideally. So how is it we find ourselves in the grip of a taboo — a taboo on the uncritical, the unironic, the unhistoricized? Such a taboo, usually unspoken and unavowed, is what I call a bar in the cage.
As writers plying a minor literary genre, we become aware of the conventions that bind us and limit our movement. That’s one thing our training allows us to do, after all: notice when something limits our free action, notice when something isn’t being said. And if you’ve been trained that way, once you become aware of a limit on your action, you start to feel suffocated. If you are keeping it real, as a humanist and a free, thinking person, you are going to insist on saying the thing that no-one is saying.
Right now I’m teaching M603, my “current readings in cultural studies” seminar. We’re reading mostly recent academic monographs, along with a few articles. For me right now, monographs are more interesting to teach than single essays, because a monograph is a larger canvas on which to spread your ideas and also offers you fewer places to hide. You can tell what writer hold dear, what they want to protect, what they want to find a place for. And sometimes that means rubbing up against the tacit conventions and rules, maybe deciding that it’s time to keep it real and do something that people are going to think is weird. You rattle the bars of the cage, or even try to break the bars. It seems to me that a lot of recent cultural-studies(ish) monographs on music are exercises in rattling the bars of the cage.
Next time: some examples of what I mean by that last sentence.
*Friedrich Nietzsche, “History in the Service and Disservice of Life,” trans. Gary Brown, in Unmodern Observations, ed. William Arrowsmith (New Haven: Yale University Press, 1990), 87.