In which I announce the birth and death of Weird Studies

I’ve been writing a lot about drugs lately. I’m really not a drug guy, though. My own daily intake of drugs (not counting mandatory middle-aged-guy stuff like blood pressure medication) is

  • 2 cups of strong coffee
  • 1 beer (2 on weekends)

Not exactly Hunter S. Thompson.

However, the point of this series of posts has never really been about drugs as such. As I’ve insisted from the beginning, there is something larger at stake in thinking about drugs: the boundaries of what we consider to be normal and acceptable mental functioning, and how those boundaries are sustained and enforced in academia.

One thing I have emphasized in my series on drugs and cognitive liberty is how present-day humanities academia, for all its outspoken concern for otherness and subalterneity, has tacitly assumed that meaningful human difference is material — differences of race, class, and gender, especially. Cognitive difference (roughly speaking, not only what you think, but how you think, strategy rather than tactics) is not especially interesting to humanities academics. It’s not only their lack of curiosity about psychedelic research and their baffling indifference to the oceanic injustice of the drug war.* For all they talk a good game about “privilege,” religious privilege is low on their list of priorities. And while “religious privilege” can mean “Christian privilege” if you live in Anadarko, Oklahoma, it can also mean the privilege of secular humanities academics for whom any metaphysical commitment other than standard-issue scientific naturalism is moronic and regressive, or, at best, daffy and woo. As Nicholas Kristof has written recently, liberal academics exercise this privilege especially in hiring decisions, where they routinely discriminate against Christian evangelicals. I doubt that an outspoken Thelemite would do much better.

Erik Davis’s podcast Expanding Mind is a gazetteer of contemporary forms of cognitive difference — religious, magical, psychedelic, philosophical, artistic, scientific — and a few months ago he made an offhand comment, during a conversation with the philosopher Timothy Morton, that perhaps “weird” could be a cognitive-diversity equivalent of “queer.” A scholar dedicated to inner exploration through psychedelic drugs would be Weird; so would someone who writes about magic without sanitizing it with an ironizing historicism. So would be a professor who is a Scientologist, or who for that matter is an evangelical Christian. All of them fall outside what modern habits of mind have set as the boundaries of cognitive normality.

I like Davis’s notion of using “Weird” as a cognate of “Queer,” as it gives a dignified and formal character to the offhand name I have given to the tendency of my own thinking and writing in recent years. It suddenly becomes possible to imagine something called “Weird Studies.” Has anyone thought of creating the field of Weird Studies? Quick Google search … No? CALLING IT! I hereby announce the birth of Weird Studies.

Hey, I just invented an academic subfield. Now that’s a good day of work right there.

yabba dabba doo

But what would it take for my hypothetical Weird Studies to claim the kind of intellectual legitimacy that Queer Studies (or Disability Studies, or Critical Race Studies, etc.) has earned in the last couple of decades? It would have to overcome a vast, systemic inertia. For the rest of this post, I will try to explain what I mean by this.

***

For years I have been teaching Gerald Graff’s maxim that the most fundamental form of academic communication is some version of the following:

Whereas so-and-so says X, I say Y.

The relationship to X and Y may be one of emendation, minor correction, slavish emulation, brusque dismissal, cruel mockery, and so on. Knowing the relationship of X and Y means knowing why a text was written and how to make a text of your own. Learning how to identify existing positions on a given topic, and consequently reading everything that has been written on that topic, is the necessary precondition of having anything to say yourself.

This is a basic skill of academic writing, and I would be the last person to suggest that we abandon it. Academic conversations are unthinkable without it. For that matter, so are many conversations in real life. But because it is so basic to our conversations, it is like water to fishes, the medium in which we move and not something to which we pay any particular attention. And when it begins to poison us — as I think it does — we don’t even notice.

In the first chapter of A Pluralistic Universe, William James writes of the deadening effect of German academicism on philosophy, and particularly the iron insistence on Graff’s X/Y form:

[In our universities we teach that] you must tie your opinion to Aristotle’s or Spinoza’s; you must define it by its distance from Kant’s; you must refute your rival’s view by identifying it with Protagoras’s. Thus does all spontaneity of thought, all freshness of conception, get destroyed. Everything you touch is shopworn. The over-technicality and consequent dreariness of the younger disciples at our American universities is appalling. It comes from too much following of German models and manners. Let me fervently express the hope that in this country you will hark back to the more humane English tradition. American students have to regain direct relations with our subject by painful individual effort in later life. Some of us have done so. Some of the younger ones, I fear, never will, so strong are the professional shop-habits already.

In this passage, James will attack three academic vices, all interconnected with one another. First, here, he attacks the staleness of thought that results from the insistence that every idea, every insight, must be stated in terms of its antecedents. Nietzsche, always an incisive psychologist, writes that a certain amount of forgetting is necessary to the health of the human organism: if someone were unable to forget anything, “like a true disciple of Heraclitus, he will end by scarcely daring to life a finger.”** The “do I dare to eat a peach” mood of neurasthenic over-refinement that afflicts highly intellectualized people — for example, the hip young intellectuals in the postwar New York scene that Seymour Krim wrote about — comes in part from an inability to forget anything one has read, and, even more, a deep insecurity about letting anyone think you haven’t read it. The toxic atmosphere of the Partisan Review in its glory days was one in which high literary standards were enforced by the fear of being caught out for not having read something or having read it too superficially, as well as the backstabbing bitchiness and intellectual snobbery that preyed upon that fear.

Like Krim, I believe that such an atmosphere is bad for writers. The writing of the PR milieu that is still readable — George Orwell’s “London Letters,” Dwight MacDonald’s “Masscult and Midcult,” and Norman Mailer’s “White Negro,” for example—was created by people who were widely-read but who could forget enough of their reading to write something original.*** As Orwell wrote, good books are written by people who are not frightened, and fear is what dogs the scholarly writer above all—fear of being found out and revealed to the world as a lightweight, a showboat, a fraud!

This fear drives the fetish for total bibliographic mastery. It is the fear of articulating a thought without proving that someone more respectable that oneself has already thought it. Scholars brandish citations like cops flashing their badges, believing that without the warrant, you can’t make the collar. This is especially true of those who have yet to climb the mountain of promotion and tenure and whose careers are still very much at the mercy of anonymous peer reviewers: it’s a fear gives the writing of many graduate students and newly-minted Ph.D.s a sheen of writerly flop-sweat. Anyone who has reviewed humanities monograph proposals for academic presses has read rebadged dissertations frontloaded with a horse-choking introductory chapter of abstract, “theoretical” framing, in which a concept is adapted from Theorist A to fit with the praxis of Theorist B and triangulated with a key term adapted from Theorist C. The author’s own position comes to seem like a complicated diplomatic negotiation between rival camps. Or perhaps the obligatory “theoretical introduction” is more like calling a trick pool shot: in this book, I’m going to bank Žižek off Latour to knock the Agamben 8-ball into the Adorno pocket.

And the cultural-theory books are by no means the only offenders here. Positivism, the philosophy that animates books of seemingly opposite temper, is a style of thought formed, root and branch, out of the fear of saying an unverified thing. It underwrites the sort of book whose author is paralyzed by fear of missing something someone has written on his topic, and who therefore has parceled out a topic so parochial that he cannot fail to read everything relevant to it.

It would be easy to dismiss all such academic writing by saying, no-one cares about those books. But this is untrue. Their authors care quite a bit about them, and also their editors, their promotion and tenure committees, the small knot of fellow-scholars working on the same topic (most of them, the nice ones at least, ending up writing external tenure review letters for the author), the graduate students who aspire to write on the same topic and, because there is now a book on that topic, must take it seriously . . . even the most forgettable academic monograph takes up a certain amount of space in the discipline.

But it’s an artificial space. And this the second point of James’s criticism:

… In Germany the forms are so professionalized that anybody who has gained a teaching chair and written a book, however distorted and eccentric, has the legal right to figure forever in the history of the subject like a fly in amber. All later comers have the duty of quoting him and measuring their opinions with his opinion. Such are the rules of the professorial game—they think and write from each other and for each other and at each other exclusively. With this exclusion of the open air all true perspective gets lost, extremes and oddities count as much as sanities, and command the same attention; and if by chance any one writes popularly and about results only, with his mind directly focussed on the subject, it is reckoned oberflaechliches zeug and ganz unwissenschaftlich.

There are certain ideas from which flow a stanchless river of academic prose, year after year, decade after decade, simply because each successive generation of scholars is compelled to read and respond to the previous one’s books, which in turn respond to the previous one’s books, und so weiter. In this way are the social and political concerns of the 1960s—the basic trinity of race, class, and gender, with its extensions—preserved in cultural studies. Scholars such as Gerald Graff were complaining about the trapped-in-amber persistence of 1960s forms of critique more than two decades ago, when I was starting graduate school. Here we are in 2016, and surprisingly little has changed. The stock of various cultural theorists go up and down (right now we are buying social network theory and selling Terry Eagleton) and refinements are made to the basic template. The concept of “gender,” for example, is now more friable than it was 20 years ago. But year after year, one monograph after another appears, each working some combination of race, class, and gender “interpretive lenses” in apposition to whatever topic lies at hand.

Some years ago I gave a guest talk at the University of Somewhere-Or-Other on the Beats and their early experiments with sound recording. An American Studies professor asked me, “how would all this look if we put it through the race/class/gender machine?” At first I thought she was speaking ironically about the invariant appearance of these interpretive categories, but nope, she meant exactly what she said. It became clear that she believed that the categories of race, class, and gender can be deployed in an automatic, almost algorithmic process of running a script and getting a certain set of results. Professional scholarly work can be automated and put on an efficient basis in the best American fashion: find a cultural phenomenon, put it in the machine, turn the crank, and out pops a ready-made interpretation.

This mechanical process of interpretation becomes particularly apparent in peer reviews. The lazy reviewer can flip quickly through a submission, and if the manuscript says something about race and class, the reviewer can write “it is troubling that the author has not considered the complex vectors of gender as they relate to the embodiment of raced and classed selves” or whatnot. Or, if the submission deals with gender and race but not class, the reviewer can write “I find it curious that the author has failed to consider the way class privileges mediations of race and gender identity.” And so on. In this way, that $50 honorarium gets to be easy money.

And the aspiring scholar who gets put through the machine is obliged, by the conventions of academic publishing, to respond to these thoughtless and superficial reviews. You are not required to accept every suggestion, but you have to give your reasons for not accepting a suggestion, and there is an unspoken understanding that you will meet your reviewers half-way and put at least some effort into adding a bit on the modalities of race in your piece on Das Lied von der Erde. You might, in your innermost heart, wish to write about something else, but because everyone else is writing about race, class, and gender, and because what James called “the rules of the professorial game” dictate that you must relate all that you think and write to what your fellow-scholars have thought and written, your writing becomes, by degrees, pretty much the same as everyone else’s. You express yourself in the same ways, use the same cant words, translate your ideas into the same idioms and forms, and impossibly picayune squabbles on the modalities of racial mediation, or the mediation of gender modalities, or the vectors of class in the gender mediations of race modalities, or whatever, becomes, to you, as absolutely real and compelling as a house fire.

I have been picking on the mechanical imposition of the race/class/gender interpretive template in the academic humanities, but I should point out, first, that this is hardly the only such template to be so imposed: there is always some template, and this one just happens to be ours. And secondly, there is nothing wrong with writing about race, class, and gender. Nothing at all — if that’s what you really want to do. The problem is not the constituent materials of an institutional template, or even that there is an institutional template at all; given the way academia works, they are inevitable. The real problem is that we no longer see that our work is structured by such templates.

Historical awareness starts with realizing it didn’t have to turn out this way. Anyone who has spent any time in humanities academia has learned this basic trick. Valentine’s Day rolls around and you annoy all your friends by commenting that the conventions of romantic love are not just how human beings love at all times and places, but are culturally- and historically-contingent conventions that do certain kinds of ideological heavy lifting. Well, fellow humanities academics, apply this insight to yourselves. Do you really believe that, out of all the infinite possibilities of the things that we could say or think about culture, the particular categories we keep recycling just happen to be the only real and valid ones? Do we believe that scholars before the 1960s all toiled in darkness, awaiting the final appearance of enlightenment and truth as represented by our own good selves? Or do you recognize that there might be contingent reasons for your choice of these scholarly idioms? That, in fact, you did not really choose those idioms at all?

My point is James’s: we “think and write from each other and for each other and at each other exclusively. With this exclusion of the open air all true perspective gets lost.” And with the failure to attain a wider perspective, our writing becomes as inbred and parochial as our subject-matter, with a crabbed gracelessness enforced by the belief that to write straightforwardly is to confess one’s naïveté. And this is James’s third and final point:

Professor Paulsen has recently written some feeling lines about this over−professionalism, from the reign of which in Germany his own writings, which sin by being ‘literary,’ have suffered loss of credit. Philosophy, he says, has long assumed in Germany the character of being an esoteric and occult science. There is a genuine fear of popularity. Simplicity of statement is deemed synonymous with hollowness and shallowness. He recalls an old professor saying to him once: ‘Yes, we philosophers, whenever we wish, can go so far that in a couple of sentences we can put ourselves where nobody can follow us.’ The professor said this with conscious pride, but he ought to have been ashamed of it. Great as technique is, results are greater.

Now that I come to think of it, forget I even said anything about Weird Studies. Because if it somehow happens that Weird Studies (a) comes into being, and (b) overcomes the institutional inertia I’ve written about here, it will end up becoming an orthodoxy as ironclad as any that James could have imagined. Which would, of course, be the least Weird thing ever. In fact, it’s already happened, give it up, it’s all over. The moment you can imagine the Weird taking on the form of a peer-reviewed edited anthology, it is no longer the Weird. I hereby announce the death of Weird Studies.

*And the American gulag rolls on every day — check out this recent New Yorker piece on how prisoners are being tortured and murdered in Florida prisons.

**Friedrich Nietzsche, “History in the Service and Disservice of Life,” in Unmodern Observations, trans. William Arrowsmith (New Haven: Yale University Press, 1990), 89.

***I have written about Mailer’s attempts to reach escape velocity from his own education in Dig: Sound and Music in Hip Culture, which also exemplifies some of the problems I am writing about here.

irony

 

About Phil Ford

Chairman of the Committee for the Memorial to the Victims of Modernism
This entry was posted in Academia, Cognitive liberty, Drugs, Intellectuals, Weird Studies, Writing. Bookmark the permalink.

4 Responses to In which I announce the birth and death of Weird Studies

  1. Tommi Uschanov says:

    Long-time reader of both your publications and the blog, but first time as a commenter.

    Last year, there was a fairly remarkable paper in the American Historical Review:

    http://ahr.oxfordjournals.org/content/120/3/787.extract

    It called for discussing whatever was… well, weird, about Ancient Athens (the author’s own area of specialization) without “sanitizing it with an ironizing historicism”, as you so aptly put it. And indeed called historians more generally to do the same for whatever they may have occasion to discuss about the past that happens to be weird.

    “When the Santal, a tribal people of Bengal and Bihar, rebelled against British forces and local landlords in 1855, they were, by their own account, simply acting on the orders of their ‘lord,’ the god Thakur. Yet as soon as one attempts to historicize this event, to tell a story in the ways prescribed by our discipline’s ‘European’ codes and protocols, one loses the ability to express the central role played here by Thakur. The best one can do within the limits of our historicism is to resort to the ways and means of cultural history, to ‘anthropologize’ Thakur’s divine agency, rationalizing it as the ‘religious belief’ of his human devotees, who can be the only ‘real,’ material agents. Thus, even the most sensitive efforts to write a ‘good’ subaltern history of the Santal revolt, one that restores full agency to a historically oppressed people, will end up denying the truth of the event as it was actually experienced by the Santal themselves.”

    “To date, scholarly efforts have focused largely on confronting historicism’s epistemological and methodological limitations, but a consensus alternative has yet to emerge. Our more urgent task should be to confront historicism’s ontological limitations, which seem to be altogether more fundamental. Before we can rethink our conventional ways of knowing and representing non-modern realities, we need to reconsider the very nature of realness itself. To help us retrieve all those pasts that we have lost in translation, we need a historicism that can make sense of each non-modern lifeworld on its own ontological terms, as a distinct real world in its own right.”

    I was actually reminded of you when I read this not long ago!

  2. philphord says:

    Hi Tommi,

    Thanks so much for getting in touch — and thanks especially for the citation to this wonderful article, which has been obsessing me all day. I find that there is already a name for the kind of cognitive/conceptual adjustment on the part of scholars that I have been calling for in my posts on magic — “the ontological turn.” From the footnotes of the Anderson article I traced this piece

    http://aotcpress.com/articles/common_nonsense/

    which takes up controversies around this “ontological turn” in anthropology. It seems to me, reading this last piece, that what these scholars are trying to do is to perform some of the same kinds of subtle mental adjustments I described in my “gnosis” post. (Not that that post is the last word on the subject — it now seems clumsy and partly wrong to me.) In other words, these “ontological turn” guys are trying to figure out how to think magically, and they are using the language of ontology (rather than that of magic per se) to do it. But I am not expressing this well — I need to think more about it and perhaps write a full-dress piece on it. But I am very grateful to you for bringing this scholarship to my attention.

  3. Elizabeth Newton says:

    This post made me smile and think so much — thanks!

  4. philphord says:

    Hey, thanks! Nice to see you round these parts! I ain’t on Twitter no more, so I don’t have enough E. Newton in my life.

Comments are closed.