Othering and smOthering

In the comments of my last post, Elizabeth Upton warns me against Othering the Medieval mind. It’s a point well-taken. If we accept the idea that certain ways of thinking about the past constitute a sort of epistemic violence — or at least an epistemic boorishness, drowning out the voices of other peoples with our own self-satisfied monologue — then Othering is what happens when we ignore those things we might have in common with another subjectivity. In saying “they’re not like us,” we deny other subjectivities their full share in the humanity we presume for ourselves. This is obviously what’s at work in the mentality of colonialism, and it’s also the standard criticism of musical exotica. To “exoticize the Other” is to make a spectacle of cultural difference, so that those we designate as Others are separated from us as if by Plexiglass walls and made exhibits in a human zoo.

Elizabeth is not the first medievalist to object to Charles Taylor’s A Secular Age (which I leaned on quite heavily in my last post) and what appears to be its deployment of a venerable trope of Othering, the old “Age of Faith” bit. Steven Justice writes,

You can almost hear medievalist palms slap medievalist foreheads in despair: trying to describe premodern religious belief, [Taylor] conjures an enchanted world where spirits and demons were part of life as lived (“it wasn’t possible to entertain seriously the idea that they might be unreal”), where what we would call ideas were encountered as realities (“no . . . distinction, between experience and its construal, arose”), and where possession, mystical union, the presence of God, must “be seen as a fact of experience, not a matter of ‘theory,’ or ‘belief,’” “a fact of [their] world.” Medieval studies thought it had stricken such notions from respectable academic discourse: one of its most confident boasts is that it dashed the notion of an “age of faith.” [Justice, “Did the Middle Ages Believe in Their Miracles?, 1]

Inasmuch as Taylor is assuming that such an “Age of Faith” lies on the far side of an impassable epistemic gap, then sure, I have a problem with this too. Writes Justice, “the problem arises neither in explaining the category of belief [the “Age of Faith” trope] nor in bracketing it [what scholars do to avoid the “Age of Faith” trope]; the problem arises in creating the category, in imagining belief as a distinct kind of cognitive state or activity, and then declaring it the defining property or defining problem of certain centuries.” [Justice, 12]

Justice wants to think of belief as practice: belief is not a “black box,” a placeholder concept whose contents remain a matter of mystery or indifference, but a set of readily-legible “mechanisms that enjoined belief, and backed the injunction by that system of regulation and pedagogy, of sanctions and rewards, that R. W. Southern memorably called ‘truth-enforcement.'” [Justice, “Did the Middle Ages Believe in Their Miracles?, 12]

For those who read my long series on magic, this should sound familiar. I argued that such a practice of belief is the sine qua non of the modern magician, and in doing so imagined a character named Ivy, “a thoughtful child of a skeptical age, who must figure out how to play with strange and archaic ideas of daimons and augurs while remaining in commerce with the modern world in which she lives.” I wrote that “such a person would have to manage a kind of double-jointedness of belief, an ability not only to put away disbelief but to assert powerful positive beliefs while holding them in a state of play.” Understanding Ivy’s practice of belief would go a long way to answering the question I meant to answer in my series, which how magic can continue to thrive in a supposedly disenchanted age.

But while I am in broad agreement with Justice — and indeed this idea of belief as practice supplies something that is missing in A Secular Age — I’m still not quite happy with his account of how disciplines of belief work:

These structures would have been pointless unless belief were something amenable to command, and they would have been redundant had it been second nature, the reflex experience of an enchanted world. Mental states cannot be enjoined, while choice can; the will, not the intellect, can obey. This is what it meant to describe belief as faith, as a discipline of fidelity undertaking and maintaining commitment to a series of putatively true propositions: the content of the commitment is cognitive—one commits oneself to the position that the propositions are true—but the mind encounters the commitment itself as something alien, peremptory, and rebarbative. The first duty every Christian undertook was to maintain assent to a series of propositions: “Do you believe in God the Father. . . ? in Jesus Christ his only Son. . . ? in the Holy Spirit. . . ?” the baptismal ritual asked; that ordinarily others answered for one—parents and godparents engaging infants to a matter on which they had no say—left one faced with the prospect of accommodating cognitive habits to the maintenance of truths often repellent to natural dispositions. Belief concerned what was uncertain, difficult, inaccessible. [Justice, “Did the Middle Ages Believe in Their Miracles?, 12]

To which I would ask, difficult and inaccessible to whom? What are these “natural dispositions”? Natural in whose terms? In modern terms, of course. “Natural” here connotes the assumption, nearly universal among educated moderns, that there are no spiritual agents in the world, and that some outside force — in this case, “mechanisms of truth-enforcement” — is needed to make them appear. Justice is suggesting that human beings, left to their own devices (ah, another venerable trope, the “state of nature”!) will naturally see no God, no Son, no Holy Spirit, because such spooks aren’t materially there. What is materially there is what is “natural.” The latter is the bedrock assumption of scientific naturalism, the belief system of educated moderns that Taylor lays bare with pitiless clarity. In this passage we sense what Taylor calls the “subtraction narrative” lurking close by: reality is simply what we moderns believe it to be, a world of brute matter evacuated of spirit, and that any other way of understanding reality is something extra added in, something in need of explanation. Taylor’s A Secular Age might trade in the “Age of Faith” trope, but its great contribution is to point out that ours is an age of faith, too. And not many scholars have wrapped their minds around what that might really mean.

For the historian, Othering isn’t the only risk at play here. If Othering is the Solve of academia’s identity-political alchemy, then Appropriation is its Coagula. Appropriation is the opposite error of asserting not difference but sameness between another subjectivity and your own. If you have ever objected to a white kids affecting African American styles or someone insisting on colorblind politics (“I don’t see race, I see people”), you are objecting to their assumption that there is no difference between their own subjectivity and someone else’s.*

The phenomenon I’m talking about here is broader than what is normally meant by Appropriation, though. What I am really writing about is an error of the historical imagination — the shoehorning of Medieval people, or Ancient Greeks, or modern magicians, etc., into our consensus habits of mind. Perhaps we need some other name for it. For now, I will call it “smOthering”: the hidden flip side of Othering, a different reaction to the same basic inability to accept a noetic difference between ourselves and our subject. Whereas Othering involves throwing up our hands (whaddaya gonna do with these primitives), smOthering begins with the assumption that there must be a fundamental similarity of mind and proceeds to smother all real differences in a forced consensus. “Just like us” makes the Other vanish just as effectively as “not like us.”

One very common way of doing this is to find something that simply cannot be made to fit a modern view of reality — say, William Blake’s notion that the sun is a host of angels — and to assume that it is a metaphor, that the sun is like a host of angels. We can only assume that Blake is a reasonable sort of creature, much like ourselves. (And of course no-one sets the standard of reasonableness quite so well as a present-day humanities professor!) So we cannot believe that Blake is really saying that the sun is literally, actually a host of angels:

What are we to say of the man who fixes his eye on the sun and does not see the sun, but sees instead a chorus of flaming seraphim announcing the glory of God? Surely we shall have to set him down as mad … unless he can coin his queer vision into the legal tender of elegant verse. Then, perhaps, we shall see fit to assign him a special status, a pigeonhole: call him “poet” and allow him to validate his claim to intellectual respectability by way of metaphorical license. Then we can say, “He did not really see what he says he saw. No, not at all. He only put it that way to lend color to his speech … as poets are in the professional habit of doing. It is a lyrical turn of phrase, you see: just that and nothing more.” And doubtless all the best, all the most objective scholarship on the subject would support us in our perfectly sensible interpretation. It would tell us, for example, that the poet Blake, under the influence of Swedenborgian mysticism, developed a style based on esoteric visionary correspondences and was, besides, a notorious, if gifted, eccentric. Etc. Etc. Footnote.

This is Theodore Roszak, in The Making of a Counter-Culture (1969), a dashing assault on the modern “myth of objective consciousness” and a call for us to cultivate a premodern, mystical way of seeing the world. But Roszak was not so foolish as to imagine that his fellow academics would be much help on this score. What is striking about this passage is how little has changed since 1969: a present-day scholar, however “postmodern,” would almost certainly end up smothering the challenging weirdness of Blake’s vision in much the same way. “Art” has become a hold-all category used to stash any thoughts that cannot otherwise be assimilated to the modern’s naïve construal. A scholar of magic is faced with a stark choice: call it art or call it madness. Calling it madness is Othering; calling it art is smOthering.**

Next time: the Ontological Turn to the rescue! Maybe!

*I should mention in passing that I think our concern over cultural appropriation is ludicrously overblown, and that in many, perhaps most instances, it’s a bullshit concept. The flap over “cultural appropriation” in the General Tso’s Chicken served at Oberlin’s dining halls marks the reductio ad absurdum of this style of critique. But that’s an argument for another day.

**Well, you could call it “religion,” but this introduces a whole different set of problems. Again, that’s an argument for another day.

Posted in Academia, Weird Studies | 3 Comments

Better Off Not Knowing?

It seems relevant, on a day when an important Republican operative defended the recitation of a plagiarized speech by the wife of the Republican candidate for President of the United States by saying it came from My Little Pony, to ruminate a bit on sources and where they come from.  Rather than talk about a corrupt, vain, narcissistic publicity hound who stops at nothing to get attention, I’ll instead offer a little cautionary tale about Mick Jagger.

Here’s a passage from Stephen Vincent Benét’s 1937 novel The Devil and Daniel Webster; the 1941 film of the same name retains the same speech, I think.  Here, Scratch (=the devil) offers his resumé as an American:

“And who with better right?” said the stranger, with one of his terrible smiles.  “When the first wrong was done to the first Indian, I was there.  When the first slaver put out for the Congo, I stood on her deck.  Am I not in your books and stories and beliefs, from the first settlements on? Am not spoken of, still, in every church in New England? ’Tis true the North claims me for a Southerner, and the South for a Northerner, but I am neither.  I am merely an honest American like yourself—and of the best descent—for, to tell the truth, Mr. Webster, though I don’t like to boast of it, my name is older in this country than yours.”  Stephen Vincent Benét, The Devil and Daniel Webster (New York: Farrar and Rinehart, 1937), 39.

Mick Jagger’s song Sympathy for the Devil (1968, off Beggars Banquet) rings familiar, here.  Here are a couple of relevant verses, just to illustrate:

And I was ’round when Jesus Christ
Had his moment of doubt and pain
Made damn sure that Pilate
Washed his hands and sealed his fate

*****

I stuck around St. Petersburg
When I saw it was a time for a change
Killed the czar and his ministers
Anastasia screamed in vain

Now, here’s Mick, contextualized and quoted in Wikipedia:

In a 1995 interview with Rolling Stone, Jagger said, “I think that was taken from an old idea of Baudelaire’s, I think, but I could be wrong. Sometimes when I look at my Baudelaire books, I can’t see it in there. But it was an idea I got from French writing. And I just took a couple of lines and expanded on it. I wrote it as sort of like a Bob Dylan song.” (Wenner, Jann (14 December 1995). “Jagger Remembers,” Rolling Stone, Retrieved 25 June 2006.)

Sourcing Baudelaire has more art-school credibility than an American novel or film, I suppose, even though the spitting impieties feel far more American (to me, certainly) than French.  I can’t be the first one pointing out this correspondence, but I don’t recall having heard it before.  In an interview I once read, Warren Zevon admitted: “Sometimes you’re better off not knowing where things are from.”  And, of course, there’s the cliché that a student borrows, but only a genius can steal.  (Stravinsky?)  And again, we have Brahms railing against people who seek to identify the sources of tunes, peering into the composer’s closet as it were (I’ve never found that to be entirely what it seems to be on the surface).  What about stealing something that isn’t there?

I’m sure that every musicologist has had the experience of looking at a score, apprehending it, “hearing” it even, and beginning to formulate a brilliant, discerning critical opinion, only to have it blown to pieces when the piece is finally heard in real time.  It is a common occurrence for musicians to think something original when in fact they got it from another musician.  The case can be made that it is easier to do this unconsciously with music than with words, and I would agree.

Still, the moral of this story it: own your $λ¡†.

Thus endeth the lesson.  I would appreciate knowing if someone else has already pointed out the bit about the Benét novel.  It seems so obvious, but I don’t recall having seen it.

Posted in Current Affairs, Ethics, Rock | 1 Comment

The weird and the naïve

In a post I wrote entirely too long ago (sorry), I imagined a new scholarly subfield called “Weird Studies,” which would presumably do for weirdos what Queer Studies has done for queer sexuality: formalize and fortify a marginal social identity by giving it academic sanction. A guy lurking around the local occult bookstore is just some weirdo; a guy publishing a 10000-word peer-reviewed essay titled “Tentacles, Pentacles, and Testicles: Weird Masculinities in Contemporary Occult Publishing” is professionally, capital-W Weird.

But of course there is something off-putting about the notion of institutionalized Weird: H. P. Lovecraft didn’t have tenure. So, having begun my last post by announcing the birth of Weird Studies, I ended it by announcing its death — a death that lacks only the formality of having happened.

All the same, I’m going to take this thought-experiment just a little further.

The question I asked last time was, what would it take for Weird Studies to become a real academic subdiscipline? I suggested that it would have to overcome a certain generic inertia, but here I would like to suggest that it would also have to overcome the modern academic’s naïve construal of reality.

Using my occult powers, I can hear the affronted inner monologues of my academic readers. Naïve? Moi? I’ve spend decades and a small fortune in student loans to learn how to see through every naïve assumption that humans beings can make! I now see every worldview as a mere artifact of culture and history

Let me ask you a question, then: what if I were to claim in all seriousness that I can use occult powers to hear your inner thoughts? Obviously, I’m kidding around here, but what if I weren’t? That would be weird, right? Imagining that you can hear other people’s thoughts is the sort of thing that paranoids and deluded New Agers believe. Even if you yourself have had so-called paranormal experiences*, you would think twice about saying so in an academic publication. You wouldn’t want to come off as naïve, or worse. But if the worldview in which telepathy makes sense is naïve, what about the worldview that says it is naïve? If you’re the sort of person who believes that all worldviews are a product of historical, social, and cultural contingencies (and if you’re in a postmodern academic humanist, you probably are) then do you exempt your own worldview from this sweeping claim? Wouldn’t that be naïve?

Much as we might like to think otherwise, our own worldview clearly has its own characteristic and largely unspoken assumptions, and one of them is, the contents of mind cannot be communicated outside of material channels. You can tell me that you object to me calling you naïve; you can write me an angry email; you can flip me the bird. But I can’t just stare into your eyes and know what you’re thinking.

holy moment

We know this not a universal belief. We can understand that people have seen things differently at other times and places. But we cannot see with their eyes, and past a certain point we don’t try. Scholars know that the ancient Romans respected the power of dreams to foretell important events, for instance, but they understand this as an interesting thing that people used to believe; they don’t treat it as a viewpoint available to themselves. (This is historicism in a nutshell.) A scholar might tell you how the Romans examined the entrails of sheep for omens of the future, but s/he will probably not argue that they would have gotten better results by examining burned turtle shells. The Roman worldview is bracketed off and treated as just another way of looking at things. Scholars converse with their (virtual, historical) interlocutors the way they would talk to a homeless person who’s yelling at trees: listening and nodding politely to whatever he’s saying and then walking away thinking “that guy is fucking crazy.”

Postmodern academic humanists like to say that our own viewpoint is yet another way of looking at things, no better or worse than any other. But we don’t really believe this, any more than we think the guy yelling at trees makes some good points. Even if we decide that all points of view are relative, we mean that all points of view are relative to our own, which we tacitly hold to be absolute. The term “naïve construal” refers to those beliefs we hold so unselfconsciously that we simply cannot think of them as just another way of looking at things. When I say that the modern academic has a naïve construal of reality, I don’t mean it’s particularly naïve; it’s just that all human beings have naïve construals, and we moderns are no different, even if we think we are.

In his epic philosophical history A Secular Age, Charles Taylor writes that such a construal does not even rise to the level of an explicit theory — say, a Cartesian theory of mind and matter. Rather, it is “our contemporary lived understanding; that is, the way we naïvely take things to be. We might say: the construal we just live in, without ever being aware of it as a construal, or — for most of us — without ever even formulating it.” [Charles Taylor, A Secular Age, 30.]

Taylor is here trying to explain the difference between our own construal and that of an “enchanted” age — between what Max Weber called the rational “disenchantment” (entzauberung) of modernity, on the one hand, and on the other the Medieval Christian world, in which human beings find themselves vulnerable to the influence of demons, angels, saints, curses, healing miracles, and a host of unseen powers. Much of Taylor’s work in A Secular Age is to fashion a narrative by which the West** moved from enchantment to disenchantment. Now, the usual narrative of how that happened is what Taylor calls a “subtraction narrative,” in which disenchantment pertains to a true understanding of the world while enchantment is something extra added in: subtract this gratuitous enchantment and the truth at last shines forth unimpeded. Taylor sets about showing how this story is itself a pious and stacked-deck fiction. Our default perspective of scientific naturalism is unique in human history and could never have occurred to us without some specific adjustments within Christian doctrine.

I can’t possibly do justice to the full complexity of Taylor’s argument — A Secular Age is almost 800 very dense pages long — but here is one very simplified and abridged version of one aspect of it:

Christian theologians have always been at pains to make God as singular, unbounded, unchallenged, and total an entity as possible. Consequently, at a certain point they reject the notion that natural laws (in Aristotelian term, the essences of things) manifest God’s actions, because that would suggest that God’s power is limited by His creation. (If it is the nature of water to be wet, and water is wet because God made it so, then He couldn’t also make it dry and crunchy.) Nominalist theologians including William of Ockham (he of the famous razor) thus denied that things have intrinsic essences or meanings; there are only things in themselves and the names or concepts*** we humans apply to them. In this way, God becomes ever more loftily absolute, abstract, remote, and untouched — a principle rather than an entity.**** He’s still in the cosmological picture: he’s the principle that makes meaning possible; but he’s not the author of the particular meanings that appear in our sublunary world. We’re on our own here. God is evacuated from the world here-and-now, and with that, we come a step closer to the Deist God-as-watchmaker, or for that matter the Deus Absconditus who was once in the world but is now hiding out, like a deadbeat Dad who went out to buy a pack of smokes one day and never came back. Once we have moved God this far to the side, it makes little practical difference if we omit him entirely from the picture. It is only a short step from a Deist, who imagines a clockmaker-God winding up His creation and letting it run on its own, to a scientific naturalist, who imagines only the clock.

Think of it this way. When you’re young, you have a transformative musical experience — you hear a great work of music, like a Bach passion, and you are swept away by its power. You think, this is a work of genius; its power and greatness is intrinsic to what it is; it has the same power and greatness now as it did when it was composed, and it will be as great and powerful in a year, or 100 years, or 10000 years; it is as great here in the United States as it was in Leipzig and it would be just as great on Alpha Centauri; the genius that went into it makes it essentially different from other pieces of music. Then you go to college and are taught genius is not in the music, but something certain people at certain times have attributed to the music, for various reasons that seemed important at the time; in other words, you come to see that genius is historically and culturally contingent, whereas formerly you thought it was unchanging and universal. In short, what we took for a work of genius is merely a “work of genius.” I have written at length about the difference those scare quotes make, and here I would simply suggest that the difference between the un-scare-quoted work of genius and the “work of genius” is the difference between an idea of music as having an essential meaning and the nominalist idea that such qualities as genius, power, and greatness are simply conventional words or concepts we attach to things.

The academic habit of bracketing everything with scare quotes and reducing meaning to the mere exchange of conventional signs among human actors is supposed to be a postmodern thing, but Taylor argues that its roots are in Medieval theology — the very roots of the modern, it turns out. The difference is that the Medieval nominalists did not take God out of the picture; they simply moved him very far away. They did this to protect Him, but they unwittingly set in motion a chain of historical events that have led to us, their philosophical heirs, simply assuming that God is not a part of the picture, and never was.

That last innovation is what leads us ineluctably towards the subtraction narrative, which is the warrant for our own naïve construal. We believe in a world evacuated of spiritual agency — even, oddly, if we still believe in God — and this absence is a capital-T Truth, something that is and was always true, just waiting for us to unveil it. However, what seems to us an eternal and universal Truth turns out to be just as contingent and historically-determined as anyone else’s worldview. There’s a story behind how we came to think what we think. (The whole point of A Secular Age is to tell it.)

Once we can see our own construal of reality as just that, a construal, we can start asking what its features are. Taylor addresses this early in A Secular Age by contrasting enchantment with disenchantment:

Let me start with the enchanted world, the world of spirits, demons, moral forces which our predecessors acknowledged. The process of disenchantment is the disappearance of this world, and the substitution of what we live today: a world in which the only locus of thoughts, feelings, spiritual élan is what we call minds; the only minds in the cosmos are those of humans (gross modo, with apologies to possible Martians or extra-terrestrials); and minds are bounded, so that these thoughts, feelings etc. are situated “within” them. [Charles Taylor, A Secular Age, 29-30.]

Now, recall where this post started, with my suggestion that it would come off a bit weird if I were to express a straightforward belief in telepathy. We can now see a bit better what’s weird about it: telepathy violates rule no. 3, that minds are bounded. Well, it’s not a “rule,” exactly, but rather an aspect of Taylor’s “construal we just live in, without ever being aware of it as a construal, or — for most of us — without ever even formulating it.” It’s just how you’re supposed to think.

What is weird is precisely what lies outside this unspoken set of assumptions. “Weird Studies” would be academic work consciously conducted outside the naïve construal of modernity. I wrote at the beginning that a hypothetical Weird Studies would have to overcome this construal, but it would be just as true to say that it would be constituted by it as well — its negative image, like the death-mask that takes the imprint of a face.

Next time: part 2!

*For example, a precognitive dream such as Mark Twain had of his brother’s death. Jeffrey Kripal has taken to calling such experiences “super natural” — no hyphen, just a space. See his most recent book, The Super Natural: A New Vision of the Unexplained,  a collaboration with Whitley Strieber that offers a toolkit for busting out of the modern’s naïve construal.

**One of the questions that comes up when thinkers try to analyze or explain the beliefs and assumptions of we moderns is, who is this “we”? Calling it “the West” gestures at a certain historical reality, but it is still very imprecise. It is after all quite possible for people in the West to believe in telepathy: many modern people have had the experience of knowing things through seemingly inexplicable means. And “modernity” is a notoriously roomy conceptual basket: you can put a lot of stuff in there, a lot of people and cultural artifacts and social phenomena, and its contents are correspondingly so varied they lack any single defining characteristic. So Haiti plays its part in modernity, but the widespread practice of Haitian Vodou rests upon assumptions and beliefs that run counter to the “naïve construal” of modernity I describe here. For that matter, the widespread American belief in angels does as well.

So I am mostly interested in discussing the worldview of those intellectual and cultural elites that set the tone of discussion, and the way the discussions they set are shaped by default assumptions of modernity. Sure, a lot of Americans believe in angels. For all I know, Pamela Paul believes in angels. But as the editor of The New York Times Book Review, Paul is not going to run a bunch of stories on books about angels, unless it’s some “religion in America” special issue, in which case the tenor of the reviews will be one of anthropological curiosity or “what this tells us about America today.” The NYTBR isn’t going to tell us anything about angels as such, like how they talk to people or what their powers are; it’ll tell us about what people think about angels. In other words, the default mode of the NYTRB and similar tone-setting flagship organs of modern thought and opinion is historicist irony.

What I mean to say is, there is a difference between what individuals in a given society believe — a wide spectrum indeed — and what beliefs are represented in those spheres that represent the consensus of a given culture. As Taylor points out in A Secular Age, early Medieval Christians blended church doctrines with holdover pagan beliefs; it was quite possible for an English peasant to pray to the Christian god every Sunday but make offerings to Freyr for a good harvest. But for the most part one looks in vain for any record of paganism in official documents. The proverbial researcher from Mars, looking for signs of belief in telepathy or angels among 21st century American humanities professors would have a hard time finding any direct textual evidence of it, at least in academic publications. But of course our own lived reality is far messier.

***Yes, names and concepts aren’t the same thing, and yes, I’m glossing over a lot of detail.

****In such a way of thinking, there is no room for magic, because magic is a kind of spiritual agency in the world, and if God’s hand has been withdrawn from our affairs, then whose hand is left to move in mysterious ways, its wonders to perform? There can be no miraculous cures or visitations of incorporeal spirits, because that domain of power belongs to God alone, and He’s not here. Christians have always said that magic is wicked sacrilege, but now they start saying there’s no such thing. Sometimes, confusingly, they say both.

Posted in Academia, Philosophy, The Modern, Weird Studies | 4 Comments

20th

My wife and I were married 20 years ago today. Happy anniversary, babe.

wedding

That’s the Weisman Art Museum in Minneapolis, by the way. It was a beautiful place to get married. It was, among other things, gloriously reverberant. One friend later said something like “I went to a great concert, and some people got married.” (A twist on the old joke: “I went to the fights and a hockey game broke out.”) Whatever else might be said about the life of a musician, you at least have a lot of musician friends who can play at your wedding.

The program:

Mia Hynes and Troy Gardner played the prelude, which was the first movement of the Brahms G Major violin sonata. Mia then played the processional, which was the Beethoven Bagatelle in E flat, op 126 no. 3.

Finally, Timothy Dunne played Domenico Scarlatti’s G major sonata K. 13 as the recessional.

Thank you, old friends, for making our wedding so beautiful.

Thank you, Helen, for making my life so beautiful.

 

Posted in Life | 1 Comment

Hey Hey!

07 04 16 Hey Hey!

Note: I’m in the middle of a summer that is not letting up—writing, editing, planning, traveling—and it’s going to go right up to the new semester (and probably into the next three years, I should live that long).  Several likely blogpost subjects have come up, and I’ve done half-drafts of a couple, and…nothing.  Quoth the Red Queen: “It takes all the running you can do,” etc.  Phil has written on Deep Stuff, and I feel like I haven’t a thought in my head most of the time, save what I’m trying to cobble into the next paper/book chapter/detailed letter to a grad student/E-mail to colleagues, etc., at any given moment.  I apologize.  So I’m writing about something that just makes me happy.

——————

How long has it been since you drove on a hot day, windows down, stereo up, hearing a new good-time album?  For those who remember the ’60s or early ‘70s, I’m guessing a long time.  For those born after, you’re excused.  There’s a particular kind of wackiness that can be heard on a lot albums from that time: killer, ear-candy singles, occasional hilarious bits of session chaff, or stuff left on vocal tracks either accidentally or to suggest the in-the-moment recording experience, a cover version or two, and best of all, the one or two utter turkeys.  But you listen hungrily, song by song, while (as Randy Newman put it) “Santa Ana winds blowin’ hot from the north!”

Folks, the Monkees are back with a new album.  A sweet-as-an-orange-popsicle feel-good project on Rhino records, Good Times is that album.  The whole band is there, including the departed Manchester Cowboy himself, Davy Jones—an unreleased version of a Neil Diamond song, “Love to Love,” done back in the day, never released, and requiring only minimal punch-up.  Davy sounds great, the arrangement is school-of-I’m-a-Believer, and it’s the knife-edge between giggling and having a lump in your throat, because all the pieces are there but the person no longer is.  Others from back in the day that deserved and received resuscitation: “Good Times” (a Harry Nilsson song; he’s playing on it, even) and a cover of Carole King’s “Wasn’t Born to Follow”…enjoyable, though I myself can only really conceive of that one in the McGuinn/Byrds incarnation.  There’s a Boyce and Hart number, and there’s another one—“Gotta Give It Time”—in the trying-to-convince-her-to-go-the-distance genre (think “I Think We’re Alone Now” by Tommy James and a gazillion others).  All upbeat, catchy, good humored.  As for the turkey, well…sorry, Mr. Nesmith.  “I Know What I Know” has apt period hypersensitivity and turgidity, but that verisimilitude doesn’t make it any easier to listen to.  Even if he wrote it back then and brought it back for this precise reason, it’s pretty leaden.

So the first point is: BUY THIS!  It makes me happy, and I like things that make me happy.  Slap it on the iPod and fire up the Millennium Falcon or whatever you’re currently driving and turn it up.

My second point is a bit more serious.  In the later eighteenth century, music for Kenner und Liebhaber—both for cognoscenti and for those who like a toe-tappin’ good tune, in other words—was particularly valued.  The broader potential audience didn’t hurt, of course, but there was a particular value in composing fine music that was so accessible that it could wear its learning lightly.  C. P. E. Bach, a musician’s musician, published music with that German phrase in the title, and Mozart and his father discuss similar ideas in their correspondence: it was high praise indeed to write music everyone could enjoy but which highly educated musicians could admire and dissect on their own professional level.  In my own work, I think about this a lot; music that does not require repeated listenings and self-education—and, indeed, conquest of initial resistance—tends to be distrusted.  How good can it be, the thinking seems to run, if I can just immediately like it?  We descendants of the German tradition hardly want our music to be easy, after all.  Chopin, Dvořák, and Gershwin have all paid a critical price for the appeal of their music…

Good Times repays repeated listenings because it works on several levels.  You get these wonderful, catchy, chirpy songs (old and new alike), yes.  But you also get the inside jokes in the songwriting, in the recording, and a variety of references to 1960s albums and musical genres: a reprised version of one of the songs, some poppy sitar-psychedelia, everything about the pacing.  Your non-music-major friends will be the Kenner on this trip, getting jokes and references that leave you behind.  The entire thing is a love note from the Monkees to their past and to their fans, and…y’know?  You’ve been working hard.  You deserve this.

Posted in Old Honkytonk Monkeyshines, Pop Aesthetics, Pop Culture, Rock | 1 Comment

Misc. Gifs

Dedicated readers already know that I like animated GIFs. I love finding funny little moments in online videos and giffing them up. At a certain point, though, I end up with a backlog of neat animations for which I’ve never found a good purpose. Not yet, anyway. Rather than let them languish on my hard drive, I’ll share a few with you. Maybe Music Theory Augmented or When In Musicology can think of what to do with them.

Boxing offers many metaphors for life: “saved by the bell,” “down for the count,” “throw in the towel,” “come up to scratch” (from the old bareknuckle days, when answering the start of a round meant walking up to a scratch-mark on the grounds), and my favorite: “didn’t lay a glove on him.” Meaning, well, this:

weave

And even better, this: Muhammad Ali slipping something like 20 punches and then wiggling his butt at the hapless Michael Dokes for good measure:

ali

Let’s see, academic situations to which these clips could apply … a successful dissertation defense? Overwhelming superiority demonstrated in the Q&A of your conference paper? The effectiveness of evaluation forms as a medium of revenge for disgruntled students?

But if you’re looking for a pugilistic visual metaphor where the punches actually land, you could do worse than this one, from the brawl between Don Frye and Yoshihiro Takayama, back in the early days of MMA, before they discovered defense.

beating

I’m sure we’ve all seen Q&A sessions that looked like that.

“But what about brawls in space? With robots and monsters?” Got you covered.

punchout

Eric Henry and Syd Garon, “Sneak Attack,” from “Wave Twisters”

“But what if I wish to convey the idea that, while I have not yet committed an act of violence, it’s absolutely the next item on my agenda?”

rarr!

Genndy Tartakovsky, “Chapter Six,” from “Clone Wars”

Patrick Dunn, one of the few working academics writing openly about magic and the occult, wrote that a clip of a raccoon trying to eat cotton candy is “the perfect metaphor for writing occult books.” Or writing anything at all, I might add.

raccoon

Sometimes I just like to make GIFS of moments in films that give me a little shudder of strangeness, like this clip from Joseph Cornell’s surrealist cut-up, Rose Hobart:

rose hobart

Or this odd scene with Grace Zabriskie in David Lynch’s Inland Empire:

murder

This seems like a reaction GIF waiting to happen. There are just so many situations a professional academic encounters in the course of a work day that call for brutal fucking murder.

Cartoons are especially good for GIFs. They loop easily, and good animation offers tiny fugitive moments in which the full anatomy of a gesture is laid bare. Like,

fighting back tears:

pearl cries

Rebecca Sugar, “Cry for Help,” from “Steven Universe”

or getting away with imposture:

leopold2

This is from the Looney Tunes short, Long-Haired Hare, in which Bugs defeats his foe through the novel expedient of pretending to be Leopold Stokowski. The little gesture enshrined in this GIF combines Bugs’ maestro impersonation with that little giveaway hitch of his right eyebrow. It’s like, Am I getting away with this? Yes? Then how about …

leopold

THIS!!!

 

Posted in GIF post

On John Crowley’s Chemical Wedding and the Switched On Pop podcast

I’m working on another long-winded blog post to follow up from the last one, but it will take me a while to think through it. So in the meantime I want to write about a couple of things about which I am currently very enthusiastic. They are my whole shit.

Whole shit no. 1: John Crowley’s Chemical Wedding

https://i1.wp.com/con-or-bust.org/wp-content/uploads/2016/05/6e91009b67fd4afad0cc4ce9cf2c513f.jpg

John Crowley is publishing The Chemical Wedding, his retelling of The Chymical Wedding of Christian Rosenkreutz. The latter is a somewhat mysterious book published in 1616 by a secret society known generally as the Rosicrucians. Musicologists know of the Rosicrucians (if at all) by Erik Satie’s association with a group of their self-proclaimed spiritual heirs. I could go into a thing about how the Rosicrucians were a esoteric society of secret teachers with a mission to bring spiritual enlightenment to a world riven by religious conflict, or maybe the whole thing was a prank. But I won’t, because there’s quite enough history-of-weird-ideas stuff on this blog, and what I really want to write about is John Crowley, who, if you put a gun to my head and forced me to choose one favorite novelist, I would say is my favorite novelist. At least, he wrote my favorite novel, Little, Big, which I tell all my friends they need to read (grabbing them by the lapels and shaking them, if necessary, to make my sincerity truly felt and known) and which they generally bog down in by about page 10o. “You’re supposed to get bogged down!,” I tell (yell at) them, but they never believe me.

An old friend, whose favorite novel was Thomas Mann’s The Magic Mountain, once told me that its formidable reputation for Teutonic longeurs was actually deserved, but that I should read it anyway, because longeurs are the point. The novel starts off with its protagonist, Hans Castorp, going about his ordinary life and preparing for a career in shipbuilding or some shit, and this part of the story is like our own daily lives, neither particularly eventful nor particularly uneventful. Then Castorp travels into the mountains to visit a cousin at a tuberculosis sanitorium and at first plans to stay only for a few days, but that turns into a couple of weeks, and then his departure is delayed again … and this part is very uneventful. It is at this point that most readers will put the novel down and give it up. Well, I guess literature is supposed to be boring …

But my friend’s point was that Mann is trying to do something very difficult here: he is trying, in prose, to convey the feeling of time slowing down. Or maybe it’s not quite right to say that time is slowing down — it’s just as true to say that years slide by in no time at all — but it’s certainly true that time begins to move differently; it takes on a different texture; its passage feels different from before. Castorp doesn’t really notice this, though, any more than we notice how different our dreams feel from the waking life we have left behind. He simply finds himself deep in the dream, folded into a blanket of congealed time. And Mann wanted to give the reader exactly that feeling too. Which is, you must admit, some trick to pull off.

So my theory is that Little, Big pulls off a similar trick. The novel starts, like The Magic Mountain, with a strenuously normal young man on a journey into a realm where time means something different — a realm from which he will never really return. The journey starts off in the world as we recognize it, on cracked blacktops past industrial parks and weed-choked vacant lots, and moves by degrees deeper into an old-weird-American magical realist hinterwelt. The young man becomes the half-unwilling protagonist (or maybe just a bit player?) in a multi-generational story of a family living in a shambling architectural folly surrounded by a wood inhabited (maybe) by fairies. I can’t possibly do justice to this long and complex novel in a single paragraph, but it’s enough here to say that one of the running conceits of the novel is the cyclical time of the seasons, and the long second section of the book takes place in the near-stasis of drowsy summer. So when you start feeling like Little, Big is going nowhere and think goddammit, Phil Ford just got me to waste ten bucks on this stupid thing, keep reading. Fall is coming, and then winter, and then spring again …

But anyway, the point is, The Chemical Wedding, an allegorical tale of spiritual enlightenment that is also just a really fascinating and peculiar story, is being retold by my favorite novelist. Really, that’s all I have to say about that. Oh, and there’s a Kickstarter for the hardback edition, which will include woodcut illustrations by Theo Fadel and promises to be a really lovely book. So pitch in if you like lovely books.

 

Whole shit no 2: The Switched On Pop Podcast

I was vaguely aware that the Switched On Pop podcast existed, but I never listened to it until a couple of days ago. It is now my whole shit.

I’ll admit, sometimes I roll my eyes when I hear people talking about “public musicology” as if all that is required is musicology plus the public. Which will somehow just show up if you put a simplified version of your usual academic stuff on a blog. As if the only thing that’s been preventing us from finding a public is the right platform. As if the public will turn up if only the New York Times would run my 10000-word piece on organ tuning in 16th century Zurich, suitably condensed.

The thing I have never once heard brought up in discussions of “public musicology” is the most obvious point, which is that to do public musicology, you need to entertain a public, which in turn means that you have to take entertainment seriously. But academics hate the idea of calling what we do “entertainment.” So vulgar! To call a scholar “entertaining” conjures a mental image of someone in striped pants furiously mugging with a cane and a straw boater. And this is not entirely unjustified; we’ve all seen would-be popularizers of classical music trying way too hard to be fun and madcap.

jolson

Painful to watch, really.

The thing is, entertainment has its own rules; it is its own discipline. If you want to be a public scholar, you need to learn that discipline, and it isn’t any easier than what you learned in graduate school. And you will need to stop thinking that you’re too good for the ol’ hat-and-cane, or that you’re doing anyone any favors by condescending to explain Mozart to them. Entertainment isn’t a means to an end; it is its own end. Public musicology isn’t just the spoonful of sugar that lets the medicine of classical music go down. If you’re doing it right, it is work that stands on its own merits.

The number of people who are doing it right is vanishingly small. But let’s add Nate Sloan to the roll call right now. Nate is a Ph.D. candidate in musicology at Stanford University, and he and his friend and fellow-musician Charlie Harding have created 36 episodes (and counting) of a podcast in which they get deep in the details of pop songcraft. I started with the most recent episode, which deals with the Jonas Brothers and its leaders’ recent bids for pop-culture reinvention. Before listening to this show I knew nothing and cared less about the Jonas Brothers; if you had told me, before I started listening to the podcast, that I was going to hear close analyses of “Cake by the Ocean” and “Close” followed by a comparison between a Disneyfied boy band and W. A. Mozart, I would have thought, ugh, this shit again.

jolson

I would have been wrong. Listen. This is what public musicology should sound like. My 14-year-old daughter overheard me listening to it while I was cooking dinner and was at first suspicious (“is that music theory?”) but is now hooked on the show. That’s a test as rigorous as peer review right there.

Bonus shit: My Hairless Face

Last week I was trimming my beard and just kept on going. So I am now, for the first time in 23 years, beardless.

beardless!

Consider what that means: the last time I shaved, it was during the first months of Bill Clinton’s first term. And the internet didn’t exist. My wife and I have been together for 22 years and she had never seen my face. (She took it pretty well.) I have been having fun walking around Bloomington for the past few days and seeing people I’ve known for years walk right past me without a flicker of recognition. This was OK for a while, but at a certain point I get tired of having to say “Hi! It’s me! Phil!” So herewith a picture of my hairless (if five-o’clock-shadowed) face.

Posted in Books, Life, Podcasts, Pop Culture | 2 Comments