Tuesday, September 28, 2010

A Tribute to the Horse, Or: The Anatomy of an Obsession

I believe it’s an Arab legend that says God took the wind and shaped it into the first horse. To non-horse-people this probably makes about as much of an emotional impact as saying God took the Slinky and shaped it into the first cat. To horse-people it is the stuff of ecstasy. Could there be any myth more fitting than one that harnesses the wind into the arch of a neck, the toss of a mane, the thunder of hooves? Human beauty may inspire more poetry, but I think equine beauty would inspire better, if only somebody more talented than myself would write it. That’s a sidelong apology for waxing poetic as I always do when the subject turns to horses, but this post is my way of explaining myself, if I can.


When it comes down to it, I think no language has a term for the horse that is beautiful enough to suit that prince of animals: horse, cheval, caballo, certainly not the unfortunate pferd—who came up with that one? At least Icelandic’s hestur has something of the rush of wind in it, and Latin’s equus has a liquid, flowing beauty, if only it weren’t for the word’s recent association with a Daniel Radcliffe play. So, beautiful or not, horse is the term we have, but the thing itself is so much greater.

It’s not just that the horse is the most beautiful creature God put on this earth (me? Biased?)—though I seem to harp on aesthetics more than anything else. What makes the horse’s beauty so unique is that it is that rare kind of beauty that man can touch without spoiling it. There are few things as wonderful as a horse and rider working in tandem, both loving the ride, both supporting and encouraging each other. If dog is man’s best friend (and it’s true that a dog on a bad day is more effusive than a horse on a good day), horse is his only partner, in everything from farming to warfare. That partnership has too often been abused, and it almost always falls short of perfection (have you ever seen a rider take a header when his horse decides he doesn’t feel like taking that jump after all?), but I’m an optimist, and I refuse to be deterred.

I’ve always been a lover of all things with fur. I gravitate toward animals at social gatherings the way some people gravitate toward children or the open bar. But horses, horses are special. Through no fault of my poor parents, who don’t like the creatures themselves but very kindly indulged me, I have been madly obsessed with horses ever since I can remember. Collecting toy horses, drawing horses (I was forbidden in middle school art class from picking horses as the subject for any more projects), taking pictures of horses, reading about horses, watching for horses on every drive through the country. And, of course, when the opportunity presented itself, riding horses.

I must admit I am not a particularly good rider—I have barely enough leg muscle to make myself go, much less a horse, and my balance is questionable at best (as evidenced by my remarkable tendency to slide right off the saddle at the slightest provocation). But what kind of obsessee would I be if I let that stop me? Whenever I’m around “real horse-people”—people who own horses, are in “the business” of horses, grew up in the saddle—I inevitably reveal myself to know very little about horsemanship and to be capable of even less. (Witness last weekend, when I was given multiple sets of instructions on how to lead a lazy horse faster, all of which completely failed in execution. YOU may be able to lead a horse to water, but it remains an open question for me.) But what does that matter to one obsessed? I make myself useful mucking stalls and sweeping aisles—anything to be near to horses, to have the sweet smell of hay and saddle leather around me. I think it’s a very good thing that horses are entirely unimpressed by admirers, because if they were any more impressionable sort of being I think I would prove to be a terrible sycophant.

Now, it’s hard to be romantic about horses when you’re covered in dirt and working with the average old nag you find in most barns—maybe that’s why “real horse-people” tend to make fun of their horses more than they boast about them—and any horse can be just as exasperating and uninspiring for a horse-person as a screaming toddler is to a kid-person. They bite. They kick. They roll in the mud right after their bath. They stretch their necks out and drag their feet so that they amble like cows. They run you into fenceposts when you’re riding. And do they really have to drool so much when they pull their heads out of the water bucket? But then, a magical moment will happen all of a sudden between a horse and his human—he’ll arch his neck and throw his shoulders into his trot like a dancer showing off for the audience, he’ll prick his ears and look just like the most beautiful study Leonardo ever dreamed of, he’ll rest he face against your chest and sigh softly into your shirt. And those are the moments that even down-to-earth “real horse-people” secretly delight in and come back for, again and again.

So I remain unabashedly obsessed, and unabashedly romantic. In fact, when I see movies I’m usually more enamored of the hero’s horse than of the hero—give me Silver and you can keep the Lone Ranger; give me Tornado and…we’ll negotiate about Zorro. When Shadowfax materialized on the screen, unearthly and breathtaking, I sincerely forgot that Gandalf and Legolas were even there. I think it brought tears to my eyes.

It has become a joke with my family and my friends that I turn into a five-year-old whenever I see a horse: whether it’s onscreen or out the window, I have an embarrassing tendency to squeak, “Horse!” and point him out, even if he’s nothing more than a spot in the background. I try very hard to suppress this in professional situations.

Looking over this post, I know I haven’t done justice to my fixation, or indeed to the poetry that is a horse at his best—horse-people will know what I mean without my being able to articulate it, and non-horse-people can just smile and shake their heads. I can’t make someone not already inclined to it catch their breath at the flare of a nostril, or go lightheaded at the sound of hooves on turf, or skip with excitement at the thought of getting to pet the pony at the petting zoo. For a horse-person, no horse is too average to invoke the ideal, and no contact is too trivial to set the heart racing.
If you don’t believe me, stop by my apartment: it’s plastered with pictures of horses cut from old calendars. They may be the most convenient and economic method of covering bare wall space, but I’ve been known to spend long and happy moments just admiring whichever picture my eye happens to fall upon, no matter how often I’ve seen it. Such is the way of obsession, and long may it live! We should all have such things that set us thinking of beauty and joy in a world that’s got far too little of either.

Tuesday, September 21, 2010

The Ricecapades: Confessions of a Haphazard Chef

(As promised, some lighter reading this time around!)

I love cooking. And several people in my life (my mother, my grandmother, even my first boyfriend) went to great pains to set me on the right track and instruct me in that mysterious art of turning raw eggs, dry beans, and baking chocolate into omelets, soups, and cakes. In fact some of my most cherished moments past and present are exchanges of recipes and cooking secrets, but sometimes I do get the feeling I’m probably the most recalcitrant student who ever entered a kitchen.

You know how everybody jokes that if you put a sign up next to a red button that says, “Do not push,” you’re effectively guaranteeing that the next person who stops by will reach right out and push the button? I’m rather like that in the kitchen: let a recipe say, “Fold, do not stir,” and I’ll immediately start to wonder, “What would happen if I stirred?” And so I do. Of course, if the recipe had said, “Do not stir or your meringues will turn to toffee,” I would have folded. I learn either by thorough explanation or by (sometimes calamitous) experimentation—those are apparently my only two methods.

One of the reasons for my cavalier attitude toward instructions, I suppose, is that following them does not guarantee smooth sailing. Take, for instance, the instructions on the rice bag: “Fluff with a fork.” The first time I tried that I hit a pocket of steam that burned my knuckles. Instinctively I pulled my hand away, with the fork still in my grip, thus flinging sticky rice ALL OVER THE KITCHEN. I was cleaning rice off the stove, the floor, the walls, the cabinets, and even the window for a week. Laughing the whole time, by the way. More recently, an attempt to pour rice one-handed out of the bag and into a measuring cup resulted in at least half a cup of rice scattering in all directions across the counter and the floor. Hence the title of this post. At least it wasn’t sticky this time. Apparently I should just be forbidden from cooking with rice—though I have to say I never burn it to the bottom of the pan!



But my other reason for disregarding instruction is the sheer joy of experimentation. I’m like a kid with a chemistry set, only I get to eat what I make (if it’s palatable) instead of just having to clean it up afterwards. What’s the fun in following a recipe when you think you might have a better (i.e. tastier, or maybe faster) way of doing it? Sometimes your way turns out to be the wrongest way possible, but you never know until you try.

Therefore, I substitute shamelessly, and I measure with the precision of a chimpanzee. My excuse for the first is that while I lived in Philadelphia and was cooking on a shoestring budget, it became a bit of a game of mine to substitute or completely replace at least one ingredient in every recipe I made. My excuse for the second is that when I was in Ireland I had no measuring cups or spoons and learned to cook using the hit-or-miss “eyeball” method. One steers clear of messing around too much with the chemically reactive ingredients, of course (yeast is yeast and baking soda is baking soda, and neither—I have discovered—is the same thing as baking powder), but everything else is fair game! Real cooks will no doubt be horrified by my propensity to substitute a tablespoon of “Italian Seasoning” for pretty much any savory spice I don’t have on hand or have never heard of. “Use 1 C butter, not margarine.” Why not? In goes the Country Crock! I should note that I have never had a disaster caused by substituting margarine for butter. “Add a dash of cayenne pepper.” Is a dash bigger or smaller than a pinch? Shake the spice bottle over the pot until something comes out. I should also note I’ve never had a disaster caused by using a pinch instead of a dash—though shaking spice bottles without a shaker lid can, indeed, cause catastrophes.

Here, for your entertainment and not necessarily for your reference, are some substitutes I have tried in recent memory, for better or for worse:

Honey is NOT a satisfactory substitute for syrup when applied to pancakes. Syrup, however, can be a tasty substitute for molasses in bread dough.

Italian Seasoning, however flexible, is NOT good in three-bean salad that calls for fresh basil.

Skim milk CAN stand in for whole milk, whatever they tell you—nothing is going to blow up or fail to bake because it’s got less fat in it. It may turn out less thick or less moist, but that’s what you’ve got to expect when you’re cooking with milk-flavored water instead of the real thing!

Likewise, I’ve never, ever been able to tell the difference in baking and cooking with non-fat sour cream as opposed to full fat. Though maybe I’m a philistine.

Canned chicken works just fine in pies and chicken salad—any dish that has enough other flavors to disguise the slight tuna-fishiness that the can leaves behind.

If you’re going to substitute cocoa powder for baking chocolate, use shortening and NOT vegetable oil if the chocolate is supposed to set (like in icing)! I had a very droopy cake result from using oil instead of shortening. Think of that scene in Sleeping Beauty where Fauna the fairy decorates the cake before she bakes it.

Blueberry pie filling is NOT the same as canned blueberries. Pie filling can, however, make a very nice pancake or bread loaf—but it will turn out the color of Barney the Purple Dinosaur. So will your teeth.

And of course we’ve already established that salt and baking soda do not accomplish the same thing when added to boiling pasta. (See my blog from last September.)



I make myself out to be a real dunce in the kitchen, but I promise I actually make very tasty things—or at least they’re tasty to me. And I’ve only given myself food poisoning once! How else are we supposed to discover new things? (Though suddenly I’m foreseeing a precipitate drop in the number of invitations I receive to potluck dinners….)

Sunday, September 12, 2010

Going Medieval

I wrote this essay awhile ago in Iceland, but I was reminded of it by a recent discussion of periodization in our grad student medieval colloquium. (Yes, medievalists are so enthusiastic that they need a colloquium to share the thoughts that spill over when class time runs out.) It turns out my views here have already been expressed by Hegel (go figure), but I hope I’m at least more readable than he is.

Disclaimer: avert thine eyes if a little Catholic dogma will offed thee! Check back next time when I tread more neutral grounds. I am contemplating a post entitled “Ricecapades.”

Some Modern Medieval Thoughts on History

As a budding medievalist, I had always been puzzled by and old question posed by some of the foundational scholars of literary studies: when did Western culture develope a sense of history? The reason I was so puzzled is that scholars traditionally answered this question by saying “in the Renaissance,” and this tradition, transmuted into Core Principles, has become a persistent if often disputed pillar of academic thought over the past hundred years. But this answer made no sense to me. Had those making this claim never read Chaucer, with his comical aside that “we don’t court today the way Troylus and Cresseyde did back in Ancient Greece, but they got along well enough in love anyway”? Had they never encountered the pervasive sense of nostalgia that cast the eyes of medieval thinkers back upon the Classical era as one of nobility and prelapsarian achievement from which their present day had fallen? Did they honestly believe that the entire medieval culture existed in a Freudian infantile state, not knowing that “then” isn’t “now” and “now” isn’t “later”—that humanity had collectively not yet reached the Lacanian mirror phase and learned to distinguish “I” from “Other”? The notion struck me as ridiculous.

The notion IS, in fact, ridiculous, and it took me a very long time to realize that this is not what those scholars (or at least the good ones) were suggesting. In fact, scholarship of the past twenty years has pretty well booted that old answer out the door—the thing is, many scholars still speak (and worse, teach) as though the Renaissance is when history came to be. But on a personal level I have recently discovered that the question itself had always seemed odd to me because those who asked it were speaking from an entirely different perspective than mine. I had never felt that the medieval sense of history was infantile or primitive because, in fact, I share it.

When medieval thinkers reflected on the past, it was with a sense that the past was still a part of them. All the figures of history had some significance for the present, whether as precedent or symbol. This attitude is the founding principle of typological readings of the Bible—indeed, the interpretation of all history: David was a prefigure of Christ and the Vandals who sacked Rome were a prefigure of the Vikings who raided Northumbria. Nothing in the past was entirely remote from the present. This is the attitude cultivated by the Communion of Saints. The dead were no longer “here” in a physical sense, but the physical was all that separated them from the present and even that barrier was permeable; they interacted with the “now” through miracles, they were accessible by relic and by prayer. Sometimes, they even sat up in their corporeal bodies and spoke! Even with such extremes set aside, in a time organized around saint’s feast days and a geography organized around pilgrimage, Peter and Paul were no more remote than a father or a brother who died last spring.

Granted, the medieval sense of what actually constituted history was significantly different from what has counted as history since the Renaissance; legend and myth were just as potent as factual events, and for much of the medieval period only the most penetrating and exacting minds drew strict lines between them. As it turns out, there’s much to be said for such a broad view of reality—but that’s a soap box for another derby. The definition of history is not what those foundational scholars I mentioned were primarily interested in. If it was, they would have asked instead when Western culture began to define myth as a genre. And even then, the answer would not be “in the Renaissance,” because such thinkers as Bede and even as far back as Augustine had a clear sense of the fantastic as opposed to the factual. What our scholars were really asking about is when Western culture began to encircle the individual in his discreet moment with a fence that segregated him from everything that was not “I” and “now.” Essentially, they are asking when the Communion of Saints died.

And now we can understand why the answer has so often been “in the Renaissance.” With the rise of humanism, man began to see himself as a remote being, unconnected to the past, separated from the future, and cut off from all the other remote beings floating in space around him as if by a cosmic accident. This at last is the “individual” that scholars of early modern literature have held that the Renaissance “discovered.” I believe that current critical thought would say instead that the Renaissance CONSTRUCTED it—that is, this perception of individualism is a cultural phenomenon and not a natural one or even the only one to which a civilized person might adhere—but the old ways are deeply ingrained in the academy and the old thoughts refuse to die. I myself would side with the “construction” rather than “creation” of the individual, of course, but not just because I am a scholar. Rather, because I am a Catholic. “Catholic,” by its very nature, encompasses universality—not an artificial conglomeration of discrete parts but an organic whole of creation. No man is an island, no moment is independent of what came before and what will come after. There is still a train of thought that holds Peter and Paul at no greater distance than a departed father or brother. The Communion of Saints has not died at all.

But, in some circles, it has been forgotten. When scholars of fifty years ago looked blandly back on the past and sought the moment in history when history itself came into being, they were on a quest that would have been incomprehensible to medieval thinkers, and it has resulted in the relegation of medieval thought, at least to some, to the realm of childish credulity and simplicity. But the notion should be incomprehensible to a Catholic thinker as well. The reason I did not understand the answer is because I could not understand the question: the medieval sense of history, in its fundamental form, is my own. One of the most encouraging trends in the academy is an increased awareness of the intellectual validity and productivity of viewpoints other than the disinterested and narrowly scientific, and the academy will be all the richer if this more flexible thinking permeates the foundational structures that underly university scholarship and instruction: it will open higher learning back up to seeing this innately Catholic viewpoint as more than a mere relic of the era before history began to be.