Universally Monstrous: Filmland’s Famous Fiends and The Humanity of Horror

Although lauded for their archetypal images, gloriously gothic sets and stark cinematography, the Universal monster movies of the 1930s and 40s are often criticized for deviations from — and oversimplifications of — their literary, folkloric and mythological source material.

Where Karloff’s creature could only grunt, Mary Shelley’s man-made man read Milton. Stoker’s wild man of a vampire, a pointy-eared, bushy-browed mustached aristocrat, is a far cry from Lugosi’s thin-lipped, slickly coiffed and quaintly caped count. And what self-respecting werewolf — save Lon Chaney Jr.’s — would run amok in dress pants and a button-down shirt?

Yet, over seventy-five years since the release of Universal’s Dracula (1931), Frankenstein (1931), [the superior] Bride of Frankenstein (1935) and The Wolf Man (1941), Universal’s monsters are still the definitive versions of these pop culture icons, and for good reason.

A picture is worth a thousand words. And while it is undeniable that twenty first century audiences have the advantage of 24 hour access to a constant stream of content, cinema goers of the thirties and forties had something  that very few living have today:  a singularity of shared cultural experience.

Born of the shadows of German Expressionism with sights and sounds the silent age couldn’t offer, horror films of the thirties and early forties were one of the few forms of escape for audiences under the tremendous weight of the real horrors of the Great Depression. That shared experience was like an atom bomb dropped on the cultural landscape. Yet unlike the fears of the bomb (and all manner of death from the skies) that defined horror films of the fifties, the monsters of Universal’s heyday had a odd innocence and air of dread that made them all the more… human.

The penultimate, heartbreaking scene from James Whale’s Bride of Frankenstein is reason alone why Universal’s monsters leave an indelible mark on most viewers. As Karloff’s creature reaches out for his newly made bride, beckoning with shaking hands, her rejection is palpable. Filled with tension, all at once the scene is comic, hopeful, repulsive and devastating.

Horror works best when it plays on a range of emotions. The sustained beat of much of modern horror with its often relentless assault of gore certainly has its place. But one need only watch something like the recent, brilliant episode of THE WALKING DEAD entitled “The Grove” to see the difference between monsters and the truly monstrous.

Beneath the superficiality of often short-sighted scripting, Universal’s monster movies were effective because of their humanity. Commenting on Jack Pierce’s (and later Hammer Horror’s version of) the Frankenstein monster makeup alone, Stephen King, in Danse Macabre, states that effectiveness simply:

“there is… something so sad, so miserable there that our hearts actually go out to the creature even as they are shrinking away from it in fear and disgust.”

Long after any blood hits the screen, the lasting impressions of the horrible and grotesque in cinema lie in their ability to engage our sense of dread. Whether we cringe or jump, laugh uneasily or sit frozen, pulse pounding, watching horror films provides us with release. It is the release from knowing what lies at the heart of nature. That there is something within the human condition that sublimates yet secretly knows that we all must inevitably confront the horror that comes with simply being. It is a horror that ends in death.

“We belong dead,” says Karloff’s creature at the end of Bride of Frankenstein, devastated emotionally as he literally brings down the house. And while that action may seem quaint and comical to modern, more sophisticated (and possible jaded) audiences, Universal’s monsters share with currently popular, quality shows like THE WALKING DEAD an understanding of the human condition and therefore an undeniably powerful undercurrent of what true horror can be.

From Janus to Mnemosyne: Memories of the Year to Come

Time is elastic. 

When I wrote of the fluidity of time, I should instead have recognized its elasticity.

It’s a relatively simple equation: a year divided by the sum of total years lived will provide a percentage of perception. To a sixty-five-year-old man, a year is 1 and 1/2% of a life lived. For him, a mere twelve months fly by. But to that man’s five-year-old grandson, that same year is 20% of the child’s entire life. For that boy, summer seems an eternity away.

At the beginning of a new year, time seems so pressing. There’s an increased weight to the present. Yet none of us are prescient, and all of us wait. If only time could be sped up. Run backward. Paused. Played back again. Is that memory?

We can remember, but can we will ourselves forward — beyond the next moment? Like the tension on a rubber band, can we stretch our minds and snap across time?

MENTAL TIME TRAVEL

Chronesthesia. Episodic memory. Mental time travel. First suggested by Endel Tulving in the 1980s, mental time travel refers to the ability to be aware in the present of both one’s past and one’s future.

A process that involves episodic thinking, travel to the past involves the memory of autobiographical events. I recall celebrating New Year’s Eve in Times Square this year. I had a lot to drink.

Travel to the future is the recall and integration of relevant information from memory coupled with the projection and processing of self-reference in subjective time. I will celebrate New Year’s Eve in Times Square again this year. But I won’t drink as much.

For Tulving, awareness of past and future comes down to the perception of self in subjective time. We presume we were present in Times Square on New Year’s Eve because we can remember the time, the place, the sum of our sensory data and even our emotions from a given moment. We then assume we can be present at a future point in history because we can use relevant episodic memory to conceive of a time, place, emotions and sensory information familiar to us and associated with that time.

We think in cycles. And while much of the pattern is due to our conditioning from the calendar, some of it is simply hard-wired into our DNA. We look back not in a line but in a circle. We can project forward because we know the wheel will come around again.

Which takes us back to January. And the timing of this post.

While looking forward to the future has been a part of the human condition pretty much since there was a human condition, celebrating the new year in January is a relatively new phenomenon (and writing about it near the end of the month? sheer procrastination).

MARCH MADNESS

Rituals give meaning to the passage of time. From the primitive cogitations of earliest man to the shallow observations of Ryan Seacrest — the passage of time necessitates ritual meaning. But the date for celebration has been in debate for millennia.

Bust of Janus, Vatican Museum, from Wikipedia Commons
Bust of Janus at the Vatican

Four thousand years ago, the Babylonians celebrated their new year in late March, around the time of the vernal equinox. Others among the ancients — including the Egyptians, Persians and Phoenicians — instead began their year with the fall equinox. And the Greeks — whose God Janus (a deity of doorways and passages, endings and beginnings) looks forward and back to both new year and old — marked their new year with the winter solstice.

January did not even exist until around 700 B.C.E., when Numa Pontilius, the successor to Romulus and second king of Rome, added it. The first time the new year was celebrated on January 1st was in Rome in 153 B.C.E. when the new year was moved from March to January to honor newly elected officials of the Roman Republic. Still, as with most pagan customs,  the celebration of new year in March continued for quite some time.

January 1st was officially Instituted as the beginning of the New Year in 46 B.CE. when Julius Caesar adopted a solar-based calendar (hence the name Julian Calendar). The practice would last for six hundred years until the Council of Tours where the celebration of January 1st was deemed pagan and the new year moved to December 25th, the birth of Christ. Celebrations at the vernal equinox were absorbed into Annunciation Day (the day Mary was told by the angel Gabriel that she would bear a child [i.e., conceived and therefore nine months before his birth]).

Still, many countries stuck to March as the start of the new year. The Church responded by celebrating January 1st as the day of Christ’s circumcision (now better known as the Solemnity of Mary),  thereby still raising the significance of the first of the day of the Julian calendar to some higher purpose that could hope to compete with an equinox. In 1582, the Gregorian Calendar (named for Pope Gregory XIII) was instituted as a replacement for the Julian calendar; it made some mathematical corrections to, among other things, align Easter celebrations with the vernal Equinox. The equinox thus retained as holiday, January 1 could be reclaimed as the beginning of the year.

Most western European countries soon changed the start of the year to January 1 (including Scotland in 1600). But England, Ireland and the British colonies (including the colonial Americans)  took their grand old time to officially  adopt the Gregorian calendar’s start of the year on January 1 (following the Calendar (New Style) Act 1750 [curiously enacted in 1752]) .

A few hundred years later and the issue is finally settled.

ALL LIES IN PERCEPTION

The history of how humanity has measured the new year has little to no bearing on how each of us marks time.  It is the individual that perceives time’s passage that determines the beginning or end of the year.

You may think of it as the beginning of the new year. To someone else, it’s Wednesday.

Not surprisingly, our methods are not tied to calendars. Instead — as neuroscience and psychology would have us believe — time and our perception of it, is tied to memory and expectation.

The year begins in January because that’s how we personally remember it. We are aware of our past and our future because we can conceive that one happened and the other will happen.

We have memories of the year to come.

By Christopher Michael Davis