What does the Hebrew Bible have in common with horror movies? This question is not as strange as it might seem. It only takes a few minutes with the biblical texts to begin to realize that the Bible is filled with all kinds of horror. There are strange ...
What does the Hebrew Bible have in common with horror movies? This question is not as strange as it might seem. It only takes a few minutes with the biblical texts to begin to realize that the Bible is filled with all kinds of horror. There are strange figures dripping blood (Isa. 63) and mysterious objects that kill upon touch (2 Sam. 6:7). Women are threatened, pursued, and even dismembered (Judges 19). The “scream queens” of horror are well matched by the screaming women of the Bible, especially in the prophetic literature, where women weep, cry, and howl in pain. (Even when it is men who are crying, their sound is compared to the sound of screaming women, as in Isa. 26:17-18). Even the repetitions of horror—the endless sequels, the killers returned from the (near) grave to haunt another day, the perky college students who just can’t stop going into the basement to find out what’s making that noise—have their parallels in the repetitions of the Hebrew Bible—the people who can’t stop sinning, the God who can’t stop finding new and appalling ways to punish them (in the book of Numbers: miserable food, disease, poisonous snakes, and strange fire, to name but a few). A better question might be not ‘What does the Bible have in common with horror movies?’ but ‘Why is the Bible so much like a horror movie?’.
I first became interested in reading the Bible in dialogue with horror because I was frustrated with the ways biblical scholarship repeated the same explanations about the prophetic literature, often without exploring how else we might understand the texts in all their bloody intensity. Intentionally framing the Bible as a work of horror destabilizes our sense of the familiar, while also opening new possibilities for reading—many of them inspired by new readings of horror films by critics, feminist and otherwise.
What does approaching the Bible like a horror film do for us as readers?
Classifying the Bible as horror lets us identify and address the parts of the text that are truly horrible, without trying to make excuses or sweep them under the interpretive rug. Instead, this approach shows that we might find continuity with other texts and scenes of horror. Isa. 63, the passage I have alluded to above, describes God’s appearance on the horizon, dripping with the blood of those he has trampled. This immediately brings to mind iconic scenes from The Shining and Carrie. In Hosea 2, God, terrorizing Israel (here represented as a woman), threatens to “hedge her up with thorns” and torture her, an image I have long associated with the barbed-wire death scene in Dario Argento’s Suspiria.
Horror also makes us think about gender, and what it means to have a female body. While both male and female bodies are subjected to outrageous violence and sadism, these bodies are not treated equally. In the Hebrew Bible, and in the prophets in particular, female bodies are disproportionately subjected to violence; rarely do they appear without being threatened. A common trope in horror films is that women who have sex are punished with violence, pain, or death; this is the case in Ezekiel 16 and 23 as well. On the other hand, death isn’t always a consequence of sex; God kills Ezekiel’s wife for no reason a few chapters later, in Ezek. 24:15-25. Lowbrow horror films, especially slasher films, are often criticized for their undermotivated violence; characters die “for no good reason.” The same is true, of course, of the Bible as well.
All this leads to another question, one often asked of horror films: What sort of person would want to watch that?—or, in the case of the Bible, What sort of person would read—or believe—that? If the Bible is truly as gruesome as a horror film, wouldn’t it be better simply to reject it? As a Bible scholar and (easily frightened) horror fan, I suggest the answer is no. The pleasure we take in watching Psycho or The Babadook—or I Know What You Did Last Summer or Child’s Play—is not simply sadism or misogyny. Instead, horror lets us complicate our experiences and play with categories. Ezekiel’s vision of the valley of the dry bones, in which the bones are brought back to life, takes on new (and more ominous) implications in light of zombie fictions as well as older films like Re-Animator. Horror also lets us play with gender. Here, one famous example in horror film is the “Final Girl,” the feisty, androgynous, female character who is the slasher film’s sole survivor (the term was coined by Carol Clover in Men, Women, and Chain Saws). The Final Girl helps us to understand a figure like Jael, the Kenite woman in Judges 4 and 5 who stabs the Canaanite general Sisera to death and lives to tell the tale. Like the Final Girls of horror films, Jael is self-directed, independent, vaguely masculinized – Sisera addresses her with a masculine grammatical form – and more than willing to stab a killer to death using a phallic weapon (in her case, a tent peg). Reading Judges 4 and 5 while watching Halloween or Scream draws out unexpected connections of all sorts.
In the summer of 2016, there was lots of buzz around the TV series Stranger Things. Newspapers and websites rushed to provide cheat sheets for millennials on all the echoes and references to the 1980s embedded in the show, from Steven Spielberg and John Hughes movies, to Nightmare on Elm Street or Stand By Me, or the way the whole plot hinged on Stephen King’s key theme of kids navigating the fallout from flawed fathers, figured in supernatural terms.
Not many, however, noted that Stranger Things, with its murderous, tentacled creature unleashed through a trans-dimensional portal into a small town by the experiments of a mad professor, owed virtually everything to the imagination of H. P. Lovecraft. He composed these scenarios over eighty years ago in classic stories like ‘The Dunwich Horror’ and ‘The Shadow over Innsmouth’.
Lovecraft’s influence stands behind many of the key cultural icons of modern Gothic and horror. There would be no Alien series without him, no Species, none of those David Cronenberg body horrors, no Clive Barker, and no Pan’s Labyrinth. (Director Guillermo del Toro has long harboured ambitions to make a block-buster film of At the Mountains of Madness, the last attempt pipped at the post by Ridley Scott’s awful Prometheus.) There is also a whole post-millennial style of fiction, called ‘The New Weird’, which would be impossible without Lovecraft, although major contemporary writers in the mode, like China Miéville and Jeff VanderMeer, have an ambiguous and vexed relationship to the Old Weird.
It’s now pretty hard to imagine our monsters outside the squishy, tentacular paradigm conjured by Lovecraft. Amazingly, Lovecraft’s horrors have burst out of the chest of a small subculture and rapidly evolved into a creature that has transformed the very shape of our nightmares.
Lovecraft’s ancient god, Cthulhu, has risen from the South Seas as an awful menace that threatens humanity’s existence – but he is also a cuddly toy you can buy on the internet. In the 2016 American presidential race, you could even get a ‘CTHULHU FOR PRESIDENT’ T-shirt, which seemed entirely appropriate.
Lovecraft lost … and found
When Lovecraft died in Providence, Rhode Island, in 1937, he was destined for literary oblivion. He had published only in small-circulation amateur journals and struggled to get his stories into the pulp magazines, cheap literature for a mass readership of millions but that paid very poorly. He ought to have found a home in Weird Tales, established in 1923, but his stories were often turned down there. He also wrote in pulps that were just beginning to stabilize around a newly coined term, ‘science fiction’. Lovecraft’s cosmic perspective, placing a fragile human race in the merciless context of astronomical space and time and the murderous competition for survival between biological species, sometimes chimed with editors in new journals like Astounding Science Fiction.
Yet Lovecraft made only a few hundred dollars here and there, his magazine work crumbling into dust, and he only ever published one limited edition book during his lifetime. He was considered a master of horror only amongst a small group of friends and fans, to whom he wrote voluminously throughout his life.
After his death, a number of these dedicated friends tried to interest New York publishing houses in collections of stories, but none were prepared to publish. This rejection prompted the establishment of the Arkham House press, which from 1939 published three volumes of his stories (hang on to those first editions if you happen to have one: they are now worth thousands). Yet when the esteemed American literary critic Edmund Wilson deigned to notice these volumes, he acidly declared that ‘the only real horror of most of these fictions is the horror of bad taste and bad art.’ This effectively condemned Lovecraft to a place entirely outside the literary sphere.
That outer darkness was where he stayed even when a mass paperback edition in the 1960s began to sell in the thousands, and then the hundreds of thousands, and even after horror hit the mainstream after the breakthrough of Rosemary’s Baby in 1968 and Stephen King began to publish in the early 1970s. Just as Gothic fictions were condemned as ‘terror novels’ that endangered public virtue in the eighteenth century, so horror fiction into the 1970s and 1980s was considered the outré, adolescent preserve of moody teens, a disordered sensation fiction that was certainly not literature. Even now, some might question Lovecraft’s appearance in a ‘Classics’ list.
A style for the times
As a writer of the 1920s and 30s, Lovecraft was decidedly not in the Modernist mode that dominates the literary history of that era. He fulminated against Futurism and particularly vented his scorn on T. S. Eliot’s poetry (although he shared his very conservative politics). Lovecraft’s tortured style, with sentences that pile up adjectives in tottering heaps in hilarious violation of every creative writing tutorial ever conducted, is about as far from Hemingway, or Raymond Carver’s minimalism, as you could imagine. It is not tasteful.
And yet there is something extremely powerful and evocative that crawls out of Lovecraft’s sentences, which often evoke horrors so effectively precisely because they are so broken and strange. He is pushing to imagine forces that exist beyond human capacities, embodied in the pressure he exerts on the order of literary prose. At his best, Lovecraft’s dime-store pulp sublime really can shatter the niceties of beautiful literary prose in extraordinary ways.
My sense is that in conceiving malignant ancient creatures that stir underground or that arrive from the cold dark of interstellar space, Lovecraft is also speaking anew to a world that feels on the brink of catastrophic change. The sense that humans have irreversibly broken the planet, and that Nature is coming back to exact a terrible revenge, is at the heart of many recent ‘weird’ fictions. Perhaps by now we should have elected H. P. Lovecraft as the Poet Laureate of the Anthropocene. He would get my vote, assuming I hadn’t had my head pulled off by Cthulhu already.
Featured image credit: “Dirt Road” by shrutikhanna. CC0 Public Domain via Pixabay.
Many people watching the UK television drama National Treasure will have made their minds up about the guilt or innocence of the protagonist well before the end of the series. In episode one we learn that this aging celebrity has ‘slept around’ throughout his long marriage but when an allegation of non-recent sexual assault is made he strenuously denies it. His wife knows about his infidelities and chooses to believe him, but his daughter, who for years has struggled with mental ill-health, substance abuse problems and fractured relationships, seems to be troubled by memories from her childhood. As the episodes unfold, the series gives the audience chance to be judge and jury, employing whatever bits of information are available to them and, not least, their own prior assumptions about such cases.
Reporting of sexual abuse has soared in recent years, boosted by encouragement to ‘come forward’ with assurances of anonymity, developments towards mandatory reporting, and by publicity given to shocking cases. National Treasure is reminiscent of several celebrities pursued under Operation Yewtree from which there have been varied outcomes, and trials of former staff in children’s homes and residential schools for juvenile offenders, as well as of priests, doctors and others whose work has given them access to children and positions of authority. The respectability of their roles is now taken as an extra indicator that allegations are true because they have had the power to exploit those under their authority and to deceive others.
Cases of sexual abuse, especially non-recent cases, are particularly difficult because there is no crime scene or physical trace. In the absence of definitive evidence, the verdict is more likely to be influenced by the impressions made by the accused and the complainants, the relative persuasiveness of the prosecution and defence barristers, and the views that jury members already hold about what typically occurs in such cases. Individual jury members may be more emotionally predisposed to believe one party over the other, following personal experience or empathy with victims of abuse or with victims of wrongful allegations, supported by cultural narratives that they find persuasive, and depending on which form of injustice most enrages them and that they want addressed.
Cases of sexual abuse, especially non-recent cases, are particularly difficult because there is no crime scene or physical trace.
Jimmy Savile too had been described as a ‘national treasure’, as had Rolf Harris, now serving a prison sentence. The case against Savile became the exception to prove the rule that allegations of historical abuse should be believed at the expense of the ‘presumption of innocence until proven guilty’. The avalanche of complaints against Savile was quickly followed by further claims of uncovered historical institutional abuse, implicating senior politicians. True or not, these have served as signal crimes, galvanising moral panics and a political will to address them. Theresa May, then Home Secretary, announced in Parliament,
“I believe the whole House will also be united in sending this message to victims of child abuse. If you have suffered and you go to the police about what you have been through, those of us in positions of authority and responsibility will not shirk our duty to support you. We must do everything in our power to do everything we can to help you…”
Sir Keir Starmer, then the Director of Public Prosecutions, was critical of an ‘overcautious approach’ in the policing and prosecution of reported sexual offences. Explaining that ‘We cannot afford another Savile moment’ he called for a collective approach and national consensus that
“a clear line now needs to be drawn in the sand and we need to redouble our efforts to improve the criminal justice response to sexual offending.”
The collective remorse for failing to report suspicions or to believe claims of abuse in the past and wanting to recognise and learn from past mistakes has, many argue, led to over-correction and confirmation bias. Believing the victim has become a moral imperative; this includes accepting people as victims on face-value and carrying forward that implication in the treatment of them as witnesses, and as vulnerable, in need of protection. In real life, the celebrity in a National Treasure is more likely to be found guilty given the climate of public opinion. His accusers would be referred to as victim, and as the subject of police enquiries he would certainly experience huge damage to his professional and personal life. If found guilty, such serious offences would warrant a long custodial sentence.
To some extent, raising the spectre of false allegations has become contentious because it involves questioning the credibility and honesty of complainants, and requires them to give evidence that can be experienced as ‘re-victimising’. Cases in which the defendant is found ‘not guilty’ can create cognitive dissonance for victims’ advocates because it requires them to entertain the possibilities that: people may be motivated to ‘come forward’ for various reasons, such as persuasion by others, the prospect of a new identity as a survivor, and praise for being a brave person whose past failures can be attributed to abuse, or the prospect of financial compensation. Without acknowledging those and other possibilities false allegations can safely be dismissed as rare.
Maybe we also need to make more room for doubt and middle ground in some cases, instead of zealous certainty.
I’m writing this before the final episode of the television drama. Some viewers have suggested that an ambiguous ending would be the brave option, and I agree. Maybe we also need to make more room for doubt and middle ground in some cases, instead of zealous certainty. Researching this subject, I have met many people who claim they have been falsely accused, and while one can make an informed hunch based on detailed facts, narrative and getting to know the character of the person concerned, there are others where one has no idea whether the individual is innocent or guilty. If there is no physical evidence how can we know for sure? There are people who, because they live alone or are non-conformist, like Christopher Jefferies, may become suspects and be wrongly arrested and charged, but unlike the more fortunate Mr Jeffries, they cannot hope for DNA evidence to absolve them.
That black hole for proof of guilt or innocence is central to the tragedy in such cases: that we can never know for sure and the individuals are left under a cloud of suspicion, that they lied, that they did something wicked. There are other cases where we may feel much more confident but can be shown to be wrong: apparent villains may have a heart of gold; respected people in public office may have a private life that is reprehensible or shows hypocrisy. The recent revelations about the private pursuits of Keith Vaz MP and his wife’s unawareness validates arguments that people can have a side to their life or have predilections that those closest to them know nothing about. I’m acutely aware that even acknowledging such instances lays more people open to suspicion if they are accused of such behaviour, making it even more difficult for ‘actually innocent’ people to ever be believed if they are wrongly accused.
It should always be remembered that one case does not fit all. There is also a need for more public recognition of wide differences in types of sexual offences and in the degree of suffering caused, and the folly of lumping all sex offenders together as evil predators who always subject their victims to catastrophic, life-long damage.
Featured image credit: ‘Paparazzi’ by Yahoo. CC BY 2.0 via Flickr.
Ang Lee, the two-time Academy Award-winning director, has noted that we should never underestimate the power of storytelling. Indeed, as a storyteller, Lee has shown through his films the potential of stories to connect people, to heal wounds, to drive change, and to reveal more about ourselves and the world. In particular, Lee has harnessed new technology for storytelling in movies such as Life of Pi (2012) and his upcoming feature film Billy Lynn’s Long Halftime Walk (to be released on 11 November, 2016). It is therefore not surprising that Lee received the International Honor for Excellence Award—the highest honor given by the International Broadcasting Convention (IBC)—during the IBC Awards ceremony in Amsterdam this September. According to IBC CEO Michael Crimp, the award “goes to an individual who has made a significant and valuable contribution to the movie industry, combining technology with creativity to achieve remarkable ends.” Previous winners of this honor include James Cameron, who directed Avatar, and Peter Jackson, director of The Lord of the Rings series and The Hobbit films.
Lee’s 3-D, high-frame rate (HFR) film Billy Lynn’s Long Halftime Walk premiered at the 2016 New York Film Festival and will be released by Sony in November. The film, adapted from Ben Fountain’s novel, is about an American war hero who embarks on a victory tour with his squad that includes a halftime show at a Thanksgiving Day football game. In this film, Lee and cinematographer John Toll used a 120-frame rate in an attempt to capture the soldier’s memories of his wartime experiences. The standard frame rate for movies is 24-frame rate per second or “FPS,” but directors sometimes depart from the standard for special purposes. However, this practice will change the nature of the image. For example, Peter Jackson previously experimented with a 48-frame rate, a technique known as “high frame rate,” in The Hobbit: An Unexpected Journey (2012).
Nevertheless, critics such as Richard Corliss (Time magazine) and Scott Foundas (Village Voice) complained that the look of The Hobbit resembled that of video games or high-definition television. This dissatisfaction is mostly due to the fact that the sterile hyperrealism and deadly coldness of HFR’s high definition image disrupts the diegetic realism of normal cinematography. Although ultra-realism has aroused criticism among The Hobbit’s fans and critics, in Billy Lynn’s Long Halftime Walk, Ang Lee appears to have delved deep into this technology in order to show the intensity of war and its emotional impact on his characters. As New York Film Festival Director, Kent Jones, said in a statement that, “Ang Lee has always gone deep into the nuances of the emotions between his characters, and that’s exactly what drove him to push cinema technology to new levels. It’s all about the faces, the smallest emotional shifts.”
The suspension of disbelief is often considered as an essential part of storytelling.
In this regard, one cannot help thinking of Lee’s earlier war film, Ride With The Devil (1999). By going beyond the dualisms associated with the American South, the film was an exception to the Civil War genre. This time, Billy Lynn’s Long Halftime Walk can be considered as another exception to customary procedure, as Lee uses digital technology to experiment and play with “suspension of disbelief” of his viewers. Billy Lynn’s Long Halftime Walk will allow us to see whether Lee is able to successfully engage his audiences in the “suspension of disbelief” with this new technology. The suspension of disbelief is often considered as an essential part of storytelling. Samuel Taylor Coleridge, the poet and literary critic, described the implicit contract in storytelling and aesthetic illusion as “that willing suspension of disbelief for the moment, which constitutes poetic faith.”
Like any creative endeavor, film is only successful to the extent that the audience offers this willing suspension. It is part of an unspoken agreement between a filmmaker and his/her viewers. Over the course of time, however, the adoption of digital technology has somewhat destabilized the “indexical” faith that spectators invested in the credibility of the image in relation to an imagined referent.
By presenting in ultra-realistic ways the realities of war and peace through a war hero’s eyes, Lee has put the issue of “faith,” the implicit contract between filmmaker and viewers, into question. Lee is likely to continue to use technological developments to push the limits of viewers’ deep-rooted knowledge about photographic image as well as cinematic language for larger purposes. Will Lee use this ultra-realistic technology to endorse or question the type of nationalist belief and storytelling that a drama of courage and heroism generally entails?
The film will give us a unique platform to understand and evaluate Lee’s earnest faith in creativity and storytelling.
Featured image credit: Screenshot of a scene from Ang Lee’s ‘Billy Lynn’s Long Halftime Walk’. Fair Use via Sony/Tristar.
Cities in the early days of the United States were mostly quiet at night. People who left the comfort of their own homes at night could often be found walking into puddles, tripping over uneven terrain, or colliding into posts because virtually no street lighting existed. Urban areas had established curfew times that “were signaled by the ringing of bells, the beating of drums, or the firing of cannons” at an early hour in the evening. With the advent of gas lighting, culture transformed in fascinating ways. Here are 12 interesting facts about urban nightlife from Peter C. Baldwin’s article for the Oxford Research Encyclopedia of American History, which shows how times have greatly changed and, remarkably, how some things have remained the same.
1. The Christmas season was an especially popular time for drunken rowdiness at night. Somewhat like 20th-century trick-or-treaters, young men in early 19th-century Philadelphia and New York knocked on doors demanding drinks or small gifts.
2. Plays were banned in New England cities, forcing traveling troupes to bill their performances as “moral dialogues.”
3. Theater audiences used to talk amongst themselves much of the time during performances. With the house lights kept up, audiences paid as much attention to socializing and people-watching as to the performance.
4. With the rise of industrialization, workers could no longer take unscheduled breaks, but had to work steadily. As a result, leisure activities like going for a round of drinks began to be pushed out of the day and into the night.
5. With gas lights in commercial districts and professional police forces replacing poorly-trained and unmotivated night watchmen, growing safety was a key factor in encouraging evening street activity. Restaurants, ice cream parlors, and oyster saloons clustered in the well-lit commercial streets.
6. Hours of work and leisure grew less distinct in the late 19th century. The number of night workers greatly expanded, partly because of the adoption of new processes of continuous production in the iron, glass, paper, and petrochemical industries. These processes were facilitated by the availability of electric power after about 1880, and by the superiority of electric lighting. Better lighting also encouraged additional night work on the docks, and in the manufacturing of textiles and books.
7. A gradual decline in labor hours, slow increases in income, and dwindling moral opposition to commercial entertainment also contributed to the expansion of nightlife. Electric lighting further encouraged the growth of commercial nightlife. In the “bright-light districts” of Chicago, Minneapolis, and many smaller cities, brilliant illumination made downtown streets seem safer and more
exciting. Advertising signs, theater marquees, and glowing store windows added to the flood of light from arc and incandescent street lamps.
8. Nightlife at the very end of the 19th century began to shift decisively away from a male-dominated scene. Alternative nightlife options to the all-male “homosocial”
saloons developed. Mixed-gender restaurants and increasingly lavish hotel dining rooms
opened for business, and vaudeville houses succeeded in attracting a mixed-gendered mass audience.
9. Changes in patterns of courtship encouraged mixed-gender nightlife as well. Instead of courting young women in private, men in the 20th century began taking them out on “dates” to public entertainment places such as restaurants, theaters, and dance halls. Dating freed young couples from parental supervision, a development assisted greatly by the advent of the automobile.
10. The invention of television decreased the number of people who went out in urban areas. Watching TV with one’s family became a popular alternative to going out at night. Total movie admissions plummeted from 4.1 billion in 1946 to 1.1 billion in 1962.
11. In the late 1940s and 1950s, bebop jazz performances flourished in smaller nightclubs filled with hipsters. Young hipsters at the time listened to bebop jazz, which offered innovative but less danceable music. They sneered at the conventionality of mainstream society, welcomed the sexuality that they associated with black culture, and, in many cases, supplemented their listening experience by using marijuana and heroin. As the dance culture of young whites came to focus on rock, bop fans preferred just to listen to this increasingly intellectualized art form.
12. Rock concerts in Boston were banned at one point. After a minor riot took place outside the Boston Arena on 3 May 1958, Boston police arrested concert promoter and disc jockey Alan Freed, and then Boston Mayor John B. Hynes briefly called off similar musical performances.
Featured image credit: “52nd Street, New York, N.Y., ca. July 1948” by William Gottlieb. Public Domain via The Library of Congress.