A human eyeball shoots out of its socket, and rolls into a gutter. A child returns from the dead and tears the beating heart from his tormentor’s chest. A young man has horrifying visions of his mother’s decomposing corpse. A baby is ripped from its living mother’s womb. A mother tears her son to pieces, and parades around with his head on a stick… These are scenes from the notorious, banned ‘video nasty’ films Eaten Alive, Zombie Flesh Eaters, I Spit on Your Grave, Anthropophagous: The Beast, and Cannibal Holocaust.
Well, no. They could be – but they’re not. All these scenes and images can be found safely inside the respectable covers of Oxford World’s Classics, in the works of Edgar Allan Poe, M.R. James, James Joyce, William Shakespeare, and Euripides. Only the first two of these are avowedly writers of horror, and none of these books comes with any kind of public health warning or age-suitability guideline. What does this mean?
Euripides’s The Bacchae, first performed around 400 BC, is one of the foundational works of the Western literary canon. In describing graphically the actions of Agave and her Maenads, dismembering King Pentheus and putting his head on a pole, it also sets the bar very high for artistic representations of violence and gore. The episode of the baby ripped from the mother’s womb to which I alluded in the first paragraph is from Macbeth, of course – it’s Macduff’s account of his own birth. And Macbeth, though certainly no slouch in the mayhem department, isn’t even Shakespeare’s most violent play. That would be Titus Andronicus, whose opening scene makes the connections between civilization and horror very clear, as Tamora, Queen of the Goths, sees her son brutally killed by the conquering Romans:
See, lord and father, how we have performed
Our Roman rites: Alarbus’ limbs are lopp’d,
And entrails feed the sacrificing fire,
Whose smoke, like incense, doth perfume the sky.
What follows is well known: further mutilation, rape, cannibalism. Shocking, yes; surprising, no. After all, the greater part of the Western literary tradition follows, or celebrates, a faith whose own sacrificial rites have at their heart symbolic representations of torture and cannibalism, the cross and the host. A case could plausibly be made that the Western literary tradition is a tradition of horror. This may be an overstatement, but it’s an argument with which any honest thinker has to engage.
The classic argument adduced in defence of the brutality of tragedy (a form which I have come to think of as highbrow horror) is the Aristotelian concept of catharsis, according to which the act of witnessing artistic representations of cruelty and monstrosity, pity and fear, purges the audience of these emotions, leaving them psychologically healthier. Horror is good for you! I confess I have always had difficulty accepting this hypothesis (though I recognize that many people far more learned and brilliant than me have had no trouble accepting it). It seems to me to be a classic example of an intellectual’s gambit, a theory offered without recourse to any evidence. And yet catharsis seems to me to be far preferable to another, more common, response to horror: the urge to censor or ban extreme documents and images in the name of public morality. If catharsis is Aristotelian, then this hypothesis is Pavlovian: horror conditions our responses; a tendency to watch violent acts leads inexorably to a tendency to commit violent acts. For many people, this seems to make intuitive sense (on more than one occasion, I’ve noticed people backing away from me when I tell them I work on horror), and it’s the impetus behind the framing of the Video Recordings Act of 1984, after which Cannibal Holocaust and all those other video nasties were banned. As a number of commentators and critics have noted, there’s no evidence for this Pavlovian hypothesis, either. Worse than that, there’s a distinct class animus behind such thinking. You and I, cultured, literate, educated, middle class folks that we are, are perfectly safe: when we watch Cannibal Holocaust (which I do, even if you don’t) we know what we are seeing, we can contextualize the film, interpret it, recognize it for what it is. The problem, the argument implicitly goes, is not us, it is them, those festering, semi-bestial proletarians whose extant propensity for violence (always simmering beneath the surface) can only be stoked by watching these films. That’s why no-one seriously considers banning The Bacchae or Titus Andronicus – why any suggestion that we do so would be treated as an act of appalling philistinism. They are horror for the educated classes.
Horror is, unquestionably, an extreme art form. Like all avant-garde art, I would suggest, its real purpose is to force its audiences to confront the limits of their own tolerance – including, emphatically, their own tolerance for what is or is not art. Commonly, when hitting these limits, we respond with fear, frustration, and even rage. Even today, this is not an unusual reaction on first reading Finnegans Wake, for example: I see it occasionally in my students, who are (a) voluntarily students of literature, and (b) usually Irish, not to say actual Dubliners. So we shouldn’t be surprised that audiences respond to horror with – well, with horror. But we need to recognize that the reasons for doing this are complex, and are deeply bound up with the meaning and function of art, and of civilization.
Headline image: Pentheus torn apart by Agave and Ino. Attic red-figure lekanis (cosmetics bowl) lid, ca. 450-425 BC, public domain, via Wikimedia Commons
How rapidly does medical knowledge advance? Very quickly if you read modern newspapers, but rather slowly if you study history. Nowhere is this more true than in the fields of neurology and psychiatry.
It was believed that studies of common disorders of the nervous system began with Greco-Roman Medicine, for example, epilepsy, “The sacred disease” (Hippocrates) or “melancholia”, now called depression. Our studies have now revealed remarkable Babylonian descriptions of common neuropsychiatric disorders a millennium earlier.
There were several Babylonian Dynasties with their capital at Babylon on the River Euphrates. Best known is the Neo-Babylonian Dynasty (626-539 BC) associated with King Nebuchadnezzar II (604-562 BC) and the capture of Jerusalem (586 BC). But the neuropsychiatric sources we have studied nearly all derive from the Old Babylonian Dynasty of the first half of the second millennium BC, united under King Hammurabi (1792-1750 BC).
The Babylonians made important contributions to mathematics, astronomy, law and medicine conveyed in the cuneiform script, impressed into clay tablets with reeds, the earliest form of writing which began in Mesopotamia in the late 4th millennium BC. When Babylon was absorbed into the Persian Empire cuneiform writing was replaced by Aramaic and simpler alphabetic scripts and was only revived (translated) by European scholars in the 19th century AD.
The Babylonians were remarkably acute and objective observers of medical disorders and human behaviour. In texts located in museums in London, Paris, Berlin and Istanbul we have studied surprisingly detailed accounts of what we recognise today as epilepsy, stroke, psychoses, obsessive compulsive disorder (OCD), psychopathic behaviour, depression and anxiety. For example they described most of the common seizure types we know today e.g. tonic clonic, absence, focal motor, etc, as well as auras, post-ictal phenomena, provocative factors (such as sleep or emotion) and even a comprehensive account of schizophrenia-like psychoses of epilepsy.
Early attempts at prognosis included a recognition that numerous seizures in one day (i.e. status epilepticus) could lead to death. They recognised the unilateral nature of stroke involving limbs, face, speech and consciousness, and distinguished the facial weakness of stroke from the isolated facial paralysis we call Bell’s palsy. The modern psychiatrist will recognise an accurate description of an agitated depression, with biological features including insomnia, anorexia, weakness, impaired concentration and memory. The obsessive behaviour described by the Babylonians included such modern categories as contamination, orderliness of objects, aggression, sex, and religion. Accounts of psychopathic behaviour include the liar, the thief, the troublemaker, the sexual offender, the immature delinquent and social misfit, the violent, and the murderer.
The Babylonians had only a superficial knowledge of anatomy and no knowledge of brain, spinal cord or psychological function. They had no systematic classifications of their own and would not have understood our modern diagnostic categories. Some neuropsychiatric disorders e.g. stroke or facial palsy had a physical basis requiring the attention of the physician or asû, using a plant and mineral based pharmacology. Most disorders, such as epilepsy, psychoses and depression were regarded as supernatural due to evil demons and spirits, or the anger of personal gods, and thus required the intervention of the priest or ašipu. Other disorders, such as OCD, phobias and psychopathic behaviour were viewed as a mystery, yet to be resolved, revealing a surprisingly open-minded approach.
From the perspective of a modern neurologist or psychiatrist these ancient descriptions of neuropsychiatric phenomenology suggest that the Babylonians were observing many of the common neurological and psychiatric disorders that we recognise today. There is nothing comparable in the ancient Egyptian medical writings and the Babylonians therefore were the first to describe the clinical foundations of modern neurology and psychiatry.
A major and intriguing omission from these entirely objective Babylonian descriptions of neuropsychiatric disorders is the absence of any account of subjective thoughts or feelings, such as obsessional thoughts or ruminations in OCD, or suicidal thoughts or sadness in depression. The latter subjective phenomena only became a relatively modern field of description and enquiry in the 17th and 18th centuries AD. This raises interesting questions about the possibly slow evolution of human self awareness, which is central to the concept of “mental illness”, which only became the province of a professional medical discipline, i.e. psychiatry, in the last 200 years.
The development of linguistics as a scientific discipline is one of the greatest achievements of contemporary thought, as it has led to the discovery of some fundamental principles about the functioning of language. However, most of its recent discoveries have not yet reached the general audience of educated people beyond the specialists. Scholars of classics, in particular, have found it difficult to become involved in the debate, since many recent studies in linguistics have been driven by the necessity to free themselves from the subordination to Latin grammar and have put into question the validity of certain aspects of traditional grammar.
As a consequence, progress made by contemporary linguistics has paradoxically had a negative rather than positive effect on the teaching of Latin. Although traditional grammars are now outdated, a suitable replacement has not yet been offered and a widespread scepticism has forced many to keep relying on old fashioned textbooks.
In order to overcome this undesirable state of affairs, it is desirable to bring Latin grammar back to its original high-level scientific conception, going beyond a prescriptive attitude and restoring the original theoretical tension. Although many branches of contemporary linguistics are potentially suited to fulfil this objective, none of them have been fully exploited in teaching yet. Their advantage over traditional approaches lies in their ability to satisfy the same needs as traditional analytical and philosophical Latin grammar, exploiting – at the same time – new methods, which are suitable to formulate more accurate analyses and theoretical generalizations.
Latin grammar should be presented as an activity which raises the linguistic awareness of its readers, using the most recent tools of modern linguistics. This should not be limited to the traditional Indo-European historical perspective, but includes the comparison of different languages and the attempt to represent the way in which grammar rules are codified in the mind.
The hypothesis is that there exists a language faculty underlying all languages, known as Universal Grammar (UG), i.e. a system of variable and invariable factors internalized in the speaker’s mind, which constitutes the basis of the grammar of each language. Understanding the contents of UG amounts to understanding those linguistic phenomena that are common to all languages. In this perspective, it is possible to develop a new method of teaching Latin, which aims at strengthening the cognitive skills of the learner’s mind. This method consists in overcoming the rigidity of a purely normative conception of grammatical rules, in order to make them explicit in a synchronic formal way and thus formulate hypotheses about the mental mechanisms that generate them. This method is an updated enhancement of the old conception of grammatical studies known as progymnasmata, i.e. “gymnastics of the mind,” which introduces the reader to the world of classical scholarship.
On the basis of some recent discoveries made by the neurosciences, it is possible to formulate grammatical rules that represent a better approximation of the implicit and explicit mental operations carried out by the language learner. The desired effect is the activation of the appropriate areas of the brain, i.e. the ones which are naturally devoted to the processing of linguistic information, thus rendering the process of language acquisition faster and more natural. Indeed, a vast number of recent studies have shown that language learning strongly relies on a constant and unconscious comparison between the second language (L2) and the learner’s mother tongue. By comparing linguistic phenomena across distinct languages and by interpreting the results with updated theoretical tools, we intend to underlie the deep similarities among languages rather than their superficial differences. This new teaching perspective represents a fundamental advantage for learners, who can focus their attention on the limits of linguistic variation, making their acquisitional task more feasible. In particular, by overtly reflecting on language and comparing L2 grammars to the structures of the mother tongue, the study of Latin becomes more stimulating and active.
Moreover, as students become aware of the difference between a “mistake”, as banned from the standard language, and linguistic “agrammaticality” (i.e. an option which is disallowed by the deep structure of the language), they become more critical and aware of the level of their written and oral performance in their mother tongue. From this perspective, it is clear that the study of Latin contributes to the overall linguistic education of learners, and not only to the training of those interested in classical studies. Students should no longer learn by heart the obscure rules of school grammars, often rooted on misconceptions, but they should instead explore the discoveries of centuries of classical scholarship in order to actively work out how languages function and change. In particular, they should focus their attention on the aspects of the targeted language they already know, before exploring the points of divergence from their mother tongue. Thanks to this revised methodology, the study of Latin loses any passive connotation and becomes an activity which enhances linguistic awareness, meta-linguistic competence, as well as critical thought.
This summer saw the release of Hercules (Radical Studios, dir. Brett Ratner). Dwayne “The Rock” Johnson took his place in the long line of strongmen to portray Greece’s most enduring icon. It was a lot of fun, and you should go see it. But, as one might expect from a Hollywood piece, the film takes a revisionist approach to the world of Greek myth, especially to its titular hero. A man of enormous sexual appetite, sacker of cities, and murderer of his own family, Hercules is glossed over here as a seeker of justice, characterized by his humanity and humility. And it is once again Hercules, not Heracles: the Romanized version loses the irony of the Greek, “Glory of Hera.”
This is neither the Hercules of ancient myth, nor is it the Hercules of Steve Moore’s graphic novel, Hercules: The Thracian Wars (Radical Comics, 2008), on which the film is loosely based. It is perhaps not surprising then that Moore fought to have his name removed from the project, at least according to long-time friend Alan Moore. Steve Moore died earlier this year and buried deep in the closing credits of the film is a dedication in his memory.
When he wrote his comic, Moore strove to fit his story into the world of Greek myth in a “realistic” way. Though the story (and that of its sequel, The Knives of Kush) is original, the characters and setting are consistent with the pseudo-historic Bronze Age of Greek legend. The film jettisons much of this careful integration for little narrative gain. I am never opposed to revisions to the myth (myth, after all, can be defined by its malleability), but why, for instance, set the opening of the film in Macedonia in 358 BCE instead of 1200? It adds nothing to the story, but confuses anyone with even a passing knowledge of Greek history — our heroes should be rubbing elbows with Philip II of Macedon, Alexander the Great’s father. The answer to this question, I suspect, is a sort of Wikipedial historicity: Hercules and his companions are hired by a fictional King Cotys, a name chosen by Moore as suitably Thracian — and there was a historical Cotys in 358.
The Thracian Wars is set well after Hercules has completed his twelve labors: in the loose chronology of Greek myth, we are somewhere between the Calydonian Boar Hunt and the battle of the Seven Against Thebes. Hercules arrives in Thrace as a mercenary, along with his companions Iolaus, Tydeus, Autolycus, Amphiarus, Atalanta, Meleager, and Meneus, the only character made up by Moore. (The Hollywood film production jettisons those characters who might have LGBT overtones: Meneus is Hercules’s male lover, and Meleager is constantly frustrated by and therefore exposes Atalanta’s lesbianism.) Though no story of Greek myth involves all these characters, they all belong to roughly the same generation — the generation before the Trojan War. These characters could have interacted in untold stories.
But they don’t interact well. As Moore notes in the afterword to the trade paperback, “Hercules was a murderer, a rapist, a womanizer, subject to catastrophic rages and plainly bisexual…I wouldn’t have wanted to spend much time in his company.” The rest of the band is not much better. Where the film presents a band of brothers, faithful to each other to the death, in the comic these characters loathe each other and are clearly bound not by love of each other but the need to earn a living. They are mercenaries, with little interest in the morality of their actions.
Legendary Greece, then, is without a moral center. Violence and bloodshed are never far away. Sexual activity is fueled only by deceit or lust. The Greek characters speak of their Thracian surroundings as barbaric, but we are never shown any better. The art of the comic articulates this grim reality. Eyes are frequently lost in shadow, for instance, dehumanizing the characters further. Throughout, artist Admira Wijaya deploys a somber color palette of greys, browns, and muted reds to convey a bleak world.
This, then, is the great disconnect of Greek myth with the modern world. In our times, our heroes of popular culture must be morally pure; only black and white values can be understood. So-called “anti-heroes” are occasionally tolerated in marginal media, but even here their transgressions are typically mitigated somehow (think of the recent television series Dexter, in which the serial killer is validated by his targeting of other serial killers — the real bad guys). The heroes of Greek legend — the word “hero” itself only denoted those who performed memorable or noteworthy deeds, without a moral element — often existed solely because they were transgressors. Tantalus, Oedipus, Orestes: their stories are of broken taboos, stories of cannibalism, incest, kin-slaying. Later authors may have complicated their stories, but violation is at the core of their being.
Sure, the common people of ancient Greece benefited from Hercules’s actions as a slayer of monsters, but none of his actions were motivated by altruism. Rather, it was shame at best that moved him: in most tellings, his famous twelve labors were penance for the death of his family at his own hands. Many of his other deeds were motivated by hunger, lust, or just boredom. In the film, Johnson’s Hercules finds a sort of absolution for his past crimes. In the comic, redemption is not an objective; in fact, Hercules doesn’t even seem to recognize the concept.
Hercules is a figure of strength and power, a conqueror of the unknown, a slayer of dragons (and giant boars and lions). The Hercules of Hollywood shows us strength. The Hercules of myth — and of Moore’s comic — shows us the consequences of that strength when it’s not carefully contained. There is a primal energy there, a reflection of that part of our souls that is fascinated with, even desires, transgression. As healthy, moral humans, most of us conquer that fascination. But myth is our reminder that it always, always bears watching. Hollywood isn’t going to help you do that.
Featured image: An engraving from The Labours of Hercules by Hans Sebald Beham, c. 1545. Public domain via Wikimedia Commons.
Hadrian’s Wall has been in the news again recently for all the wrong reasons. Occasional wits have pondered on its significance in the Scottish Referendum, neglecting the fact that it has never marked the Anglo-Scottish border, and was certainly not constructed to keep the Scots out. Others have mistakenly insinuated that it is closed for business, following the widely reported demise of the Hadrian’s Wall Trust. And then of course there is the Game of Thrones angle, best-selling writer George R R Martin has spoken of the Wall as an inspiration for the great wall of ice that features in his books.
Media coverage of both Hadrian’s Wall Trust’s demise and Game of Thrones’ rise has sometimes played upon and propagated the notion that the Hadrian’s Wall was manned by shivering Italian legionaries guarding the fringes civilisation – irrespective of the fact that the empire actually trusted the security of the frontier to its non-citizen soldiers, the auxilia rather than to its legionaries. The tendency to overemphasise the Italian aspect reflects confusion about what the Roman Empire and its British frontier was about. But Martin, who made no claims to be speaking as a historian when he spoke of how he took the idea of legionaries from Italy, North Africa, and Greece guarding the Wall as a source of inspiration, did at least get one thing right about the Romano-British frontier.
There were indeed Africans on the Wall during the Roman period. In fact, at times there were probably more North Africans than Italians and Greeks. While all these groups were outnumbered by north-west Europeans, who tend to get discussed more often, the North African community was substantial, and its stories warrant telling.
Perhaps the most remarkable tale to survive is an episode in the Historia Augusta (Life of Severus 22) concerning the inspection of the Wall by the emperor Septimius Severus. The emperor, who was himself born in Libya, was confronted by a black soldier, part of the Wall garrison and a noted practical joker. According to the account the notoriously superstitious emperor saw in the soldier’s black skin and his brandishing of a wreath of Cyprus branches, an omen of death. And his mood was not further improved when the soldier shouted the macabre double entendre iam deus esto victor (now victor/conqueror, become a god). For of course properly speaking a Roman emperor should first die before being divinized. The late Nigerian classicist, Lloyd Thompson, made a powerful point about this intriguing passage in his seminal work Romans and Blacks, ‘the whole anecdote attributes to this man a disposition to make fun of the superstitious beliefs about black strangers’. In fact we might go further, and note just how much cultural knowledge and confidence this frontier soldier needed to play the joke – he needed to be aware of Roman funerary practices, superstitions, and the indeed the practice of emperor worship itself.
Why is this illuminating episode not better known? Perhaps it is because there is something deeply uncomfortable about what could be termed Britain’s first ‘racist joke’, or perhaps the problem lies with the source itself, the notoriously unreliable Historia Augusta. And yet as a properly forensic reading of this part of the text by Professor Tony Birley has shown, the detail included around the encounter is utterly credible, and we can identify places alluded to in it at the western end of the Wall. So it is quite reasonable to believe that this encounter took place.
Not only this, but according to the restoration of the text preferred by Birley and myself, there is a reference to a third African in this passage. The restoration post Maurum apud vallum missum in Britannia indicates that this episode took place after Severus has granted discharge to a soldier of the Mauri (the term from which ‘Moors’ derives). And has Birley has noted, we know that there was a unit of Moors stationed at Burgh-by-Sands on the Solway at this time.
Sadly, Burgh is one of the least explored forts on Hadrian’s Wall, but some sense of what may one day await an extensive campaign of excavation there comes from Transylvania in Romania, where investigations at the home of another Moorish regiment of the Roman army have revealed a temple dedicated to the gods of their homelands. Perhaps too, evidence of different North African legacies would emerge. The late Vivian Swann, a leading expert in the pottery of the Wall has presented an attractive case that the appearance of new forms of ceramics indicates the introduction of North African cuisine in northern Britain in the second and third centuries AD.
What is clear is that the Mauri of Burgh-by-Sands were not the only North Africans on the Wall. We have an African legionary’s tombstone from Birdoswald, and from the East Coast the glorious funerary stela set up to commemorate Victor, a freedman (former slave) by his former master, a trooper in a Spanish cavalry regiment. Victor’s monument now stands on display in Arbeia Museum at South Shields next to the fine, and rather better known, memorial to the Catuvellunian Regina, freedwoman and wife of Barates from Palmyra in Syria. Together these individuals, and the many other ethnic groups commemorated on the Wall, remind us of just how cosmopolitan the people of Roman frontier society were, and of how a society that stretched from the Solway and the Tyne to the Euphrates was held together.