Tragedies certainly aren’t the most popular types of performances these days. When you hear a film is a tragedy, you might think “outdated Ancient Greek genre, no thanks!” Back in those times, Athenians thought it their civic duty to attend tragic performances of dramas like Antigone or Agammemnon. Were they on to something that we have lost in contemporary Western society? That there is something specifically valuable in a tragic performance that a spectator doesn’t get from other types or performances, such as those of our modern genres of comedy, farce, and melodrama?
Since films reach a greater audience in our culture than plays, after updating Aristotle’s Poetics for the twenty-first century, we analyzed what we call “cinematic tragedies”: films that demonstrate the key components of Aristotelian tragedy. We conclude that a tragedy must consist in the representation of an action that is: (1) complete; (2) serious; (3) probable; (4) has universal significance; (5) involves a reversal of fortune (from good to bad); (6) includes recognition (a change in epistemic state from ignorance to knowledge); (7) includes a specific kind of irrevocable suffering (in the form of death, agony or a terrible wound); (8) has a protagonist who is capable of arousing compassion; and (9) is performed by actors. The effects of the tragedy must include: (10) the arousal in the spectator of pity and fear; and (11) a resolution of pity and fear that is internal to the experience of the drama.
Unlike melodrama (which we hold is the most common film genre), tragedy calls on spectators to ponder thorny moral issues and to navigate them with their own moral compass. One such cinematic tragedy — Into The Wild, 2007, directed by Sean Penn — thematizes the preciousness and precariousness of human life alongside environmental problems, raising questions about human beings’ apparent inability to live on earth without despoiling the beauty and integrity of the biosphere. Other cinematic tragedies deal with a variety of problems with which our modern societies must grapple.
One such topic is illegal immigration, a highly politicized issue that is far more complex than national governments seem equipped to handle, especially beyond the powers of the two parties in the American system. Cinematic tragedies that deal with this issue have been produced over several decades involving immigration into various Western countries, especially the United States; these include Black Girl (France, 1966), El norte (US/UK, 1983), and Sin nombre (Mexico, 2009), the last of which we will expand on here.
In US director Cary Fukunaga’s Sin nombre (which means “Nameless” but which was released in the United States under the Spanish title), Hondurans escaping from their harsh political and economic realities risk their lives in order to make it to the United States, through Mexico, on the tops of rail cars. They travel in this manner since, as we all know, there would be no other legal way for most of these foreign citizens to come to the United States. Over the course of the journey, the immigrants endure terrible suffering or die at the hands of gang members who rob, rape, and even kill some of them.
The film focuses on just a few of the multitudes atop the trains: on a teenage Honduran girl, Sayra, migrating with her father and uncle; and on a few of the gang members. One of them, Casper, has had a change of heart and is no longer loyal to the gang, after its leader killed Casper’s girlfriend after trying to rape her. Casper and other gang members are atop the train robbing the migrants, but he defends Sayra by killing the leader when he tries to rape her. Ultimately, Sayra will arrive in the United States. However, she realizes that the cost has been too great—her father has died falling off of the train; she has lost Casper who is, ironically, shot to death by the pre-pubescent boy whom he himself had trained in the ways of the gang in the opening scenes of the film.
The tremendous losses, and the scenes of suffering, rape, and murder, make unlikely the possibility that the spectator will feel that Sayra’s arrival constitutes a happy ending. In some other aesthetic treatment, Casper’s ultimate death might have been melodramatized as redemptive selflessness for the sake of his new girlfriend. But in Fukunaga’s film, the juxtaposed images imply a continuing cycle of despair and death: Casper’s young killer in Mexico is promoted up the ranks of the gang with a new tattoo, while Sayra’s uncle, back in Honduras after being deported from Mexico, starts the voyage to the United States all over again. Sayra too may face deportation in the future. Following the scene of the reinvigoration of the criminal gang system, as its new young leader gets his first tattoo, the viewer sees Sayra outside a shopping mall in the American southwest. The teenage girl has arrived in the United States and may aspire to participate in advanced consumer capitalism, yet she has lost so much and suffered so undeservingly.
This aesthetic juxtaposition prompts the spectator to attend to the failure of Western political leaders to create a humane system of immigration for the twenty-first century, one which cannot be reached with the entrenched politicized views of the “two sides of the aisle” who miss the human story of immigrants’ plight. This film—like all tragedies—promotes the spectator’s active pondering, that is, it challenges them to respond in some way.
In the tradition of philosophers as various as Aristotle, Seneca, Schopenhauer, Nietzsche, Martha Nussbaum, and Bernard Williams, we find that tragedies bring to conscious awareness the most significant moral, social, political, and existential problems of the human condition. A film such as Sin nombre, through its tragic performance, points to one of these terrible necessities with which our contemporary Western culture must grapple. While it doesn’t offer an answer, this cinematic tragedy prompts us to recognize and deal with a seemingly intractable problem that needs to move beyond the current impasse of political debate, as we in the industrialized nations continue to go shopping and watch movies in the comfort of our malls.
This month marks the 50th anniversary of Disney’s beloved film Mary Poppins, starring the legendary Julie Andrews. Although Andrews was only twenty-nine at the time of the film’s release, she had already established herself as a formidable star with numerous credits to her name and performances opposite Richard Burton, Rex Harrison, and other leading actors of Hollywood’s Golden Age. Mary Poppins would earn Andrews an Academy Award for Best Actress and serve as a milestone in a career that continues today. Herewith are some of our favorite songs from Andrew’s illustrious career.
“I Could Have Danced All Night” Andrews belted out this song in the 1956 Broadway performance of My Fair Lady. Andrews proved her singing capabilities playing Eliza Doolittle opposite Rex Harrison as Professor Higgins, although she was replaced in the film version (with Audrey Hepburn acting and Marni Nixon dubbing).
“Camelot” Andrews performed the play’s title track during its 1960 performance on Broadway. The actress played Queen Guenevere – a title she was apparently comfortable with, later playing Queen Renaldi in Disney’s Princess Diaries – opposite Richard Burton as King Arthur.
“Impossible; It’s Possible” Starring in another royal role, Andrews played the title character in CBS’ 1957 production of Cinderella, written by Richard Rodgers and Oscar Hammerstein.
“Supercalifragilisticexpialidocious” People are still reciting this tongue twister performed by Andrews in Disney’s 1964 hit film Mary Poppins. In addition to earning her an Oscar, Andrews’ role as the angelic English Nanny cemented her name in silver screen history.
“My Favorite Things” Hot on the heels of her success from Mary Poppins, Andrews starred as Maria von Trapp in The Sound of Music, expanding her international fame and branding herself as a singer to be reckoned with in Hollywood and on Broadway.
There’s a scene in the movie High Noon that seems to me to capture an essential feature of our moral lives. Actually, it’s not the entire scene. It’s one moment really, two shots — a facial expression and a movement of the head of Grace Kelly.
The part she’s playing is that of Amy Kane, the wife of Marshal Will Kane (Gary Cooper). Amy Kane is a Quaker, and as such is opposed to violence of any kind. Indeed, she tells Kane she will marry him only if he resigns as marshal of Hadleyville and vows to put down his guns forever. He agrees. But shortly after the wedding Kane learns that four villains have plans to terrorize the town, and he comes to think it is he who must try to stop them. He picks up his guns in preparation to meet the villains, and in so doing breaks his vow to Amy.
Unrelenting in her passivism, Amy decides to leave Will. She boards the noon train out of town. Then she hears gunfire, and, just as the train is about to depart, she disembarks and rushes back. Meanwhile, Kane is battling the villains. He manages to kill two of them, but the remaining two have him cornered. It looks like the end for Kane. Then one of them falls.
Amy has picked up a gun and shot him in the back.
We briefly glimpse Amy’s face immediately after she has pulled the trigger. She is distraught, stricken. When the camera angle changes to a view from behind, we see her head fall with great sadness under the weight of what she’s done.
What’s going on with Amy at that moment? It’s possible, I suppose, that she believes she shouldn’t have shot the villain, that she let her emotions run away with her, that she thinks she did the wrong thing. But I doubt that’s it. More likely is that when Amy heard the gunshots she decided that the right thing for her to do was return to town and help her husband in his desperate fight. But why then is Amy dismayed? If she performed the action she thought was right, shouldn’t she feel only moral contentment with what she has done?
Grace Kelly could have played it differently. She could have whooped with delight at having offed the bad guy, perhaps dropping some “hasta la vista”-like catchphrase along the way. Or she could have set her ample square jaw in grim determination and gone after the remaining villain, signaling to us her decision to discard namby-pamby pacifism for the robust alternative of visceral western justice. But Amy Kane’s actual reaction is psychologically more plausible — and morally more interesting. While she believes she’s done what she had to do, she’s still dismayed. Why?
What Amy’s reaction shows, I believe, is that morality is pluralist, not monist.
Monistic moral theories tell us that there is one and only one ultimate moral end. If monism is true, in every situation it will always be at least theoretically possible to justify the right course of action by showing that everything of fundamental moral importance supports it. Jeremy Bentham is an example of a moral monist.
He held that pleasure is the single ultimate end. Another example is Immanuel Kant, who held that the single base for all of morality is the Categorical Imperative. According to monists,successful moral justification will always ends at a single point (even if they disagree among themselves about what that point is).
Pluralist moral theories, in contrast, hold that there is a multitude of basic moral principles that can come into conflict with each other. David Hume and W.D. Ross were both moral pluralists. They believed that various kinds of moral conflict can arise — justice can conflict with beneficence, keeping a promise can conflict with obeying the law, courage can conflict with prudence — and that there are no underlying rules that explain how such conflicts are to be resolved.
If Hume and Ross are right and pluralism is true, even after you have given the best justification for a course of action that it is possible to give, you may sometimes have to acknowledge that to follow that course will be to act in conflict with something of fundamental moral importance. Your best justification may fail to make all of the moral ends meet.
With that understanding of monism and pluralism on board, let’s now return to Grace Kelly as Amy Kane. Let’s return to the moment her righteous killing of the bad guy causes her to bow her head in moral remorse.
If we assume monism, Amy’s response will seem paradoxical, weird, in some way inappropriate. If there is one and only one ultimate end, then to think that a course of action is right will be to think that everything of fundamental importance supports it. And it would be paradoxical or weird — inappropriate in some way — for someone to regret doing something in line with everything of fundamental moral importance. If the moral justification of an action ends at a single point, then what could the point be of feeling remorse for doing it?
But Amy’s reaction is perfectly explicable if we take her to have a plurality of potentially-conflicting basic moral commitments. Moral pluralists will explain that Amy has decided that in this situation saving Kane from the villains has a fundamental moral importance that overrides the prohibition on killing, even while she continues to believe that there is something fundamentally morally terrible about killing. For pluralists, there is nothing strange about feeling remorse toward acting against something one takes to be of fundamental moral importance.
Indeed, feeling remorse in such a situation is just what we should expect. This is why we take Amy’s response to be apt, not paradoxical or weird. We think that she, like most of us, holds a plurality of fundamental moral commitments, one of which she rightly acted on even though it meant having to violate another.
The upshot is this. If you think Grace Kelly played the scene right — and if you think High Noon captures something about our moral lives that “hasta la vista”-type movies do not — then you ought to believe in moral pluralism.
If you share my jealousy of Peter Capaldi and his new guise as the Doctor, then read on to discover how you could become the next Time Lord with a fondness for Earth. However, be warned: you can’t just pick up Matt Smith’s bow-tie from the floor, don Tom Baker’s scarf, and expect to save planet Earth every Saturday at peak viewing time. You’re going to need training. This is where Oxford’s online products can help you. Think of us as your very own Companion guiding you through the dimensions of time, only with a bit more sass. So jump aboard (yes it’s bigger on the inside), press that button over there, pull that lever thingy, and let’s journey through the five things you need to know to become the Doctor.
Being called two-faced may not initially appeal to you. How about twelve-faced? No wait, don’t leave, come back! Part of the appeal of the Doctor is his ability to regenerate and assume many faces. Perhaps the most striking example of regeneration we have on our planet is the Hydra fish which is able to completely re-grow a severed head. Even more striking is its ability to grow more than one head if a small incision is made on its body. I don’t think it’s likely the BBC will commission a Doctor with two heads though so best to not go down that route. Another example of an animal capable of regeneration is Porifera, the sponges commonly seen on rocks under water. These sponge-type creatures are able to regenerate an entire limb which is certainly impressive but are not quite as attractive as The David Tenants or Matt Smiths of this world.
(2) Fighting aliens
Although alien invasion narratives only crossed over to mainstream fiction after World War II, the Doctor has been fighting off alien invasions since the Dalek War and the subsequent destruction of Gallifrey. Alien invasion narratives are tied together by one salient issue: conquer or be conquered. Whether you are battling Weeping Angels or Cybermen, you must first make sure what you are battling is indeed an alien. Yes, that lady you meet every day at the bus-stop with the strange smell may appear to be from another dimension but it’s always better to be sure before you whip out your sonic screwdriver.
(3) Visiting unknown galaxies
The Hubble Ultra Deep Field telescope captures a patch of sky that represents one thirteen-millionth of the area of the whole sky we see from Earth, and this tiny patch of the Universe contains over 10,000 galaxies. One thirteen-millionth of the sky is the equivalent to holding a grain of sand at arm’s length whilst looking up at the sky. When we look at a galaxy ten billion light years away, we are actually only seeing it by the light that left it ten billion years ago. Therefore, telescopes are akin to time machines.
The sheer vastness and mystery of the universe has baffled us for centuries. Doctor Who acts as a gatekeeper to the unknown, helping us imagine fantastical creatures such as the Daleks, all from the comfort of our living rooms.
(4) Operating the T.A.R.D.I.S.
The majority of time-travel narratives avoid the use of a physical time-machine. However, the Tardis, a blue police telephone box, journeys through time dimensions and is as important to the plot of Doctor Who as upgrades are to Cybermen. Although it looks like a plain old police telephone box, it has been known to withstand meteorite bombardment, shield itself from laser gun fire and traverse the time vortex all in one episode. The Tardis’s most striking characteristic, that it is “much bigger on the inside”, is explained by the Fourth Doctor, Tom Baker, by using the analogy of the tesseract.
(5) Looking good
It’s all very well saving the Universe every week but what use is that without a signature look? Tom Baker had the scarf, Peter Davison had the pin-stripes, John Hurt even had the brooding frown, so what will your dress-sense say about you? Perhaps you could be the Doctor with a cravat or the time-traveller with a toupee? Whatever your choice, I’m sure you’ll pull it off, you handsome devil you.
Don’t forget a good sense of humour to compliment your dashing visage. When Doctor Who was created by Donald Wilson and C.E. Webber in November 1963, the target audience of the show was eight-to-thirteen-year-olds watching as part of a family group on Saturday afternoons. In 2014, it has a worldwide general audience of all ages, claiming over 77 million viewers in the UK, Australia, and the United States. This is largely due to the Doctor’s quick quips and mix of adult and childish humour.
You’ve done it! You’ve conquered the cybermen, exterminated the daleks, and saved Earth (we’re eternally grateful of course). Why not take the Tardis for another spin and adventure through more of Oxford’s online products?
Image credit: Doctor Who poster, by Doctor Who Spoilers. CC-BY-SA-2.0 via Flickr.
The anniversaries of conflicts seem to be more likely to capture the public’s attention than any other significant commemorations. When I first began researching the nurses of the First World War in 2004, I was vaguely aware of an increase in media attention: now, ten years on, as my third book leaves the press, I find myself astonished by the level of interest in the subject. The Centenary of the First World War is becoming a significant cultural event. This time, though, much of the attention is focussed on the role of women, and, in particular, of nurses. The recent publication of several nurses’ diaries has increased the public’s fascination for the subject. A number of television programmes have already been aired. Most of these trace journeys of discovery by celebrity presenters, and are, therefore, somewhat quirky – if not rather random – in their content. The BBC’s project, World War One at Home, has aired numerous stories. I have been involved in some of these – as I have, also, in local projects, such as the impressive recreation of the ‘Stamford Military Hospital’ at Dunham Massey Hall, Cheshire. Many local radio stories have brought to light the work of individuals whose extraordinary experiences and contributions would otherwise have remained hidden – women such as Kate Luard, sister-in-charge of a casualty clearing station during the Battle of Passchendaele; Margaret Maule, who nursed German prisoners-of-war in Dartford; and Elsie Knocker, a fully-trained nurse who established an aid post on the Belgian front lines. One radio story is particularly poignant: that of Clementina Addison, a British nurse, who served with the French Flag Nursing Corps – a unit of fully trained professionals working in French military field hospitals. Clementina cared for hundreds of wounded French ‘poilus’, and died of an unnamed infectious disease as a direct result of her work.
The BBC drama, The Crimson Field was just one of a number of television programmes designed to capture the interest of viewers. I was one of the historical advisers to the series. I came ‘on board’ quite late in the process, and discovered just how difficult it is to transform real, historical events into engaging drama. Most of my work took place in the safety of my own office, where I commented on scripts. But I did spend one highly memorable – and pretty terrifying – week in a field in Wiltshire working with the team producing the first two episodes. Providing ‘authentic background detail’, while, at the same time, creating atmosphere and constructing characters who are both credible and interesting is fraught with difficulty for producers and directors. Since its release this spring, The Crimson Field has become quite controversial, because whilst many people appear to have loved it, others complained vociferously about its lack of authentic detail. Of course, it is hard to reconcile the realities of history with the demands of popular drama.
I give talks about the nurses of the First World War, and often people come up to me to ask about The Crimson Field. Surprisingly often, their one objection is to the fact that the hospital and the nurses were ‘just too clean’. This makes me smile. In these days of contract-cleaners and hospital-acquired infection, we have forgotten the meticulous attention to detail the nurses of the past gave to the cleanliness of their wards. The depiction of cleanliness in the drama was, in fact one of its authentic details.
One of the events I remember most clearly about my work on set with The Crimson Field is the remarkable commitment of director, David Evans, and leading actor, Hermione Norris, in recreating a scene in which Matron Grace Carter enters a ward which is in chaos because a patient has become psychotic and is attacking a padre. The matron takes a sedative injection from a nurse, checks the medication and administers the drug with impeccable professionalism – and this all happens in the space of about three minutes. I remember the intensity of the discussions about how this scene would work, and how many times it was ‘shot’ on the day of filming. But I also remember with some chagrin how, the night after filming, I realised that the injection technique had not been performed entirely correctly. I had to tell David Evans that I had watched the whole sequence six times without noticing that a mistake had been made. Some historical adviser! The entire scene had to be re-filmed. The end result, though, is an impressive piece of hospital drama. Norris looks as though she has been giving intramuscular injections all her life. I shall never forget the professionalism of the director and actors on that set – nor their patience with the absent-minded-professor who was their adviser for the week.
In a centenary year, it can be difficult to distinguish between myths and realities. We all want to know the ‘facts’ or the ‘truths’ about the First World War, but we also want to hear good stories – and it is all the better if those elide facts and enhance the drama of events – because, as human beings, we want to be entertained as well. The important thing, for me, is to fully realise what it is we are commemorating: the significance of the contributions and the enormity of the sacrifices made by our ancestors. Being honest to their memories is the only thing that really matters –the thing that makes all centenary commemoration projects worthwhile.
Image credit: Ministry of Information First World War Collection, from Imperial War Museum Archive. IWM Non Commercial Licence via Wikimedia Commons.