a short essay about killing

the poster for Krzysztof Kieslowski’s A Short Film About Killing (1988)

I don’t believe in the death penalty. In this, I’m in the majority, globally. I’m not sure when exactly I became against it; until at least the age of 12 I was pretty much a proto-fascist with an ‘eye-for-an-eye’ sense of justice, as boys tended to be in those days and for all I know still are. But I know that by the time I saw Krzysztof Kieslowski’s brilliantly grim A Short Film About Killing (Krótki film o zabijaniu) when I was 16 or so I was already anti-death penalty and have remained so ever since.

 

My reasons are, typically, kind of pedantic. There are many obvious arguments against it; there’s the ‘what if you accidentally kill the wrong person’ argument and that’s a pretty strong one – it has happened and does happen and is irreversible. There’s the fact that the death penalty seems to have a negligible effect on the crime rate. In fact, countries with the death penalty on the whole seem to have more rather than less murders (not that there’s necessarily a link between those two things). Even from the coldest and most reptilian, utilitarian point of view of getting rid of the problem of prison overcrowding, any possible benefit is negated by the fact that in most countries with the death penalty, prisoners spend years on death row being fed and housed, rather than being quickly and efficiently ‘processed.’ There’s also the Gandalfian(!) argument from The Lord of the Rings; “Many that live deserve death. And some that die deserve life. Can you give it to them? Then do not be too eager to deal out death in judgement. For even the very wise cannot see all ends.” This wasn’t just a handy deus ex machina because Tolkien needed Gollum to survive in order to destroy the ring. It was that, but Tolkien was also a devout and serious Christian and that was his moral outlook. Thank the gods that unlike his friend CS Lewis, he deliberately left religion out of his books though! In the Biblical commandment Thou Shalt Not Kill, the Christian/Jewish god doesn’t list any exceptions or mitigating circumstances – in that one instance. Of course elsewhere in the Bible there are many circumstances where humans killing humans is considered appropriate and even righteous – the ultimate irony being that Jesus, kind of like an anti-Gollum, has to suffer death through violence to achieve his purpose. Religion is odd; but I’m not a Christian or Jew.

All of those points are relevant, but for me personally, it’s far simpler than that; if you can be legally killed, that means that in the eyes of the state there’s essentially nothing wrong with killing people. I think there is, and I don’t think that it should just be a matter of having the right paperwork. In essence, to kill a murderer is not telling them ‘what you did is wrong‘ so much as ‘you did it wrong‘ which I don’t think is a minor difference. And on top of that, there’s the whole question of who you are handing this responsibility of life and death to. I have a lot of respect for some lawyers, attorneys, judges, police officers etc, but there are others that I wouldn’t trust with my lunch, let alone my (or anyone else’s) life. States have a character, and often it is institutionally biased regarding race, class, gender and sexuality. Giving that kind of power within that kind of framework seems likely to make far more problems than it solves. But even in non-death-penalty countries like the UK we routinely give people the legal right to take other people’s lives, all they have to do is join the armed forces.

British volunteers in the International Brigade, 1937

I’m no more consistent than anyone else and my attitudes have their exceptions and contradictions. I (predictably) don’t philosophically differentiate between the military and mercenaries, because what ‘serving your country’ means in practical terms is carrying out whatever the policy of your government is that week, with no certainty that it won’t be contradicted by a new policy (or a new government) the next week and if enemies suddenly turn out to be allies or vice versa, the dead remain dead. That said – here’s the contradiction – I’m not a pacifist absolutist either, and I think, or like to think that if an invading army arrived in my country I’d take arms against it. These things are particular though; everyone likes to think they’d fight for a good cause, but the Spanish Civil War stands out for the number of anti-fascist fighters from all over the world who took up arms in defence of Spain. But that happened partly because so many people were ready to – and wanted to fight. Many of those – George Orwell is a prominent and typical example – belonged to the generation who had been just too young to fight in World War One and whose feelings about war – including a considerable amount of survivor’s guilt – had been shaped by it. And the fascist attack on the Spanish republic gave them a clear-cut situation to intervene in, in a way that the more political rise of fascism in Italy and Germany didn’t.

But anyway, the death penalty. People of course do terrible things, but although lots of them are significantly more horrific than a lethal injection or the electric chair, the end result is the same. Being – odd, brief segue but bear with me, it’s relevant – a fan of black metal music, the subject of death and murder is one you come across in a different way from just being, say, a fan of horror movies. Because the poser-ish ‘darkness’ of black metal spills over (though less than it used to) into ‘real life,’ almost as if the kind of art you make bears some relation to the kind of person you are. I won’t go into the tedious-but-fascinating Lords of Chaos stuff about Mayhem & Burzum or Absurd because it’s not quite relevant here, but the story of Smutak (Pavel Selyun) who ran Morak Production record label in Belarus is.

In 2012 Selyun discovered that his wife, the artist and singer Frozendark (Victoria Selyunova) was having an affair with the artist, zine editor and musician Kronum (Alexey Vladimirovich Utokva). Sticking with the psuedonyms seems appropriate, so anyway; Smutak murdered both Frozendark and Kronum, dismembered them and was apprehended on the Subway three days later with Kronum’s head (or skull; same difference I suppose – some accounts say he boiled the head – I don’t need to know) in a bag. After his arrest, he was imprisoned in Minsk and after a confession gained under torture and the failure of various appeals  he was executed two years later, by being shot in the back of the head. A horrible postscript that demonstrates how the death penalty punishes the innocent as well as the guilty; after the execution the authorities failed to hand over Smutak’s body to his mother or tell her where he’s buried, the case was handed to the UN Court of Human Rights.

Not many people (and certainly not me) would say that Selyun didn’t ‘deserve’ his treatment. But still. He possibly tortured and definitely killed people and then was tortured and killed. There is a kind of balance there, but it’s one in which the act of torturing and killing itself is made neutral. Whoever tortured and killed Smutak doesn’t need any kind of defence because they did it in the name of the law, but the idea that torturing and killing is morally neutral because you don’t have any emotional investment in the act is an odd one. Smutak had nothing to gain from his actions other than some kind of horrible satisfaction. The person or people who did the same to him got paid for it. Which is morally, what? Better? He reportedly felt the same kind of fear as his victims; well good, I guess, but that did nothing to benefit the victims. It may have pleased the victims’ relatives but I wouldn’t want to examine that kind of pleasure too closely.

The current case of Luigi Mangione is far stranger. It’s the only time I can recall that the supporters (in this case I think ‘fans’ would be just as correct a word) of someone accused of murder want the suspect to be guilty rather than innocent. Whether they would still feel that way if he looked different or had a history of violent crime or had a different kind of political agenda is endlessly debatable, but irrelevant. It looks as if the State will be seeking the death penalty for him and for all the reasons listed above I think that’s wrong. But assuming that he’s guilty, which obviously one shouldn’t do (and if he isn’t, Jesus Christ, good luck getting a fair trial!) Mangione himself and some of his fans, should really be okay with it. If he is guilty, he hasn’t done anything to help a single person to get access to healthcare or improve the healthcare system or even effectively protested against it in a way that people with political power can positively react to. UnitedHealthcare still has a CEO, still has dubious political connections and still treats people very badly. That doesn’t mean that it’s an unassailable monolith that can never be changed, but clearly removing one figurehead isn’t how it can be done.

But more to the point; why does the killer (assuming their motives are the ones that are being extrapolated from the crime) care anyway? If actually shooting someone dead in the street is okay, then surely being indirectly responsible for the misery and possible deaths of others is barely even a misdemeanour. It amounts to the kind of Travis Bickle movie logic I’m sure I’ve sneered about elsewhere; complaining about the decay of social values and then committing murder is not reducing the sum total of social decay, it’s adding to it. A society where evil CEOs are shot dead in the street is a society where human beings are shot dead in the streets and that becoming acceptable is not likely to be the pathway to a more just, equal or happy society.

Michael Haneke’s disturbing Benny’s Video (1992)

What the death penalty does do, and probably a key part of why it’s still used in some countries, is offer a punishment that seems (in the case of murder at least) to fit the crime. Interestingly, public executions – which counterintuitively seem to have no better track record as a deterrent than any other kind – are now vanishingly rare. Part of that is no doubt to do with public disgust and part with institutional secrecy and shame, but I imagine that part of it is also the fear that the public would enjoy it too much. I’m not sure if I would think that if it wasn’t for the spate of Islamic State beheadings that were so widely watched on the internet back in the early 2010s (was it?) I watched one, like most people seem to have, and still wish I hadn’t; but you can’t un-ring a bell. That was at the back of my mind when I wrote about saints and martyrdom for this site and I can bring images of it to mind horribly easily. But even before that it shouldn’t have surprised me – like many other teenage horror movie fans in the pre-internet era I watched exploitation videos like Face of Death that featured executions, accidents etc, and in doing so realised that I was a horror fan and not whatever fans of that are. I should have learned my lesson there, but it’s undeniable that these things have a murky kind of fascination; since then, thanks to one of my favourite writers, Georges Bataille, I’ve ended up reading about Lingchi (‘Death by a Thousand Cuts’) and looking at the chilling and depressing photos of it, been appalled by postcards of lynchings, seen revolting photographs of soldiers’ desecrated bodies and murder victims… I haven’t gotten used to those images and I hope I never will. Teenage me would no doubt sneer at that because he thought that things that are ‘dark’ are cool, but that seems like a laughable and childish attitude to me now, so I can take his sneering. I seem to be edging towards the point that Michael Haneke is making in Funny Games (1997), which I find a bit tiresome and preachy (even more so the remake), but I’m not. I disagree with the premise of that film because I do think there’s a difference between fictional horror and real horror, and that enjoying one isn’t the same as enjoying the other. I think his 1992 film Benny’s Video makes a similar but much more subtle and complex point far better.

Imprisonment (whatever your views on the justice system) is a pretty unsatisfactory solution for most crimes, but it’s difficult to think of a better one which doesn’t essentially exonerate the kind of behaviour we want to characterise as abnormal or criminal. Stealing from a thief is obviously ‘justice’ in the eye-for-an-eye sense, but as a punishment it’s laughable. Raping a rapist would be grotesque and double the number of rapists in the room every time it happened. But even so, it’s never going to be comfortable that the tax payer is contributing to the relative comfort of someone like (I’ll only mention dead ones, this isn’t a complaint about the legal system being soft on psychopaths) Fred West. A solution l think I might suggest is one which I’m very dubious about myself from lots of different humanitarian, psychological and philosophical points of view; why not offer (and that word alone would make people angry) ‘monsters’ – the kind of killers in a category of their own, who admit to horrendous acts of murder and torture and whose guilt is not in doubt – those who will never be allowed freedom – the choice of a lethal injection rather than life imprisonment? That’s a horrible thing to contemplate,  but then so is paying for the meals and upkeep of someone like Ian Brady, especially when he essentially had the last laugh, exercising his little bit of power over the families of his victims and having his self-aggrandising bullshit book The Gates of Janus published.

Anyway, that last part was kind of icky and uncomfortable, but so it should be – the whole subject is. So for what it’s worth, those are my thoughts on the death penalty. Time for a shower; until next time, don’t murder anyone please.

meted out to the man

Although Mr Musk’s*  statement about Hitler, Stalin and Mao is (surely not unexpectedly) ignorant and abhorrent, he is making a serious point that’s worth remembering, even if his reasons for doing so come from a paranoid, (wouldn’t normally go straight for the WW2 analogy but he already did, so why not?) bunker-mentality sense of self-preservation.
Hitler was the main architect of the Holocaust and other Nazi atrocities from murder to mental/physical torture to the indoctrination of children in a misanthropic ideology, and so he therefore bears a large part of the moral responsibility for it. BUT, he genuinely wasn’t standing there in the streets of Warsaw or the hills of Ukraine, swinging small children by the legs and smashing them to death against walls or leading groups of half-starved prisoners into ravines and machine-gunning them, or even holding a gun to the heads of those who did to make sure they did it.

*nice innit? Sounds kind of like a fox from an old children’s book

Stalin’s policies led, both directly and indirectly to the death of millions, but he wasn’t personally there in the salt mines working people to death, or stabbing them in the head with ice-picks or torturing and shooting them because their vision of communism differed from his, or simply because they refused to agree with him.

Mao Zedong instigated vast, dehumanising programs that decimated the people of his country through famine and starvation and led campaigns that ruthlessly wiped out political opponents – but he did it with words or with a pen, not with bullets or by actually snatching food from people’s mouths.

In all of those cases, those atrocities happened for two reasons; most importantly, because the instigators wanted it; they would not have happened without those three individuals. But also because others, most of whose names are now unknown to us without a lot of tedious and depressing research, were willing to make it happen. The people who murdered and tortured did those things, some no doubt more enthusiastically than others, because they were paid to do so. Now, there are people ending international aid to starving children, or impeding Ukraine’s fight against the invading forces of Russia, or firing veterans or ‘just’ setting up armed cordons around car dealerships and arresting people that they or their superiors are pretending for ideological reasons to think are dangerous aliens – and whatever the level of enthusiasm, they are essentially doing those things because they are being paid to.

Some of these people (it doesn’t matter which era or regime you apply this to, as bodycam and mobile phone footage testifies) perform additional cruelties which they aren’t specifically being paid for, and that their leaders may never even know about, just because they can and because it gratifies them in some way, while others are simply following the orders they are given.

But ‘just following orders’ – complicity, in a word – wasn’t considered a reasonable defence in the war trials of 1945 and it still isn’t one now. And the reptilian act of formulating and issuing dehumanising orders, even (or perhaps especially?) without personally committing any atrocities oneself isn’t any kind of defence at all. It was and should be part of any prosecution’s case for maximum culpability. Leaders require followers and followers need leaders, but you don’t have to be either.

 

what do you look like?

A few years ago a friend sent me a photograph of the ten-year-old us in our Primary School football team. I was able, without too much thought, to put names to all eleven of the boys, but the biggest surprise was that my initial reaction, for maybe a second but more like two seconds, was not to recognise myself. In my defence, I don’t have any other pictures of me at that age, and even more unusually, in that picture I’m genuinely smiling. Usually I froze when a camera was pointed at me (and still do, if it takes too long), but I must have felt safer than usual in a group shot, because it is a real smile and not the standard grimace that normally happened when I was asked to smile for photographs. I could possibly also be forgiven for my confusion because in contrast with my present self, ten year old me had no eyebrows, a hot-pink-to-puce complexion and unmanageably thick, wavy, fair hair; but even so, that was the face I looked at in the mirror every day for years and, more to the point, that gangly child with comically giant hands actually is me; but what would I know?

My favourite of David Hockney’s self portraits – Self Portrait with Blue Guitar (1977)

In a recent documentary, the artist David Hockney made a remark (paraphrased because I don’t have it to refer to) that resonated with me; your face isn’t for you, it’s for other people. And, as you’d expect of someone who has spent a significant part of his long career scrutinizing people and painting portraits of them, he has a point. Everyone around you has a more accurate idea of what you look like than you do. Even when you see someone ‘in real life’ who you are used to seeing in photographs or films, there’s a moment of mental recalibration; even if they look like their image, the human being before you in three dimensions is a whole different scale from the thing you are used to seeing. I remember reading in some kids’ novel that the young footballer me liked (I’m guessing Willard Price but can’t swear to it) that when being shown photographs of themselves, the indigenous people of (I think) New Guinea, not only weren’t impressed, but didn’t recognise them as anything in particular. Like Hockney, they had a point; if the Victorian people who invented photography hadn’t grown up with a tradition of ‘realistic,’ representational art would they have seen any relationship between themselves as living, breathing, colourful, space-filling three-dimensional organisms and the monochromatic marks on little flat pieces of paper? The response of the fictional New Guinea tribespeople is actually more logical than the response (surprise, wonder, awe) that’s expected of them in the novel.

Hockney went on further to say that portrait painting (if the sitter is present with the artist) gives a better idea of a person than photography does. At first this is a harder argument to buy into in a way, but it has its own logic too. A photograph, as he pointed out, is a two-dimensional record of one second in time, whereas the portrait painter creates their also two-dimensional image from spending time in the company of the sitter and focusing on them, a different, deeper kind of focus, since it engages the brain as well as the senses, than the technical one that happens with a lens, light and film or digital imaging software. A camera doesn’t care what you are like, it just sees how you look, from that angle, for that second. Maybe my big 10-year-old smile really is representative of how I was, but from memory it doesn’t represent that period for me at all.

Egon Schiele in his studio c.1915 (left) vs his 1913 self-portrait (right)

But I might never have written this had I not been reading Frank Whitford’s excellent monograph on the Austrian expressionist painter Egon Schiele (Thames & Hudson, 1981). Schiele is famous for (among other things) his twisted, emaciated and fanatically awkward self-portraits. The man he depicts is scrawny, elongated, intense, sometimes almost feline and utterly modern. Schiele in photographs, on the other hand, is quite a different presence. He sometimes has the expected haunted look and the familiar shock of hair, and he poses almost as awkwardly, but otherwise he looks surprisingly dapper, civilised, diminutive, square faced and elfin. But if we think – and it seems logical that we do – that the photographs show us the ‘real’ Schiele, then the descriptions of those who knew him suggest otherwise. “a slim young man of more than average height… Pale but not sickly, thin in the face, large dark eyes and full longish dark brown hair which stood out in all directions. His manner was a little shy, a little timid and a little self-confident. He did not say much, but when spoken to his face always lit up with the glimmer of a quiet smile.” (Heinrich Benesch, quoted in Whitford, p.66) This description doesn’t exactly clash with the Schiele of the photographs (though he never appears especially tall), but it’s somehow far easier to identify with the dark-eyed, paradoxically shy and confident Schiele of the self portraits. In his own writings, Schiele seems as tortured and intense as in his paintings, but in photographs he appears confident, knowing and slightly arch.  His face, as Hockney says, may not have been for him, but he seems to have captured it in his art in ways that his friends and acquaintances recognised, and which the camera apparently didn’t.

Schiele in 1914 by Josef Anton Trčka (left) vs his 1911 self portrait (right)

Which, what, proves Hockney both right (portraiture is superior to photography) and wrong (Schiele knew his own face)? And anyway, what does that have to do with the 10-year old me? Nothing really, except that the camera, objective and disinterested, captured an aspect of me in that second which may or may not have been “true.” Objectivity and disinterestedness are positive qualities for evaluating facts, but when it comes to human beings, facts and truth have a complicated relationship. Photography, through its “realness,” has issues capturing these complexities, unless the photographer is aware of them and – Diane Arbus and Nan Goldin spring to mind – has the ability to imbue their work with more than the obvious surface information that is the camera’s speciality. But manually-created art, with its human heart and brain directing, naturally takes the relationship between truth and facts in its stride.

One final example that proves nothing really, except to my satisfaction. Around the year 1635, the Spanish painter Diego Velázquez was tasked with painting portraits of the assorted fools, jesters dwarfs and buffoons whose lives were spent entertaining the Spanish court. Most of these people suffered from mental or physical disabilities (or both) and were prized (I think a more accurate word than ‘valued’ in this context) for their difference from ‘normal’ people; in the same way as carnival “freaks” into the early 20th century in fact. Although these people were comparatively privileged, compared to what their lives would have been like had they not been adopted by the Royal court, their position in the household was more akin to pets than friends or even servants. Juan de Calabazas (“John of Gourds; a gourd was a traditional jester’s attribute) suffered from unknown mental illnesses and physical tics. In a time and place where formality and manners were rigidly maintained, especially around the monarch – where a misstep in etiquette could have serious or even fatal consequences, buffoons like Juan entertained the court with unfettered, sometimes nonsensical or outrageous speech, impulsive laughter and strange, free behaviour. Whereas in normal society these people would be lucky even to survive, in the Court their behaviour was celebrated and encouraged. Velázquez is rightly famous for the empathy and humanity with which he painted portraits of these marginalised figures, but although, as Wikipedia (why not?) puts it; “Velázquez painted [Juan] in a relatively calm state, further showing Velazquez’s equal show of dignity to all, whether king or jester” that seems an unusual response to the portrait below, It’s not untrue, but for me at least, Velázquez’s process of humanisation is painful too. The knowledge that this man lived his life as a plaything of the rich and powerful, alive only because they found him funny is troubling enough. But that pathos seems to be embodied in the picture and you know, or it feels like you know, that Velázquez didn’t find him funny, or at least not only funny. It’s something like watching David Lynch’s The Elephant Man compared to looking at the Victorian photographs of the real Joseph Merrick. Seeing the photographs is troubling, seeing Lynch’s cinematic portrait is too, but it’s deeply moving too.

Juan de Calabazas (c.1635-9) by Diego Velázquez

All of which may just be a way of saying that a camera is a machine and does what it does – recording the exterior of what it’s pointed at – perfectly, while a human being does, and feels, many things simultaneously, probably not perfectly. Well I’m sure we all knew that anyway. I eventually got eyebrows, by the way.

 

most things don’t exist

 

eh, Mel Gibson: but he played a good Hamlet (dir Franco Zeffirelli, 1990)

With apologies to Marcel Proust – but not very vehement apologies, because it’s true – the taste of honey on toast is as powerfully evocative and intensely transporting to me as anything that I can think of. The lips and tongue that made that association happen don’t exist anymore and neither does the face, neither do the eyes, and neither does one of the two brains and/or hearts* that I suppose really made it happen (mine are still there, though). In 21st century Britain, it’s more likely than not that even her bones don’t exist anymore, which makes the traditional preoccupation with returning to dust feel apt and more immediate and (thankfully?) reduces the kind of corpse-fetishising morbidity that seems to have appealed so much to playgoers in the Elizabethan/Jacobean era.

Death & Youth (c.1480-90) by the unknown German artist known as The Master of the Housebook

Thou shell of death,
Once the bright face of my betrothed lady,
When life and beauty naturally fill’d out
These ragged imperfections,
When two heaven-pointed diamonds were set
In those unsightly rings: then ’twas a face
So far beyond the artificial shine
Of any woman’s bought complexion

(The Revenger’s Tragedy (1606/7) by Thomas Middleton and/or Cyril Tourneur, Act one, Scene one)

 

                                                                                                     *is the heart in the brain? In one sense obviously not, in another maybe, but the sensations associated with the heart seem often to happen somewhere around the stomach; or is that just me?

More to the point, “here hung those lips that I have kissed I know not how oft“, etc. All of which is beautiful; but for better or worse, a pile of ash isn’t likely to engender the same kind of thoughts or words as Yorick’s – or anybody’s – skull. But anyway, the non-existence of a person – or, even more abstractly, the non-existence of skin that has touched your skin (though technically of course all of the skin involved in those kisses has long since disappeared into dust and been replaced anyway) is an absence that’s strange and dismal to think about. But then most things don’t exist.

Vanitas: Still Life with Skull (c.1671) by an unknown English painter

But honey does exist of course; and the association between human beings and sugary bee vomit goes back probably as long as human beings themselves. There are Mesolithic cave paintings, 8000 years old or more, made by people who don’t exist, depicting people who may never have existed except as drawings, or may have once existed but don’t anymore, plundering beehives for honey. Honey was used by the ancient Egyptians, who no longer exist, in some of their most solemn rites, it had sacred significance for the ancient Greeks, who no longer exist, it was used in medicine in India and China, which do exist now but technically didn’t then, by people who don’t, now. Mohammed recommended it for its healing properties; it’s a symbol of abundance in the Bible and it’s special enough to be kosher despite being the product of unclean insects. It’s one of the five elixirs of Hinduism, Buddha was brought honey by a monkey that no longer exists. The Vikings ate it and used it for medicine too. Honey was the basis of mead, the drink of the Celts who sometimes referred to the island of Britain as the Isle of Honey.

probably my favourite Jesus & Mary Chain song: Just Like Honey (1985)

And so on and on, into modern times. But also (those Elizabethan-Jacobeans  again) “The sweetest honey is loathsome in its own deliciousness. And in the taste destroys the appetite.” (William Shakespeare, Romeo and Juliet (c.1595) Act 2, scene 6)Your comfortable words are like honey. They relish well in your mouth that’s whole; but in mine that’s wounded they go down as if the sting of the bee were in them.”(John Webster, The White Devil (1612), Act 3. Sc.ene 3). See also “honey trap”. “Man produces evil as a bee produces honey.”You catch more flies with honey.

But on the whole, the sweetness of honey is not and has never been sinister. A Taste of Honey, Tupelo Honey, “Wild Honey,” “Honey Pie”, “Just like Honey,” “Me in Honey,” “Put some sugar on it honey,” Pablo Honey, “Honey I Sure Miss You.” Honey to the B. “Honey” is one of the sweetest (yep) of endearments that people use with each other. Winnie-the-Pooh and Bamse covet it. Honey and toast tasted in a kiss at the age of 14 is, in the history of the world, a tiny and trivial thing, but it’s enough to resonate throughout a life, just as honey has resonated through the world’s human cultures. Honey’s Dead. But the mouth that tasted so sweetly of honey doesn’t exist anymore. Which is sad, because loss is sad. But how sad? Most things never exist and even most things that have existed don’t exist now, so maybe the fact that it has existed is enough.

“Most things don’t exist” seems patently untrue: for a thing to be ‘a thing’ it must have some kind of existence, surely? And yet, even leaving aside things and people that no longer exist, we are vastly outnumbered by the things that have never existed, from the profound to the trivial. Profound, well even avoiding offending people and their beliefs, probably few people would now say that Zeus and his extended family are really living in a real Olympus. Trivially, 70-plus years on from the great age of the automobile, flying cars as imagined by generations of children, as depicted in books and films, are still stubbornly absent from the skies above our roads. The idea of them exists, but even if – headache-inducing notion – it exists as a specific idea (“the idea of a flying car”), rather than just within the general realm of “ideas,” an idea is an idea, a thing perhaps but not the thing that it is about. Is a specific person’s memory of another person a particular thing because it relates to a particular person, or does it exist only under the larger and more various banner of “memories”? Either way, it’s immaterial, because even though the human imagination is a thing that definitely exists, the idea of a flying car is no more a flying car than Leonardo da Vinci’s drawing of a flying machine was a flying machine or that my memory of honey-and-toast kisses is a honey-and-toast kiss.

If you or I picture a human being with electric blue skin, we can imagine it and if we have the talent we can draw it, someone could depict it in a film, but it wouldn’t be the thing itself, because human beings with electric blue skin, like space dolphins, personal teleportation devices, seas of blood, winged horses, articulate sentient faeces and successful alchemical experiments, don’t exist. And depending on the range of your imagination (looking at that list mine seems a bit limited), you could think of infinite numbers of things that don’t exist. There are also, presumably, untold numbers of things that do exist but that we personally don’t know about or that we as a species don’t know about yet. But even if it was possible to make a complete list of all of the things in existence (or things in existence to date; new things are invented or develop or evolve all the time), it would always be possible to think of even more things that don’t exist, – simply, in the least imaginative way, by naming variations on, or parodies of everything that does exist. So supermassive black holes exist? Okay, but what about supertiny pink holes? What about supermedium beige holes? This June, a new snake (disappointingly named Ovophis jenkinsi) was discovered. But what about a version of Ovophis jenkinsi that sings in Spanish or has paper bones or smells like Madonna? They don’t exist.

JAMC Honey’s Dead, 1992

Kind of a creepy segue if you think about it (so please don’t), but like those beautifully-shaped lips that tasted of honey, my mother no longer exists, except as a memory, or lots of different memories, belonging to lots of different people. Presumably she exists in lots of memories as lots of different people who happen to have the same name. But unlike supermedium beige holes, the non-existence of previously-existing things and people is complex, because of the different perspectives they are remembered from. But regardless, they are still fundamentally not things anymore. But even with the ever-growing, almost-infinite number of things, there are, demonstrably, more things that don’t exist. And, without wishing to be horribly negative or repeating things I’ve written before, one of the surprises with the death of a close relative was to find that death does exist. Well, obviously, everyone knows that – but not just as an ending or as the absence of life, as was always known, but as an active, grim-reaper-like force of its own. For me, the evidence for that – which I’m sure could be explained scientifically by a medical professional – is the cold that I mentioned in the previous article. Holding a hand that gets cold seems pretty normal; warmth ebbing away as life ebb away; that’s logical and natural. But this wasn’t the expected (to me) cooling down of a warm thing to room temperature, like the un-drunk cups of tea which day after day were brought and cooled down because the person they were brought for didn’t really want them anymore, just the idea of them. That cooling felt natural, as did the warming of the glass of water that sat un-drunk at the bedside because the person it was for could no longer hold or see it. That water had been cold but had warmed up to room temperature, but the cold in the hand wasn’t just a settling in line with ambient conditions. It was active cold; hands chilling and then radiating cold in quite an intense way, a coldness that dropped far below room temperature. I mentioned it to a doctor during a brief, unbelievably welcome break to get some air, and she said “Yes, she doesn’t have long left.” Within a few days I wished I’d asked for an explanation of where that cold was coming from; where is it generated? Which organ in the human body can generate cold so quickly and intensely? Does it do it in any other situations? And if not, why not? So, although death can seem abstract, in the same sense that ‘life’ seems abstract, being big and pervasive, death definitely exists. But as what? Don’t know; not a single entity, since it’s incipient in everyone, coded into our DNA: but that coding has nothing to do with getting hit by cars or drowning or being shot, does it? So, a big question mark to that. Keats would say not to question it, just to enjoy the mystery. Well alright then.

Klaus Nomi as “the Cold Genius” from his 1981 version of Purcell’s “The Cold Song”

But since most things *don’t* exist, but death definitely does exist, existence is, in universal terms, rare enough to be something like winning the lottery. But like winning the lottery, existence in itself is not any kind of guarantee of happiness or satisfaction or even honey-and-toast kisses; but it at least offers the possibility of those things, whereas non-existence doesn’t offer anything, not even peace, which has to be experienced to exist. We have all not existed before and we will all not exist again; but honey will still be here, for as long as bees are at least. I don’t know if that’s comforting or not. But if you’re reading this – and I’m definitely writing it – we do currently exist, so try enjoy your lottery win, innit.

Something silly about music next time I think.

Ancient Roman vanitas mosaic showing a skull and the wheel of fortune

who’d have them?

My mother died just about a month ago, and I think she/her death is taking up too much space in my conscious mind to trouble my subconscious or unconscious self too much. It’s interesting to note that even though death is one of the central themes of much of the most important art ever created, and although I am someone with an interest in Art, in the capital A, “high culture” sense, what came into my mind in that room, while holding her hand was actually a line from a song which turned out to have an accuracy I didn’t realise until then; “it’s so cold, it’s like the cold if you were dead.”* Mum wouldn’t have liked that. And if she wasn’t dead I probably wouldn’t be posting what follows online, even though there’s nothing in it she would object to and even though, as far as I’m aware, she never read a word I wrote: which sounds petulant but it’s not a complaint. Our parents know us too well in one way to want them to know us in other ways, or at least that’s how I think I feel about it.

*Plainsong by The Cure, which luckily I’ve barely been able to stand for many years although I really do love it.

Max Ernst – Max Ernst Showing a Young Girl the Head of his Father (1926/7)

Anyway, last night, for the first time in what feels like decades, I dreamed about my dad. The dream was full of vivid, long forgotten details, most of which almost immediately receded back into the murk of subconscious memory on waking. Not all of them though; how could I have forgotten his strangely hissing laugh (less sinister than it sounds)? But waking up, what was lurking in my mind as the dream faded was, of all things – pop culture strikes again – lines from Stephen King’s IT (which mum read, but dad didn’t, he was squeamish about horror) and a feeling of dread that wasn’t terrifying or even upsetting, just somehow inevitable and in some way kind of comfortable.

That quote comes from a scene in the book when the young protagonists come across the monster, Pennywise, in an old newspaper clipping from 1945. I had no idea that I had absorbed this paragraph, or at least its final lines, first read when I was 14, completely enough to have known it almost word for word, but there it was (have included the whole paragraph for sense):

The headline: JAPAN SURRENDERS – IT’S OVER! THANK GOD IT’S OVER! A parade was snake-dancing its way along Main Street toward Up-Mile Hill. And there was the clown in the background, wearing his silver suit with the orange buttons, frozen in the matrix of dots that made up the grainy newsprint photo, seeming to suggest (at least to Bill) that nothing was over, no one had surrendered, nothing was won, nil was still the rule, zilch still the custom; seeming to suggest above all that all was still lost.  

Stephen King, IT, 1986 p.584 (in the edition I have)

Pietà or Revolution by Night (1923)

Which is not really fair; dad had his faults but he was not a shape-shifting alien clown that ate kids. And anyway, it wasn’t even a nightmare as such. Details are receding – and have almost vanished even since I made the original note this morning – but essentially, nothing bad happened, we were in a house, dad was there, my siblings were there, offering eye-rolling ‘he’s annoying but what can you do?’ support, but what lingers is the last phase before waking – an interminably long, drawn out scene where I was attempting, unsuccessfully, to make coffee for everyone in an unfamiliar kitchen, but couldn’t find the right spoon, with dad behind me watching with condescending amusement and laughing that hissing laugh. And then I woke up to a Stephen King quote. So thanks for that, brain. One of the hardest lessons to learn and re-learn is that other people are none of your business, or to put it less negatively, that you have no claim on any other human being and they have no claim on you. Except for your parents of course; but that’s that dealt with anyway.

nostalgia isn’t going to be what it was, or something like that

When I was a child there was music which was, whether you liked it or not, inescapable. I have never – and this is not a boast – deliberately or actively listened to a song by Michael Jackson, Madonna, Phil Collins, Duran Duran, Roxette, Take That, Bon Jovi, the Spice Girls… the list isn’t endless, but it is quite long. And yet I know some, or a lot, of songs by all of those artists. And those are just some of the household names. Likewise I have never deliberately listened to “A Horse With No Name” by America, “One Night in Bangkok” by Murray Head or “Would I Lie to You” by Charles & Eddie; and yet, there they are, readily accessible should I wish (I shouldn’t) to hum, whistle or sing them, or just have them play in my head, which I seemingly have little control over.

Black Lace: the unacceptable face(s) of 80s pop

And yet, since the dawn of the 21st century, major stars come and go, like Justin Bieber, or just stay, like Ed Sheeran, Lana Del Rey or Taylor Swift, without ever really entering my consciousness or troubling my ears. I have consulted with samples of “the youth” to see if it’s just me, but no: like me, there are major stars that they have mental images of, but unless they have actively been fans, they couldn’t necessarily tell you the titles of any of their songs and have little to no idea of what they actually sound like. Logical, because they were no more interested in them than I was in Dire Straits or Black Lace; but alas, I know the hits of Dire Straits and Black Lace. And the idea of ‘the Top 40 singles chart’ really has little place in their idea of popular music. Again, ignorance is nothing to be proud of and I literally don’t know what I’m missing. At least my parents could dismiss Madonna or Boy George on the basis that they didn’t like their music. It’s an especially odd situation to find myself in as my main occupation is actually writing about music; but of course, nothing except my own attitude is stopping me from finding out about these artists.

The fact is that no musician is inescapable now. Music is everywhere, and far more accessibly so than it was in the 80s or 90s – and not just new music. If I want to hear Joy Division playing live when they were still called Warsaw or track down the records the Wu-Tang Clan sampled or hear the different version of the Smiths’ first album produced by Troy Tate, it takes as long about as long to find them as it does to type those words into your phone. Back then, if you had a Walkman you could play tapes, but you had to have the tape (or CD – I know CDs are having a minor renaissance, but is there any more misbegotten, less lamented creature than the CD Walkman?) Or you could – from the 1950s onwards – carry a radio with you and listen to whatever happened to be playing at the time. I imagine fewer people listen to the radio now than they did even 30 years ago, but paradoxically, though there are probably many more – and many more specialised –  radio stations now than ever, their specialisation actually feeds the escapability of pop music. Because if I want to hear r’n’b or metal or rap or techno without hearing anything else, or to hear 60s or 70s or 80s or 90s pop without having to put up with their modern-day equivalents, then that’s what I and anyone else will do. I have never wanted to hear “Concrete and Clay” by Unit 4+2 or “Agadoo” or “Come On Eileen” or “Your Woman” by White Town or (god knows) “Crocodile Shoes” by Jimmy Nail; but there was a time when hearing things I wanted to hear but didn’t own, meant running the risk of being subjected to these, and many other unwanted songs. As I write these words, “Owner of a Lonely Heart” by Yes, a song that until recently I didn’t know I knew is playing in my head.

And so, the music library in my head is bigger and more diverse than I ever intended it to be. In a situation where there were only three or four TV channels and a handful of popular radio stations, music was a kind of lingua franca for people, especially for young people. Watching Top of the Pops on a Thursday evening, or later The Word on Friday was so standard among my age group that you could assume that most people you knew had seen what you saw; that’s a powerful, not necessarily bonding experience, but a bond of sorts, that I don’t see an equivalent for now, simply because even if everyone you know watches Netflix, there’s no reason for them to have watched the same thing at the same time as you did. It’s not worse, in some ways it’s obviously better; but it is different. Of course, personal taste back then was still personal taste, and anything not in the mainstream was obscure in a way that no music, however weird or niche, is now obscure, but that was another identity-building thing, whether one liked it or not.

Growing up in a time when this isn’t the case and the only music kids are subjected to is the taste of their parents (admittedly, a minefield) or fragments of songs on TV ads, if they watch normal TV or on TikTok, if they happen to use Tiktok, is a vastly different thing. Taylor Swift is as inescapable a presence now, much as Madonna was in the 80s, but her music is almost entirely avoidable and it seems probable that few teenagers who are entirely uninterested in her now will find her hits popping unbidden into their heads in middle age. But conversely, the kids of today are more likely to come across “Owner of a Lonely Heart” on YouTube than I would have been to hear one of the big pop hits of 1943 in the 80s.

Far Dunaway as Bonnie Parker; a little bit 1930s, a lot 1960s

What this means for the future I don’t know; but surely its implications for pop-culture nostalgia – which has grown from its humble origins in the 60s to an all-encompassing industry, are huge. In the 60s, there was a brief fashion for all things 1920s and 30s which prefigures the waves of nostalgia that have happened ever since. But for a variety of reasons, some technical, some generational and some commercial, pop culture nostalgia is far more elaborate than ever before. We live in a time when constructs like “The 80s” and “The 90s” are well-defined, marketable eras that mean something to people who weren’t born then, in quite a different way from the 1960s version of the 1920s. Even back then, the entertainment industry could conjure bygone times with an easy shorthand; the 1960s version of the 1920s and 30s meant flappers and cloche hats and Prohibition and the Charleston and was evoked on records like The Beatles’ Honey Pie and seen onstage in The Boy Friend or in the cinema in Bonnie & Clyde. But the actual music of the 20s and 30s was mostly not relatable to youngsters in the way that the actual entertainment of the 80s and 90s still is. Even if a teenager in the 60s did want to watch actual silent movies or listen to actual 20s jazz or dance bands they would have to find some way of accessing them. In the pre-home video era that meant relying on silent movie revivals in cinemas, or finding old records and having the right equipment to play them on, since old music was then only slowly being reissued in modern formats. The modern teen who loves “the 80s” or “the 90s” is spoiled by comparison, not least because its major movie franchises like Star Wars, Indiana Jones, Ghostbusters and Jurassic Park are still around and its major musical stars still tour or at least have videos and back catalogues that can be accessed online, often for free.

Supergrass in 1996: a little bit 60s, a lot 70s, entirely 90s

Fashion has always been cyclical, but this feels quite new (which doesn’t mean it is though). Currently, culture feels not like a wasteland but like Eliot’s actual Waste Land, a dissonant kind of poetic collage full of meaning and detritus and feeling and substance and ephemera but at first glance strangely shapeless. For example, in one of our current pop culture timestreams there seems to be a kind of 90s revival going on, with not only architects of Britpop like the Gallagher brothers and Blur still active, but even minor bands like Shed Seven not only touring the nostalgia circuit but actually getting in the charts. Britpop was notoriously derivative of the past, especially the 60s and 70s. And so, some teenagers and young adults (none of these things being as pervasive as they once were) are now growing up in a time when part of ‘the culture’ is a version of the culture of the 90s, which had reacted to the culture of the 80s by absorbing elements of the culture of the 60s and 70s. And while the artists of 20 or 30 years ago refuse to go away even modern artists from alternative rock to mainstream pop stars make music infused with the sound of 80s synths and 90s rock and so on and on. Nothing wrong with that of course, but what do you call post-post-modernism? And what will the 2020s revival look like when it rears its head in the 2050s, assuming there is a 2050s? Something half interesting, half familiar no doubt.

an alan smithee war

an annoying but perhaps necessary note; “Alan (or Allan, or Allen) Smithee” is a pseudonym used by Hollywood film directors when they wish to disown a project

Watch out, this starts off being insultingly elementary, but then gets complicated and probably contradictory, quite quickly.

Countries, States and religions are not monoliths and nor are they sentient. They don’t have feelings, aims, motivations or opinions. So whatever is happening in the Middle East isn’t ‘Judaism versus Islam’ or even ‘Israel versus Palestine’, any more than “the Troubles”* were/are ‘Protestantism versus Catholicism’ or ‘Britain versus Ireland’.

* a euphemism, which, like most names for these things is partly a method of avoiding blame – as we’ll see

Places and atrocities aren’t monoliths either; Srebrenica didn’t massacre anybody**, the Falkland islands didn’t have a conflict, ‘the Gulf’ didn’t have any wars and neither did Vietnam or Korea. But somebody did. As with Kiefer Sutherland and Woman Wanted in 1999 or Michael Gottlieb and The Shrimp on the Barbie in 1990 and whoever it was that directed Gypsy Angels in 1980, nobody wants to claim these wars afterwards. But while these directors have the handy pseudonym Allan Smithee to use, there is no warmongering equivalent, and so what we get is geography, or flatly descriptive terms like ‘World War One’, which divert the focus from the aggressor(s) and only the occasional exception (The American War of Independence) that even references the real point of the war. But, whether interfered with by the studio or not, Kevin Yagher did direct Hellraiser: Bloodline, just as certain individuals really are responsible for actions which are killing human beings as you read this. Language and the academic study of history will probably help to keep their names quiet as events turn from current affairs and into history. Often this evasion happens for purely utilitarian reasons, perhaps even unknowingly, but sometimes it is more sinister.

** see?

As the 60s drew to its messy end, the great Terry “Geezer” Butler wrote lines which, despite the unfortunate repeat/rhyme in the first lines, have a Blakean power and truth:

Generals gathered in their masses
Just like witches at black masses

Black Sabbath, War Pigs, 1970

There is something sinister and even uncanny in the workings of power, in the distance between avowed and the underlying motivations behind military action. Power politics feels like it is – possibly because intuitively it seems like it should be – cold and logical, rather than human and emotional. It doesn’t take much consideration though to realise that even beneath the chilly, calculated actions of power blocs there are weird and strangely random human desires and opinions, often tied in with personal prestige, which somehow seems to that person to be more important than not killing people or not having people killed.

Anyway, Geezer went on to say:

Politicians hide themselves away
They only started the war
Why should they go out to fight?
They leave that role to the poor

Still Black Sabbath, War Pigs (1970)

And that’s right too; but does that mean Butler’s ‘poor’ should take no responsibility at all for their actions? In the largest sense they are not to blame for war or at least for the outbreak of war; and conscripts and draftees are clearly a different class again from those who choose to “go out to fight.” But. As so often WW2 is perhaps the most extreme and therefore the easiest place to find examples; whatever his orders or reasons, the Nazi soldier (and there were lots of them) who shot a child and threw them in a pit, actually did shoot a child and throw them in a pit. His immediate superior may have done so too, but not to that particular child. And neither did Himmler or Adolf Hitler. Personal responsibility is an important thing, but responsibility, especially in war, isn’t just one act and one person. Between the originator and the architects of The Final Solution and the shooter of that one individual child there is a chain of people, any one of whom could have disrupted that chain and even if only to a tiny degree, affected the outcome. And that tiny degree may have meant that that child, that human being, lived or died. A small thing in a death toll of something over 6 million people; unless you happen to be that person, or related to that person.

As with the naming of wars and atrocities, terms like “genocide” and “the Holocaust” are useful, especially if we want – as we clearly do – to have some kind of coherent, understandable narrative that can be taught and remembered as history. But in their grim way, these are still euphemisms. The term ‘the Holocaust’ memorialises the countless – actually not countless, but still, nearly 80 years later, being counted – victims of the Nazis’ programme of extermination. But the term also makes the Holocaust sound like an event, rather than a process spread out over the best part of a decade, requiring the participation of probably thousands of people who exercised – not without some form of coercion perhaps, but still – their free will in that participation. The Jewish scholar Hillel the Elder’s famous saying,  whosoever saves a life, it is as though he had saved the entire world is hard to argue with, insofar as the world only exists for us within our perceptions. Even the knowledge that it is a spinning lump of inorganic and organic matter in space, and that other people populate it who might see it differently only exists in our perceptions. Or at least try to prove otherwise. And so the converse of Hillel’s saying – which is actually included in it but far less often quoted – is Whosoever destroys one soul, it is as though he had destroyed the entire world. Which sounds like an argument for pacifism, but while pacifism is entirely viable and valuable on an individual basis as an exercise of one’s free will* – and on occasion has a real positive effect – one-sided pacifism relies on its opponents not taking a cynically Darwinian approach, which is hopeful at best. Pacifism can only really work if everyone is a pacifist, and everyone isn’t a pacifist.

*the lone pacifist can at least say, ‘these terrible things happened, but I took no part in them’, which is something, especially if they used what peaceful means they could to prevent those terrible things and didn’t unwittingly contribute to the sum total of suffering; but those are murky waters to wade in.

But complicated though it all is, people are to blame for things that happen. Just who to blame is more complicated – more complicated at least than the workable study of history can afford to admit. While countries and religions are useful as misleading, straw man scapegoats, even the more manageable unit of a government is, on close examination, surprisingly hard to pin down. Whereas (the eternally handy example of) Hitler’s Nazi Party or Stalin’s Council of People’s Commissars routinely purged heretics, non-believers and dissidents, thus acting as a genuine, effective focus for their ideologies and therefore for blame and responsibility, most political parties allow for a certain amount of debate and flexibility and therefore blame-deniability. Regardless, when a party delivers a policy, every member of that party is responsible for it, or should publicly recuse themselves from it if they aren’t.

The great (indeed Sensational) Scottish singer Alex Harvey said a lot of perceptive things, not least and “[Something] I learned from studying history. Nobody ever won a war. A hundred thousand dead at Waterloo*. No glory in that. Nobody needs that.” Nobody ever won a war;  but plenty of people, on both sides of every conflict, have lost one – and, as the simple existence of a second world war attests, many, many people have lost a peace too.

*Modern estimates put it at ‘only’ 11,000 plus another 40,000 or so casualties; but his point stands

But the “causes” of war are at once easily traced and extremely slippery. Actions like the 1939 invasion of Poland by the armies of Germany and the USSR were, as military actions still are, the will of certain individuals, agreed to by other individuals and then acted upon accordingly. You may or may not agree with the actions of your government or the leaders of your faith. You may even have had some say in them, but in most cases you probably haven’t. Some of those dead on the fields of Waterloo were no doubt enthusiastic about their cause, some probably less so. But very few would have had much say in the decisions which took them to Belgium in the first place.

The buck should stop with every person responsible for wars, crimes, atrocities; but just because that’s obviously impossible to record – and even if it wasn’t, too complex to write in a simple narrative – that doesn’t mean the buck should simply not stop anywhere. Victory being written by the winners often means that guilt is assigned to the losers, but even when that seems fair enough (there really wouldn’t have been a World War Two without Hitler) it’s a simplification (there wouldn’t have been an effective Hitler without the assistance of German industrialists) and a one-sided one (it was a World War because most of the leading participants had already had unprovoked wars of conquest). That was a long sentence. But, does the disgusting history of Western colonialism, the arguably shameful treatment of Germany by the Allies after WW1 and the dubious nature of the allies and some of their actions make Hitler himself any less personally responsible for the war? And does Hitler’s own guilt make the soldier who shoots a child or unarmed adult civilians, or the airman who drops bombs on them any less responsible for their own actions?

Again; only human beings do these things, so the least we can do is not act like they are some kind of unfathomable act of nature when we discuss them or name them. Here’s Alex Harvey again; “Whether you like it or not, anybody who’s involved in rock and roll is involved in politics. Anything that involves a big crowd of people listening to what you say is politics.” If rock and roll is politics, then actual politics is politics squared; and for as long as we settle, however grudgingly or complacently, for pyramidal power structures for our societies then the person at the top of that pyramid, enjoying its vistas and rarefied air should be the one to bear its most sombre responsibilities. But all who enable the pyramid to remain standing should accept their share of it too.

So when you’re helplessly watching something that seems like an unbelievable waste of people’s lives and abilities, pay close attention to who’s doing and saying what, even if you don’t want to, because the credits at the end probably won’t tell you who’s really responsible.

 

 

 

chocolate eggs & bunnies & pregnancy & blood: happy Easter!

ceramic sculpture of a Moon Goddess and her rabbit or hare partner, Mexico, c.700 AD

Imagine a culture so centred on wealth, property and power that it becomes scared of something as fundamental to human existence as sex, and frets endlessly about what it sees as the misuses of sex. A culture that identifies breeding so closely with with money, wealth and status, and women so closely with breeding and therefore with sex that, when looking to replace the traditional symbols of birth and regeneration it rejects sex and even nature and, in the end makes the embodiment of motherhood a virgin and the embodiment of rebirth a dead man. Unhealthy, you might think; misanthropic even – and yet here we are.

But when that misanthropic culture loses the religious imperative that fuelled it for centuries, what should be waiting but those ancient symbols of fertility; rabbits and eggs. But whereas Christianity in its pure, puritanical form found it hard to assimilate these symbols, preferring instead to just impose its own festival of rebirth on top of the pagan one, capitalism, despite being in so many ways compatible with the Judeo-Christian tradition, is essentially uninterested in spiritual matters. So even though capitalism is mostly pretty okay with Christianity, which creates its own consumer-friendly occasions, it proves to be equally okay with paganism, as long as it can sell us the pagan symbols in a lucrative way.

In Christianity the idea of the life cycle is almost surreally reproduced in the (male) Trinity; God the Father, God the Son and God the Holy Spirit – defined by the Lateran Council of 1213 – 15 as “the Father who begets, the Son who is begotten, and the Holy Spirit who proceeds” – there’s no room for anything as earthly or earthy as motherhood. The Virgin Mary is essentially a token female presence, and one with her biological female attributes erased. And yet in every society that has worshipped under the Christian banner, child-bearing has historically only been done by women and child-raising has almost entirely been ‘women’s work’ too. Which makes you think that really, patriarchy is one of the great mysteries of humanity and the fact that it’s seen by many as the natural order of human society is even stranger.

But anyway; Easter. Easter is a mess, even to begin with; its name is pagan (Ēostre or Ôstara, Goddess of the spring) and its Christian traditions, even when embodied in the tragic idea of a man being murdered/sacrificed by being nailed to a cross, were never entrenched enough to suppress the celebratory, even frivolous feeling that spring traditionally brings. Okay, so Christ ascending to heaven is pretty celebratory without being frivolous; but as, in the UK at least, represented by a hot cross bun, with the cross on the top to represent the crucifix and even – to play up the morbid factor that is so central to Christianity – its spices that are supposed allude to the embalming of Christ’s dead body, it’s hardly solemn: it’s a bun.

On the other hand, birth, since the dawn of time and to the present day, is not just a simple cause for rejoicing and in that, the Christian tradition – although it tries to remove the aspects that seem most central to birth to us: women, labour (the word presumably wasn’t chosen accidentally) and procreation – probably tells us more about the seriousness and jeopardy of childbirth than the Easter bunny does.

St Margaret, “reborn” after being eaten by a dragon

Childbirth is the central and most fundamental human experience and, until the 20th century it was one of the most perilous ones, so naturally the church had to address it. And so there’s a ‘patron’ (interesting choice of word) saint of childbirth; clearly the Virgin Mary is too specialist to be identified with (and perhaps it would even be blasphemous to do so?) so instead there’s St Margaret. Not much help; firstly, St Margaret should surely be a ‘matron saint’ but that’s not a thing, and secondly, in herself she has nothing to do with birth, although she was presumably born. Instead she becomes the saint of childbirth through the symbolic act of bursting out of the dragon who ate her – a strange analogy but one that reflects the hazardous nature of childbirth in medieval times, when mortality rates were high, not just for babies but for their mothers. And what mother couldn’t relate to bursting out of a dragon? But Christianity’s real issue with the whole topic of birth has less to do with birth itself than how humans reproduce in the first place. Rabbits and hares may represent – in ancient cultures across the world, from Europe to Mexico and beyond – fecundity, but it’s an animal idea of fertility for its own sake that has nothing to do with the practical or emotional aspects of producing new human beings, or the legal, dynastic and financial ones that the Old Testament and the ancient world generally saw as the purpose of reproduction.

Jan & Hubert Van Eyck’s Eve from the Ghent Altarpiece (completed c 1432)

Pregnancy in Western art was a rarity until fairly recently and the puritanical ideas inherited by Victorian Christianity shaped art historical studies, to the point that people (until quite recently) tended to deny the evidence of their own eyes. Surely to believe that Jan and Hubert van Eyck’s hyper-realistic Eve – the mother of the human race – from Ghent Altarpiece (completed in 1432) just has the preferred medieval figure, rather than being pregnant, is perverse, isn’t it? Or that Mrs Arnolfini (Costanza Trenta) in the Arnolfini Portrait of 1434, who is touching her swollen stomach and who had died, presumably in childbirth – the year before this painting was completed, is just an example of that same fashionable shape, seems ridiculously far-fetched. (My favourite among the many theories about the Arnolfini portrait is Margaret Koster’s – which is explored in Waldemar Januszczak’s excellent short film about the painting.)

To go back to Eve; the idea of the first woman pregnant with the first child makes more sense for the 15th century, which was neither squeamish about or embarrassed by the realities of life in the same way that the 19th and early 20th century gentlemen who codified the canon of Western art history were. It’s not impossible that she is just the medieval/gothic ideal of femininity as seen in illuminated manuscripts and carvings; small shoulders, small breasts, big hips and stomach – given an unusually realistic treatment, but it’s hard to believe that even in the 15th century the first reaction of viewers – especially given the realism of the picture – wouldn’t have been to assume she was pregnant. Culture and society has changed a lot in the intervening centuries, but biology hasn’t.

For subsequent generations, the status of women and the perils of childbirth and childhood gave pregnant women and babies a strange presence in secular art. While there’s no reason to assume that people were less caring or sentimental about their partners or their children, portraits were rarely about sentiments, but status. Portraits of women, with the rare exception of Queens, were generally portraits of wives or potential wives, and pregnancy was of crucial dynastic importance. But in times when childbirth was almost as likely to end in death as life for both mother and child, it was presumably a risky thing to record; there are not very many pregnant portraits. Maybe – I should probably have investigated this before writing it – the time a portrait took from commission to completion was also a factor that made it risky? A portrait wasn’t a particularly inexpensive thing, possibly commissioning a portrait of someone who would quite likely be dead within the next nine months felt like an iffy investment, or (to be less mercenary about it) courting bad luck? In the generations that followed, female artists – such as Elizabeth Vigee-Lebrun – could celebrate motherhood in self-portraits, but for the kind of reasons mentioned above – and because of contemporary ideas of ‘decency’ – they were hardly likely to portray themselves as obviously pregnant.

Gustav Klimt – Hope 1 (1903)

As time went on and connoiseurship and ‘art history’ became a thing I don’t think it’s too much of an exaggeration to say that the arbiters of high culture in the paternalistic (at best, misogynistic at worst) society of Europe were intimidated by the female power inherent in the creation of the human race. The other side of that coin is the (slightly titillating) sense of the beauty, magic and wonder of pregnancy that the pro-female (philogynist? There must be a word) Austrian Gustav Klimt brought to art with Hope I. Beautiful though that is, Klimt’s vision isn’t really so far from the pure virgin/corrupt whore binary of medieval times, especially when you see his beautiful female figure of hope and renewal glowing against a background of death and peril. It really only when women enter the art world in greater numbers that the symbolic and magical aspects of motherhood are reconciled with the more sombre, earthly spirituality that Christianity preferred to represent in a dying man and that pregnant women can just be pregnant women.

For me, Paula Modersohn-Becker – one of my favourite painters – is the artist of pregnancy and childbirth and a painting like her Reclining Mother and Child II (1907) shows all of the human aspects that were embodied in the contorted Christian images of the Virgin Mary, crucifixion and Christ’s rebirth. In her self-portraits, the magic of Klimt without the titillating overtones, the fragility and peril of the older periods and the prosaic facts of pregnancy and what it does, good and bad, to the body, are all acknowledged. For once, it doesn’t seem ironic, only tragic, that Modersohn-Becker would be one of the many thousands of women of her era to die from complications shortly after giving birth.

Paula Modersohn-Becker – Reclining Mother & Child II (1906)
Käthe Kollwitz, 1920

But once the reality had been captured, where to go from there? Anywhere, essentially; after Paula Modersohn-Becker pregnancy becomes just a subject, if a special one; art as creation representing creation. That’s a lofty way of putting it, but for the generation of German artists that followed, ‘realism’ was the whole point, some of the time at least. If Paula Modersohn-Becker represented pregnancy from the point of view of experience, capturing both its beauty and discomfort, Otto Dix the arch-realist gives us just the discomfort. His pregnant mothers are almost all exhausted working class women, heavy, swollen, weighed down by their burden. It’s a beautifully-observed point of view, and an empathetic one, but possibly a very male one too. Although Dix claimed, possibly sincerely, “I’m not that obsessed with making representations of ugliness. Everything I’ve seen is beautiful.” he nevertheless took a definite pride in shocking viewers with his art. As he also said; “All art is exorcism. I paint dreams and visions too; the dreams and visions of my time. Painting is the effort to produce order; order in yourself. There is much chaos in me, much chaos in our time.” By the time Dix painted these pictures he was a father himself, but although his paintings of his family reveal a more tender, if just as incisive, aspect to his art. When he paints these mothers-to-be, with their hard lives in the terminally unstable Weimar Republic, he paints as a pitiless observer, knowing that his work was challenging and confrontational to the generally conservative audience of his time; a time when, like ours, forces of intolerance and conservatism were closing in on the freedom embodied in art this truthful. It’s notable that, while dealing in the same harsh realities as Dix, but with a socially conscious, rather than clinical eye, the artist Käthe Kollwitz gives her women a more studiedly pitiable, though no less ‘realistic’ aura.

But the fact that Dix’s realism, though ‘objective’ was dramatically heightened is highlighted by a comparison between two paintings, one by Dix and the other by his female student Gussy Hippold-Ahnet, painted in 1931/2 and of – I think – the same model. In Dix’s painting, his most famous painting of a pregnant woman, the mother-to-be’s face is averted, hidden in darkness and it’s her almost painful roundness and heaviness that is the focus of the picture. In Hippold-Ahnet’s painting, far less dramatically, the mother sits more or less neatly, looking big but not unhappy. It’s a less dynamic and less assured piece of work – but is it any less real? In Dix’s realism, reality is generally harsh and pitiless, with no veneer of politeness or sentimentality. But although that represents a kind of underlying truth, especially about nature, people are often savage and cruel are nevertheless just as often also polite and sentimental. Gussy’s painting seems less powerful, but she is not showing us, as Dix seems to be, a faceless being representing the eternal, but rarely-remarked-on hardship involved in the joyous business of continuing the human species. Instead, sh3 shows us a woman who happens to be pregnant; both paintings are realistic, both are objective and, as with the symbolic sacrifice of Christ and the eternally recurring Easter bunny, both display different aspects of the truth.

Otto Dix – Pregnant Woman (1931) & Gussy Hippold-Ahnert – Pregnant Woman (1932)

Since the 1920s, attitudes towards pregnancy and women have fluctuated but female artists are no longer the exception within the art world and so women in art can be women in art and not women as a symbols in art. And although male artists have continued – and why not? – to paint pregnant sitters (Lucian Freud’s Pregnant Girl is a beautiful, not uncomplicated example), not surprisingly women do it better. And while I’m not sure if my favourites – Alice Neel and Paula Rego spring to mind – add anything in terms of content and meaning to Paula Modersohn-Becker’s example, what they do add is more experience, wider experience and therefore bring a truer reflection of the source and the central experience of humanity to the world. Regardless of whether or not one believes in a god, everyone believes in that creation story; which is kind of more important than an old, bearded man, a young, sacrificed man and a bird; but it doesn’t matter, there’s room in art for everything. Anyway, enjoy your chocolate eggs.

Paula Rego – The First Mass in Brazil (1993)
Bonus picture: my favourite bunny in art: detail from Piero di Cosimo’s Venus, Mars & Cupid (1505)

 

the cult of maimed perfection

*firstly, may change this title as it possibly sounds like I’m saying the opposite of what I’m saying*

That western culture¹ has issues with womens’ bodies² is not a new observation. But it feels like the issues are getting stranger. Recently there have been, both on TV (where the time of showing is important) and online (where it isn’t), cancer awareness campaigns where women who have had mastectomies are shown topless (in the daytime). This is definitely progress – but it simultaneously says two different things with very different implications.
On the one hand it’s – I would say obviously – very positive; it is of course normal to have a life-changing (or life saving) operation and the scars that come with it, and it can only be helpful to minimise the fear surrounding what is a daunting and scary prospect for millions of people. Normalising in the media things that are already within the normal experience of people – especially when those things have tended to be burdened with taboos – is generally the right thing to do. These scars, after all are nothing to be ashamed of or that should be glossed over or hidden from view. I hope that not many people would argue with that. But at the same time, isn’t it also saying, ‘yes it’s completely normal and fine for a woman to be seen topless on daytime TV, or on popular social media sites, as long as she’s had her breasts³ cut off?’ That seems less positive to me.

¹ Western culture isn’t alone in this, but ‘write about what you know’ (not always good advice, but still). I’m also aware that this whole article could be seen as a plea for more nudity. I’m not sure that’s what I mean

² might as well say it, this article deals mainly with old fashioned binary distinctions, but misogyny applies equally to trans women and I think what I say about men probably applies equally to trans men. 

³ or her nipples, on social media

Raphael – The Three Graces (1505) nudity acceptable due to classical context

Essentially, this positive and enlightened development seems to be inadvertently(?) reinforcing ancient and (surely!) redundant arguments, in a completely confused way. ‘Non-sexual nudity’, whatever that means, has always been okay with the establishment(s) in some circumstances. Now, one could argue from the context (cancer awareness campaign) that the nudity is desexualised, and I think that’s why it is allowed to be aired at any time of day. In fact, the Ofcom (UK TV regulating authority) rules on nudity – which are aimed at ‘protecting the under 18s’ from nudity, a strange a concept, as it always has been*, are pretty simple:

Nudity

1.21: Nudity before the watershed [9 pm in the UK], or when content is likely to be accessed by children […] must be justified by the context.

*Interestingly, Ofcom’s rules about nudity are listed between their rules about ‘Sexual behaviour’ and their rules about ‘Exorcism, the occult and the paranormal’

So presumably, Ofcom (rightly) considers this context to be justified, because the naked body is not being presented in a sexual context. But, at the same time, one thing the cancer awareness film demonstrates – and which I think it’s partly supposed to demonstrate – is that there’s nothing undesirable about the female body post-mastectomy. I mean, possibly that’s just me, projecting the notorious male gaze onto the subject, as if that’s the determining factor in what attractiveness is or isn’t, but let’s ignore that. Of course, the people that devised and created the film are not the same people that determine what can be shown on TV or online and when, but they would surely have been aware of the rules that they are working within.

Even accepting that it’s permitted to show a topless woman on TV during the daytime because it’s ‘de-sexualised nudity’, why is that better? Two opposing sides of that argument, a puritanical, right-wing one and a feminist one might both be (rightly?) skeptical of me, as a heterosexual male writing about this. But if women have to be de-sexualised to be regarded equally, or taken seriously, to not be somehow reduced by the male gaze (or damaging to the child’s gaze, since nudity on TV tends to be fine after children’s standard bedtimes and on the internet is theoretically policed by child locks) then that seems no less problematic – and not even very different – from the traditional, paternalistic Western view which sees the Virgin Mary as the ultimate exemplar of female-kind. And if sex or desire is itself the problem then not allowing specifically female nudity is also, typically, reducing the visibility of women for what is in essence a problem of male behaviour.

Sebastiano del Piombo – The Martyrdom of St Agatha (1520)

It’s worth looking at the fact that nudity is even an issue in the first place, considering that we all privately live with it, or in it, every day of our lives. In many world cultures of course, it isn’t and never has been a problem, unless/until Westerners have interfered with and poisoned those cultures, but it’s widespread enough elsewhere too, to be a human, rather than purely western quirk. It possibly has a little to do with climate, but it definitely has a lot to do with religion.

But the fact is that, in Western culture, even before the era of the Impressionists and their selectively nude women or the (as it now looks, very selectively) permissive society of the 1960s, female nudity has been perfectly acceptable to depict for hundreds of years; as long as the nude female is either mutilated (say, a virtuous martyr like the Roman suicide Lucretia), the victim of alien (non-Christian) assailants (various saints*) or, turning the tables, if she is a heathen herself (various classical figures, plus Biblical villains like Salome; a favourite subject with the same kind of sex & violence frisson as Lucretia)

* I didn’t realise when I posted this article that today (5th February) is the Feast day of St Agatha, the patron saint of – among other things – breast cancer. I’m not a believer in the supernatural or supreme beings, but that’s nice.

Even in Reformation Germany – surely one of the least frisky periods in the history of Western civilisation – in the private chambers of the privileged male viewer, nudity – especially female nudity – was there in abundance, providing it came with various kinds of extenuating nonsense; dressed up (or rather, not dressed up) in the trappings of classical antiquity. Okay, so maybe a woman can’t be flawless like Christ, but she can be nude and beautiful too, as long as she is being murdered, or stabbing herself to preserve her virtue, or is sentenced to everlasting damnation.

Lucas Cranach the Elder – Lucretia (1528)

Men could, in art, and can on TV or anywhere else, be more or less naked (admittedly with a fig-leaf or something similar) at pretty much any time because – I assume – of Jesus. Otherwise how to explain it? The male chest is arguably less aesthetically pleasing than the female one, and certainly less utilitarian in the raising of infants, but in deciding that it is less sexual, our culture makes lots of assumptions and takes directives that come from religious, patriarchal roots.

The dissonance between the ways that female and male nudity are treated in our culture has its roots in Christianity and its iconography and although in the UK we’re technically the children of the Reformation, what’s striking is how little difference there really was between the way nudity was treated in the Catholic renaissance and the Protestant one.

In both Catholic and Protestant cultures, the art that was not solely designed for the private, adult (male) gaze was almost entirely religious. Popes and Puritans both found themselves in the same odd position; Jesus must be perfect and preferably therefore beautiful, whatever that meant at the time – but more than that, it would be blasphemous – literally criminal – not to portray Christ as beautiful. But in addition to being perfect, he must, crucially, be human. Understandably, but ironically, it seemed the obvious way to depict human beauty and perfection was without the burden of clothes. The human aspect is after all how the people of the Renaissance could (and I presume people still can) identify with Christ, in a way that they never do with God in other contexts, where that identification would be as blasphemous as a deliberately ugly Christ.

But how was one supposed to regard the nearly nude, technically beautiful body of Christ? With reverence, of course. But revering and worshipping the naked beautiful body of a perfect human being is not something that a misanthropic (or if that’s too strong, homo-skeptic5) religion can do lightly. Helpfully, the part of Christianity that puts the (nearly) naked figure at the centre of our attention is the human sacrifice ritual of the crucifixion and its aftermath.  That bloody, pain-filled ritual allows the viewer to look at Jesus with pity and empathy and tempers (one would hope; but people) the quality of desire that the naked beautiful body of a perfect human being might be expected to engender. And to that Renaissance audience, the reason for that desire was another, but far more ambiguous subject for artists; Adam and Eve.

4 There are special cases though, see below re Grunewald

5 Doesn’t Alan Partridge call himself homoskeptic at some point? What I mean is – and I’m sure many Christians would take serious issue with this – that Christianity/the Christian God is in theory all-accepting of humans and their frailties, but somehow humans just as they are never seem to be quite good enough to escape negative judgement. Not just for things like murder or adultery that are within their power to not do, but things that are in their nature like envy and greed, and which were placed in their nature by God. And then, making a human being who must be killed for the things that other human beings have done or will by their nature do seems on the one hand, not very different from a horror movie pagan blood sacrifice cult and, on the other, kind of misanthropic

Hans Baldung Grien’s slightly diabolical looking Adam & Eve (1531)

Adam and Eve were a gift to the Renaissance man seeking pervy thrills from his art collection because they are supposed to be sexy. Here are the first humans, made, like Christ, in God’s image and therefore outwardly perfect; and, to begin with, happily nude. But in (almost immediately) sullying the human body, Adam and Eve are fallible where Christ is not. But how to depict the people that brought us the concept of desire except as desirable? Because they are not only not our saviours, but the actual opposite, their nudity can afford to be alluring, as long as the lurking threat of that attraction is acknowledged.

Alongside the problems of the iconography in art came the practical problems of making it; and I think that one of the reasons that, of the four main ‘Turtles’ of the Italian Renaissance,6 Raphael was elevated to the status he enjoyed for centuries, is that his nude women suggested that he might actually have seen some nude women. For all their athletic/aesthetic beauty, figures like Michelangelo’s Night (see below) and his Sistine Chapel Sibyls are the product of someone who found that the church’s strictures on female nudity (no nude models) happened to strike a chord with his own ideas of aesthetic perfection. Likewise,  Leonardo’s odd hybrid woman, the so-called Monna Vanna (possibly posed for by one of his male assistants) seems to demonstrate an uncharacteristic lack of curiosity on the artist’s part.

6 childish

Michelangelo – Night, Basilica di San Lorenzo in Florence (1526-31) and Leonardo(?) Monna Vanna (c.1500)

One way around the problem of naked human beauty was – as it seems still to be – to mutilate the body. Paintings like Mattias Grünewald’s agonised, diseased-looking Jesus (perhaps the most moving depiction of Christ, designed to give comfort and empathy to sufferers of skin diseases) and, on (mostly) a slightly shallower level, the myriad Italian paintings of the martyrdom of St Sebastian, do much the same as those Lucretias and St Agathas; they show the ideal of the body as god intended it, while punishing its perfection so that we can look at it without guilt.

This feels, for all its beauty, like the art of sickness. What kind of response these St Sebastians are supposed to evoke can only be guessed at; and the guesses are rarely ones the original owners of the paintings would have liked. Empathy with and reverence for the martyred saint, obviously; but while Grunewald’s Christ reflects and gives back this sense of shared humanity with the weight of his tortured body and his human suffering, St Sebastian gives us, what? Hope? Various kinds of spiritual (it’s in the eyes) and earthly (relaxed pose and suggestive loincloth) desire?

Grunewald’s agonised Christ from the Isenheim Altarpiece (1515) and one of Pietro Perugino’s fairly comfortable-with-his-situation St Sebstians (1495)

There are lots of fascinating themes and sub-themes here, but for now, there you have it; Christ may have, spiritually, redeemed all of humankind, but aesthetically speaking, women remain (as they say in Narnia) ‘daughters of Eve’.

Nowadays, tired presumably of the restrictions on their lives, men have liberated themselves enough that we don’t even need St Sebastian’s spiritual gaze, or a hint of damnation, to justify our nudity. In what remains an essentially patriarchal society, just advertising a razor, or underwear, or perfume, or chocolate, or taking part in a swimming event, or even just being outside on a warm day is enough to justify our bodies, as long as they don’t veer too far from that Christlike ideal, and as long as they aren’t visibly excited. But even now, women – who can look like humanity’s mother Eve, but not our reborn father Christ – can be more or less naked too, at any time of day they like (on TV or online at least); just as long as they are mutilated.

the law won – police academy and 80s pop culture

In the 2020s, the Police may feel beleaguered by the pressure to account for their actions and act within the boundaries of the laws that they are supposed to be upholding, but despite the usual complaints from conservative nostalgists about declining standards of respect, the question of ‘who watches the watchmen’ (or, ‘who will guard the guards’ or however Quis custodiet ipsos custodies? is best translated) is hardly new, and probably wasn’t new even when that line appeared in Juvenal’s Satires in the 2nd century AD.

In the UK (since I’m here), the modern police force (and quasi-police forces like the Bow Street Runners) have almost always been controversial from their foundations in the 18th century onwards – and not surprisingly so.

It’s probably true that the majority of people have always wanted to live their lives in peace, but ‘law and order’ is not the same thing as peace. The ‘order’ comes from the enforcement of the law, and ‘the law’ has never been a democratically agreed set of rules. So law and order is always somebody’s law and order, but  not everybody’s. As is often pointed out, most of the things which we currently regard as barbaric in the 21st century, from slavery and torture to child labour and the lack of universal suffrage, were all technically legal. ‘Respect for the law’ may not just be a different thing from respect for your fellow human beings, it might be (and often has been) the opposite of it; so it’s no wonder that the position of the gatekeepers of the law should often be ambiguous at best.

the Keystone Cops in the nineteen-teens

Popular culture, as it tends to do – whether consciously or not –  reflects this uneasy situation. Since the advent of film and television, themes of law enforcement and policing have been at the centre of the some of mediums’ key genres, but the venerable Dixon of Dock Green notwithstanding, the focus is only very rarely on orthodox police officers faithfully following the rules. Drama almost invariably favours the maverick individualist who ‘gets the job done’* over the methodical, ‘by the book’ police officer, who usually becomes a comic foil or worse. And from the Keystone Cops (or sometimes Keystone Kops) in 1912 to the present day, the police in comedies are almost invariably either inept or crooked (or both; but more of that later).

*typically, the writers of Alan Partridge manage to encapsulate this kind of stereotype while also acknowledging the ambiguity of its appeal to a conservatively-minded public. Partridge pitches ‘A detective series based in Norwich called “Swallow“. Swallow is a detective who tackles vandalism. Bit of a maverick, not afraid to break the law if he thinks it’s necessary. He’s not a criminal, you know, but he will, perhaps, travel 80mph on the motorway if, for example, he wants to get somewhere quickly.’ i.e. he is in fact a criminal, but one that fits in with the Partridgean world view

But perhaps the police of the 2020s should think themselves lucky; they are currently enduring one of their periodic crisis points with public opinion, but they aren’t yet (again) a general laughing stock; perhaps because it’s too dangerous for their opponents to laugh at them, for now. But almost everyone used to do it. For the generations growing up in the 70s and 80s, whatever their private views, the actual police force as depicted by mainstream (that is, mostly American) popular culture was almost exclusively either comical or the bad guys, or both.

redneck police: Clifton James as JW Pepper (Live and Let Die), Jackie Gleason as Buford T Justice (Smokey and the Bandit), Ernest Borgnine as ‘Dirty Lyle’ Wallace (Convoy), James Best as Rosco P Coltrane (Dukes of Hazzard)
the same but different; Brian Dennehy as Teasle in First Blood

The idiot/yokel/corrupt/redneck cop has an interesting cinematic bloodline, coming into their own in the 1960s with ambivalent exploitation films like The Wild Angels (1966) and genuine Vietnam-war-era countercultural artefacts like Easy Rider, but modulating into the mainstream – and the mainstream of kids’ entertainment at that – with the emergence of Roger Moore’s more comedic James Bond in Live and Let Die in 1973. This seems to have tonally influenced similar movies like The Moonrunners (1975; which itself gave birth to the iconic TV show The Dukes of Hazzard, 1979-85), Smokey and the Bandit (1977), Any Which Way You Can (1980) and The Cannonball Run (1981) among others. Variations of these characters – police officers concerned more with the relentless pursuit of personal vendettas than actual law enforcement, appeared (sometimes sans the redneck accoutrements) in both dramas (Convoy, 1978) and comedies (The Blues Brothers, 1980), while the more sinister, corrupt but not necessarily inept police that pushed John Rambo to breaking point in First Blood (1982) could also be spotted harassing (equally, if differently, dysfunctional Vietnam vets) The A-Team from 1983 to ’85.

iconic movie; iconic poster

In fact, the whole culture of the police force was so obviously beyond redemption as far as the makers of kids and teens entertainment were concerned, that the only cops who could be the good guys were the aforementioned ‘mavericks.’ These were borderline vigilantes who bent or broke or ignored the rules as they saw fit, but who were inevitably guided by a rigid sense of justice that was generally unappreciated by their superiors. This kind of cop reaches some kind of peak in Paul Verhoeven’s masterly Robocop (1987). Here, just beneath the surface of straightforward fun sci-fi/action movie violent entertainment, the director examines serious questions of ‘law’ vs ‘justice’ and the role of human judgement and morality in negotiating between those two hopefully-related things. Robocop himself is, as the tagline says ‘part man, part machine; all cop’ but the movie also gives us pure machine-cop in the comical/horrific ED-209, which removes the pesky human element that makes everything so complicated and gives us instead an amoral killing machine. The film also gives us good and bad human-cops, in the persons of  Officer Lewis and Dick Jones. Lewis (the always-great Nancy Allen) has a sense of justice is no less keen than that of her robot counterpart, but her power is limited by the machinations of the corrupt hierarchy of the organisation she works for, and she’s vulnerable to physical injury. Jones (the brilliant Ronny Cox) is very aware of both the practical and moral problems with law enforcement, but he’s than happy to benefit personally from them.

Part Man, Part Blue Jeans; All Cop

The following year, Peter Weller (Robocop himself) returned in the vastly inferior Shakedown, worthy of mention because it too features unorthodox/mismatched law enforcers (a classic 80s trope, here it’s Weller’s clean-cut lawyer and Sam Elliott’s scruffy, long haired cop) teaming up to combat a corrupt police force; indeed the movie’s original tagline was Whatever you do… don’t call the cops. And it’s also worthy of mention because its UK (and other territories) title was Blue Jean Cop, though it sadly lacked the ‘part man, part blue jean; all cop’ tagline one would have hoped for). Into the 90s, this kind of thing seemed hopelessly unsophisticated, but even a ‘crooked cops’ masterpiece like James Mangold’s Cop Land (1997) relies, like Robocop, on the police – this time in the only mildly unconventional form of a good, simple-minded cop (Sylvester Stallone), to police the bad, corrupt, too-clever police, enforcing the rules that they have broken so cavalierly. The film even ends with the explicit statement (via a voiceover) that crime doesn’t pay; despite just showing the viewer that if you are the police, it mostly seems to, for years, unless someone else on the inside doesn’t like it.

There’s always an ironic focus on ‘the rules’ – ironic because the TV and movie police tend to be bending them a-la Starsky and Hutch (and the rest), or ineffectually wringing their hands over that rule-bending, like the strait-laced half of almost every mismatched partnership (classic examples being Judge Reinhold in 1984’s Beverley Hills Cop and Danny Glover in Lethal Weapon, another famous ‘unorthodox cop’ movie from the same year as Robocop) or even disregarding them altogether like Clint Eastwood’s Dirty Harry. So, it’s no surprise that the training of the police and the learning of those rules should become the focus of at least one story. Which brings us to Police Academy.

the spiritual children of the Keystone Cops

Obviously any serious claim one makes for Police Academy is a claim too far. It’s not, nor was it supposed to be, a serious film, or even possibly a good film, and certainly not one with much of a serious message. But its theme is a time-honoured one; going back to the medieval Feast of Fools and even further to the Roman festival of Saturnalia, it’s the world upside down, the lords of Misrule. And in honouring this tradition, the film tells us a lot about the age that spawned it. Police Academy purports to represent the opposite of what was the approved behaviour of the police in 1984 and yet, despite its (not entirely unfounded) reputation for sexism and crass stereotypes it remains largely watchable where many similar films do not. But, more surprisingly, it also feels significantly less reactionary than, say the previous year’s Dirty Harry opus, Sudden Impact.

While it’s a trivial piece of fluff, Police Academy is notable for – unlike many more enlightened films before and since – passing the Bechdel test. Don’t expect anything too deep – not just from the female characters – but it also has having noticeably more diversity among its ensemble cast than the Caddyshack/National Lampoon type of films that were in its comedy DNA. Three prominent African-American characters with more than cameo roles in a mainstream Hollywood movie may not seem like much – and it definitely isn’t – but looking at the era it feels almost radical. At this point in Hollywood history, let’s not forget, the idea for a film where a rich white kid finds the easiest way to get into college is by disguising as a black kid not only got picked up by a major studio, but actually made it to the screen.

In that context, these three actors – Marion Ramsey, Michael Winslow and the late Bubba Smith could look back on a series of movies which may not have been* cinematic masterpieces, but which allowed them to use their formidable comedic talents in a non-token way. More to the point, their race is neither overlooked in a ‘colourblind’ way (they are definitely Black characters rather than just Black actors playing indeterminate characters) or portrayed in a negative sense. Police Academy is not an enlightened franchise by any means; the whole series essentially runs on stereotypes and bad taste and therefore has the capacity to offend pretty much everyone. But although there are almost certainly racial slurs to be found there, alongside (for sure) gross sexism, homophobia etc, the series is so determined to make fun of every possible point of view that it ends up leaving a far less bad smell behind it than many of its peers did; perhaps most of all the previously alluded to Soul Man (1986).
*ie they definitely aren’t

Despite its essential good nature though, there is a genuine, if mild kind of subversion to be found in the Police Academy films. With the Dickensian, broadly-drawn characters comes a mildly rebellious agenda (laughing at authority), but it also subverts in a more subtle (and therefore unintentional? who knows) way, the established pattern of how the police were depicted. Yes, they are a gang, and as such they are stupid and corrupt and vicious and inept, just like the police of Easy Rider, Smokey and the Bandit, The Dukes of Hazzard etc. Unlike all of those films and franchises though, Police Academy offers a simple solution in line with its dorky, good natured approach; if you don’t want the police to suck, it implies, what you need to do is to recruit people who are not ‘police material.’ In the 1980s those who were not considered traditional ‘police material’ seemingly included ethnic minorities, women, smartasses, nerds, and at least one dangerous gun-worshipper, albeit one with a sense of right and wrong that was less morally dubious than Dirty Harry’s. So ultimately, like its spiritual ancestors, Saturnalia and the Feast of Fools, Police Academy is more like a safety valve that ensures the survival of the status quo rather than a wrecking ball that ushers in a new society. Indeed, as with Dickens and his poorhouses and brutal mill owners, the message is not – as you might justifiably expect it to be – ‘we need urgent reform’, but instead ‘people should be nicer’. It’s hard to argue with, as far as it goes, but as always seems to be the case*, the police get off lightly in the end.

The Boys in Blue (1982). Christ

*one brutal exception to this rule is roughly the UK equivalent of Police Academy, the risible 1982 Cannon & Ball vehicle The Boys In Blue. After sitting through an impossibly long hour and a half of Tommy and Bobby, the average viewer will want not only to dismantle the police force, but  also set fire to the entire western culture that produced it.