most things don’t exist

 

eh, Mel Gibson: but he played a good Hamlet (dir Franco Zeffirelli, 1990)

With apologies to Marcel Proust – but not very vehement apologies, because it’s true – the taste of honey on toast is as powerfully evocative and intensely transporting to me as anything that I can think of. The lips and tongue that made that association happen don’t exist anymore and neither does the face, neither do the eyes, and neither does one of the two brains and/or hearts* that I suppose really made it happen (mine are still there, though). In 21st century Britain, it’s more likely than not that even her bones don’t exist anymore, which makes the traditional preoccupation with returning to dust feel apt and more immediate and (thankfully?) reduces the kind of corpse-fetishising morbidity that seems to have appealed so much to playgoers in the Elizabethan/Jacobean era.

Death & Youth (c.1480-90) by the unknown German artist known as The Master of the Housebook

Thou shell of death,
Once the bright face of my betrothed lady,
When life and beauty naturally fill’d out
These ragged imperfections,
When two heaven-pointed diamonds were set
In those unsightly rings: then ’twas a face
So far beyond the artificial shine
Of any woman’s bought complexion

(The Revenger’s Tragedy (1606/7) by Thomas Middleton and/or Cyril Tourneur, Act one, Scene one)

 

                                                                                                     *is the heart in the brain? In one sense obviously not, in another maybe, but the sensations associated with the heart seem often to happen somewhere around the stomach; or is that just me?

More to the point, “here hung those lips that I have kissed I know not how oft“, etc. All of which is beautiful; but for better or worse, a pile of ash isn’t likely to engender the same kind of thoughts or words as Yorick’s – or anybody’s – skull. But anyway, the non-existence of a person – or, even more abstractly, the non-existence of skin that has touched your skin (though technically of course all of the skin involved in those kisses has long since disappeared into dust and been replaced anyway) is an absence that’s strange and dismal to think about. But then most things don’t exist.

Vanitas: Still Life with Skull (c.1671) by an unknown English painter

But honey does exist of course; and the association between human beings and sugary bee vomit goes back probably as long as human beings themselves. There are Mesolithic cave paintings, 8000 years old or more, made by people who don’t exist, depicting people who may never have existed except as drawings, or may have once existed but don’t anymore, plundering beehives for honey. Honey was used by the ancient Egyptians, who no longer exist, in some of their most solemn rites, it had sacred significance for the ancient Greeks, who no longer exist, it was used in medicine in India and China, which do exist now but technically didn’t then, by people who don’t, now. Mohammed recommended it for its healing properties; it’s a symbol of abundance in the Bible and it’s special enough to be kosher despite being the product of unclean insects. It’s one of the five elixirs of Hinduism, Buddha was brought honey by a monkey that no longer exists. The Vikings ate it and used it for medicine too. Honey was the basis of mead, the drink of the Celts who sometimes referred to the island of Britain as the Isle of Honey.

probably my favourite Jesus & Mary Chain song: Just Like Honey (1985)

And so on and on, into modern times. But also (those Elizabethan-Jacobeans  again) “The sweetest honey is loathsome in its own deliciousness. And in the taste destroys the appetite.” (William Shakespeare, Romeo and Juliet (c.1595) Act 2, scene 6)Your comfortable words are like honey. They relish well in your mouth that’s whole; but in mine that’s wounded they go down as if the sting of the bee were in them.”(John Webster, The White Devil (1612), Act 3. Sc.ene 3). See also “honey trap”. “Man produces evil as a bee produces honey.”You catch more flies with honey.

But on the whole, the sweetness of honey is not and has never been sinister. A Taste of Honey, Tupelo Honey, “Wild Honey,” “Honey Pie”, “Just like Honey,” “Me in Honey,” “Put some sugar on it honey,” Pablo Honey, “Honey I Sure Miss You.” Honey to the B. “Honey” is one of the sweetest (yep) of endearments that people use with each other. Winnie-the-Pooh and Bamse covet it. Honey and toast tasted in a kiss at the age of 14 is, in the history of the world, a tiny and trivial thing, but it’s enough to resonate throughout a life, just as honey has resonated through the world’s human cultures. Honey’s Dead. But the mouth that tasted so sweetly of honey doesn’t exist anymore. Which is sad, because loss is sad. But how sad? Most things never exist and even most things that have existed don’t exist now, so maybe the fact that it has existed is enough.

“Most things don’t exist” seems patently untrue: for a thing to be ‘a thing’ it must have some kind of existence, surely? And yet, even leaving aside things and people that no longer exist, we are vastly outnumbered by the things that have never existed, from the profound to the trivial. Profound, well even avoiding offending people and their beliefs, probably few people would now say that Zeus and his extended family are really living in a real Olympus. Trivially, 70-plus years on from the great age of the automobile, flying cars as imagined by generations of children, as depicted in books and films, are still stubbornly absent from the skies above our roads. The idea of them exists, but even if – headache-inducing notion – it exists as a specific idea (“the idea of a flying car”), rather than just within the general realm of “ideas,” an idea is an idea, a thing perhaps but not the thing that it is about. Is a specific person’s memory of another person a particular thing because it relates to a particular person, or does it exist only under the larger and more various banner of “memories”? Either way, it’s immaterial, because even though the human imagination is a thing that definitely exists, the idea of a flying car is no more a flying car than Leonardo da Vinci’s drawing of a flying machine was a flying machine or that my memory of honey-and-toast kisses is a honey-and-toast kiss.

If you or I picture a human being with electric blue skin, we can imagine it and if we have the talent we can draw it, someone could depict it in a film, but it wouldn’t be the thing itself, because human beings with electric blue skin, like space dolphins, personal teleportation devices, seas of blood, winged horses, articulate sentient faeces and successful alchemical experiments, don’t exist. And depending on the range of your imagination (looking at that list mine seems a bit limited), you could think of infinite numbers of things that don’t exist. There are also, presumably, untold numbers of things that do exist but that we personally don’t know about or that we as a species don’t know about yet. But even if it was possible to make a complete list of all of the things in existence (or things in existence to date; new things are invented or develop or evolve all the time), it would always be possible to think of even more things that don’t exist, – simply, in the least imaginative way, by naming variations on, or parodies of everything that does exist. So supermassive black holes exist? Okay, but what about supertiny pink holes? What about supermedium beige holes? This June, a new snake (disappointingly named Ovophis jenkinsi) was discovered. But what about a version of Ovophis jenkinsi that sings in Spanish or has paper bones or smells like Madonna? They don’t exist.

JAMC Honey’s Dead, 1992

Kind of a creepy segue if you think about it (so please don’t), but like those beautifully-shaped lips that tasted of honey, my mother no longer exists, except as a memory, or lots of different memories, belonging to lots of different people. Presumably she exists in lots of memories as lots of different people who happen to have the same name. But unlike supermedium beige holes, the non-existence of previously-existing things and people is complex, because of the different perspectives they are remembered from. But regardless, they are still fundamentally not things anymore. But even with the ever-growing, almost-infinite number of things, there are, demonstrably, more things that don’t exist. And, without wishing to be horribly negative or repeating things I’ve written before, one of the surprises with the death of a close relative was to find that death does exist. Well, obviously, everyone knows that – but not just as an ending or as the absence of life, as was always known, but as an active, grim-reaper-like force of its own. For me, the evidence for that – which I’m sure could be explained scientifically by a medical professional – is the cold that I mentioned in the previous article. Holding a hand that gets cold seems pretty normal; warmth ebbing away as life ebb away; that’s logical and natural. But this wasn’t the expected (to me) cooling down of a warm thing to room temperature, like the un-drunk cups of tea which day after day were brought and cooled down because the person they were brought for didn’t really want them anymore, just the idea of them. That cooling felt natural, as did the warming of the glass of water that sat un-drunk at the bedside because the person it was for could no longer hold or see it. That water had been cold but had warmed up to room temperature, but the cold in the hand wasn’t just a settling in line with ambient conditions. It was active cold; hands chilling and then radiating cold in quite an intense way, a coldness that dropped far below room temperature. I mentioned it to a doctor during a brief, unbelievably welcome break to get some air, and she said “Yes, she doesn’t have long left.” Within a few days I wished I’d asked for an explanation of where that cold was coming from; where is it generated? Which organ in the human body can generate cold so quickly and intensely? Does it do it in any other situations? And if not, why not? So, although death can seem abstract, in the same sense that ‘life’ seems abstract, being big and pervasive, death definitely exists. But as what? Don’t know; not a single entity, since it’s incipient in everyone, coded into our DNA: but that coding has nothing to do with getting hit by cars or drowning or being shot, does it? So, a big question mark to that. Keats would say not to question it, just to enjoy the mystery. Well alright then.

Klaus Nomi as “the Cold Genius” from his 1981 version of Purcell’s “The Cold Song”

But since most things *don’t* exist, but death definitely does exist, existence is, in universal terms, rare enough to be something like winning the lottery. But like winning the lottery, existence in itself is not any kind of guarantee of happiness or satisfaction or even honey-and-toast kisses; but it at least offers the possibility of those things, whereas non-existence doesn’t offer anything, not even peace, which has to be experienced to exist. We have all not existed before and we will all not exist again; but honey will still be here, for as long as bees are at least. I don’t know if that’s comforting or not. But if you’re reading this – and I’m definitely writing it – we do currently exist, so try enjoy your lottery win, innit.

Something silly about music next time I think.

Ancient Roman vanitas mosaic showing a skull and the wheel of fortune

who’d have them?

My mother died just about a month ago, and I think she/her death is taking up too much space in my conscious mind to trouble my subconscious or unconscious self too much. It’s interesting to note that even though death is one of the central themes of much of the most important art ever created, and although I am someone with an interest in Art, in the capital A, “high culture” sense, what came into my mind in that room, while holding her hand was actually a line from a song which turned out to have an accuracy I didn’t realise until then; “it’s so cold, it’s like the cold if you were dead.”* Mum wouldn’t have liked that. And if she wasn’t dead I probably wouldn’t be posting what follows online, even though there’s nothing in it she would object to and even though, as far as I’m aware, she never read a word I wrote: which sounds petulant but it’s not a complaint. Our parents know us too well in one way to want them to know us in other ways, or at least that’s how I think I feel about it.

*Plainsong by The Cure, which luckily I’ve barely been able to stand for many years although I really do love it.

Max Ernst – Max Ernst Showing a Young Girl the Head of his Father (1926/7)

Anyway, last night, for the first time in what feels like decades, I dreamed about my dad. The dream was full of vivid, long forgotten details, most of which almost immediately receded back into the murk of subconscious memory on waking. Not all of them though; how could I have forgotten his strangely hissing laugh (less sinister than it sounds)? But waking up, what was lurking in my mind as the dream faded was, of all things – pop culture strikes again – lines from Stephen King’s IT (which mum read, but dad didn’t, he was squeamish about horror) and a feeling of dread that wasn’t terrifying or even upsetting, just somehow inevitable and in some way kind of comfortable.

That quote comes from a scene in the book when the young protagonists come across the monster, Pennywise, in an old newspaper clipping from 1945. I had no idea that I had absorbed this paragraph, or at least its final lines, first read when I was 14, completely enough to have known it almost word for word, but there it was (have included the whole paragraph for sense):

The headline: JAPAN SURRENDERS – IT’S OVER! THANK GOD IT’S OVER! A parade was snake-dancing its way along Main Street toward Up-Mile Hill. And there was the clown in the background, wearing his silver suit with the orange buttons, frozen in the matrix of dots that made up the grainy newsprint photo, seeming to suggest (at least to Bill) that nothing was over, no one had surrendered, nothing was won, nil was still the rule, zilch still the custom; seeming to suggest above all that all was still lost.  

Stephen King, IT, 1986 p.584 (in the edition I have)

Pietà or Revolution by Night (1923)

Which is not really fair; dad had his faults but he was not a shape-shifting alien clown that ate kids. And anyway, it wasn’t even a nightmare as such. Details are receding – and have almost vanished even since I made the original note this morning – but essentially, nothing bad happened, we were in a house, dad was there, my siblings were there, offering eye-rolling ‘he’s annoying but what can you do?’ support, but what lingers is the last phase before waking – an interminably long, drawn out scene where I was attempting, unsuccessfully, to make coffee for everyone in an unfamiliar kitchen, but couldn’t find the right spoon, with dad behind me watching with condescending amusement and laughing that hissing laugh. And then I woke up to a Stephen King quote. So thanks for that, brain. One of the hardest lessons to learn and re-learn is that other people are none of your business, or to put it less negatively, that you have no claim on any other human being and they have no claim on you. Except for your parents of course; but that’s that dealt with anyway.

nostalgia isn’t going to be what it was, or something like that

When I was a child there was music which was, whether you liked it or not, inescapable. I have never – and this is not a boast – deliberately or actively listened to a song by Michael Jackson, Madonna, Phil Collins, Duran Duran, Roxette, Take That, Bon Jovi, the Spice Girls… the list isn’t endless, but it is quite long. And yet I know some, or a lot, of songs by all of those artists. And those are just some of the household names. Likewise I have never deliberately listened to “A Horse With No Name” by America, “One Night in Bangkok” by Murray Head or “Would I Lie to You” by Charles & Eddie; and yet, there they are, readily accessible should I wish (I shouldn’t) to hum, whistle or sing them, or just have them play in my head, which I seemingly have little control over.

Black Lace: the unacceptable face(s) of 80s pop

And yet, since the dawn of the 21st century, major stars come and go, like Justin Bieber, or just stay, like Ed Sheeran, Lana Del Rey or Taylor Swift, without ever really entering my consciousness or troubling my ears. I have consulted with samples of “the youth” to see if it’s just me, but no: like me, there are major stars that they have mental images of, but unless they have actively been fans, they couldn’t necessarily tell you the titles of any of their songs and have little to no idea of what they actually sound like. Logical, because they were no more interested in them than I was in Dire Straits or Black Lace; but alas, I know the hits of Dire Straits and Black Lace. And the idea of ‘the Top 40 singles chart’ really has little place in their idea of popular music. Again, ignorance is nothing to be proud of and I literally don’t know what I’m missing. At least my parents could dismiss Madonna or Boy George on the basis that they didn’t like their music. It’s an especially odd situation to find myself in as my main occupation is actually writing about music; but of course, nothing except my own attitude is stopping me from finding out about these artists.

The fact is that no musician is inescapable now. Music is everywhere, and far more accessibly so than it was in the 80s or 90s – and not just new music. If I want to hear Joy Division playing live when they were still called Warsaw or track down the records the Wu-Tang Clan sampled or hear the different version of the Smiths’ first album produced by Troy Tate, it takes as long about as long to find them as it does to type those words into your phone. Back then, if you had a Walkman you could play tapes, but you had to have the tape (or CD – I know CDs are having a minor renaissance, but is there any more misbegotten, less lamented creature than the CD Walkman?) Or you could – from the 1950s onwards – carry a radio with you and listen to whatever happened to be playing at the time. I imagine fewer people listen to the radio now than they did even 30 years ago, but paradoxically, though there are probably many more – and many more specialised –  radio stations now than ever, their specialisation actually feeds the escapability of pop music. Because if I want to hear r’n’b or metal or rap or techno without hearing anything else, or to hear 60s or 70s or 80s or 90s pop without having to put up with their modern-day equivalents, then that’s what I and anyone else will do. I have never wanted to hear “Concrete and Clay” by Unit 4+2 or “Agadoo” or “Come On Eileen” or “Your Woman” by White Town or (god knows) “Crocodile Shoes” by Jimmy Nail; but there was a time when hearing things I wanted to hear but didn’t own, meant running the risk of being subjected to these, and many other unwanted songs. As I write these words, “Owner of a Lonely Heart” by Yes, a song that until recently I didn’t know I knew is playing in my head.

And so, the music library in my head is bigger and more diverse than I ever intended it to be. In a situation where there were only three or four TV channels and a handful of popular radio stations, music was a kind of lingua franca for people, especially for young people. Watching Top of the Pops on a Thursday evening, or later The Word on Friday was so standard among my age group that you could assume that most people you knew had seen what you saw; that’s a powerful, not necessarily bonding experience, but a bond of sorts, that I don’t see an equivalent for now, simply because even if everyone you know watches Netflix, there’s no reason for them to have watched the same thing at the same time as you did. It’s not worse, in some ways it’s obviously better; but it is different. Of course, personal taste back then was still personal taste, and anything not in the mainstream was obscure in a way that no music, however weird or niche, is now obscure, but that was another identity-building thing, whether one liked it or not.

Growing up in a time when this isn’t the case and the only music kids are subjected to is the taste of their parents (admittedly, a minefield) or fragments of songs on TV ads, if they watch normal TV or on TikTok, if they happen to use Tiktok, is a vastly different thing. Taylor Swift is as inescapable a presence now, much as Madonna was in the 80s, but her music is almost entirely avoidable and it seems probable that few teenagers who are entirely uninterested in her now will find her hits popping unbidden into their heads in middle age. But conversely, the kids of today are more likely to come across “Owner of a Lonely Heart” on YouTube than I would have been to hear one of the big pop hits of 1943 in the 80s.

Far Dunaway as Bonnie Parker; a little bit 1930s, a lot 1960s

What this means for the future I don’t know; but surely its implications for pop-culture nostalgia – which has grown from its humble origins in the 60s to an all-encompassing industry, are huge. In the 60s, there was a brief fashion for all things 1920s and 30s which prefigures the waves of nostalgia that have happened ever since. But for a variety of reasons, some technical, some generational and some commercial, pop culture nostalgia is far more elaborate than ever before. We live in a time when constructs like “The 80s” and “The 90s” are well-defined, marketable eras that mean something to people who weren’t born then, in quite a different way from the 1960s version of the 1920s. Even back then, the entertainment industry could conjure bygone times with an easy shorthand; the 1960s version of the 1920s and 30s meant flappers and cloche hats and Prohibition and the Charleston and was evoked on records like The Beatles’ Honey Pie and seen onstage in The Boy Friend or in the cinema in Bonnie & Clyde. But the actual music of the 20s and 30s was mostly not relatable to youngsters in the way that the actual entertainment of the 80s and 90s still is. Even if a teenager in the 60s did want to watch actual silent movies or listen to actual 20s jazz or dance bands they would have to find some way of accessing them. In the pre-home video era that meant relying on silent movie revivals in cinemas, or finding old records and having the right equipment to play them on, since old music was then only slowly being reissued in modern formats. The modern teen who loves “the 80s” or “the 90s” is spoiled by comparison, not least because its major movie franchises like Star Wars, Indiana Jones, Ghostbusters and Jurassic Park are still around and its major musical stars still tour or at least have videos and back catalogues that can be accessed online, often for free.

Supergrass in 1996: a little bit 60s, a lot 70s, entirely 90s

Fashion has always been cyclical, but this feels quite new (which doesn’t mean it is though). Currently, culture feels not like a wasteland but like Eliot’s actual Waste Land, a dissonant kind of poetic collage full of meaning and detritus and feeling and substance and ephemera but at first glance strangely shapeless. For example, in one of our current pop culture timestreams there seems to be a kind of 90s revival going on, with not only architects of Britpop like the Gallagher brothers and Blur still active, but even minor bands like Shed Seven not only touring the nostalgia circuit but actually getting in the charts. Britpop was notoriously derivative of the past, especially the 60s and 70s. And so, some teenagers and young adults (none of these things being as pervasive as they once were) are now growing up in a time when part of ‘the culture’ is a version of the culture of the 90s, which had reacted to the culture of the 80s by absorbing elements of the culture of the 60s and 70s. And while the artists of 20 or 30 years ago refuse to go away even modern artists from alternative rock to mainstream pop stars make music infused with the sound of 80s synths and 90s rock and so on and on. Nothing wrong with that of course, but what do you call post-post-modernism? And what will the 2020s revival look like when it rears its head in the 2050s, assuming there is a 2050s? Something half interesting, half familiar no doubt.

an alan smithee war

an annoying but perhaps necessary note; “Alan (or Allan, or Allen) Smithee” is a pseudonym used by Hollywood film directors when they wish to disown a project

Watch out, this starts off being insultingly elementary, but then gets complicated and probably contradictory, quite quickly.

Countries, States and religions are not monoliths and nor are they sentient. They don’t have feelings, aims, motivations or opinions. So whatever is happening in the Middle East isn’t ‘Judaism versus Islam’ or even ‘Israel versus Palestine’, any more than “the Troubles”* were/are ‘Protestantism versus Catholicism’ or ‘Britain versus Ireland’.

* a euphemism, which, like most names for these things is partly a method of avoiding blame – as we’ll see

Places and atrocities aren’t monoliths either; Srebrenica didn’t massacre anybody**, the Falkland islands didn’t have a conflict, ‘the Gulf’ didn’t have any wars and neither did Vietnam or Korea. But somebody did. As with Kiefer Sutherland and Woman Wanted in 1999 or Michael Gottlieb and The Shrimp on the Barbie in 1990 and whoever it was that directed Gypsy Angels in 1980, nobody wants to claim these wars afterwards. But while these directors have the handy pseudonym Allan Smithee to use, there is no warmongering equivalent, and so what we get is geography, or flatly descriptive terms like ‘World War One’, which divert the focus from the aggressor(s) and only the occasional exception (The American War of Independence) that even references the real point of the war. But, whether interfered with by the studio or not, Kevin Yagher did direct Hellraiser: Bloodline, just as certain individuals really are responsible for actions which are killing human beings as you read this. Language and the academic study of history will probably help to keep their names quiet as events turn from current affairs and into history. Often this evasion happens for purely utilitarian reasons, perhaps even unknowingly, but sometimes it is more sinister.

** see?

As the 60s drew to its messy end, the great Terry “Geezer” Butler wrote lines which, despite the unfortunate repeat/rhyme in the first lines, have a Blakean power and truth:

Generals gathered in their masses
Just like witches at black masses

Black Sabbath, War Pigs, 1970

There is something sinister and even uncanny in the workings of power, in the distance between avowed and the underlying motivations behind military action. Power politics feels like it is – possibly because intuitively it seems like it should be – cold and logical, rather than human and emotional. It doesn’t take much consideration though to realise that even beneath the chilly, calculated actions of power blocs there are weird and strangely random human desires and opinions, often tied in with personal prestige, which somehow seems to that person to be more important than not killing people or not having people killed.

Anyway, Geezer went on to say:

Politicians hide themselves away
They only started the war
Why should they go out to fight?
They leave that role to the poor

Still Black Sabbath, War Pigs (1970)

And that’s right too; but does that mean Butler’s ‘poor’ should take no responsibility at all for their actions? In the largest sense they are not to blame for war or at least for the outbreak of war; and conscripts and draftees are clearly a different class again from those who choose to “go out to fight.” But. As so often WW2 is perhaps the most extreme and therefore the easiest place to find examples; whatever his orders or reasons, the Nazi soldier (and there were lots of them) who shot a child and threw them in a pit, actually did shoot a child and throw them in a pit. His immediate superior may have done so too, but not to that particular child. And neither did Himmler or Adolf Hitler. Personal responsibility is an important thing, but responsibility, especially in war, isn’t just one act and one person. Between the originator and the architects of The Final Solution and the shooter of that one individual child there is a chain of people, any one of whom could have disrupted that chain and even if only to a tiny degree, affected the outcome. And that tiny degree may have meant that that child, that human being, lived or died. A small thing in a death toll of something over 6 million people; unless you happen to be that person, or related to that person.

As with the naming of wars and atrocities, terms like “genocide” and “the Holocaust” are useful, especially if we want – as we clearly do – to have some kind of coherent, understandable narrative that can be taught and remembered as history. But in their grim way, these are still euphemisms. The term ‘the Holocaust’ memorialises the countless – actually not countless, but still, nearly 80 years later, being counted – victims of the Nazis’ programme of extermination. But the term also makes the Holocaust sound like an event, rather than a process spread out over the best part of a decade, requiring the participation of probably thousands of people who exercised – not without some form of coercion perhaps, but still – their free will in that participation. The Jewish scholar Hillel the Elder’s famous saying,  whosoever saves a life, it is as though he had saved the entire world is hard to argue with, insofar as the world only exists for us within our perceptions. Even the knowledge that it is a spinning lump of inorganic and organic matter in space, and that other people populate it who might see it differently only exists in our perceptions. Or at least try to prove otherwise. And so the converse of Hillel’s saying – which is actually included in it but far less often quoted – is Whosoever destroys one soul, it is as though he had destroyed the entire world. Which sounds like an argument for pacifism, but while pacifism is entirely viable and valuable on an individual basis as an exercise of one’s free will* – and on occasion has a real positive effect – one-sided pacifism relies on its opponents not taking a cynically Darwinian approach, which is hopeful at best. Pacifism can only really work if everyone is a pacifist, and everyone isn’t a pacifist.

*the lone pacifist can at least say, ‘these terrible things happened, but I took no part in them’, which is something, especially if they used what peaceful means they could to prevent those terrible things and didn’t unwittingly contribute to the sum total of suffering; but those are murky waters to wade in.

But complicated though it all is, people are to blame for things that happen. Just who to blame is more complicated – more complicated at least than the workable study of history can afford to admit. While countries and religions are useful as misleading, straw man scapegoats, even the more manageable unit of a government is, on close examination, surprisingly hard to pin down. Whereas (the eternally handy example of) Hitler’s Nazi Party or Stalin’s Council of People’s Commissars routinely purged heretics, non-believers and dissidents, thus acting as a genuine, effective focus for their ideologies and therefore for blame and responsibility, most political parties allow for a certain amount of debate and flexibility and therefore blame-deniability. Regardless, when a party delivers a policy, every member of that party is responsible for it, or should publicly recuse themselves from it if they aren’t.

The great (indeed Sensational) Scottish singer Alex Harvey said a lot of perceptive things, not least and “[Something] I learned from studying history. Nobody ever won a war. A hundred thousand dead at Waterloo*. No glory in that. Nobody needs that.” Nobody ever won a war;  but plenty of people, on both sides of every conflict, have lost one – and, as the simple existence of a second world war attests, many, many people have lost a peace too.

*Modern estimates put it at ‘only’ 11,000 plus another 40,000 or so casualties; but his point stands

But the “causes” of war are at once easily traced and extremely slippery. Actions like the 1939 invasion of Poland by the armies of Germany and the USSR were, as military actions still are, the will of certain individuals, agreed to by other individuals and then acted upon accordingly. You may or may not agree with the actions of your government or the leaders of your faith. You may even have had some say in them, but in most cases you probably haven’t. Some of those dead on the fields of Waterloo were no doubt enthusiastic about their cause, some probably less so. But very few would have had much say in the decisions which took them to Belgium in the first place.

The buck should stop with every person responsible for wars, crimes, atrocities; but just because that’s obviously impossible to record – and even if it wasn’t, too complex to write in a simple narrative – that doesn’t mean the buck should simply not stop anywhere. Victory being written by the winners often means that guilt is assigned to the losers, but even when that seems fair enough (there really wouldn’t have been a World War Two without Hitler) it’s a simplification (there wouldn’t have been an effective Hitler without the assistance of German industrialists) and a one-sided one (it was a World War because most of the leading participants had already had unprovoked wars of conquest). That was a long sentence. But, does the disgusting history of Western colonialism, the arguably shameful treatment of Germany by the Allies after WW1 and the dubious nature of the allies and some of their actions make Hitler himself any less personally responsible for the war? And does Hitler’s own guilt make the soldier who shoots a child or unarmed adult civilians, or the airman who drops bombs on them any less responsible for their own actions?

Again; only human beings do these things, so the least we can do is not act like they are some kind of unfathomable act of nature when we discuss them or name them. Here’s Alex Harvey again; “Whether you like it or not, anybody who’s involved in rock and roll is involved in politics. Anything that involves a big crowd of people listening to what you say is politics.” If rock and roll is politics, then actual politics is politics squared; and for as long as we settle, however grudgingly or complacently, for pyramidal power structures for our societies then the person at the top of that pyramid, enjoying its vistas and rarefied air should be the one to bear its most sombre responsibilities. But all who enable the pyramid to remain standing should accept their share of it too.

So when you’re helplessly watching something that seems like an unbelievable waste of people’s lives and abilities, pay close attention to who’s doing and saying what, even if you don’t want to, because the credits at the end probably won’t tell you who’s really responsible.