some kind of loss

I remember when Robert Redford’s 1994 film Quiz Show was released, much of the publicity focussed on the idea that a relatively everyday scandal – the discovery in 1957-8 that a popular TV quiz show was rigged – marked the loss of the USA’s innocence. At the time, 20-year-old me could not have been more scornful. The idea that the innocence, whatever that meant, of 160+ million American citizens had somehow survived the relatively recent dropping of two atom bombs on civilians, the uncovering of the Holocaust and the filmed and widely publicised Nuremberg Trials, ongoing racial segregation and lynchings (Emmett Till, the last lynching victim recorded by the Tuskegee Institute – though not the last lynching, by a long way – was murdered aged 14 just three years earlier), the fighting of a war in Korea under the pretence of a ‘police action’ and the rise of and eventual disgust with McCarthyism; but that the scales then dropped from their eyes and the foundations of their way of life began to crumble when it turned out that the things they were offered as entertainment turned out to just be entertainment and not some kind of bastion of morality and fairness seemed laughable at best.

Quiz Show (1994)

I can still see my younger self’s point of view; it doesn’t take much consideration to realise that innocence, whether on a personal or a societal level, is a dangerous fetish. And innocence itself is less often a real state and more often an illusion or chimera – or just a point of view. It’s at best a slippery concept, whose opposite can be guilt, corruption or just experience, none of which are precisely the same thing. The final verse of Philip Larkin’s 1964 poem MCMXIV about the outbreak of World War One half a century earlier, is justly famous and has a kind of intuitive truth to it:

Never such innocence,
Never before or since,
As changed itself to past
Without a word – the men
Leaving the gardens tidy,
The thousands of marriages,
Lasting a little while longer:
Never such innocence again.

Philip Larkin, ‘MCMXIV’, The Whitsun Weddings, Faber & Faber, 1964, p.28

enlistees in London at the outbreak of WW1 (Imperial War Museum)

Truth, because we know now (and Larkin knew then) what the young men lining up to enlist in the army in 1914 didn’t; that they were about to enter a modern world entirely different from the late Victorian one that they had grown up in. But illusion too, because the horrific brutality of WW1 was new only insofar as the people facing the onslaught of modern weaponry were white. There is the scale of it; but though tanks, explosive shells and machine guns firing 500-600 rounds per minute created unprecedented levels of slaughter, the question of whether two sides using such weapons against each other is ‘worse’ than people armed with guns attacking people armed with spears or swords or bows and arrows is one that doesn’t seem worth answering. It’s probably not un-worse at least.

Even leaving aside the Imperialism of the WW1 combatants, the innocence of the Britain that the enlistees were queuing up to leave was dubious at best. The previous few years had been marked by the fight for women’s suffrage and the brutalities inflicted on suffragettes by the authorities, not to mention (the usual) grotesque levels of poverty and inequality; there is more than one reason that so many young men were keen to join the army. Even on a smaller, more localised scale, early 20th century Britain was full of strangely archaic, Tintin-like episodes that also seem to have a quaint kind of innocence now which they definitely didn’t have at the time; Latvian anarchists, terrorism, gang violence, the Siege of Sidney Street. And these things were happening all over Europe: the First World War didn’t come out of nowhere.

In a famous riposte to George Orwell on the subject of weekly magazines for boys, Frank Richards, the author of the Billy Bunter series wrote,

Probably I am older than Mr Orwell: and I can tell him the world went very well then [in 1910]. It was not been improved by the Great War, the General Strike, the outbreak of sex-chatter, by make-up or lipstick, by the present discontents [World War Two], or by Mr Orwell’s thoughts upon the present discontents!”

(Frank Richards responds – Collected Essays, Journalism and Letters of George Orwell Volume 1: An Age Like This (Penguin, 1970 p.532).

Fair enough, but when Frank Richards (real name Charles Hamilton) was 12 years old, the British Empire was at its height and Jack the Ripper was murdering prostitutes in London. Probably his parents were children during the period when Chartist protesters were being killed by the army, and their parents would have been alive during the period of Napoleonic Wars and the Peterloo Massacre. Which doesn’t make the First World War any less horrific, but you might as well say the sinking of the Titanic caused the loss of Britain’s innocence.

Advertisement for an account of the Peterloo Massacre

But I’m no better. Even though I wouldn’t use the phrase ‘loss of innocence,’ to me it seems like the world has never felt quite the same since 9/11. I’d be fooling myself if I said things were in any real sense better beforehand, and as with WW1, the events of that day didn’t come out of nowhere, it was as much a culmination as it was a beginning. But still, there’s a certain kind of low-level dread that emerged (in me at least) then and which, since then, always seems to be within easy reach. It came to the fore again in 2004 when the photos of the torture and abuse of prisoners at Abu Ghraib were released and then again in the early 2010s with the violent rise of the Islamic State and the series of filmed beheadings of westerners that appeared online and on the news, and the murder of Lee Rigby in London. But it’s not a feeling that’s exclusive to Middle East-related matters, and it recurs like a migraine; every white supremacist rally, every attack on a mosque or synagogue, every time the media normalises far-right politics, every new announcement of US government policy brings with it a hint of that particular combination of heavy misery and pit-of-the-stomach dread.

Thomas Hoepker’s notorious 2001 photo of nonchalant 9/11 witnesses

Were things ‘better’ before 9/11? It depends on what you mean and who you are. For me, I was a teenager living in a peaceful and relatively prosperous country, so my specific worries, even the ones that felt existentially soul-crushing at the time, were probably pretty trivial. And it wasn’t entirely a new feeling; I had felt apprehensive and angry in the run up to the Gulf War in 1990, and before every general election, but that anxiety didn’t really linger in my day to day teenage life.

echhh

The same year as Quiz Show was released I remember watching the unfolding of the story of Fred and Rosemary West’s murders on breakfast TV for several days in a row with a sense of outraged horror, but it for me it didn’t have the lasting, polluting effect of 9/11. I remember watching with actual disbelief (I think the only time I’ve experienced that, apart from seeing 9/11 itself on the news) when breakfast TV broke the story of Princess Diana’s death and then with irritated disbelief in the days (or weeks?) that followed, at Britain’s reaction to it. Again, that was a different thing. In 1910, when the world ‘went very well,’ Paris flooded, the French government massacred protesters in Côte d’Ivoire, Albania revolted against Ottoman rule, Boutros Ghali was assassinated in Cairo, while in Britain King Edward VII died and George V was crowned, Dr Crippen murdered his wife and was caught and executed, 300 Suffragettes fought with police outside of Parliament and Captain Scott set off on the British Antarctic Expedition; but Frank Richards was a successful author who had established two very weekly papers for children for which he wrote humorous stories; probably life seemed pretty good.

This isn’t nostalgia, exactly. In middle age, when their time is poisoned by yet unforeseen anxieties,  will the teenagers of today look back wistfully at a period when there were still Palestinian people living (however precariously) in Palestine, or when the weather was hot at one predictable time of year and cold in another and think of it as a better world they once knew, or just a different one? Who knows. At the moment, just catching up with the news every morning feels more and more like doomscrolling and the headlines feel increasingly like “The preparations for Hate Week were in full swing.” Around the time of the 9/11 attacks in 2001, the internet was just beginning to be a normal presence in almost everybody’s home, long before it was something they carried in their pockets and their hands. Since then, everyone has access to everything and, in the words of British heavy metal stalwarts Saxon, innocence is no excuse. Or at least it’s a willed, deliberate choice. But maybe it always was. Is innocence anything to aspire to outside of a court case anyway? I don’t know.

A conversation from Art Spiegelman’s Maus, a Pulitzer Prize-winning book, but one that was recently removed from some American schools, presumably to preserve some kind of innocence, springs to mind;

“Many younger Germans have had it up to HERE with Holocaust stories. These things happened before they were even born. Why should THEY feel guilty?”

“Who am I to say? But a lot of the corporations that flourished in Nazi Germany are richer than ever. I dunno… Maybe EVERYONE has to feel guilty. EVERYONE! FOREVER!

Art Spiegelman: Maus A Survivor’s Tale (The Complete Maus, Penguin Books, 2003 p.202)

 

confessions of a godless heathen

Percy Bysshe Shelley, 1819, by Amelia Curran

Ignore the senstionalist headline; there are no confessions here, and I’m not a heathen, I’m an atheist. When I was a teenage atheist, one of my main issues with the idea of god had been neatly summed up well over a century earlier by Shelley in The Necessity of Atheism (1811):

If God wishes to be known, cherished, thanked, why does he not show himself under his favourable features to all these intelligent beings by whom he wishes to be loved and adored? Why not manifest himself to the whole earth in an unequivocal manner, much more capable of convincing us than these private revelations which seem to accuse the Divinity of an annoying partiality for some of his creatures? The all−powerful, should he not heave more convincing means by which to show man than these ridiculous metamorphoses, these pretended incarnations, which are attested by writers so little in agreement among themselves?

As an adult atheist I still think that, but I think a lot of other things too. I should possibly point out here that though I don’t believe in any deities, the god I primarily didn’t and don’t believe in was the Christian one, simply because that’s the one who most prominently didn’t and doesn’t exist in my own personal experience. My lack of any kind of religious belief is something I’ve given a lot of thought to over the years and mentioned many times in passing on this website. I’ve never written specifically about it, but several things I’ve recently come across made me want to. One is the slightly dubious, clickbaity claim that (as one headline put it) “God is back” and that Gen Z (or some such amorphous group) is embracing the Catholic church. I’m sure that to some extent that’s true, as the Catholic church is just as evident as always, the choosing of a new Pope is TV news etc, but it’s also true that there have been other, substantially similar news stories about Gen Z embracing astrology and conspiracy theories and feminism and anti-feminism and fretting about world war three. None of those things are mutually exclusive of course (most of them should be; maybe feminism & anti-feminism actually are), and what it seems to add up to is that kind of end-times malaise normally associated with the end of a century or millennium.

I feel like it’s necessary to take those kinds of stories with a pinch of salt though, simply because over the years I’ve read all kinds of similar stories about Gen X which occasionally apply to me and often don’t, but in either case I’ve never been asked my opinion in order to gauge it and neither I presume have most people. And since every generation seems to spawn its own Nazis, centrists, communists and anti-fascists and everything in between, its philanthropists, misanthropes and bystanders, its religious zealots, libertines and atheists (etc, etc, ad nauseam), it seems fair to assume that any theory about a generation, just like any theory about a gender, race or sexuality is going to involve the kinds of generalisations which, once really examined, make the whole theory redundant. Presumably, church attendances are on the rise, but does that mean that belief is on the rise, or just that the desire for belief – quite a different thing – is? Or both? Who knows.

Alongside that, not coincidentally, more and more (inevitably right wing) politicians have been yammering on at first in the USA and now here, about “Judeo-Christian” values. It seems that this is mostly because they don’t like foreigners and Islam and are immune to irony. Because in insisting on the values of two ancient foreign religions from what we now in the West call the Middle East and denying the very similar values of another, very similar (though not quite as ancient) religion also from what we now call the Middle East does seem ironic, especially when one is tying it in with one’s national identity. There’s been a growing rhetoric (again, on the right) that suggests that Christians are becoming an oppressed minority in the UK, which is both tiresome and laughable but nicely (and again not coincidentally) complements the growth of a men’s rights movement that claims feminism (which, like atheism has arguably only recently began to have a fairly minor influence if any on the power structures underlying British society) has ‘gone too far’ and all that fun stuff.

Although my attitude has changed over the years, I don’t think my views really have. I genuinely think that it’s terrible and damaging that all over the world people are punished or ostracised or oppressed or killed or made to feel bad about themselves for offending arbitrary rules established in the name of imaginary beings. And in a way worse, the idea that there are omniscient, omnipotent beings who would be offended by actions which they must have foreseen at the moment of creation but decided to allow anyway, in order to punish them.

That kind of thing seems to be the basis of a lot of atheist polemic. Sometimes I find it entertaining and (depending on the writer) interesting, but, even while still believing every word of it, and feeling that it’s worth insisting on if asked about my views, as a middle aged atheist I wonder about the usefulness of saying it polemically at all. Because – for me at least – the opposite of religious faith isn’t science and logic (though I do believe in those), it’s simply non-faith. And I’m not sure there’s much to learn from that.

It’s not an argument that strengthens any cause, let alone mine, but I have come to think that lack of belief in a god or gods is just as instinctive, reflexive and fundamental as faith in them is. My mother was a Christian in her youth (in an atheist household, oddly for the 1950s) to the point where she considered becoming a nun. During her life, she wavered from various kinds of Christianity, to Taoism and Buddhism and a kind of vague paganism, but – and I think this is the most important point – although she lost her faith in many belief systems over the years, she never lost her essential faith in some kind of benevolent god or spirit at the heart of creation. For me it’s almost the opposite.

I have always been very interested in religions from Animism to Zoroastrianism in the exact same way that I’ve always been interested in mythology (I don’t really distinguish between the two) and I find pretty much all religions to some degree fascinating. I love churches and places of worship, I love the atmosphere of ‘holy’ places (even pre-historic places we now assume were once sacred) and I love the imagery and paraphernalia of religions, in the exact same way I love art and history. But it’s good that I’ve never wanted to belong to a faith or to become involved with those mythologies, because I can’t remember a time when I ever believed in even the possibility that a deity of any kind was an actual, real thing. Santa Claus either for that matter, although presumably at some pre-remembered point I did believe in him (Him?)

I have no idea where my lack of faith came from but I can pinpoint when I first became aware of it. I went to three ordinary Scottish primary schools, which in the 1980s meant reciting the Lord’s Prayer every morning before the class started. Not surprisingly, I still remember most of it, though mysteriously I can’t work out which bit I thought in my childhood mentioned snot; I was quite deaf then, but I definitely remember a snot reference, which always seemed odd. In my memory that daily recital was just part of a greater daily ritual which also involved (in the early years) chanting the alphabet and (through all of Primary school) greeting the teacher in monotone unison (The phonetic version of Mrs expresses it more accurately) “GOOD MOR-NING ‘MI-SIZ WAT-SON” or whoever the teacher happened to be – seemingly there were no male Primary School teachers in my day.

I have surprisingly sharp memories of looking round the class during the morning prayer to see who else didn’t have their eyes closed – there were usually a few of us, and sometimes we would try to make each other laugh – but a key part of that memory for me is the sureness of the feeling that I wasn’t talking to anybody. The praying itself wasn’t something I questioned or minded – if anything I quite liked it. It didn’t feel at all ‘bad’ or rebellious not to believe, it just never occurred to me at any point that god was real and might be listening, any more than I remember feeling that the notes put up the chimney to Santa would be read by an old man with a red suit and white beard, or that the carrot for Rudolph would be eaten by an actual reindeer.

At school we went to church (I think) three times a year – at Christmas, Easter and (an anomaly) Harvest Festival – and so folk horror-ish paraphenalia like corn dollies are always associated with church in my mind. The sermons were boring, as were some of the hymns, although others, the ones where the kids invariably sang the wrong lyrics, were fun – but I liked (and like) churches. I liked the musty, chilly smell and the wooden pews and the acoustics and the stained glass windows and especially the holiday feeling of being at school but not at school. And, though they only came into school life at these times of year I liked the Bible stories too. It seems funny now, but until well into adulthood the image that the word ‘Palestine’ summoned in my mind was an illustration of Jesus wandering around in pink and turquoise robes; I presume it’s from some forgotten book of Bible stories. But to me, stories – sometimes good ones (in the case of the early days of Moses and the last days of Jesus, very good ones), sometimes boring ones, are all that they were.
But where does lack of belief come from? The same place, presumably as belief.

Bowie in 1976 by Michael Marks

In Word on a Wing (1976), one of my favourite David Bowie songs – also I think one of his most deeply felt and certainly one of his most open and revealing songs – Bowie, then in LA and in the middle of a drug-fuelled existential crisis but soon to withdraw to Berlin to live a relatively austere and private life, sings:
Just because I believe
Don′t mean I don′t think as well
Don’t have to question everything in heaven or hell

 

For me, that sums up (non-blind) faith perfectly. Essentially, it’s what Keats (those romantics again!) summarised as ‘negative capability’ (“Negative Capability, that is, when a man is capable of being in uncertainties, mysteries, doubts, without any irritable reaching after fact and reason” – from an 1817 letter to his brothers) but applied to one of the most fundamental human impulses. I completely respect it and see what both Keats and Bowie mean by it, but it’s completely alien to me. Well, not completely: I don’t need to know how a jet engine works to travel by plane, I do indeed have ‘faith’ in it, but what the (nowadays many) commentators who characterise scientific belief as a kind of religious faith seem to overlook is that I don’t believe it because a scientist says it’s true, but because I can actually travel on a jet plane, and even before I did travel on a jet plane I could see that other people travelled on jet planes, that planes really do fly and engines really do work. Which seems like the build up to some kind of New Atheism gotcha of the ‘if God is real why doesn’t he just prove it’ type popular in the 2000s (essentially a more sneery version of the Shelley quote). but that’s not really me either. Although I am definitely an actual ‘speculative atheist’ and I suppose even an ‘atheist fundamentalist’ and though I genuinely do believe that the world and humanity would be better off without religion, I’m just not sure how much better off it would be.

It’s not that the New Atheists were wrong (or even new, thinking again of Shelley). Most of the arguments that were raised against them are easily picked apart. The idea that there is no morality without religion is so obviously wrong that it seems pointless even to argue against it. The same basics of morality (murder and stealing and cheating and lying are bad, treat people as you wish to be treated etc) are and have been all but universal, though not without different nuances, throughout history and throughout world cultures.
But the problem with lack of faith as certainty (and for myself I really am certain about it) is that its arguments, though more logical – at least up to a point, as we shall see – have precisely as much effect on the certainty of faith as the arguments of faith have on the certainty of non-faith. Logic is no help here.

From my point of view, in the certain absence of a god or gods, religion is purely human and therefore many of the (in themselves solid) arguments against it are kind of a cop-out. It’s not unreasonable to find it laughable that a supreme supernatural being should care what food you eat on which days, or what you wear or how you like to have your hair. It seems bizarre that an almighty creator who could presumably do whatever it liked, would take the time to tell humans which crops they prefer to have planted where or that male masturbation is bad rather than simply preventing the possibility of rule-breaking ‘at source’. But the omnipresent invisible elephant in the room is that whether or not a god really felt or feels strongly about these things, whether or not a god had them written down in words, they really were written down in words, by human beings, some of whom definitely did want these rules to exist and to be enforced. And it’s human beings that still enforce them. Also, it’s just as true that primarily secular or entirely secular societies also have rules and customs regarding things like clothing, food, hairstyles and even names, although they rarely come with threats of severe retribution and never with the threat of ongoing retribution after death. And yes, many of these customs – like the acceptable length of women’s skirts in western society – ultimately derive from religious directives, but any authoritarian society, not only theocracies or weird, nominally religious ones like Nazi Germany, but even states where religion is completely anathema like Stalinist Russia, Communist East Germany or the North Korea under its current regime are hardly relaxed about the individual’s freedom of expression.

Religious wars and religious persecution are bad, not because they are religious per se, but because wars and persecution are bad. Wars and persecution may often be provoked by religion, but surely if like me you don’t believe in god, then blaming that non-existent creature for religious wars is just euphemistic buck-passing bullshit? The Crusades were horrific, bloody and unjustifiable, but to blame “Christianity” for them, rather than Christians, that is, actual European human beings, is like blaming, or giving credit to, Tengri for Genghis Khan’s conquest of vast tracts of Asia, or suggesting that Jupiter, Neptune and co enabled the Romans to found their empire. “Catholicism” didn’t create the Spanish Inquisition any more than the concept of Nazism created the Holocaust or Islam as a belief system resulted in 9/11 or the Taliban. Left to themselves, religions, ideologies and philosophies don’t do anything; they just sit there. And they all have one common denominator, and it’s not a deity.

This morning, I saw that the Pope had made a statement that some policy or other of the current US administration is “un-Christian and un-American.” Well. I am glad to see anyone with any kind of authority challenging inhumane, intolerant and fascistic regimes. But those actions are only un-Christian insofar as Christ himself wouldn’t like them, according to the Bible. But Christ was one single man-god who acted a certain way and said certain things. All manner of atrocities are entirely in keeping with the actions of two millennia of Christians. As for un-American, again, the acts the Pope condemns are not compatible with the statements made by the founding fathers of the Unites States of America; but they are probably no worse than the actions carried out by those same founding fathers in their lives or many of the successive governments of the USA. Or indeed many, many other governments in the world. And, to be all New Atheism about it, when it comes to the welfare of children for instance, it’s not like the Catholic church itself has an impressive record. Does that mean the Pope shouldn’t condemn things or that American people shouldn’t try to hold their government to account using the egalitarian rules set down when the country was founded? Of course not; but invoking some kind of imaginary, ideal standard of behaviour really shouldn’t be necessary to do so. There’s human decency after all

Another (non-conclusive, because none of them are) argument for the human, rather than divine nature of religion is that the religions that have survived the longest and strongest in the modern world are those which are most compatible with it. The paternalistic, to varying degrees misogynistic Abrahamic religions all defer their ultimate spiritual rewards (but more on the non-ultimate ones later) until after death. They have no in-built expectation of much material happiness or contentment on this plane of existence and to varying extents they actually value hardship, while prioritising men within the earthly realm. Well, the paths that led us to 21st century culture, especially imperialism and capitalism, are fine with all that. Work and strive now, happiness comes later, unless you are one of the privileged few. Communism in theory isn’t fine with that, but naturally, having been formulated during the Industrial Revolution, when the vast mass of people were already oppressed by a tiny ruling class (itself a mirror image of the earlier rule of Church & monarchical elite vs peasant majority), it is defined by its opposition to capitalism. Early Communism therefore took hardship as a given (there is no proletariat without it) and, in lieu of heaven, deferred the payoff of universal prosperity and equality to some future time when the world revolution has been achieved and all opposition to itself removed. It’s a cliché to say that communism is itself a kind of religion, but the parallels are unignorably consistent; trust the leaders, put up with the shit now, eventually if we’re true to our cause it’ll all work out, if those heretics don’t spoil it.

On the other hand, various older kinds of religions, animism and ‘earth mother’ paganism and so on, value (quite logically) the need to look after the world we live in. It’s not that the religions of the book explicitly say not to, but they aren’t primarily concerned with this world – and imperialism and capitalism and even communism, which have other uses for the material world than care and stewardship, have historically all been fine with that. It’s somehow not very surprising that the aspects of non-Christian religions that became most taboo during the age of imperialism, and therefore attributed to “savage” or primitive cultures – human sacrifice, cannibalism, idol worship and so on – should be parts of Christianity itself. Without human sacrifice, even if it’s only the sacrifice of one special token human, there is no Christianity. The divinity of Christ kind of goes without saying – that’s what makes it a religion. But his humanity is what makes him more than just the old Testament god. And insisting on his humanity inevitably made the eating of his flesh and drinking of his blood controversial. But seriously, whether someone believes they are literally eating the flesh and drinking the blood of an actual human being or only symbolically doing so, it’s a cannibalistic ritual just as atavistic and visceral as any of the imagined horrors that the Christians of the crusading period or the Europeans who spread their faith across the world believed they had encountered. It doesn’t seem too fanciful to say that what really horrified those Christians was the discovery that things they saw as fundamental to their own civilisation might be just as fundamental to civilisations that they had to believe were inherently inferior in order to destroy them.

Monkey (1979) Buddhism & Taoism that was fun for kids

The fact that there are analogous stories to those in the Bible throughout history and world cultures (death, rebirth, sacrifice, enlightenment) suggests that whether or not one has any faith in these stories, they aren’t ‘just’ stories. In fact, a lesson that stayed with me (because it suits my personality I suppose) from the 70s TV show Monkey, based on Wu Cheng’en’s 16th century novel, Journey to the West – something like “winners make losers, therefore remove yourself from competitions” purports to be from a Taoist religious text. Eating the fruit of the tree of knowledge (I like to think a banana) and paying the unexpected price for it is, even as a mythological story, one that has real life analogies all through human history. I remember as a child when plastic coca cola bottles began to replace glass ones. It seemed futuristic and in a weird way utopian – lightweight like a can but resealable, far less risk to your drink if you dropped it than a glass bottle; less broken glass in the streets and parks. Whether or not scientists were already concerned with the problem of plastic’s lifespan or the sheer accumulation of it I don’t know, but kids weren’t, for a few years at least. Which has nothing to do with religion – but the attempt to do good turning out not only to be bad, but to be something that has to be dealt with and paid for down the generations is hardly an alien one. And in this case it was made worse not by religion, but by the inability or unwillingness of people under capitalism (myself included) to distinguish between convenience in the sense of people not having to waste half of their lives in drudgery and convenience in the sense of not having to get up to change TV channels. There’s probably a parable in there somewhere.

A favourite anti-atheist argument is the ‘intelligent design’/watchmaker one. It’s clearly an empty argument, but my counter arguments would only be convincing to an atheist – and not even to all atheists. The argument, put simplistically, that because a watch, or a computer, or anything human-made and complex didn’t just evolve on its own, but had to be consciously invented, therefore means that life, earth and everything else must have been consciously invented too requires an obvious leap of logic. The universe is not a machine, life is not the same as battery life.

The most complex things in our world seem to be human beings, and human beings also produce other human beings, often with no conscious thought and rarely with any kind of design at all. People are accidentally invented all the time. The idea that creation is accidental or ‘just happens’ is hardly a difficult one to grasp. The people that people produce are every bit as complex as their parents and grandparents, but only occasionally, and in the most superficial way, are they designed. Worse than that, logically, we know how humans are created, but even so it’s hardly unusual for them to be produced even when the people doing it very much desire not to do so. To look at the way that the most complex creations on earth are usually made and to label it “intelligent design” would be a strange thing to do, since it doesn’t necessarily include much intelligence or any actual design. Of course that doesn’t prove that things weren’t originally designed, but the gulf between organic living things and intelligently designed things as we experience them, even at the beginning of the AI age, is so fundamentally different that you might as well argue that a cat must have designed clouds because you once saw a cloud that was cat-shaped.

As mentioned in passing before, it’s popular among a certain kind of (usually, but not exclusively right-wing, American) Christian to compare ‘faith’ in science to faith in god, which is a false equivalence, for the jet plane kind of reasons mentioned above – but although I do believe science to be superior in every way to religion – because it learns from experience, for one thing – I do sometimes wonder whether it suffers from being (this sounds very different from how I mean it) homocentric (is ‘anthropocentric’ better? It sounds worse) in a similar kind of way. I remember learning (in a very basic way) about the big bang at school and asking the teacher, not unreasonably I think, *what* was supposed to have exploded and where that came from and being told “that isn’t a scientifically interesting question.” Well, quite possibly all the teacher meant is that at the current time any answer to that question must be pure speculation of a non-mathematical kind, but teen-me felt that it was basically “science works in mysterious ways” and he/I didn’t like that.

Somewhere in this article I had been going to say that Shakespeare was was as right as anybody when he wrote “Nothing will come from nothing” but now that I’ve reached this point I wonder whether being creatures that are born, who come from somewhere, who live for a while, who are subject to time and then who die and stop existing (or go somewhere else) shapes our understanding of everything. I do believe in the big bang because the evidence around us confirms its likelihood. The universe started, it expanded and at some point it will end. The idea of something that just is, forever, or that exists outside of time, whatever that would mean, seems as incomprehensible as non-existence does. That things, including human beings do stop existing is in one way obvious – but things breaking down, decomposing, changing from one form to another and (romantically) melding with the universe or (prosaically) enriching the soil or whatever is a process that is understandable. The personality and individual human consciousness switching off and simply not existing is the hard part to take in. As far as we can tell this isn’t a change in energy type, the electrical impulses that are us don’t seem go anywhere or do anything. But maybe that whole frame of reference; beginning, middle, end isn’t everything, it’s just the limits of human understanding. Which doesn’t, to me, imply the existence of any kind of creator or supreme being, just that there’s scope there for whatever you care to imagine but which you can never truly know. Keats would be fine with that.

Similarly, to apply logic to the existence of god will always be self-defeating, because logic is (as far as we know) a specifically human way of explaining the universe to its/our own satisfaction. The laws of physics and nature and mathematics do seem to work according to logic, which is very helpful for teaching and learning and science, but human beings themselves routinely defy logic in both profound and trivial ways. Many of the things that humans value most highly are completely resistant to logic, like art and god and love and money. Even something as humble as sports; one human being being able to run faster than another or play a game better than another is only dubiously something to celebrate, and if it is, then logically one might expect people to support only the best teams and athletes. If, alternatively it’s to do with identification with and loyalty to one’s own area, then fans might only be expected to support teams or athletes from the same geographical location as yourself, which is occasionally how it works, but just as often isn’t. There’s nothing especially logical about the enjoyment of a race or a game in which you aren’t involved for its own sake. Does that mean that logic is a faulty way of understanding the universe? I don’t know; but it is a faulty way of understanding human beings. The idea that god’s existence is a logical reality in a 2 x 2 = 4 way makes about as much sense as the position of the planets at the time of your birth dictating your future.

As Bowie implied, faith needn’t – and in many cases I’m sure doesn’t – preclude seriously considering the implications of one’s belief. But sometimes it does. I’ve never wanted to believe (I don’t really get why anyone would, if they don’t; which is my deficiency), but as an adult I have always wanted to understand people who do. And in general, I find it frustrating to try to do so, as two different but very similar anecdotes about my encounters with people of faith illustrate. I am aware though that these may say more about me than they do about the believers.

In my professional capacity I was once interviewing a prominent American black metal musician whose latest album went on about blasphemy a lot. Given that black metal encompasses everything from orthodox satanists to heathens and pagans and occultists and chaos magicians and nihilists, I asked what I thought was a reasonable question; what meaning does blasphemy have unless one believes in god? Doesn’t the concept of blasphemy essentially reinforce the religion it attacks by affording it some kind of legitimacy?* The musician’s response was the black metal version of these go up to eleven. I think what he actually said was “Everyone knows what blasphemy is.” And he was right I suppose, but he was also characterising his band as purveyors of simple shock and outrage to the very few people who are still shocked and outraged by blasphemy. Ho hum.*

The archetypal image of black metal, Nattefrost of Carpathian Forest, photo by Peter Beste

*this made me think of an occasion in high school where I muttered “of for god’s sake” or something like that and my maths teacher said “don’t blaspheme, William!” and I replied “it would only be blasphemy if god existed” and was given a punishment (lines). It was only years later than realised I deserved the punishment, not because of god, but because I was being a smart arse to a teacher – at the time I just felt righteously angry about the lines.

Likewise, a visit from some very pleasant Jehovah’s Witnesses left me with unexpected admiration for them, but also some frustration; they also left prematurely, which my younger self would have regarded as a victory. The respect was for their answer to the kind of question that seems like a typical smart-arse one, but I was genuinely curious. If there are only 144, 000 places in heaven in your religion (I had only recently learned that strange fact) and those are all spoken for already, why are you knocking on people’s doors trying to spread the word about your faith? I hadn’t expected their response, which was something like “Oh, we don’t expect to see heaven. Heaven is for god and the saints and angels, Earth is the paradise that god made for humans, it just needs to be fixed.” A version of Christianity that withholds the promise of paradise even after death was weird to me, but also impressive. Having a faith where you never expect to attain the best bit seems coolly ascetic, but also kind of servile, which it literally is. The fact that servility seems distasteful to me is I suppose my weakness not theirs.

I was less impressed with the response to what I felt and still feel is a serious question and not just a cynical gotcha; If god is all you say it is, all powerful, blah blah, then why create evil? There was a stock answer ready, which was to do with free will and choice, but even though there are holes to be picked in that too (the ‘free will’ of transgressors has nothing to do with the free will of their victims, what about their will?) – that wasn’t what I meant. What I was asking is, If you can do whatever you like, can see everything that has ever happened and everything that will ever happen, if you are capable, presumably, of endless satisfaction and happiness, why create ‘bad’ – or, more personally perhaps, why create even the concept of ‘things you don’t like’ at all? To that question, I got the Jehovah’s Witness version of “these go up to eleven” and a quick goodbye. But I genuinely wasn’t trying to catch them out, I really wanted to know what they thought about it, but apparently they didn’t think anything. Having said that, I can see now that I write about it, that interrogating your belief system for the benefit of a stranger who obviously isn’t going to be persuaded to join you is probably not all that attractive. Still, I didn’t knock on their door.

Guy Pearce as Peter Weyland in Ridley Scott’s Prometheus (2012) – something to aspire to?

So much of religion seems to me to be saying that that, whatever the wonders and horrors and joys and pains of life, it’s not enough and they want more. But again, that’s not exclusive to religious people. I recently saw an unsettling but also unintentionally funny video in which the PA of a shadowy, influential and incredibly wealthy figure was talking about transhumanism and his master’s ultimate Roy Batty/Weyland-from-Prometheus plan not to die at all. Which feels very sci-fi, but also very late Roman Empire. At the same time, my generation grew up with the rumour that Walt Disney’s head is in a refrigerator, awaiting medical science until he can be resurrected when the technology catches up enough. Rebirth and resurrection; there really is nothing new in human history.

detail of the crucifixion from the Isenheim altarpiece (1512 – 6) by Matthias Grünewald

All a bit bleak, maybe; but if religion only offered oppression, judgement, condemnation and war then far fewer people would devote their lives to it. And if the negative aspects of religion all exist independently of religion, then so do the positive aspects, and without the same arbitrary punishment/reward structure underlying it.

Religion offers comfort to people in distress, it offers a sense of community and belonging, it offers contact to people who feel isolated. It offers various kinds of love.  I can’t think of many artworks more moving than Matthias Grünewald’s crucifixion from the Isenheim altarpiece (1512-6), painted to comfort people who were suffering from skin diseases, by showing them the scourged Christ’s suffering, which mirrored their own. But just as the Quran didn’t issue a fatwa against Salman Rushdie and the Bible didn’t take babies from unmarried mothers and kill them and bury them in the grounds of institutions, neither do those books feed the poor, embrace the lonely, paint pictures or create a sense of community. Human beings do those things, and they do them regardless of religion. They do it in societies where religious beliefs aren’t based on the Judeo-Christian tradition and they do it in societies where religious beliefs are actively frowned on. After the dissolution of the USSR, few people were nostalgic about the food queues or the secret police, but many were nostalgic about the sense of community that came from masses of people being in the same situation together. And now that capitalism which, unlikely though it seems, is not always so far removed from Soviet communism, has created its own underclass and hierarchical power structure and pogroms and whatnot, people have also created their own communities, support groups, charities and friendships.

The one positive thing that faith offers that non-faith of my kind doesn’t, is a personal relationship with god – and that’s where we came in; you either believe or you don’t. I can completely understand that having a direct line to someone who knows you and understands you better than you know yourself, who accepts and forgives you could be nice and comforting. Maybe in pre-Christian or non-monotheistic societies that voice was the voice of the ancestors or the spirits of the trees and rivers. I can see how that would be nice too, but for myself I can’t imagine having such a thing or longing for it or even wanting it. For me, you either disbelieve or you don’t.

And maybe that’s really the strongest argument, not against faith, which there is no argument against, but against religions as institutions, as rules and directives of the kind that people are so keen to re-establish. Because if there’s one thing you can see, looking not just at the diversity of religions but at the diversity of beliefs within them, at the different ways that people relate to and communicate with their gods, it’s that god is just as personal and individual as any of its believers and disbelievers and so making an orthodoxy of it can only ever harm more people than it helps.

a plea for comparative rudeness

I had already started writing this, but about half an hour ago the point I want to make was violently reinforced for me. I was waiting for my order in a café, where a radio was on in the background. A senior political figure – not a member of the current government but an elderly-sounding member of the House of Lords who was a veteran of the diplomatic service, I didn’t catch his name – was being interviewed. Before I demonise him too much, I should point out that, even if he did represent the British government, he would have no real power over the situations he was invited to discuss. In a way, that actually makes it worse, because it means he is in a position where he can openly speak his mind and presumably, this was his mind.

George Grosz – The Pillars of Society (1926)

He was being asked about two situations that are more similar than is often portrayed in the media, though one is significantly bloodier. That is, two invasions which are attempted annexations or land-grabs by political leaders with ideological agendas. In the political discourse on the left you hear a lot about how differently the invasions of Ukraine and Palestine are being treated by the political and media establishments (and to a degree the British public), but although there is truth in that, to be fair to the interviewee, he barely differentiated between the two.

When asked about the latest meeting between the Presidents of the United States of America and Russia to discuss the fate of Ukraine – just writing that highlights the essential absurdity of it – the interviewee was reasonable, measured, but oddly wry. While he was clearly concerned about Ukraine and the Ukrainian people, the general tenor of his response was a kind of verbal shrug – a dryly amused ‘what-can-you-do-with-these-guys?’ tone that characterises the way that many of the more serious figures in the British political and media spheres engage with the current administration of the USA and, to a lesser extent (because there’s no need to pretend that he’s an ally) with the government of Vladimir Putin. Moving on to Israel/Palestine/Gaza there was, similarly, some concern about the people currently being attacked, plus a bit of ‘what about Hamas?’ waffle that I don’t think was disingenuous in this case, as it so often is. Because from the point of view of a career diplomat, there is a question about what happens with Hamas after the slaughter stops. It’s a problem that’s been made a much worse and much more unavoidable by Benjamin Netanyahu’s much-publicised funding of Hamas which essentially neutralised any chance of a moderate Palestinian government – but regardless of how they got there, it’s not a situation that will suddenly be resolved, whichever way (to put it coldly) the invasion of Gaza works out.

But when asked about Netanyahu himself, and the actual current Israeli policy, that shrug returned; ‘what-can-you-do-with-these-guys?’ Well, it’s doubtful that a British diplomat, or even a member of the British government can do much to influence someone like Netanyahu – at least not while he has the backing of the US government – but one thing they can do and should do with any rogue politician from any country is to stop acting as if behaving in a consistent, predictable, true-to-character way is the same as behaving in an acceptable way. Given that the UN does have rules, guidelines and standards of conduct, acting as though the leaders of some of its nations are unfathomable forces of nature rather than political figures making conscious policy choices is not helpful, either to the world or to the UN itself, which is only as effective as world leaders make it.

John Heartfield – The Meaning of the Hitler Salute: Little Man asks for Big Gifts (1932) A photomontage made while Hitler was wooing the 1930s equivalent of big tech companies to fund his ideological aims

It would of course be nice if our political/media figures were bluntly critical of despots and would-be authoritarians – but if not, they could at least stop being indulgent towards them or nice about them. There are people who think that diplomacy is, by its nature damaging and wrong, but though it certainly can be, I believe in it, when used appropriately. It’s hard not to believe in it, if like me you grew up during the final phases of the Cold War. That decades of aggressive brinkmanship and paranoia should have ended peacefully with virtually no bloodshed was a barely-credible relief at the time and, given the mental state and emotional temperature of world leaders in the 21st century, it now seems almost miraculous. And that resolution really is a testament to the leaders, and particularly Mikhail Gorbachev, whose sober unflappability wasn’t shared by many politicians then, and doesn’t even seem to be a desired trait among the political class now. There are many times and many situations where sober, reflective diplomacy are desirable.

Conversely, when faced with the actions of hysterical, erratic, devious and capricious idiots or their cynical, opportunistic enablers and hangers-on, or coolly calculating monomaniacs, the kind of reasonable, statesmanlike professional on the radio this morning is at an immediate disadvantage. Acting according to the norms of your profession with people who have no respect for those norms is pointless at best. Even then, that doesn’t negate the whole profession of diplomacy; when meeting with powerful, impetuous morons, being calm and professional is a given and, for many reasons it’s the right thing to do. But to do more than that – to act like the terrified child who wants to appease the bully, or the substitute teacher who wants the scary kids to think they are cool – is a mistake that politicians, unless they happen to be in the final twilight of their careers, will live to regret.

Wyndham Lewis’s 1934 portrait of the highly principled left-wing diplomat Sir Stafford Cripps, originally one of a pair of portraits, the other (now lost) being the leader of the British Union of Fascists Sir Oswald Moseley

Wherever there are tyrants, authoritarians and powerful reactionaries, there is never any shortage of people willing – even against their own interests – to defend and promote them. But we are currently at a strange point in history where these people are so brazen and shamelessly open about their own actions that media commentators and politicians – occasionally with no special vested interest (but often with a financial one) – who do see the shame in those actions are willing to make claims on the behalf of their idols which go beyond any statement made by the perpetrators themselves. I would expect this to be a right-wing problem, because I’m a prejudiced left-winger, but it seems to be more of an ideological problem than one specific to either end of the spectrum. And so, even though on the eve of the invasion of Ukraine, Vladimir Putin himself gave a long speech on Russian TV, where he ranted at length about historical grievances, denied Ukraine’s right to exist at all and talked about restoring the old Russian Empire, never once mentioning NATO, British commentators on both the far left and right have repeatedly justified Putin’s actions with reference to the threat to Russia’s borders posed by NATO expansion and so forth. It doesn’t seem that Putin has asked them to legitimise his actions – he doesn’t seem to think his actions need justifying at all, beyond the simple fact that he thinks Russia should own Ukraine – so why embarrass yourself by making claims on his behalf?

Similarly, members of the Israeli government have been blunt about their desire to remove the Palestinian people from Gaza one way or another. Those of us with memories going back a few months may even remember discussion, involving the US government, about turning the area into a resort. There are photos online of members of the IDF standing in ruins holding the banners of real estate companies, there are videos shared by IDF fighters where they laugh as they rake through the underwear drawers of Palestinian women in their deserted houses, where they joke about using children for target practice, And as the Israeli historian Ilan Pappé has discussed, it’s not like these kinds of debates on what to do with Palestine are new or unusual. So why would any western politician or media spokesperson feel the need to frame the situation as a war between two equal states, or talk endlessly about the hostages that the Israeli government seems not to care about? But what about Hamas? Well, no doubt they have their own gloating social media presence, glorifying their inhuman acts, but they aren’t an everyday part of normal, Western social media and however much the Israeli government like to frame all criticism of themselves as antisemitic Hamas propaganda, I haven’t so far seen a single mainstream politician online or on TV criticise Israeli policy without also condemning Hamas and calling for the freeing of those Israeli hostages. That those hostages are important and should be prioritised should go without saying, just as the fact that almost everyone living in a Western democracy is fundamentally opposed to a repressive theocratic organisation like Hamas should go without saying; and yet it has to be stated again and again because of the way the events – in reality barely a conflict, let alone a war – are being presented.

So, yes, there is definitely a time for diplomacy – it’s both necessary and desirable when negotiating the different cultures, belief systems, nationalities and identities that make up the modern world. And while it would be nice – a relief even – to hear a senior British politician not just commenting on  or blandly ‘condemning’ the words or actions of any rogue regime, whether our supposed allies or not, nor just or urging them to change their ways – but launching a scathing tirade against them, taking legal action in international courts and cutting all unnecessary ties with them, nobody can realistically expect that, It’s just not how politics work. But, as our successive governments have managed to coexist for decades alongside ideologically opposed countries like China or North Korea without the constant threat of war and without feeling the need to openly pander to and flattering their leaders, then it shouldn’t be too much to ask that they do the same with governments whose values are not supposed to be so far removed from our own. Speaking as a citizen of the UK, if the core values of our country really are what we say they are – democracy, tolerance, compassion and all that – then at some point, coming up against the opposite, diplomacy should only go so far.

————————————————————————–

Postscript: on the day I started writing the final version of this, I heard that radio interview. This morning, as I finished it I saw three news stories that all made this mild call for bluntness seem worse than ineffectual: in one, the Israeli government had targeted and wiped out an entire Al Jazeera news crew. While UK news talks about ‘collateral damage,’ IDF spokespeople have talked proudly about removing “Hamas terrorist” Anas al-Sharif, who neither they nor anybody else believes was a Hamas terrorist, because he demonstrably wasn’t. In the USA, the President, while deflecting questions about the files of a dead paedophile that was once a friend of his, is talking openly about forcibly removing city administrations to ‘re-establish law and order’ in areas which seemed to have no special law and order issues until he created them. Finally, I was watching footage of an old, blind man, a woman in her eighties and an elderly military veteran being arrested by the police for holding placards at a peaceful protest while, elsewhere in the UK the police impassively watched a mob of people screaming racist abuse at a hotel where refugees from war zones are being housed, and stood casually by as the leader of an admittedly moribund political party danced around and made Nazi salutes. There is no single correct response to these events, but empty diplomacy from the country’s leaders has nothing to offer in any of them.


 

the end of all songs

A question occurred to me while watching a documentary about Joy Division ; is there any better ending to a song than Ian Curtis bellowing FEELING FEELING FEELING FEELING FEELING FEELING FEEEEEELING! as the music clatters to a halt at the end of “Disorder”? Lyrically, despite its explosiveness, it isn’t cathartic, but in a musical way it is – for the listener at least – because until that point, the tempo has been too fast and the lyrics too complex for Curtis’s voice to do whatever the deep, melancholy equivalent of ‘taking flight’ is. There’s an underappreciated art to ending songs and it’s not something that even great bands do infallibly or that all great songs feature. Not all songs need to end with a crescendo or flourish, and very few songs benefit from just grinding to a halt or being cut off mid-flow, but the sense of completeness when a song (especially a relatively short song) ends perfectly is one of the things that makes you want to hear it again.

Ian Curtis in 1979 by Kevin Cummins

“Decades,” the final song on Closer, the final Joy Division album, is one of relatively few songs (given their vast number) where fading out at the end doesn’t seem like a cop out. There’s nothing wrong with fading out a song, but often it just feels like an easy option taken in order to dodge the question of how to end a song properly. Which is fine, except in live performances, where it’s difficult to satisfactorily replicate a fade-out. Partly that’s because of the practicality of it – does the band all try to play more quietly? Do they just get the sound person to turn down the volume, which works, unless you’re close to the stage, which, in that situation is sub-optimal, since hearing the unamplified sounds from the stage (drums clattering, guitars plinking etc) is kind of a mood-killer? And if so, when do they all stop? There’s also the awkwardness of the audience reaction; the crowd might start cheering/jeering before the song is actually finished, or they might not start until someone in the band indicates that that the song is definitely over, which is also not ideal. Basically, it feels artificial – but obviously it has the appeal of being simple – haven’t thought of a proper ending for you song? Just keep playing and fade it out afterwards. But Closer needed to fade into silence and it does.

Another musical ending this week – a seriously clunky segue this but bear with me – was the death of Ozzy Osbourne, a week after what was explicitly intended to be his final performance, a different kind of ending and a very unusual one in the music world where ‘farewell’ tours can become an annual occurrence and no split is too acrimonious to be healed by the prospect of bigger and bigger sums of money.

Ozzy Osbourne in 1974 by Mick Rock

On paper, any kinship between Ozzy and Joy Division seems unlikely to say the least, but the ears say otherwise. Regardless of the punk roots of Joy Division, the only real precursor to a song like “New Dawn Fades” from their 1979 debut album Unknown Pleasures is Black Sabbath. And it’s not only the oppressively doomladen atmosphere, though that’s important; Bernard Sumner’s opening guitar melody is remarkably like Tony Iommi’s melodic solo from “War Pigs” – a classic song, incidentally, which has one of the worst endings of any great song ever written. Presumably, Black Sabbath had no idea how to end it and so did something worse than a fade out; speeding it up until it ends with a comical squeak. Oh well. But anyway, there are many moments, especially on Unknown Pleasures, where Joy Division sound like a cross between Black Sabbath and the Doors, although I’m sure neither of those things were in the minds of Peter Hook, Bernard Sumner, Stephen Morris and Ian Curtis, any more than they were in the consciousness of the music journalists who lauded the band in ’79, who mostly tended to see punk as year zero, the new beginning from which the influence of anything pretentious or overblown had been erased.

That basic idea was one I also accepted without much thought as a teenage indie fan in the early 90s when Joy Division – by then defunct for a decade – became one of my favourite bands. With the honourable, weekly music paper-approved exception of the Velvet Underground, I was dubious about anything old or anything that I considered overtly commercial. Without giving it much thought I just assumed that mentality came from my reading of Melody Maker and the NME. I had definitely accepted their pre-Britpop genealogy of cool rock music that essentially began with the Velvet Underground and then continued via punk and post-punk into 80s indie guitar music, most of which existed firmly outside of the mainstream of the UK top 40. But reflecting on Ozzy on the news of his death, it seems my snobbery has older roots.

“Mad Housewife”-era Ozzy, c.1986

I don’t remember when I first heard Ozzy Osbourne’s name, but I do remember when I first heard his music. It was 1988 and I was about a year away from growing out of metal, but still immersed in it for the time being. Within metal itself I had fairly wide taste and my favourite bands included many of the biggest metal bands of the era; Iron Maiden, Metallica, Guns ‘n’ Roses, Helloween, Megadeth, Suicidal Tendencies, Queensrÿche, Slayer, Anthrax, plus many more. At that point I mostly discovered music via magazines (especially Metal Forces) and my friends. In addition to my modest collection of records and tapes I had many more cassettes that had been made for me by friends and I spent a good bit of my spare time making tapes for them; it was fun. And so; Ozzy. A friend had taped a couple of albums for me on a C90 cassette (the odd pairing, it seems now, of Mötley Crüe’s Girls, Girls Girls and Slayer’s Reign in Blood) and filled up the rest of the tape with random metal songs, among them “Foaming at the Mouth” by Rigor Mortis, “The Brave” by Metal Church, “Screamin’ in the Night” by Krokus and Ozzy’s latest single, “Miracle Man”. I pretty much hated it. I thought Ozzy’s voice was unbearably nasal and awful and the production really harsh and tinny (that was probably just the tape though).

Memorex C90s were pretty dependable
Teenage metal fans were obliged to like Elvira in 1989

By then, I knew who Ozzy was, and was aware of his bat-biting notoriety, though that definitely seemed to be a bigger deal in the USA than it was in the UK (or at least in my corner of rural Scotland). At some point just a little later, Cassandra Peterson, or more accurately Elvira, Mistress of the Dark presented a short series of metal-related shows for the BBC. One episode included Penelope Spheeris’ fantastic documentary The Decline of Western Civilization Part II: The Metal Years, which includes one of my favourite Ozzy interviews, but also concert footage of Ozzy during his ‘mad housewife’ era when his image seemed to be based on Jackie Collins’s style at the time. I love that era of Ozzy now, but at the time I thought it was laughably awful. It must have been around that time that I also became aware of Ozzy’s history with Black Sabbath, who I only knew in their then-current incarnation with Tony Martin, which again I now love but at the time thought irremediably middle aged and boring. The fact that Ozzy’s Black Sabbath was from the 70s meant that I pretty much dismissed them without needing to hear them. When Elvira showed a classic early Led Zeppelin concert in black and white I also found that tiresomely old and dull, especially in comparison with the Napalm Death concert she presented. It’s hard to relate to now, but in the 80s, for me – and I think for most people I knew of my age – the 70s was cheesy, embarrassing and possibly funny, but with no redeeming features. Actually, that’s how the 80s were for a good part of the 90s too; changed days.

Again, like most of the metal fans I knew, I loved metal, but I mostly didn’t like rock. Metal meant precision, virtuosity, heaviness and speed. Rock (to this kind of metal fan) was simplistic, old-fashioned and (worse) commercial. Oddly, I never thought to include the very glam-oriented hair metal bands I liked in the rock camp; which I can now see is where they really belonged. I loved bands like Poison, Faster Pussycat and Pretty Boy Floyd, despite the fact that their very obvious ambition was to be famous and that they wrote schmaltzy ballads. I made the same exception, mysteriously, for Guns ‘n’ Roses, who I loved. But I thought of them as metal, not rock.

Cliff Burton rocking like it’s 1974 (c.1986)

It was a distinction that my parents’ generation seemed simply not to understand. To them and their friends if you liked Metallica wasn’t that basically the same as liking Meat Loaf? But I was of the generation for whom, from the earliest days of primary school, the idea of being seen in flared trousers was the stuff of nightmares. That horror of the era we were born in was hard to let go of., which is no doubt partly why the legacy of punk was easy to embrace later. In 1988, when I first heard them, Metallica instantly became one of my favourite bands and …And Justice For All one of my favourite albums. A crucial part of that was that the band, as I first knew them, looked cool to me. When, probably later that year, I first heard Ride the Lightning and Master of Puppets I loved those too, but the sight of the great Cliff Burton (RIP) in his denim bellbottoms with his middle-parted hair and little moustache, looking like he should have been in Status Quo circa 1974 was extremely cringe-inducing; that was not cool. Not in Scotland in 1988 anyway.

It took a while for that attitude to change. One of the gateway albums that led young teen me away from heavy metal and towards the indie/alternative world was Faith No More’s The Real Thing, which included a cover of “War Pigs.” And at that time the song still felt old fashioned and less good than the rest of the album to me. It was only after a few years of hardcore indie snobbery that my attitude really changed. As my adolescence got to the more painfully introspective stage I stopped listening to metal, having been introduced to things like the Pixies and Ride and simultaneously discovering slightly older music like The Smiths, The Cure, Joy Division and the Jesus & Mary Chain. The part of me that still liked loud and heavy guitars didn’t care so much about precision anymore and so alongside the typical UK indie stuff, I also liked grunge for a while, mainly Mudhoney, Tad and Nirvana, but especially grunge-adjacent weirdness like the Butthole Surfers and Sonic Youth. That would seem to provide an obvious bridge to the hard rock of the 70s, since virtually all grunge-oriented bands referenced Sabbath and Kiss, but no.

a book that shaped my taste in the 90s

In fact, what happened was that in the Britpop era, I loved 70s-influenced bands like Pulp and Suede (I was never a fan of Blur or Oasis) and as Britpop became dull I started to get into the older music that Britpop referenced. At first it was mostly Bowie and Lou Reed, but after reading  Shots From the Hip (referenced a million times on this website) by Charles Shaar Murray, I broadened my horizons to include 70s glam in general (Roxy Music, Eno, Jobriath, Raw Power-era Stooges, but also the bubblegum stuff) and other things that Murray mentioned, whether positively or disparagingly. The latter seems odd but I’ve discovered lots of things I like that way. And suddenly, Ozzy was inescapable (though less so than he is this week).

I bought the Charles Shaar Murray book because Bowie was featured heavily in it; but he also wrote about Black Sabbath. I bought a book by the great photographer Mick Rock, because he had photographed Bowie and Lou Reed and Iggy and John Cale; but who should be in there but Ozzy, looking uncharacteristically thoughtful. I bought old 70s music annuals from glam and tail end of glam era; Fab 208 maybe – because they had Bowie and Mott the Hoople and Pilot and whatnot in them, but inside there was also mention of Black Sabbat. I remember a paragraph about their then-forthcoming compilation We Sold Our Souls for Rock ‘n’ Roll being especially intriguing.

Birmingham in the 1970s by Peter Trulock

Anyway, one thing led to another and I spent a large chunk of the late 90s and early 2000s immersing myself in the music of the 1970s. At first it was primarily glam, but then all kinds of rock, pop, soul, funk etc. At some point it started including bands that I’d long been aware of and never liked; like Led Zeppelin, Kiss – and Black Sabbath. The first Black Sabbath album I owned was Sabotage, bought for 50 pence in a charity shop. The texture of the sleeve was, interestingly, the same texture as my LP of Joy Division’s Unknown Pleasures, but the imagery was a little less classy, thanks to Bill Ward’s checked underpants being visible through his red tights; oh well. Ozzy sounded pretty much as I remembered from “Miracle Man,” but primed by Charles Shaar Murray’s description of Ozzy [caterwauling] about something or other in a locked basement and with a more sympathetic production and – crucially – the far more bare and elemental sound of Black Sabbath, so unappealing just a few years earlier, he sounded right. And then, when I heard the earliest Black Sabbath albums, Black Sabbath and Paranoid, both from 1970, one of the things they reminded me of, most unexpectedly, was Joy Division.

Black Sabbath in 1970 by Keef, Joy Division in 1979 by Anton Corbijn

Yes, the whole aura is different, Sabbath were surly and aggressive where Joy Division were solemn and withdrawn, but there’s something about the simplicity of the sound. Geezer and Hooky’s basses took up as much space as Tony and Bernard’s guitars. Bill Ward, like Stephen Morris, was a drummer who brought a strong dance/funk element into the band’s rock music without any sense of incongruity. Ozzy and Ian Curtis are worlds apart as vocalists, but both have a despairing intensity that makes them stand out, even within their respective genres. Both bands were from the grim, grey, hopeless industrial 1970s north of England, but whereas Joy Division were definitively a product of Manchester, with all the gritty coolness that conferred upon them, Sabbath were solidly of Birmingham, with all of the perceived oafishness and lack of credibility that entailed in the music press at least. Both singers were self-destructive too, but the same year that Ian Curtis tragically ended his life, Ozzy was reflecting on his self-destructive behaviour in “Suicide Solution”* and starting his life anew, launching a solo career which, against all expectations, made him an even bigger star and ultimately the icon who is being mourned today, far more widely than I’m sure he would ever have imagined. It was a good ending.

*Ozzy was always a far more thoughtful lyricist than he’s given credit for; I can’t think of any other artist from the aggressively cocky 80s hair metal scene who would have written the glumly confessional anthem “Secret Loser” from Ozzy’s 1986 album The Ultimate Sin

Hulme, Manchester in the 1970s, by David Chadwick

Because I’m a nerd, and not just a music nerd, writing this piece made me think of Michael Moorcock’s elegiac sci-fi/fantasy novel novel, The End of All Songs, published in 1976, the year that Ian Curtis, Peter Hook and Bernard Sumner met at a Sex Pistols concert in the Lesser Free Trade Hall in Manchester, the year that Black Sabbath released their seventh album, Technical Ecstasy, generally agreed to be the one where the cracks started to show in the Ozzy-led lineup but one of my favourites. Moorcock took the title of his novel from a poem by the Victorian writer Ernest Dowson, which feels appropriate to end with, since fading out is kind of a hassle, text-wise.

With pale, indifferent eyes, we sit and wait
For the drop’d curtain and the closing gate:
This is the end of all the songs man sings.
Ernest Dowson, Dregs (1899)

to the victor

Everyone knows that history is written by the victors, but I’m not sure it’s always realised – or I should say that I hadn’t considered – how much of a privilege that is. Of course there are the obvious material, societal privileges that come with winning, but I’m thinking specifically about history and the way it’s transmitted.

This little epiphany came while watching the now ten-year-old show, Tony Robinson’s Wild West. It’s not a piece of history that generally interests me, but Robinson belongs to the last generation that grew up with ‘cowboys and Indians’ occupying a major place in popular culture but is also a presenter who tends to look at history from a point of view that conservative critics would call postmodern; i.e. he tends to think that history is complicated and acknowledges injustices, calls exploitation exploitation and that kind of thing, and so it piqued my interest. But one of the unexpected consequences of making an enlightened history show is the focus it places on experts and the way that they come across.

Ancient Greek red-figure pottery c.480 BC

Now, I don’t mean to condemn the historians I’m talking about here; most historians are in some ways at least enthusiasts for their subject, and the best ones have a special, and sometimes quite a nerdy, bond with the period and people that they have chosen to study. When I studied ancient history, one of the best lecturers I had was a specialist on ancient Greece who had seemingly based his physical appearance on the look of Greek men on ancient red-figure pottery; curly hair, big beard and all. And there was a historian of Rome in my Classical Studies course with a ‘Julius Caesar’ haircut. And it feels natural that someone would want to immerse themselves in the civilisation which fascinates them enough to make it their life’s work.

Joe LeFors – lawman & fashion role model

And so it makes sense that some of the historians in Tony Robinson’s Wild West, with their (to British eyes) Edwardian moustaches and waistcoats should look as though they are tending the bar in a saloon in Calamity Jane, or are speaking to the camera from the frontier, while wearing their fringed buckskin jackets. It’s kind of comical, but also isn’t, in the context of that history. The clothes seem like a concession to the romanticism of the Old West, but these are modern historians, and despite being occasionally slightly squeamish about the history they are recounting, and tending to overstate the balance of power in the ‘Indian wars’ they make no serious attempt to whitewash it.

But it feels significant that the Lakota historian talking about the massacre at Wounded Knee in 1890 wasn’t wearing historical fancy dress, and it would seem strange if he was. And it’s the discussion of atrocities like Wounded Knee which, to paraphrase that historian, is framed as a war crime, because that feels historically, appropriate but in fact there was no war – it was just a crime, albeit one committed by a government. And nakedly genocidal attitudes at the time, encapsulated by a quotes from L. Frank Baum, author of – of all things – the Wizard of Oz and its sequels, make the buckskins, waistcoats and archaic facial hair and fashions of the Western historians seem somewhat incongruous.

Bartender at the Toll Gate Saloon in Black Hawk, Colorado c.1897

Hearing of the death of Sitting Bull in the aftermath of Wounded Knee, Baum wrote in his popular newspaper:

The proud spirit of the original owners of these vast prairies inherited through centuries of fierce and bloody wars for their possession, lingered last in the bosom of Sitting Bull. With his fall the nobility of the Redskin is extinguished, and what few are left are a pack of whining curs who lick the hand that smites them. The Whites, by law of conquest, by justice of civilization, are masters of the American continent, and the best safety of the frontier settlements will be secured by the total annihilation of the few remaining Indians. Why not annihilation? Their glory has fled, their spirit broken, their manhood effaced; better that they die than live the miserable wretches that they are.

Though thankfully not, or not quite, enacted, this wasn’t a controversial view at the time. But with even with a little knowledge of the historical context (there had been a government policy explicitly called the Indian Removal Act only 50 years earlier), its knowing blatancy is shocking. It’s one thing to talk or read about massacres of Greeks or Spartans or Persians thousands of years ago; it’s quite another to think of it happening in the time of your own great-grandparents, or to think of it happening elsewhere literally as I write these words.

It’s an obvious, problematic and quite cheesy comparison to make, but I’ll make it. A situation where a historian could sit in a black shirt and swastika armband, talking about World War Two – acknowledging quite reasonably the terrible things that were done on both sides, but talking, in an essentially neutral way about the formation of modern Europe, while a far more sombre Jewish historian shows us the Auschwitz museum, is almost unimaginable. And yet – judging by the way these things actually work – it’s not an especially outlandish alternate present.

the Emperor Augustus, statue from the 1st century AD with eternally modern hair

Not only is history written by the victors, but the world we live in is forged by them, and generations on from winning an overwhelming victory they are no longer ‘they,’ they are just us. But that takes time. The historian who loves ancient Greece can afford love it, wherever they come from. Greece was barbaric as well as enlightened, Greece had slaves and was misogynistic – but Greece is only what we call it now for convenience; the collection of independent city states and relatively heterogynous cultures now called ‘ancient Greece’ is not the same as the country Greece. “Rome” is not Italy. The world – especially but not exclusively the western world – has benefitted from ancient Greek and Roman culture and though it’s easy to argue for its negative influence too, that’s less the fault of those ancient people than it is our more recent ancestors who understood those cultures in the way they could and wanted to understand them.

Regarding America, and Britain and everywhere else, all of that may come, in a thousand years or so, when the state of the world will probably be as unimaginable to us now as the internet would be to Herodotus. But. There are photographs of the mass graves being dug at Wounded Knee and while the winners of the ‘Indian wars’ were able to impress their culture on the country and to define that recent history in terms of expansion, exploration and consolidation, the other side of the story is still living history too. And the losers get to just live with it and tell their stories to whoever will listen.

It’s fair to say that the nature of the conquest of America – not just the 1800s and the Old West, but everything from the founding of the first colonies to the establishment and erosion of the reservations – has never been forgotten, but it’s only relatively recently that it’s been publicly acknowledged. Tony Robinson’s television show came about precisely because he belongs to a generation – my father’s generation – where young boys were entertained by stories of heroic cowboys fighting savage Indians in a way that now looks, despite the inevitable messiness of history as opposed to fiction, like an almost perfect inversion of the facts. And I belong to one of the first generations where it hasn’t been popularly presented that way. It’s never too soon to be passionate about history or to tell the truth about it, but even though it’s always nice to find yourself on the winning team, when that happens it’s probably worth considering what it really means and how you got there. And if the dressing up still feels appropriate then go for it.

a victory over ourselves – versions of 1984 in 2025

Remembering 1984 as someone who was a child then, I find that although the clocks didn’t strike thirteen, the year – as encapsulated by two specific and very different but not unconnected childhood memories, as we’ll see – is almost as alien nowadays as Orwell’s Airstrip One. Of course, I know far more now about both 1984 and Nineteen Eighty-Four than I did at the time. I was aware – thanks mostly I think to John Craven’s Newsround – of the big, defining events of the year.

Surely the greatest ever cover for Nineteen Eighty-Four, by Stuart Hughes, for the 1990 Heinemann New Windmills edition

I knew, for instance about the Miners’ Strike and the Greenham Common Women’s Peace Camp, but they didn’t have anything like the same impact on me personally as Indiana Jones and the Temple of Doom. I remember Zola Budd tripping or whatever it was at the Olympics and Prince Harry being born, but they weren’t as important to me as Strontium Dog or Judge Dredd. Even Two Tribes by Frankie Goes to Hollywood made a bigger impression on me than most of the big news events, despite the fact that I didn’t like it. My favourite TV show that year was probably Grange Hill – and here we go.

Grange Hill, essentially a kids’ soap opera set in a big comprehensive high school in London, ran for 30 years and I recently discovered that the era of it that I remember most fondly – the series’ that ran from 1983-6 – is available on YouTube. When I eventually went to high school later in the 80s, my first impression of the school was that it was like Grange Hill, and now I find that despite the silliness and melodrama, Grange Hill still reflects the reality, the look and the texture of my high school experience in the 80s with surprising accuracy.

Grange Hill”s 5th formers in 1984 (back: Stewpot & Glenroy, front: Suzanne & Mr McGuffy)

But anyway, watching old Grange Hill episodes out of nostalgia, I was struck by how good it seems in the context of the 2020s, despite the obvious shortcomings of being made for children. Check out this scene from series seven, episode five, written by Margaret Simpson and aired in January of 1984. In among typical story arcs about headlice and bullying, the Fifth form class (17 year olds getting towards the end of their time at school) get the opportunity to attend a mock UN conference with representatives from other schools. In a discussion about that, the following exchange occurs between Mr McGuffy (Fraser Cains) and his pupils Suzanne Ross (Susan Tully), Christopher “Stewpot” Stewart (Mark Burdis), Claire Scott (Paula-Ann Bland) and Glenroy (seemingly of no last name) (Steven Woodcock). It’s worth noting that this was the year before Live Aid.

Suzanne: [re. the UN]:"It's about as effective as the school council."

Mr McGuffy: "Oh well I wouldn't quite say that. The UN does some excellent work - UNESCO, the Food and Agriculture Organisation, the UN Peacekeeping force..."   [...]

Claire: "What's the conference gonna be about?"


Mr McGuffy: "The world food problem. There was a real UN conference on this topic ten years ago..Glenroy: "Didn't solve much then, did they? Millions of people still starvin'"

Stewpot: ”Yeah that's cos they ain't got no political clout to do anything about it though, ain't it"

Glenroy: ”Naw man, it's because the rich countries keep them that waySuzanne: “The only chance a poor country's got is if it's got something we wantGlenroy:That's right - they got something the west wants and they'd better watch out because the west starts to mess with their government."

Mr McGuffy: "Well it's clear from what you've all said so far that you're interested in the sort of issues that will be discussed that weekend..."

Suzanne & Claire, 1984

It’s not too much of an exaggeration to say that that is a more mature political discussion than is often heard on Question Time in 2025. Interestingly, it’s not an argument between left and right as such, but between standard, humanitarian and more radical left-wing viewpoints. Needless to say, if it was presented on a TV show that’s popular with ten year-olds nowadays, a certain demographic would be foaming at the mouth about the BBC indoctrinating the young with “wokeness.” But as a kid this sort of discussion didn’t at all mar my enjoyment of the show – naturally there’s also a lot of comedic stuff in the series about stink bombs and money-making schemes, but one of the reasons that Grange Hill remained popular (and watchable for 8 year-olds and 15 year-olds alike) for so many years was that it refused to talk down to its audience.

The way the writers tackled the obvious big themes – racism, sexism, parents getting divorced, bullying, gangs, sex education etc – are impressive despite being, quite broad, especially when the younger pupils are the focus of the storyline, but what makes a bigger impression on me now is the background to it all. It’s a little sad – though true to Thatcher’s Britain – to see through all this period the older pupils’ low-level fretting about unemployment and whether it’s worth being in school at all.

And maybe they were right. In 1984, when Suzanne and Stewpot were 17, a fellow Londoner who could in a parallel universe have been in the year above them at Grange Hill was the 18-year-old model Samantha Fox. That year, she was The Sun newspaper’s “Page 3 Girl of the Year.” She had debuted as a topless model for the paper aged 16, which is far more mind boggling to a nostalgic middle-aged viewer of Grange Hill than it would have been to me at the time. Presumably, some parts of the anti-woke lobby would not mind Sam’s modelling as much as they would mind the Grange Hill kids’ political awareness, but who knows?

Sam Fox in (approximately) Grange Hill mode c.1986, not sure who took it

Naturally, the intended audience for Page 3 wasn’t Primary School children – but everybody knew who Sam Fox was and in the pre-internet, 4-channel TV world of 80s Britain he had a level of fame far beyond that of any porn star 40 years later (arguments about whether or not Page 3 was porn are brain-numbingly stupid, so I won’t go there; and anyway, I don’t mean porn to be a derogatory term). Anyway, Sam (she’ll always be “Sam” to people who grew up in the 80s) and her Page 3 peers made occasional accidental appearances in the classroom, to general hilarity, when the class was spreading old newspapers on our desks to prepare for art classes. It was also pretty standard then to see the “Page Three Stunnas” (as I think The Sun put it) blowing around the playground or covering a fish supper. It wasn’t like growing up with the internet, but in its own way the 80s was an era of gratuitous nudity.

a nice Yugoslav edition of 1984 from 1984

Meanwhile, on Page Three of Nineteen Eighty-Four, Winston Smith – who is, shockingly, a few years younger than I am now – is trying to look back on his own childhood to discern whether things were always as they are now:

But it was no use, he could not remember: nothing remained of his childhood except a series of bright-lit tableaux, occurring against no background and mostly unintelligible.” George Orwell, Nineteen Eighty-Four (1949) P.3 – New Windmill edition, 1990

By contrast, some of the roots of 2025 are plain to see in 1984, despite the revolution of the internet that happened halfway between then and now. As the opposing poles of the Grange Hill kids and The Sun demonstrate, there were tensions in British society which would never so far be resolved, but they would come to some kind of semi-conclusion at the end of the Thatcher era when when ‘Political Correctness,’ the chimerical predecessor of the equally chimerical ‘Woke’ began to work in its unpredictable (but I think mostly positive) ways.

 

Most obviously, Page 3 became ever more controversial and was toned down (no nipples) and then vanished from the tabloids altogether for a while (though in the 90s the appearance of “lads’ mags” which mainstreamed soft porn made the death of Page 3 kind of a pyrrhic victory.) More complicatedly, the kind of confrontational storylines about topics like racism that happened in kids shows in the 80s became a little more squeamish, to the point where (for entirely understandable reasons) racist bullies on kids’ shows would rarely use actual racist language and then barely appear at all, replaced by positivity in the shape of more inclusive casting and writing. All of which became pretty quaint as soon as the internet really took off.

a very 1984-looking edition of Nineteen Eighty-Four from 1984

So, that was part of the 1984 that I remember; what Orwell would have made of any of it I don’t know. It wasn’t his nineteen eighty-four, which might have pleased him. For me, it all looks kind of extreme but also refreshingly straightforward, though I’m sure I only think so because I was a child. It’s all very Gen-X isn’t it?

notes in may

May-appropriate art: William Roberts – Bank Holiday in the Park (1923)

This is just what it says*: I intend to post something at least once a month but in lieu of any finished articles, here are various notes I made during May that never got developed into anything more substantial, some of which will probably seem mysterious later since I don’t think I’ll bother to explain the context.

*tragically, that title is a kind of pun though

 

 

“Let them eat space travel”

Publishing light-hearted articles that debate whether AI deserves “human rights,” while not covering the erosion of actual human rights because you don’t want to be ‘political’ is political.

The pressure to make politicians and news agencies use the word “genocide” to describe the Israeli government’s attacks on Gaza are understandable (because they are committing genocide, by the normal definition of the word: the crime of intentionally destroying part or all of a national, ethnic, racial, or religious group, by killing people or by other methods Cambridge Dictionary) but it’s also kind of self defeating. The focus on a word that represents their actions, rather than the actions themselves creates an instant response which, given the nature of genocide, should be horror, but is more often dismissal; “Genocide? Oh, you’re one of those ‘Free Palestine’ people.” A more important question to ask politicians and the media is possibly, if what is being done to the people of Gaza isn’t genocide, is it therefore somehow okay? Would any non-psychopath presented with an estimated death toll as high as 62,000 Palestinian people since October 7th 2023 think “oh well, it’s a lot but at least it isn’t genocide“? The word is important for moral, legal and factual reasons but at this point it seems as likely to distract from the reality that we are seeing every day, rather than to really bring it home.

Arthur Wright – May Day in Town (1974)

Even if the climate emergency is allowed to escalate with no serious attempt to alter it for another decade (which would be, or just as likely, will be, disastrous), it would still be infinitely easier to prevent the Earth from becoming an uninhabitable and Mars-like wasteland than it would to make Mars into a habitable and Earth-like home – especially for any meaningful number of people. Those who are most determined that colonising Mars is a good idea are essentially not serious people, or at least are not serious about the future of mankind. ‘Conquering new worlds’ is just a fun, romantic and escapist idea that appeals more than looking after the world we have; it’s a typical expression of the political right’s obsession with the welfare of imaginary future people as opposed to the welfare of actual human beings who exist.

Just MIT confirming that the problem with solar energy is there’s not enough profit in it

It’s becoming ever more obvious that the UK has a media problem. For a decade now, a particular politician (don’t even want to type his name) with a consistent track record of being unpopular and not winning elections – not, so far coming third or even fourth in a general election – has been foisted on the public to the point where he’s an inescapable presence in British culture. He’s on TV, in newspapers and online on all of the major news outlets, far more than the leader of the official opposition, let alone the leaders of the third and fourth largest parties in the country. By this point, this obsession has seriously started to shape public discourse. It’s fuelled essentially by fact that the small group of very wealthy people in charge of the traditional media are his peers – they support him and his views because they belong to the same millionaire class and milieu. This was the group that made Brexit happen, portraying it as a movement of ‘the people’ when the real impetus for it was the fact that the EU was closing tax loopholes for the millionaire class.

We are now in the Stalinist phase of Brexit (a funny idea, since its adherents are virulently anti-communist) where the only people who have benefitted from Brexit and continue to benefit from it are that ruling class (who still don’t want to pay taxes). As ‘the people’ inevitably fall out of love with Brexit, since it’s damaged the economy, made foreign holidays more difficult and expensive and basically failed to provide any material gains, let alone a raise in the standard of living in the UK, the Brexit ideology becomes stronger and more corrosive, emotive and unhinged. Basically, the media can’t make people satisfied with having less. but it can try to make them angry, and to direct their anger.

The media’s obsession with the views of the man who has become the figurehead for Brexit distracts from the actual views of the public, but naturally it affects them too. Not surprisingly, constant positive coverage of the man and his colleagues has made he and his party more successful – but, after a decade not much more successful, really, given how inescapable his presence has become. But every little increase in popularity is fed into the circular narrative and framed as an unstoppable rise, as if that rise wasn’t essentially being created by those reporting on it. But really, as with Brexit itself, it’s mostly about money.

All of which raises questions; firstly and most importantly, what can be done about it? The readership of even the most popular newspapers isn’t especially big now, but those newspapers are also major presences on social media and on TV. Most importantly (and quite bizarrely, when you think about it) the major broadcasters in TV in the UK still look to newspapers to gauge the political zeitgeist, rather than the other way around (or rather than both TV and newspapers looking to the internet, which would be more accurate but probably not better). The obvious response is to boycott the newspapers and/or TV, but for as long as Parliament still looks to the press barons to find out the mood of the public, that can only remove the governing of the country even further from the lives and opinions of the people. A more positive answer would be to promote alternative politics through what media is available; but again that can only work up to a point, because if politicians are still in thrall to the same old newspapers and broadcasters then, again, Parliament becomes even more of a closed-off, cannibalistic circuit, isolated from popular opinion except when a general election comes around.

Complaining to one’s MP is probably the most sensible thing to do, but unless they happen to be one of the five MPs currently representing the media’s chosen party in Parliament, then they almost certainly agree with you and don’t know what to do about it either. And yet surely it can’t be an insurmountable problem?

A less important, but more heart-breaking question is a hypothetical one; what would the country be like now if the media was obsessed with a party with a progressive political agenda (the Green Party for instance, are actually more popular than ever, and despite a few cranks and weirdos, mostly a positive force)? What if, instead of spending a decade spreading intolerance, division, hatred, racism etc so that a few millionaire businessmen could pay less tax, they had been had been pushing ideas of equality and environmentalism into the culture? Money is at the heart of it all really, and this frustrating situation actually led to me taking the unusual (for me) and pointless step of writing an email to the Prime Minister that he will never have read, part of which said;

Surely one of the most effective way to neutralise the poisonous rhetoric of the far right is not to pursue its populist talking points, but to materially improve the lives of the people of the UK?  In the last general election – less than a year ago – a vote for Labour was for most people a vote for change, not for more of the same. If Britain wanted divisive rhetoric, attacking migrants and minorities, there were far more obvious people to vote for.

William Collins – May Day (1811 – 12)

So anyway; hopefully June is better.

nostalgia isn’t going to be what it was, or something like that

When I was a child there was music which was, whether you liked it or not, inescapable. I have never – and this is not a boast – deliberately or actively listened to a song by Michael Jackson, Madonna, Phil Collins, Duran Duran, Roxette, Take That, Bon Jovi, the Spice Girls… the list isn’t endless, but it is quite long. And yet I know some, or a lot, of songs by all of those artists. And those are just some of the household names. Likewise I have never deliberately listened to “A Horse With No Name” by America, “One Night in Bangkok” by Murray Head or “Would I Lie to You” by Charles & Eddie; and yet, there they are, readily accessible should I wish (I shouldn’t) to hum, whistle or sing them, or just have them play in my head, which I seemingly have little control over.

Black Lace: the unacceptable face(s) of 80s pop

And yet, since the dawn of the 21st century, major stars come and go, like Justin Bieber, or just stay, like Ed Sheeran, Lana Del Rey or Taylor Swift, without ever really entering my consciousness or troubling my ears. I have consulted with samples of “the youth” to see if it’s just me, but no: like me, there are major stars that they have mental images of, but unless they have actively been fans, they couldn’t necessarily tell you the titles of any of their songs and have little to no idea of what they actually sound like. Logical, because they were no more interested in them than I was in Dire Straits or Black Lace; but alas, I know the hits of Dire Straits and Black Lace. And the idea of ‘the Top 40 singles chart’ really has little place in their idea of popular music. Again, ignorance is nothing to be proud of and I literally don’t know what I’m missing. At least my parents could dismiss Madonna or Boy George on the basis that they didn’t like their music. It’s an especially odd situation to find myself in as my main occupation is actually writing about music; but of course, nothing except my own attitude is stopping me from finding out about these artists.

The fact is that no musician is inescapable now. Music is everywhere, and far more accessibly so than it was in the 80s or 90s – and not just new music. If I want to hear Joy Division playing live when they were still called Warsaw or track down the records the Wu-Tang Clan sampled or hear the different version of the Smiths’ first album produced by Troy Tate, it takes as long about as long to find them as it does to type those words into your phone. Back then, if you had a Walkman you could play tapes, but you had to have the tape (or CD – I know CDs are having a minor renaissance, but is there any more misbegotten, less lamented creature than the CD Walkman?) Or you could – from the 1950s onwards – carry a radio with you and listen to whatever happened to be playing at the time. I imagine fewer people listen to the radio now than they did even 30 years ago, but paradoxically, though there are probably many more – and many more specialised –  radio stations now than ever, their specialisation actually feeds the escapability of pop music. Because if I want to hear r’n’b or metal or rap or techno without hearing anything else, or to hear 60s or 70s or 80s or 90s pop without having to put up with their modern-day equivalents, then that’s what I and anyone else will do. I have never wanted to hear “Concrete and Clay” by Unit 4+2 or “Agadoo” or “Come On Eileen” or “Your Woman” by White Town or (god knows) “Crocodile Shoes” by Jimmy Nail; but there was a time when hearing things I wanted to hear but didn’t own, meant running the risk of being subjected to these, and many other unwanted songs. As I write these words, “Owner of a Lonely Heart” by Yes, a song that until recently I didn’t know I knew is playing in my head.

And so, the music library in my head is bigger and more diverse than I ever intended it to be. In a situation where there were only three or four TV channels and a handful of popular radio stations, music was a kind of lingua franca for people, especially for young people. Watching Top of the Pops on a Thursday evening, or later The Word on Friday was so standard among my age group that you could assume that most people you knew had seen what you saw; that’s a powerful, not necessarily bonding experience, but a bond of sorts, that I don’t see an equivalent for now, simply because even if everyone you know watches Netflix, there’s no reason for them to have watched the same thing at the same time as you did. It’s not worse, in some ways it’s obviously better; but it is different. Of course, personal taste back then was still personal taste, and anything not in the mainstream was obscure in a way that no music, however weird or niche, is now obscure, but that was another identity-building thing, whether one liked it or not.

Growing up in a time when this isn’t the case and the only music kids are subjected to is the taste of their parents (admittedly, a minefield) or fragments of songs on TV ads, if they watch normal TV or on TikTok, if they happen to use Tiktok, is a vastly different thing. Taylor Swift is as inescapable a presence now, much as Madonna was in the 80s, but her music is almost entirely avoidable and it seems probable that few teenagers who are entirely uninterested in her now will find her hits popping unbidden into their heads in middle age. But conversely, the kids of today are more likely to come across “Owner of a Lonely Heart” on YouTube than I would have been to hear one of the big pop hits of 1943 in the 80s.

Far Dunaway as Bonnie Parker; a little bit 1930s, a lot 1960s

What this means for the future I don’t know; but surely its implications for pop-culture nostalgia – which has grown from its humble origins in the 60s to an all-encompassing industry, are huge. In the 60s, there was a brief fashion for all things 1920s and 30s which prefigures the waves of nostalgia that have happened ever since. But for a variety of reasons, some technical, some generational and some commercial, pop culture nostalgia is far more elaborate than ever before. We live in a time when constructs like “The 80s” and “The 90s” are well-defined, marketable eras that mean something to people who weren’t born then, in quite a different way from the 1960s version of the 1920s. Even back then, the entertainment industry could conjure bygone times with an easy shorthand; the 1960s version of the 1920s and 30s meant flappers and cloche hats and Prohibition and the Charleston and was evoked on records like The Beatles’ Honey Pie and seen onstage in The Boy Friend or in the cinema in Bonnie & Clyde. But the actual music of the 20s and 30s was mostly not relatable to youngsters in the way that the actual entertainment of the 80s and 90s still is. Even if a teenager in the 60s did want to watch actual silent movies or listen to actual 20s jazz or dance bands they would have to find some way of accessing them. In the pre-home video era that meant relying on silent movie revivals in cinemas, or finding old records and having the right equipment to play them on, since old music was then only slowly being reissued in modern formats. The modern teen who loves “the 80s” or “the 90s” is spoiled by comparison, not least because its major movie franchises like Star Wars, Indiana Jones, Ghostbusters and Jurassic Park are still around and its major musical stars still tour or at least have videos and back catalogues that can be accessed online, often for free.

Supergrass in 1996: a little bit 60s, a lot 70s, entirely 90s

Fashion has always been cyclical, but this feels quite new (which doesn’t mean it is though). Currently, culture feels not like a wasteland but like Eliot’s actual Waste Land, a dissonant kind of poetic collage full of meaning and detritus and feeling and substance and ephemera but at first glance strangely shapeless. For example, in one of our current pop culture timestreams there seems to be a kind of 90s revival going on, with not only architects of Britpop like the Gallagher brothers and Blur still active, but even minor bands like Shed Seven not only touring the nostalgia circuit but actually getting in the charts. Britpop was notoriously derivative of the past, especially the 60s and 70s. And so, some teenagers and young adults (none of these things being as pervasive as they once were) are now growing up in a time when part of ‘the culture’ is a version of the culture of the 90s, which had reacted to the culture of the 80s by absorbing elements of the culture of the 60s and 70s. And while the artists of 20 or 30 years ago refuse to go away even modern artists from alternative rock to mainstream pop stars make music infused with the sound of 80s synths and 90s rock and so on and on. Nothing wrong with that of course, but what do you call post-post-modernism? And what will the 2020s revival look like when it rears its head in the 2050s, assuming there is a 2050s? Something half interesting, half familiar no doubt.

jack told him about the thing – updating children’s books

There’s a strange moment near the beginning of the 1982 Puffin Books edition of Robert Westall’s Fathom Five (1979):

Dad never talked about Life and its Meanings; only fried bread and thrushes. ‘What’s got you up so early?’  Jack told him about the thing in the water.                   It’ll be a mandolin, floated off a sunken ship mevve…’

Puffin Books, 1982, p.35

Strange only that is, because the hero of the book isn’t called Jack, he’s called Chas. I remember first reading this copy of Fathom Five as a child, being puzzled, then moving on. It was only some time – possibly years – later that I read the blurb on the first page, before the title page:

Fathom Five (1978)

Robert Westall wrote this book straight after his best-selling The Machine Gunners, and it features many of the same characters that appear in the earlier novel. However, when the book was first published the names were changed. In this Puffin edition the original names have been restored.

I have always uncharitably assumed that what actually happened was that Fathom Five didn’t sell as well as Westall or his publishers had hoped and was then rebranded by them as a sequel to already successful and acclaimed The Machine Gunners in order to boost its sales, but I may be wrong. But either way, it’s apposite at the moment because a range of children’s books are being altered, apparently for various other reasons, but really for that same commercial one.

It isn’t obvious from the generally hysterical media coverage, but re-writing or tampering with “much-loved” (and that bit is important) children’s books, ostensibly to remove any possible offensiveness, has nothing to do with being PC or (sigh, eyeroll, etc, etc) “woke” – l reluctantly use the word because currently it is the word being used to talk about this issue by every moron who’s paid to have what they would have you believe is a popular (invariably intolerant) opinion. Right-wing tabloids love “woke” because it’s a single, easy-to-spell, easy-to-say syllable that takes up much less space in a headline than “Political Correctness” used to. I think there are also people who like to use it because saying “Political Correctness” feels dry and snooty and even the abbreviation “PC” has a certain technical, academic quality; but using “woke” allows them to feel cool and in touch with the times. It’s the same kind of frisson that high school teachers get (or did “in my day”) from using teenage slang or mild swearwords in front of the kids; and the cringe factor is about the same too. Hearing someone with a public-school accent decrying “wokeness” is so milk-curdlingly wrong that it’s masochistically almost worth hearing, just to enjoy the uniquely peculiar and relatively rare sensation of having one’s actual flesh creep.

But anyway, the editing of children’s books has nothing to do with people’s supposedly delicate sensibilities. High profile examples – significantly, there are only high profile examples – being of course tried-and-tested bestsellers like Roald Dahl’s Matilda and Charlie and the Chocolate Factory, plus various works by Enid Blyton. “Problematic” though those books might be,  the new edits are to do with money, and preventing the cash cow from dying of natural causes (fashion, essentially) like Biggles or Billy Bunter did. Parents aren’t lobbying publishers to have these books edited; “woke” parents generally don’t really want their kids reading racist or offensive books at all. And every year, untold numbers of unfashionable books (like, for example Fathom Five itself, which is great, regardless of the characters’ names) quietly slip out of print without any fuss being made. What it is, is that the books of Roald Dahl, still being adapted into films and plays and Enid Blyton, once ubiquitous enough to still have nostalgia value have made, and continue to make, a lot of money. Publishers naturally realise that some of what those authors wrote is now embarrassingly out of date and, rather than just printing a possibly-off-putting disclaimer at the front of the book, prefer to prevent any chance of damaging sales by seamlessly – well, it should be seamless, in the case of Matilda especially, it seems to be pretty clumsy – editing the book itself.

In an ideal-for-the-publishers-world no-one would even notice that this editing had been done, columnists wouldn’t pick up on it and the kids could go on requesting the books and the parents and schools could supply them and nobody would be upset. But this is precisely the type of trivial issue the (here we go again) “anti-woke” lobby loves. It has no major impact on society, no major impact on children and it has nothing to do with any of the big issues facing the modern world, or even just the UK. It also puts the right-wing commentator in the position they love, of being the honourable victims of modern degraded values, defending their beloved past. Plus, in this case there’s even – uniquely I think – an opportunity for them to take what can be seen as the moral high ground without people with opposing political views automatically disagreeing with them. Even I slightly agree with them. My basic feeling is that if books are to be altered and edited, it should be by, or at least with the approval of, the author. But it’s never quite that simple.

The reason I only slightly agree is because the pretended outrage is just as meaningless as the revising of the texts itself, it’s not a governmental, Stalinist act. The new editions of Matilda etc only add to the mountain of existing Matildas, they don’t actually replace it. If the racist parent prefers fully-leaded, stereotype-laden, unreconstructed imperialist nostalgia, it’s childishly simple for them to get it, without even leaving the comfort of their home. Better still, if they have the time and their love of the past stretches to more analogue pursuits, they can try browsing second hand bookshops and charity shops. It’s possible, even in the 2020s, to track down a copy of the original ,1967 pre-movie Charlie and the Chocolate Factory, or the most virulently offensive Enid Blyton books, not to mention long out-of-print goodies of the Biggles Exterminates the Foreigners type without too much difficulty. And in many cases one could do it just as cheaply/expensively as by going into Waterstones or WH Smith and buying the latest, watered-down versions.

the big-format illustrated 1967 Charlie I knew as a child

But anyway, books, once owned, have a way of hanging around; I remember being mystified by Mel Stuart’s 1971 film Willy Wonka and the Chocolate Factory the second time I saw it, at some point in the early 80s. I had known the book well since I was very young and the first time I saw the film, at home, in black and white, I thought everybody looked and sounded wrong, especially Gene Wilder, but that was all. When I saw it again, a while later, in colour, I found to my bemusement that the Oompa-Loompas were orange.* This definitely seemed odd – but even so, it’s not exactly the kind of thing that burns away at you and so it was only this year, when the book caused its latest furore, that I discovered that, although my mother had read Charlie and the Chocolate Factory to me in the late ‘70s and then I had read it myself in the early ‘80s, the edition I knew was the large-format 1967 UK hardback edition. This had Faith Jaques’s beautifully detailed illustrations – which is where all of my impressions of the characters still come from – but more importantly, it had the original Oompa-Loompas. A pygmy tribe, “imported” from “the very deepest and darkest part of the African jungle,” they were immediately controversial in the US, where the NAACP understandably took issue with them. Roald Dahl, who presumably wanted the book to sell as well in the US as it did elsewhere, agreed with them (he may have actually seen their point too, but given his character in general I don’t think it does him too much of a disservice to assume the money was the bigger issue) and changed the book. So, no problem there, even if Dahl’s solution – making the Oompa-Loompas a race of blonde, rosy-cheeked white little people, who still live some kind of life of indentured servitude in a chocolate factory – doesn’t seem super-un-problematic when you really think about it; but it was his decision and his book. The orange Oompa-Loompas were a more fantastical way around the problem, and one which enhanced the almost psychedelic edge of the film.

If the intention of publishers in 2023 is to make Roald Dahl nice, they are not only wasting their time, they are killing what it is that kids like about his books in the first place. If children must still read Charlie and the Chocolate Factory – and I don’t see why they shouldn’t – they are reading a story so mean-spirited and spitefully funny – and so outdated in so many ways – that it doesn’t really bear fixing. Though it was written in the 60s, Charlie’s poverty-stricken childhood with his extended family feels like something from the pre-war era when Dahl was a non-poverty stricken child, as does the book’s Billy Bunter-esque excitement about and fascination with chocolate. Are kids even all that rabidly excited about chocolate these days? And is a man luring kids into a chocolate factory to judge them for their sins something that can or should be made nice? I don’t think that’s an entirely frivolous point; as a child I remember Willy Wonka had the same ambiguous quality as another great figure of children’s literature, Dr Seuss’s the Cat in the Hat; which is where Mel Stuart went wrong, title-wise at least. Willy Wonka and the Chocolate Factory is all well and good, but it’s Charlie – a poor, harmless, nice kid who wants some chocolate – that’s the hero, not Wonka, a rich, mischievous adult man whose motives can only be guessed at. And in fact Gene Wilder captures that slightly dangerous quality perfectly. Almost all of Roald Dahl’s books are similarly nasty; but that’s why kids like them. Where necessary, a disclaimer of the ‘this book contains outdated prejudices and stereotypes which may cause offence’ (but hopefully less awkwardly worded) type is surely all that’s necessary. And anyway, where do you stop? Sanitising Charlie and the Chocolate Factory is patronising and weakens the power of Dahl’s writing, but to sanitise The Twits would be to render the whole book pointless.

*I had a similar epiphany when as a young adult I discovered that Bagpuss was pink and not the relatively normal striped ginger cat I had assumed; the joys of growing up black & white in the colour age!

JJ Fortune’s Race Against Time series – good 80s fun

Anyway; the thing that really makes the updating of books pointless is that kids who like to read, tend to read and understand. As a child in the 1980s, I had plenty of entertaining, modern-at-the-time books to read, like the Fighting Fantasy series, the novels of Leon Garfield and Robert Westall or even JJ Fortune’s slightly silly, very cinematic Race Against Time novels, but I also loved books that were much older and felt much older. I loved Capt. WE Johns’ legendary fighter pilot Biggles – especially the WW1-set Biggles books and The Boy Biggles, about the pilot’s childhood adventures in India. I loved Richmal Crompton’s William series (I wonder if William the Dictator (1938), where William and his gang decide to be Nazis is still in print? I hope so) I loved Willard Price’s Adventure series, about American brothers travelling the world to capture animals for zoos and safari parks. I even liked boarding school stories, especially Anthony Buckeridge’s Jennings books. I also remember very fondly a book called The One-Eyed Trapper by (will look it up) John Morgan Gray (1907-1978; got to love the internet) which was about (actually, the title says it all). Years later at high school, some of the poems of Robert Frost immediately recalled to me the vivid, bracing outdoorsy atmosphere of The One-Eyed Trapper, though I don’t suppose Frost would have appreciated the comparison. I was never much of an Enid Blyton fan, but I did read a couple of her Famous Five and Secret Seven books. My favourite Blytons though were the series about her lesser-known, far more awkwardly-named gang of nosy children, the Five Find-Outers (presumably it was because of that awkwardness that the names of the Five Find-Outers books were slightly bland and anonymous things like The Mystery of the Burnt Cottage etc).

William the Dictator – good 1930s fun

These were books from the 1930s, 40s, 50s and 60s, that were set in those eras and written in outdated language and which, as they say, ‘reflected the values and attitudes of the time.’ Relatability is important in fiction up to a point, but it doesn’t need to be literal – children have imaginations after all. I didn’t want William in jeans or Jennings and Darbyshire without their school caps, or wearing trousers instead of shorts. I didn’t need Biggles to talk like a modern pilot (in fact the occasional glossaries of olden-days pilot talk made the books even more entertaining) or the one-eyed trapper to have two eyes and be kind to wildlife. My favourite member of the ‘Five Find-Outers’ was “Fatty”, which is probably not a name you would have given a lead character in a children’s book even in the 1980s. The idea that changing “Fatty” to something more tactful, making him thinner or even just using his “real” first name, Frederick – would make the books more palatable or less damaging to the young readers of today is ridiculous and patronising. And possibly damaging in itself, to the books at least. Children’s books are mostly escapism, but they are also the most easily absorbed kind of education, and a story from the 1940s, set in a version of the 1940s where the kids look and speak more or less like the children of today and nobody is ever prejudiced against anyone else doesn’t tell children anything about the actual 1940s.

I’m reminded of the recent movie adaptation of Stephen King’s IT. In the novel’s sections set in the 1950s, one of its heroes, Mike Hanlon, who is African-American, is mercilessly bullied and abused by racist teens when he’s a kid. In the movie version he’s just as bullied, but without any racist abuse. I understand why that’s being done – more explicit racism onscreen is obviously not the solution to any of the world’s problems, especially in a story which only has one substantial Black character – but at the same time, making fictional bullies and villains more egalitarian in their outlook than they were in the source material doesn’t feel like the solution to anything either. But even more to the point, there’s only so much altering you can do to a piece of writing without altering its essential character. There are many problems with the much-publicised passage in the latest edition of Roald Dahl’s Matilda where references to Kipling and so forth are replaced with references to Jane Austen etc, but the biggest one is that it just doesn’t read like Roald Dahl anymore.

All of which is to say that, whatever the rights and wrongs of it, a third party “fixing” literature (or any art form for that matter) has its limitations. I remember reading an interview with a director of the British Board of Film Classification (Ken Penry maybe?) back in the early ‘90s, discussing John McNaughton’s notorious Henry, Portrait of a Serial Killer. He was concerned about the film – though he didn’t dismiss it as worthless trash – but his main worry was that it couldn’t be meaningfully cut to reduce its horrific elements because it was the movie’s tone, rather than its content that was worrisome. A few years earlier, the BBFC had unwittingly made Paul Verhoeven’s Robocop far more brutal by editing a few seconds from the scene where the giant robot ED-209 shoots an executive for a ridiculously long time in a botched demonstration. In the original cut, the shooting goes on for so ludicrously long that it becomes pure black comedy; but cut down a little it becomes a lot less funny and therefore far nastier and (negating the point of the edit) more traumatic for a young audience. There is a reasonable argument that seeing someone get shot to death by a giant robot should be traumatic, but I’m pretty sure that wasn’t the BBFC’s motive in making the cuts, since the movie was rated 18 and theoretically not to be seen by children anyway.

A children’s novel (or at least a novel given to children to read) that comes under fire for mostly understandable reasons in American schools is The Adventures of Tom Sawyer. But though the casual use of ‘the N word’ is possibly removable, what would removing it achieve? What people are objecting to hopefully isn’t really just the language, it’s the era and the society that Mark Twain was writing about. How could you and why would you want to remove that context from the book? Making it into a story where African-Americans are, in the narrative, demonstrably second class citizens but no one ever refers to their status by using nasty names seems in a way more problematic than the racist language itself. Similarly, The Catcher in the Rye has been controversial for decades, but what difference would taking the offensive words out of it make? The only real solution, editing-wise for those who object to the ‘offensive’ material in the book would be to make it so that Holden Caulfield doesn’t get expelled from school, doesn’t hang around bars drinking while underage, doesn’t hire a prostitute and get threatened by her pimp, doesn’t continuously rant about everyone he meets; to make him happier in fact. Well, that’s all very nice and laudable in its way and it’s theoretically what Holden himself would want, but it’s not what JD Salinger would have wanted and whatever book came out of it wouldn’t be The Catcher in the Rye.

Fathom Five 80s rewrite – terrible cover, good book

But, since there is no Stalinist attempt to destroy the books of the past, it’s not all negative. To go back to the Fathom Five example; as a kid there was something fascinating about the phantom “Jack” and had the internet existed at the time I probably wouldn’t have been able to resist trying to track down an un-revised edition of the book.  I still might – but would it be worth it? Well possibly; authors and artists tampering with their old work is always fascinating, but usually it’s the revised version that is less satisfying. In the preface to the 1928 edition of his then ten-year-old novel Tarr, Wyndham Lewis wrote;

turning back to [Tarr] I have always felt that as regards form it should not appear again as it stood, for it was written with extreme haste, during the first year of the War, during a period of illness and restless convalescence. Accordingly for the present edition I have throughout finished what was rough and given the narrative everywhere a greater precision.

Reading that, you already know that the 1918 text is better, and it is. Lewis was a restless reviser of his written works, but for every improvement he made – and he did make many – he lost some of the explosive quality that keeps his often over-elaborate writing alive. As with Lewis, William Wordsworth tampered with his The Prelude – Growth of a Poet’s Mind throughout his life. Like Lewis, some of the changes he made were less to do with the character of the poem than the evolving character of the man who wrote it. The Prelude is an autobiographical work and when Wordsworth first completed the poem in 1805, he was in his mid-30s, a successful youngish poet with some lingering radical tendencies. When he completed the final version, somewhere around 1840, he was a respected, conservative and establishment figure with very mixed feelings about his wilder youth. Both versions are equally valid in their different ways and if the later version doesn’t really eclipse the first – and has shades of the orange Oompa-Loompa redesign about it – the reader is glad to have both. The point with these examples is that all remain available; if Wyndham Lewis had managed to destroy all the copies of the 1918 Tarr or Wordsworth had somehow “taped over” the 1805 Prelude the world would be a poorer place. When it comes to reworking previous triumphs (or failures) literature is no different from the other arts. Some visual artists – Leonardo Da Vinci is the classic example – can never stop messing with their work, and the film industry (think of the phenomenon of the “Director’s cut”) and the music industry frequently have these moments too. In 1988, after 8 years of complaining about the cheap production of their debut album, Iron Maiden finally decided to re-record its opening track, “Prowler” with their then-current line-up and the expensive studios now available to them. Even if original singer Paul Di-Anno hadn’t sung the song better (but he did), “Prowler ’88,” oddly tired and flabby sounding, would still be vastly inferior to the basic-but-vital original; sometimes artists just aren’t the best judges of their own work. U2’s latest venture, essentially re-recording and reworking their greatest hits, has received mixed reviews; but though one has to accept in good faith that the band thinks it was a worthwhile exercise, it’s unlikely that they have enough confidence in the new versions to replace the originals on their actual Greatest Hits from here on in.

Lord of the Rings in drafts

A similar, but backwards version of the above has taken place with JRR Tolkien. A whole industry has been generated from his decades-long struggle with The Lord of the Rings, but the difference here is that the earlier material was only posthumously published. Tolkien himself probably wouldn’t have been hugely enamoured with the idea of the public reading about the adventures of Bingo Bolger-Baggins, “Trotter” et al, but as a fan it’s fascinating seeing the slow evolution of not only the book and its characters, but Middle Earth itself, with its re-drawn maps and growing sense of newly-uncovered history. In this case though, Tolkien was  the best judge of his work; The History of Middle Earth is vast, an even more, but very differently, epic journey than The Lord of the Rings, but the final draft has, unlike the 1928 Tarr, a sense of life and completeness missing from all of the previous drafts and half-drafts. Partly no doubt this was because – again unlike Tarr – The Lord of the Rings remained a work-in-progress and Tolkien’s main focus for many years – the characters and setting ‘grew in the telling’ (as Tolkien puts it) and reached a kind of three-dimensional quality that is missing from most epic fantasy novels, despite Tolkien’s reticence in so many areas, notably (but not only) sex.

Fiona Shaw’s superb Richard II (1995)

Alongside the concern/faux concern of “wokifying” children’s books, there’s a similar list of complaints from the usual people about the “wokifying” of TV and film adaptations of classic literature (or just literature), but here I think they are only wrong with nothing to redeem their wrongness. Firstly, because adaptations are always collaborations – and in a movie adaptation of, say, Barnaby Rudge, the artist isn’t Dickens, whose work is already complete, but those making the film. Adaptations are just that, they adapt, they don’t and can’t precisely transcribe from one art form into another. Early-Primary-School-me thought that Gene Wilder was the wrong guy to play Willy Wonka – adult me can see that in the most important way, the spirit-of-the-text way, he’s completely right. He just doesn’t look like the illustrations I knew or sound the way I thought he should sound. I would say the same (in the capturing-the-spirit sense) about Dev Patel’s David in The Personal History of David Copperfield and Fiona Shaw’s Richard II or the fact that Tilda Swinton could give a note-perfect performance as all the incarnations of the title character in Sally Potter’s Orlando. Colour and/or gender-blind casting (and all the variations thereof) can give directors and performers ways of finding the real heart of a story – or just revitalising something that has grown stale through familiarity – that conventional casting might not – and unlike replacing the word ‘fat’ with ‘stout,’ ‘large’ or ‘fluffy’ in a kid’s book, it keeps the work alive for a new audience, or even for an old one.

Secondly (I think I wrote ‘firstly’ way back there somewhere?), time, scholarship and cultural evolution give us a greater understanding of the context of a novel or play. It’s now clear that Britain, through the 20th century, back into Victorian and even medieval times and beyond, had a much broader ethnic and cultural mix than you might ever suspect from the country’s artistic record. And with that understanding, it becomes clear that characters that occasionally did appear in British fiction of the 19th century and earlier, whether Jewish, Chinese, Black, gay, whatever; tend to be represented as stereotypes to stress their otherness, but in those stories that otherness has grown rather than lessened over the years as the real-life otherness diminishes. In addition, through the passage of time, the gradations of apparently homogenous British characters, even in relatively recent fiction, tend to blend into each other.

Nowadays, Dickens seems to many of us to be full of rich and poor characters, but for Dickens’s audience the social differences between the upper, upper-middle, middle, lower-middle, working and under-classes would seem far more marked than they do today and therefore even a caricature like Fagin in Oliver Twist would be part of a far richer tapestry of caricatures than now, when he stands out in ever more stark relief. We can’t, hopefully don’t want to and anyway shouldn’t change the novels themselves – indeed, the idea of a modern writer being tasked with toning down the character of Fagin or Shylock in The Merchant of Venice highlights how ridiculous the treatment of children’s books is, as well as the devalued position they have in the pantheon of literature. But in adapting the works for the screen, the truer a picture we can paint of the society of the time when the works were written or are set, the more accurately we can capture what contemporary audiences would have experienced and perhaps gain more of an insight into the author’s world-view too.

Thirdly, and on a more trivial level; why not make adaptations more free and imaginative, not only to give a more accurate and nuanced picture of the past, or to ‘breathe new life’ etc, but just for the joyous, creative sake of it? The source material is untouched after all. Fairly recently, a comedian/actor that I had hitherto respected, complained online about the inclusion of actors of colour in an episode of Doctor Who, in which the Doctor travels back in time to London in some past era, on the grounds that it was ‘unrealistic.’ Well, if you can readily accept the time-travelling, gender-swapping Timelord from Gallifrey and its logic-defying time/space machine, but only for as long as olden days London is populated entirely by white people – as it probably wasn’t, from at least the Roman period onwards – then I don’t know what to tell you.

So maybe the answer is yes, change the books if you must; remove the old words and references, make them into something new and palatably bland as fashion dictates – just don’t destroy the old ones and please, always acknowledge the edits. Let the children of the future wonder about that strange note that says the book they are reading isn’t the same book it used to be, and maybe they will search out the old editions and be educated, shocked or amused in time; it’s all good. But until it happens to obscure books too, let’s not pretend the motives for ‘fixing’ them are purely humanitarian.

 

music of my mind (whether I like it or not)

Since the age of 13 or so, music has been an important part of my life. I have written about it for various places, including here, here, here, here and, um, here, but more than that, I listen to music that I don’t have to write about pretty much every day.

I was going to write something about my favourite songs or whatever (and may do still), but thinking about it made me tune into the music that plays in my head, almost constantly and seemingly involuntarily, as the general background to my day. Involuntarily, because when tuned into, it becomes obvious that quite a bit of it is stuff that I wouldn’t necessarily listen to at all. Trying to keep track of the music of your mind is difficult though, because as soon as one focuses on it, one begins to/you begin to – that is, in my case I begin to influence it. Even when it is music that you like and listen to by choice, it’s rarely anything that seems specific to the present moment in a movie soundtrack kind of way – at the moment for instance, it’s Deirdre by the Beach Boys. It’s January (cue January by Pilot – sometimes the conscious mind and/or context does influence these things), so not really a season associated with the Beach Boys, I’m not especially in a Beach Boys kind of mood, I don’t know anyone called Deirdre; but the subconscious mind has determined that that’s what we are playing right now. Playing, but also listening to; it’s peculiar when you think about it.

Though the trombone on Deirdre (which I love) prevents it from being a “cool” choice, this could of course be an opportunity to display cooler-than-thou hipsterism, but as you’ll see in the (mostly DON’T) playlist below, lack of conscious control seems to equate to lack of quality control too. With that in mind, I won’t include things that popped into my head fleetingly, like the immortal  Everybody Gonfi Gon by 2 Cowboys or jingles from advertisements by Kwik Fit (or, more locally, Murisons, whatever that is/was). Not that the songs below have all appeared in their entirety – in some cases I don’t even know the whole song, in several I only know a few lines of the lyrics. So anyway, here – as comprehensively as I can make it – is what I have “heard” today, with notes where there’s anything to say and concluding thoughts at the end…

The 5th January 2023 being-playedlist – *warning* contains actual songs

Thank You for Being a Friend (Theme from the Golden Girls).

https://www.youtube.com/watch?v=HV7AXRABSng

I have no idea where this came from or why I should apparently be thinking of it, but it’s been a regular on the ‘playlist’ this week. I’ve noticed that some songs stay in rotation for a while, sometimes evolving along the way. A key feature of these kinds of songs is that the ‘voice’ your brain chooses for them and the lyrics etc might be quite different from the real ones, especially when it’s a song you don’t actually know the lyrics of. I haven’t seen The Golden Girls for decades, or heard the theme tune (I included the video without playing it), so this seems an especially odd one. But perhaps it’s an early morning thing; while writing this it occurred to me that the theme from Happy Days has been popping into my head in the shower a lot recently.

Wham! – Last Christmas

https://www.youtube.com/watch?v=E8gmARGvPlI

It feels like extremely bad taste to be subjected to one of my least favourite festive songs, after Christmas, especially since I seem to have successfully avoided this one last year – but oh well, something in the Golden Girls theme apparently suggests it, since they tend to occur together.

Frank Sinatra – Young At Heart

https://www.youtube.com/watch?v=aZRn4auk4PQ

I’ve never intentionally listened to this song, but I guess it’s part of “the culture.” But at least it’s less mysterious than the Golden Girls theme; on my early morning walk there’s a creaky gate that makes a note that somehow puts this song in my mind, though it took me a few days to realise that’s what was happening.

Magnum – Just Like an Arrow

https://www.youtube.com/watch?v=BJeLByGsOGo

I like this song – and cheesy 80s Magnum generally – a lot, but it’s another one I haven’t intentionally listened to for a long time. Maybe this is my brain’s way of telling me to revisit it?

Jim Diamond – Hi Ho Silver

https://www.youtube.com/watch?v=p6mjSAgxusM

Still stuck in the 80s, but this time in the company of a song I loathe and detest; why brain, why? Isn’t this another one that’s TV-related in some way? John Logie Baird has a lot to answer for, clearly

Men at Work – Down Under

https://www.youtube.com/watch?v=XfR9iY5y94s

*Still* in the 80s, but at least it’s a song I don’t dislike. I’m not sure if I’ve ever deliberately listened to this song (you didn’t need to “back in the day”, you heard it everywhere) but it’s been a regular visitor to my brain for many years. There was even a harrowing few weeks (or months – it seemed like a long time) – when it formed a weird medley in my mind with Paul Simon’s Call Me Al (one of the few of his songs I actually dislike). Except that Call me Al had different lyrics at various points. I remember that the flute (recorder?) part of Down Under came in just after the last line of the chorus. Since that time, whenever I’ve heard that song I’ve been half-surprised that the segue doesn’t happen.

The Supremes – Baby Love

https://www.youtube.com/watch?v=ZAWSiWtUK2s

https://www.youtube.com/watch?v=ZAWSiWtUK2s

At least most of these are cheerful songs I guess? This one always makes me think of that objectively quite strange scene in Quentin Tarantino’s (in my opinion) best movie by miles, Jackie Brown

Mull Historical Society – Barcode Bypass

https://www.youtube.com/watch?v=StWYuUbl4M8

https://www.youtube.com/watch?v=StWYuUbl4M8

Oh well, they can’t all be cheerful. I’m guessing the opening line “let me get my gloves/and walk the dogs for miles” has something to do with the inclusion of this one. I like it, but the weary melancholy is not at all the mood of most of these.

Slayer – Raining Blood

https://www.youtube.com/watch?v=Gy3BOmvLf2w

https://www.youtube.com/watch?v=Gy3BOmvLf2w

???Why not, I suppose?

King Crimson – Fallen Angel

https://www.youtube.com/watch?v=eLlmbCkb3As

https://www.youtube.com/watch?v=eLlmbCkb3As

Mysterious: I like bits and pieces of King Crimson but I’m surprised to find that I know this song at all, since I don’t even own the album it’s on (Red, 1974) or any compilations etc. I wonder how I know it? I had to look it up from a fragment of lyric that I knew, but sure enough, it’s Fallen Angel. I thought the only song of that title that I knew was the arguably superior one by Poison, but that’s an argument for another day

Souls of Mischief – ’93 ‘Til Infinity

https://www.youtube.com/watch?v=fXJc2NYwHjw

https://www.youtube.com/watch?v=fXJc2NYwHjw

What this has to do with anything is anyone’s guess; I like it, it’s a classic and all, but I think I heard an alarm of some kind in the distance that somehow morphed into that noise in the background during the “Dial the seven digits” bit. But more importantly, is Tajai really wearing a cricket jumper? And if so, how come he looks cool doing so?

Which brings us up to date and Deirdre: but what other wonders lie ahead?

The Beach Boys – Deirdre

https://www.youtube.com/watch?v=IsDYy1l6TQU

https://www.youtube.com/watch?v=IsDYy1l6TQU

Conclusion: Hm. I don’t know: the subconscious mind is almost a separate entity with different and broader tastes than its conscious host? Or it has a masochistic streak? Or absorbing decades of unwanted stimuli from pop culture means that there has to be a continual processing (with some regrettable but hopefully harmless leakage) in order to function in any kind of normal, rational way, like an overspilling of the dream state into the waking one? Or maybe the brain is constantly making observations and connections that are necessary for its normal functioning (things like intuition and mood) but which the conscious brain has little or no access to except in this oblique way. A lot of this stuff is from the 80s, when I was growing up and absorbing knowledge etc: whatever; being human is strange sometimes. Hope you’re enjoying whatever your brain is treating you to today!