confessions of a godless heathen

Percy Bysshe Shelley, 1819, by Amelia Curran

Ignore the senstionalist headline; there are no confessions here, and I’m not a heathen, I’m an atheist. When I was a teenage atheist, one of my main issues with the idea of god had been neatly summed up well over a century earlier by Shelley in The Necessity of Atheism (1811):

If God wishes to be known, cherished, thanked, why does he not show himself under his favourable features to all these intelligent beings by whom he wishes to be loved and adored? Why not manifest himself to the whole earth in an unequivocal manner, much more capable of convincing us than these private revelations which seem to accuse the Divinity of an annoying partiality for some of his creatures? The all−powerful, should he not heave more convincing means by which to show man than these ridiculous metamorphoses, these pretended incarnations, which are attested by writers so little in agreement among themselves?

As an adult atheist I still think that, but I think a lot of other things too. I should possibly point out here that though I don’t believe in any deities, the god I primarily didn’t and don’t believe in was the Christian one, simply because that’s the one who most prominently didn’t and doesn’t exist in my own personal experience. My lack of any kind of religious belief is something I’ve given a lot of thought to over the years and mentioned many times in passing on this website. I’ve never written specifically about it, but several things I’ve recently come across made me want to. One is the slightly dubious, clickbaity claim that (as one headline put it) “God is back” and that Gen Z (or some such amorphous group) is embracing the Catholic church. I’m sure that to some extent that’s true, as the Catholic church is just as evident as always, the choosing of a new Pope is TV news etc, but it’s also true that there have been other, substantially similar news stories about Gen Z embracing astrology and conspiracy theories and feminism and anti-feminism and fretting about world war three. None of those things are mutually exclusive of course (most of them should be; maybe feminism & anti-feminism actually are), and what it seems to add up to is that kind of end-times malaise normally associated with the end of a century or millennium.

I feel like it’s necessary to take those kinds of stories with a pinch of salt though, simply because over the years I’ve read all kinds of similar stories about Gen X which occasionally apply to me and often don’t, but in either case I’ve never been asked my opinion in order to gauge it and neither I presume have most people. And since every generation seems to spawn its own Nazis, centrists, communists and anti-fascists and everything in between, its philanthropists, misanthropes and bystanders, its religious zealots, libertines and atheists (etc, etc, ad nauseam), it seems fair to assume that any theory about a generation, just like any theory about a gender, race or sexuality is going to involve the kinds of generalisations which, once really examined, make the whole theory redundant. Presumably, church attendances are on the rise, but does that mean that belief is on the rise, or just that the desire for belief – quite a different thing – is? Or both? Who knows.

Alongside that, not coincidentally, more and more (inevitably right wing) politicians have been yammering on at first in the USA and now here, about “Judeo-Christian” values. It seems that this is mostly because they don’t like foreigners and Islam and are immune to irony. Because in insisting on the values of two ancient foreign religions from what we now in the West call the Middle East and denying the very similar values of another, very similar (though not quite as ancient) religion also from what we now call the Middle East does seem ironic, especially when one is tying it in with one’s national identity. There’s been a growing rhetoric (again, on the right) that suggests that Christians are becoming an oppressed minority in the UK, which is both tiresome and laughable but nicely (and again not coincidentally) complements the growth of a men’s rights movement that claims feminism (which, like atheism has arguably only recently began to have a fairly minor influence if any on the power structures underlying British society) has ‘gone too far’ and all that fun stuff.

Although my attitude has changed over the years, I don’t think my views really have. I genuinely think that it’s terrible and damaging that all over the world people are punished or ostracised or oppressed or killed or made to feel bad about themselves for offending arbitrary rules established in the name of imaginary beings. And in a way worse, the idea that there are omniscient, omnipotent beings who would be offended by actions which they must have foreseen at the moment of creation but decided to allow anyway, in order to punish them.

That kind of thing seems to be the basis of a lot of atheist polemic. Sometimes I find it entertaining and (depending on the writer) interesting, but, even while still believing every word of it, and feeling that it’s worth insisting on if asked about my views, as a middle aged atheist I wonder about the usefulness of saying it polemically at all. Because – for me at least – the opposite of religious faith isn’t science and logic (though I do believe in those), it’s simply non-faith. And I’m not sure there’s much to learn from that.

It’s not an argument that strengthens any cause, let alone mine, but I have come to think that lack of belief in a god or gods is just as instinctive, reflexive and fundamental as faith in them is. My mother was a Christian in her youth (in an atheist household, oddly for the 1950s) to the point where she considered becoming a nun. During her life, she wavered from various kinds of Christianity, to Taoism and Buddhism and a kind of vague paganism, but – and I think this is the most important point – although she lost her faith in many belief systems over the years, she never lost her essential faith in some kind of benevolent god or spirit at the heart of creation. For me it’s almost the opposite.

I have always been very interested in religions from Animism to Zoroastrianism in the exact same way that I’ve always been interested in mythology (I don’t really distinguish between the two) and I find pretty much all religions to some degree fascinating. I love churches and places of worship, I love the atmosphere of ‘holy’ places (even pre-historic places we now assume were once sacred) and I love the imagery and paraphernalia of religions, in the exact same way I love art and history. But it’s good that I’ve never wanted to belong to a faith or to become involved with those mythologies, because I can’t remember a time when I ever believed in even the possibility that a deity of any kind was an actual, real thing. Santa Claus either for that matter, although presumably at some pre-remembered point I did believe in him (Him?)

I have no idea where my lack of faith came from but I can pinpoint when I first became aware of it. I went to three ordinary Scottish primary schools, which in the 1980s meant reciting the Lord’s Prayer every morning before the class started. Not surprisingly, I still remember most of it, though mysteriously I can’t work out which bit I thought in my childhood mentioned snot; I was quite deaf then, but I definitely remember a snot reference, which always seemed odd. In my memory that daily recital was just part of a greater daily ritual which also involved (in the early years) chanting the alphabet and (through all of Primary school) greeting the teacher in monotone unison (The phonetic version of Mrs expresses it more accurately) “GOOD MOR-NING ‘MI-SIZ WAT-SON” or whoever the teacher happened to be – seemingly there were no male Primary School teachers in my day.

I have surprisingly sharp memories of looking round the class during the morning prayer to see who else didn’t have their eyes closed – there were usually a few of us, and sometimes we would try to make each other laugh – but a key part of that memory for me is the sureness of the feeling that I wasn’t talking to anybody. The praying itself wasn’t something I questioned or minded – if anything I quite liked it. It didn’t feel at all ‘bad’ or rebellious not to believe, it just never occurred to me at any point that god was real and might be listening, any more than I remember feeling that the notes put up the chimney to Santa would be read by an old man with a red suit and white beard, or that the carrot for Rudolph would be eaten by an actual reindeer.

At school we went to church (I think) three times a year – at Christmas, Easter and (an anomaly) Harvest Festival – and so folk horror-ish paraphenalia like corn dollies are always associated with church in my mind. The sermons were boring, as were some of the hymns, although others, the ones where the kids invariably sang the wrong lyrics, were fun – but I liked (and like) churches. I liked the musty, chilly smell and the wooden pews and the acoustics and the stained glass windows and especially the holiday feeling of being at school but not at school. And, though they only came into school life at these times of year I liked the Bible stories too. It seems funny now, but until well into adulthood the image that the word ‘Palestine’ summoned in my mind was an illustration of Jesus wandering around in pink and turquoise robes; I presume it’s from some forgotten book of Bible stories. But to me, stories – sometimes good ones (in the case of the early days of Moses and the last days of Jesus, very good ones), sometimes boring ones, are all that they were.
But where does lack of belief come from? The same place, presumably as belief.

Bowie in 1976 by Michael Marks

In Word on a Wing (1976), one of my favourite David Bowie songs – also I think one of his most deeply felt and certainly one of his most open and revealing songs – Bowie, then in LA and in the middle of a drug-fuelled existential crisis but soon to withdraw to Berlin to live a relatively austere and private life, sings:
Just because I believe
Don′t mean I don′t think as well
Don’t have to question everything in heaven or hell

 

For me, that sums up (non-blind) faith perfectly. Essentially, it’s what Keats (those romantics again!) summarised as ‘negative capability’ (“Negative Capability, that is, when a man is capable of being in uncertainties, mysteries, doubts, without any irritable reaching after fact and reason” – from an 1817 letter to his brothers) but applied to one of the most fundamental human impulses. I completely respect it and see what both Keats and Bowie mean by it, but it’s completely alien to me. Well, not completely: I don’t need to know how a jet engine works to travel by plane, I do indeed have ‘faith’ in it, but what the (nowadays many) commentators who characterise scientific belief as a kind of religious faith seem to overlook is that I don’t believe it because a scientist says it’s true, but because I can actually travel on a jet plane, and even before I did travel on a jet plane I could see that other people travelled on jet planes, that planes really do fly and engines really do work. Which seems like the build up to some kind of New Atheism gotcha of the ‘if God is real why doesn’t he just prove it’ type popular in the 2000s (essentially a more sneery version of the Shelley quote). but that’s not really me either. Although I am definitely an actual ‘speculative atheist’ and I suppose even an ‘atheist fundamentalist’ and though I genuinely do believe that the world and humanity would be better off without religion, I’m just not sure how much better off it would be.

It’s not that the New Atheists were wrong (or even new, thinking again of Shelley). Most of the arguments that were raised against them are easily picked apart. The idea that there is no morality without religion is so obviously wrong that it seems pointless even to argue against it. The same basics of morality (murder and stealing and cheating and lying are bad, treat people as you wish to be treated etc) are and have been all but universal, though not without different nuances, throughout history and throughout world cultures.
But the problem with lack of faith as certainty (and for myself I really am certain about it) is that its arguments, though more logical – at least up to a point, as we shall see – have precisely as much effect on the certainty of faith as the arguments of faith have on the certainty of non-faith. Logic is no help here.

From my point of view, in the certain absence of a god or gods, religion is purely human and therefore many of the (in themselves solid) arguments against it are kind of a cop-out. It’s not unreasonable to find it laughable that a supreme supernatural being should care what food you eat on which days, or what you wear or how you like to have your hair. It seems bizarre that an almighty creator who could presumably do whatever it liked, would take the time to tell humans which crops they prefer to have planted where or that male masturbation is bad rather than simply preventing the possibility of rule-breaking ‘at source’. But the omnipresent invisible elephant in the room is that whether or not a god really felt or feels strongly about these things, whether or not a god had them written down in words, they really were written down in words, by human beings, some of whom definitely did want these rules to exist and to be enforced. And it’s human beings that still enforce them. Also, it’s just as true that primarily secular or entirely secular societies also have rules and customs regarding things like clothing, food, hairstyles and even names, although they rarely come with threats of severe retribution and never with the threat of ongoing retribution after death. And yes, many of these customs – like the acceptable length of women’s skirts in western society – ultimately derive from religious directives, but any authoritarian society, not only theocracies or weird, nominally religious ones like Nazi Germany, but even states where religion is completely anathema like Stalinist Russia, Communist East Germany or the North Korea under its current regime are hardly relaxed about the individual’s freedom of expression.

Religious wars and religious persecution are bad, not because they are religious per se, but because wars and persecution are bad. Wars and persecution may often be provoked by religion, but surely if like me you don’t believe in god, then blaming that non-existent creature for religious wars is just euphemistic buck-passing bullshit? The Crusades were horrific, bloody and unjustifiable, but to blame “Christianity” for them, rather than Christians, that is, actual European human beings, is like blaming, or giving credit to, Tengri for Genghis Khan’s conquest of vast tracts of Asia, or suggesting that Jupiter, Neptune and co enabled the Romans to found their empire. “Catholicism” didn’t create the Spanish Inquisition any more than the concept of Nazism created the Holocaust or Islam as a belief system resulted in 9/11 or the Taliban. Left to themselves, religions, ideologies and philosophies don’t do anything; they just sit there. And they all have one common denominator, and it’s not a deity.

This morning, I saw that the Pope had made a statement that some policy or other of the current US administration is “un-Christian and un-American.” Well. I am glad to see anyone with any kind of authority challenging inhumane, intolerant and fascistic regimes. But those actions are only un-Christian insofar as Christ himself wouldn’t like them, according to the Bible. But Christ was one single man-god who acted a certain way and said certain things. All manner of atrocities are entirely in keeping with the actions of two millennia of Christians. As for un-American, again, the acts the Pope condemns are not compatible with the statements made by the founding fathers of the Unites States of America; but they are probably no worse than the actions carried out by those same founding fathers in their lives or many of the successive governments of the USA. Or indeed many, many other governments in the world. And, to be all New Atheism about it, when it comes to the welfare of children for instance, it’s not like the Catholic church itself has an impressive record. Does that mean the Pope shouldn’t condemn things or that American people shouldn’t try to hold their government to account using the egalitarian rules set down when the country was founded? Of course not; but invoking some kind of imaginary, ideal standard of behaviour really shouldn’t be necessary to do so. There’s human decency after all

Another (non-conclusive, because none of them are) argument for the human, rather than divine nature of religion is that the religions that have survived the longest and strongest in the modern world are those which are most compatible with it. The paternalistic, to varying degrees misogynistic Abrahamic religions all defer their ultimate spiritual rewards (but more on the non-ultimate ones later) until after death. They have no in-built expectation of much material happiness or contentment on this plane of existence and to varying extents they actually value hardship, while prioritising men within the earthly realm. Well, the paths that led us to 21st century culture, especially imperialism and capitalism, are fine with all that. Work and strive now, happiness comes later, unless you are one of the privileged few. Communism in theory isn’t fine with that, but naturally, having been formulated during the Industrial Revolution, when the vast mass of people were already oppressed by a tiny ruling class (itself a mirror image of the earlier rule of Church & monarchical elite vs peasant majority), it is defined by its opposition to capitalism. Early Communism therefore took hardship as a given (there is no proletariat without it) and, in lieu of heaven, deferred the payoff of universal prosperity and equality to some future time when the world revolution has been achieved and all opposition to itself removed. It’s a cliché to say that communism is itself a kind of religion, but the parallels are unignorably consistent; trust the leaders, put up with the shit now, eventually if we’re true to our cause it’ll all work out, if those heretics don’t spoil it.

On the other hand, various older kinds of religions, animism and ‘earth mother’ paganism and so on, value (quite logically) the need to look after the world we live in. It’s not that the religions of the book explicitly say not to, but they aren’t primarily concerned with this world – and imperialism and capitalism and even communism, which have other uses for the material world than care and stewardship, have historically all been fine with that. It’s somehow not very surprising that the aspects of non-Christian religions that became most taboo during the age of imperialism, and therefore attributed to “savage” or primitive cultures – human sacrifice, cannibalism, idol worship and so on – should be parts of Christianity itself. Without human sacrifice, even if it’s only the sacrifice of one special token human, there is no Christianity. The divinity of Christ kind of goes without saying – that’s what makes it a religion. But his humanity is what makes him more than just the old Testament god. And insisting on his humanity inevitably made the eating of his flesh and drinking of his blood controversial. But seriously, whether someone believes they are literally eating the flesh and drinking the blood of an actual human being or only symbolically doing so, it’s a cannibalistic ritual just as atavistic and visceral as any of the imagined horrors that the Christians of the crusading period or the Europeans who spread their faith across the world believed they had encountered. It doesn’t seem too fanciful to say that what really horrified those Christians was the discovery that things they saw as fundamental to their own civilisation might be just as fundamental to civilisations that they had to believe were inherently inferior in order to destroy them.

Monkey (1979) Buddhism & Taoism that was fun for kids

The fact that there are analogous stories to those in the Bible throughout history and world cultures (death, rebirth, sacrifice, enlightenment) suggests that whether or not one has any faith in these stories, they aren’t ‘just’ stories. In fact, a lesson that stayed with me (because it suits my personality I suppose) from the 70s TV show Monkey, based on Wu Cheng’en’s 16th century novel, Journey to the West – something like “winners make losers, therefore remove yourself from competitions” purports to be from a Taoist religious text. Eating the fruit of the tree of knowledge (I like to think a banana) and paying the unexpected price for it is, even as a mythological story, one that has real life analogies all through human history. I remember as a child when plastic coca cola bottles began to replace glass ones. It seemed futuristic and in a weird way utopian – lightweight like a can but resealable, far less risk to your drink if you dropped it than a glass bottle; less broken glass in the streets and parks. Whether or not scientists were already concerned with the problem of plastic’s lifespan or the sheer accumulation of it I don’t know, but kids weren’t, for a few years at least. Which has nothing to do with religion – but the attempt to do good turning out not only to be bad, but to be something that has to be dealt with and paid for down the generations is hardly an alien one. And in this case it was made worse not by religion, but by the inability or unwillingness of people under capitalism (myself included) to distinguish between convenience in the sense of people not having to waste half of their lives in drudgery and convenience in the sense of not having to get up to change TV channels. There’s probably a parable in there somewhere.

A favourite anti-atheist argument is the ‘intelligent design’/watchmaker one. It’s clearly an empty argument, but my counter arguments would only be convincing to an atheist – and not even to all atheists. The argument, put simplistically, that because a watch, or a computer, or anything human-made and complex didn’t just evolve on its own, but had to be consciously invented, therefore means that life, earth and everything else must have been consciously invented too requires an obvious leap of logic. The universe is not a machine, life is not the same as battery life.

The most complex things in our world seem to be human beings, and human beings also produce other human beings, often with no conscious thought and rarely with any kind of design at all. People are accidentally invented all the time. The idea that creation is accidental or ‘just happens’ is hardly a difficult one to grasp. The people that people produce are every bit as complex as their parents and grandparents, but only occasionally, and in the most superficial way, are they designed. Worse than that, logically, we know how humans are created, but even so it’s hardly unusual for them to be produced even when the people doing it very much desire not to do so. To look at the way that the most complex creations on earth are usually made and to label it “intelligent design” would be a strange thing to do, since it doesn’t necessarily include much intelligence or any actual design. Of course that doesn’t prove that things weren’t originally designed, but the gulf between organic living things and intelligently designed things as we experience them, even at the beginning of the AI age, is so fundamentally different that you might as well argue that a cat must have designed clouds because you once saw a cloud that was cat-shaped.

As mentioned in passing before, it’s popular among a certain kind of (usually, but not exclusively right-wing, American) Christian to compare ‘faith’ in science to faith in god, which is a false equivalence, for the jet plane kind of reasons mentioned above – but although I do believe science to be superior in every way to religion – because it learns from experience, for one thing – I do sometimes wonder whether it suffers from being (this sounds very different from how I mean it) homocentric (is ‘anthropocentric’ better? It sounds worse) in a similar kind of way. I remember learning (in a very basic way) about the big bang at school and asking the teacher, not unreasonably I think, *what* was supposed to have exploded and where that came from and being told “that isn’t a scientifically interesting question.” Well, quite possibly all the teacher meant is that at the current time any answer to that question must be pure speculation of a non-mathematical kind, but teen-me felt that it was basically “science works in mysterious ways” and he/I didn’t like that.

Somewhere in this article I had been going to say that Shakespeare was was as right as anybody when he wrote “Nothing will come from nothing” but now that I’ve reached this point I wonder whether being creatures that are born, who come from somewhere, who live for a while, who are subject to time and then who die and stop existing (or go somewhere else) shapes our understanding of everything. I do believe in the big bang because the evidence around us confirms its likelihood. The universe started, it expanded and at some point it will end. The idea of something that just is, forever, or that exists outside of time, whatever that would mean, seems as incomprehensible as non-existence does. That things, including human beings do stop existing is in one way obvious – but things breaking down, decomposing, changing from one form to another and (romantically) melding with the universe or (prosaically) enriching the soil or whatever is a process that is understandable. The personality and individual human consciousness switching off and simply not existing is the hard part to take in. As far as we can tell this isn’t a change in energy type, the electrical impulses that are us don’t seem go anywhere or do anything. But maybe that whole frame of reference; beginning, middle, end isn’t everything, it’s just the limits of human understanding. Which doesn’t, to me, imply the existence of any kind of creator or supreme being, just that there’s scope there for whatever you care to imagine but which you can never truly know. Keats would be fine with that.

Similarly, to apply logic to the existence of god will always be self-defeating, because logic is (as far as we know) a specifically human way of explaining the universe to its/our own satisfaction. The laws of physics and nature and mathematics do seem to work according to logic, which is very helpful for teaching and learning and science, but human beings themselves routinely defy logic in both profound and trivial ways. Many of the things that humans value most highly are completely resistant to logic, like art and god and love and money. Even something as humble as sports; one human being being able to run faster than another or play a game better than another is only dubiously something to celebrate, and if it is, then logically one might expect people to support only the best teams and athletes. If, alternatively it’s to do with identification with and loyalty to one’s own area, then fans might only be expected to support teams or athletes from the same geographical location as yourself, which is occasionally how it works, but just as often isn’t. There’s nothing especially logical about the enjoyment of a race or a game in which you aren’t involved for its own sake. Does that mean that logic is a faulty way of understanding the universe? I don’t know; but it is a faulty way of understanding human beings. The idea that god’s existence is a logical reality in a 2 x 2 = 4 way makes about as much sense as the position of the planets at the time of your birth dictating your future.

As Bowie implied, faith needn’t – and in many cases I’m sure doesn’t – preclude seriously considering the implications of one’s belief. But sometimes it does. I’ve never wanted to believe (I don’t really get why anyone would, if they don’t; which is my deficiency), but as an adult I have always wanted to understand people who do. And in general, I find it frustrating to try to do so, as two different but very similar anecdotes about my encounters with people of faith illustrate. I am aware though that these may say more about me than they do about the believers.

In my professional capacity I was once interviewing a prominent American black metal musician whose latest album went on about blasphemy a lot. Given that black metal encompasses everything from orthodox satanists to heathens and pagans and occultists and chaos magicians and nihilists, I asked what I thought was a reasonable question; what meaning does blasphemy have unless one believes in god? Doesn’t the concept of blasphemy essentially reinforce the religion it attacks by affording it some kind of legitimacy?* The musician’s response was the black metal version of these go up to eleven. I think what he actually said was “Everyone knows what blasphemy is.” And he was right I suppose, but he was also characterising his band as purveyors of simple shock and outrage to the very few people who are still shocked and outraged by blasphemy. Ho hum.*

The archetypal image of black metal, Nattefrost of Carpathian Forest, photo by Peter Beste

*this made me think of an occasion in high school where I muttered “of for god’s sake” or something like that and my maths teacher said “don’t blaspheme, William!” and I replied “it would only be blasphemy if god existed” and was given a punishment (lines). It was only years later than realised I deserved the punishment, not because of god, but because I was being a smart arse to a teacher – at the time I just felt righteously angry about the lines.

Likewise, a visit from some very pleasant Jehovah’s Witnesses left me with unexpected admiration for them, but also some frustration; they also left prematurely, which my younger self would have regarded as a victory. The respect was for their answer to the kind of question that seems like a typical smart-arse one, but I was genuinely curious. If there are only 144, 000 places in heaven in your religion (I had only recently learned that strange fact) and those are all spoken for already, why are you knocking on people’s doors trying to spread the word about your faith? I hadn’t expected their response, which was something like “Oh, we don’t expect to see heaven. Heaven is for god and the saints and angels, Earth is the paradise that god made for humans, it just needs to be fixed.” A version of Christianity that withholds the promise of paradise even after death was weird to me, but also impressive. Having a faith where you never expect to attain the best bit seems coolly ascetic, but also kind of servile, which it literally is. The fact that servility seems distasteful to me is I suppose my weakness not theirs.

I was less impressed with the response to what I felt and still feel is a serious question and not just a cynical gotcha; If god is all you say it is, all powerful, blah blah, then why create evil? There was a stock answer ready, which was to do with free will and choice, but even though there are holes to be picked in that too (the ‘free will’ of transgressors has nothing to do with the free will of their victims, what about their will?) – that wasn’t what I meant. What I was asking is, If you can do whatever you like, can see everything that has ever happened and everything that will ever happen, if you are capable, presumably, of endless satisfaction and happiness, why create ‘bad’ – or, more personally perhaps, why create even the concept of ‘things you don’t like’ at all? To that question, I got the Jehovah’s Witness version of “these go up to eleven” and a quick goodbye. But I genuinely wasn’t trying to catch them out, I really wanted to know what they thought about it, but apparently they didn’t think anything. Having said that, I can see now that I write about it, that interrogating your belief system for the benefit of a stranger who obviously isn’t going to be persuaded to join you is probably not all that attractive. Still, I didn’t knock on their door.

Guy Pearce as Peter Weyland in Ridley Scott’s Prometheus (2012) – something to aspire to?

So much of religion seems to me to be saying that that, whatever the wonders and horrors and joys and pains of life, it’s not enough and they want more. But again, that’s not exclusive to religious people. I recently saw an unsettling but also unintentionally funny video in which the PA of a shadowy, influential and incredibly wealthy figure was talking about transhumanism and his master’s ultimate Roy Batty/Weyland-from-Prometheus plan not to die at all. Which feels very sci-fi, but also very late Roman Empire. At the same time, my generation grew up with the rumour that Walt Disney’s head is in a refrigerator, awaiting medical science until he can be resurrected when the technology catches up enough. Rebirth and resurrection; there really is nothing new in human history.

detail of the crucifixion from the Isenheim altarpiece (1512 – 6) by Matthias Grünewald

All a bit bleak, maybe; but if religion only offered oppression, judgement, condemnation and war then far fewer people would devote their lives to it. And if the negative aspects of religion all exist independently of religion, then so do the positive aspects, and without the same arbitrary punishment/reward structure underlying it.

Religion offers comfort to people in distress, it offers a sense of community and belonging, it offers contact to people who feel isolated. It offers various kinds of love.  I can’t think of many artworks more moving than Matthias Grünewald’s crucifixion from the Isenheim altarpiece (1512-6), painted to comfort people who were suffering from skin diseases, by showing them the scourged Christ’s suffering, which mirrored their own. But just as the Quran didn’t issue a fatwa against Salman Rushdie and the Bible didn’t take babies from unmarried mothers and kill them and bury them in the grounds of institutions, neither do those books feed the poor, embrace the lonely, paint pictures or create a sense of community. Human beings do those things, and they do them regardless of religion. They do it in societies where religious beliefs aren’t based on the Judeo-Christian tradition and they do it in societies where religious beliefs are actively frowned on. After the dissolution of the USSR, few people were nostalgic about the food queues or the secret police, but many were nostalgic about the sense of community that came from masses of people being in the same situation together. And now that capitalism which, unlikely though it seems, is not always so far removed from Soviet communism, has created its own underclass and hierarchical power structure and pogroms and whatnot, people have also created their own communities, support groups, charities and friendships.

The one positive thing that faith offers that non-faith of my kind doesn’t, is a personal relationship with god – and that’s where we came in; you either believe or you don’t. I can completely understand that having a direct line to someone who knows you and understands you better than you know yourself, who accepts and forgives you could be nice and comforting. Maybe in pre-Christian or non-monotheistic societies that voice was the voice of the ancestors or the spirits of the trees and rivers. I can see how that would be nice too, but for myself I can’t imagine having such a thing or longing for it or even wanting it. For me, you either disbelieve or you don’t.

And maybe that’s really the strongest argument, not against faith, which there is no argument against, but against religions as institutions, as rules and directives of the kind that people are so keen to re-establish. Because if there’s one thing you can see, looking not just at the diversity of religions but at the diversity of beliefs within them, at the different ways that people relate to and communicate with their gods, it’s that god is just as personal and individual as any of its believers and disbelievers and so making an orthodoxy of it can only ever harm more people than it helps.

the end of all songs

A question occurred to me while watching a documentary about Joy Division ; is there any better ending to a song than Ian Curtis bellowing FEELING FEELING FEELING FEELING FEELING FEELING FEEEEEELING! as the music clatters to a halt at the end of “Disorder”? Lyrically, despite its explosiveness, it isn’t cathartic, but in a musical way it is – for the listener at least – because until that point, the tempo has been too fast and the lyrics too complex for Curtis’s voice to do whatever the deep, melancholy equivalent of ‘taking flight’ is. There’s an underappreciated art to ending songs and it’s not something that even great bands do infallibly or that all great songs feature. Not all songs need to end with a crescendo or flourish, and very few songs benefit from just grinding to a halt or being cut off mid-flow, but the sense of completeness when a song (especially a relatively short song) ends perfectly is one of the things that makes you want to hear it again.

Ian Curtis in 1979 by Kevin Cummins

“Decades,” the final song on Closer, the final Joy Division album, is one of relatively few songs (given their vast number) where fading out at the end doesn’t seem like a cop out. There’s nothing wrong with fading out a song, but often it just feels like an easy option taken in order to dodge the question of how to end a song properly. Which is fine, except in live performances, where it’s difficult to satisfactorily replicate a fade-out. Partly that’s because of the practicality of it – does the band all try to play more quietly? Do they just get the sound person to turn down the volume, which works, unless you’re close to the stage, which, in that situation is sub-optimal, since hearing the unamplified sounds from the stage (drums clattering, guitars plinking etc) is kind of a mood-killer? And if so, when do they all stop? There’s also the awkwardness of the audience reaction; the crowd might start cheering/jeering before the song is actually finished, or they might not start until someone in the band indicates that that the song is definitely over, which is also not ideal. Basically, it feels artificial – but obviously it has the appeal of being simple – haven’t thought of a proper ending for you song? Just keep playing and fade it out afterwards. But Closer needed to fade into silence and it does.

Another musical ending this week – a seriously clunky segue this but bear with me – was the death of Ozzy Osbourne, a week after what was explicitly intended to be his final performance, a different kind of ending and a very unusual one in the music world where ‘farewell’ tours can become an annual occurrence and no split is too acrimonious to be healed by the prospect of bigger and bigger sums of money.

Ozzy Osbourne in 1974 by Mick Rock

On paper, any kinship between Ozzy and Joy Division seems unlikely to say the least, but the ears say otherwise. Regardless of the punk roots of Joy Division, the only real precursor to a song like “New Dawn Fades” from their 1979 debut album Unknown Pleasures is Black Sabbath. And it’s not only the oppressively doomladen atmosphere, though that’s important; Bernard Sumner’s opening guitar melody is remarkably like Tony Iommi’s melodic solo from “War Pigs” – a classic song, incidentally, which has one of the worst endings of any great song ever written. Presumably, Black Sabbath had no idea how to end it and so did something worse than a fade out; speeding it up until it ends with a comical squeak. Oh well. But anyway, there are many moments, especially on Unknown Pleasures, where Joy Division sound like a cross between Black Sabbath and the Doors, although I’m sure neither of those things were in the minds of Peter Hook, Bernard Sumner, Stephen Morris and Ian Curtis, any more than they were in the consciousness of the music journalists who lauded the band in ’79, who mostly tended to see punk as year zero, the new beginning from which the influence of anything pretentious or overblown had been erased.

That basic idea was one I also accepted without much thought as a teenage indie fan in the early 90s when Joy Division – by then defunct for a decade – became one of my favourite bands. With the honourable, weekly music paper-approved exception of the Velvet Underground, I was dubious about anything old or anything that I considered overtly commercial. Without giving it much thought I just assumed that mentality came from my reading of Melody Maker and the NME. I had definitely accepted their pre-Britpop genealogy of cool rock music that essentially began with the Velvet Underground and then continued via punk and post-punk into 80s indie guitar music, most of which existed firmly outside of the mainstream of the UK top 40. But reflecting on Ozzy on the news of his death, it seems my snobbery has older roots.

“Mad Housewife”-era Ozzy, c.1986

I don’t remember when I first heard Ozzy Osbourne’s name, but I do remember when I first heard his music. It was 1988 and I was about a year away from growing out of metal, but still immersed in it for the time being. Within metal itself I had fairly wide taste and my favourite bands included many of the biggest metal bands of the era; Iron Maiden, Metallica, Guns ‘n’ Roses, Helloween, Megadeth, Suicidal Tendencies, Queensrÿche, Slayer, Anthrax, plus many more. At that point I mostly discovered music via magazines (especially Metal Forces) and my friends. In addition to my modest collection of records and tapes I had many more cassettes that had been made for me by friends and I spent a good bit of my spare time making tapes for them; it was fun. And so; Ozzy. A friend had taped a couple of albums for me on a C90 cassette (the odd pairing, it seems now, of Mötley Crüe’s Girls, Girls Girls and Slayer’s Reign in Blood) and filled up the rest of the tape with random metal songs, among them “Foaming at the Mouth” by Rigor Mortis, “The Brave” by Metal Church, “Screamin’ in the Night” by Krokus and Ozzy’s latest single, “Miracle Man”. I pretty much hated it. I thought Ozzy’s voice was unbearably nasal and awful and the production really harsh and tinny (that was probably just the tape though).

Memorex C90s were pretty dependable
Teenage metal fans were obliged to like Elvira in 1989

By then, I knew who Ozzy was, and was aware of his bat-biting notoriety, though that definitely seemed to be a bigger deal in the USA than it was in the UK (or at least in my corner of rural Scotland). At some point just a little later, Cassandra Peterson, or more accurately Elvira, Mistress of the Dark presented a short series of metal-related shows for the BBC. One episode included Penelope Spheeris’ fantastic documentary The Decline of Western Civilization Part II: The Metal Years, which includes one of my favourite Ozzy interviews, but also concert footage of Ozzy during his ‘mad housewife’ era when his image seemed to be based on Jackie Collins’s style at the time. I love that era of Ozzy now, but at the time I thought it was laughably awful. It must have been around that time that I also became aware of Ozzy’s history with Black Sabbath, who I only knew in their then-current incarnation with Tony Martin, which again I now love but at the time thought irremediably middle aged and boring. The fact that Ozzy’s Black Sabbath was from the 70s meant that I pretty much dismissed them without needing to hear them. When Elvira showed a classic early Led Zeppelin concert in black and white I also found that tiresomely old and dull, especially in comparison with the Napalm Death concert she presented. It’s hard to relate to now, but in the 80s, for me – and I think for most people I knew of my age – the 70s was cheesy, embarrassing and possibly funny, but with no redeeming features. Actually, that’s how the 80s were for a good part of the 90s too; changed days.

Again, like most of the metal fans I knew, I loved metal, but I mostly didn’t like rock. Metal meant precision, virtuosity, heaviness and speed. Rock (to this kind of metal fan) was simplistic, old-fashioned and (worse) commercial. Oddly, I never thought to include the very glam-oriented hair metal bands I liked in the rock camp; which I can now see is where they really belonged. I loved bands like Poison, Faster Pussycat and Pretty Boy Floyd, despite the fact that their very obvious ambition was to be famous and that they wrote schmaltzy ballads. I made the same exception, mysteriously, for Guns ‘n’ Roses, who I loved. But I thought of them as metal, not rock.

Cliff Burton rocking like it’s 1974 (c.1986)

It was a distinction that my parents’ generation seemed simply not to understand. To them and their friends if you liked Metallica wasn’t that basically the same as liking Meat Loaf? But I was of the generation for whom, from the earliest days of primary school, the idea of being seen in flared trousers was the stuff of nightmares. That horror of the era we were born in was hard to let go of., which is no doubt partly why the legacy of punk was easy to embrace later. In 1988, when I first heard them, Metallica instantly became one of my favourite bands and …And Justice For All one of my favourite albums. A crucial part of that was that the band, as I first knew them, looked cool to me. When, probably later that year, I first heard Ride the Lightning and Master of Puppets I loved those too, but the sight of the great Cliff Burton (RIP) in his denim bellbottoms with his middle-parted hair and little moustache, looking like he should have been in Status Quo circa 1974 was extremely cringe-inducing; that was not cool. Not in Scotland in 1988 anyway.

It took a while for that attitude to change. One of the gateway albums that led young teen me away from heavy metal and towards the indie/alternative world was Faith No More’s The Real Thing, which included a cover of “War Pigs.” And at that time the song still felt old fashioned and less good than the rest of the album to me. It was only after a few years of hardcore indie snobbery that my attitude really changed. As my adolescence got to the more painfully introspective stage I stopped listening to metal, having been introduced to things like the Pixies and Ride and simultaneously discovering slightly older music like The Smiths, The Cure, Joy Division and the Jesus & Mary Chain. The part of me that still liked loud and heavy guitars didn’t care so much about precision anymore and so alongside the typical UK indie stuff, I also liked grunge for a while, mainly Mudhoney, Tad and Nirvana, but especially grunge-adjacent weirdness like the Butthole Surfers and Sonic Youth. That would seem to provide an obvious bridge to the hard rock of the 70s, since virtually all grunge-oriented bands referenced Sabbath and Kiss, but no.

a book that shaped my taste in the 90s

In fact, what happened was that in the Britpop era, I loved 70s-influenced bands like Pulp and Suede (I was never a fan of Blur or Oasis) and as Britpop became dull I started to get into the older music that Britpop referenced. At first it was mostly Bowie and Lou Reed, but after reading  Shots From the Hip (referenced a million times on this website) by Charles Shaar Murray, I broadened my horizons to include 70s glam in general (Roxy Music, Eno, Jobriath, Raw Power-era Stooges, but also the bubblegum stuff) and other things that Murray mentioned, whether positively or disparagingly. The latter seems odd but I’ve discovered lots of things I like that way. And suddenly, Ozzy was inescapable (though less so than he is this week).

I bought the Charles Shaar Murray book because Bowie was featured heavily in it; but he also wrote about Black Sabbath. I bought a book by the great photographer Mick Rock, because he had photographed Bowie and Lou Reed and Iggy and John Cale; but who should be in there but Ozzy, looking uncharacteristically thoughtful. I bought old 70s music annuals from glam and tail end of glam era; Fab 208 maybe – because they had Bowie and Mott the Hoople and Pilot and whatnot in them, but inside there was also mention of Black Sabbat. I remember a paragraph about their then-forthcoming compilation We Sold Our Souls for Rock ‘n’ Roll being especially intriguing.

Birmingham in the 1970s by Peter Trulock

Anyway, one thing led to another and I spent a large chunk of the late 90s and early 2000s immersing myself in the music of the 1970s. At first it was primarily glam, but then all kinds of rock, pop, soul, funk etc. At some point it started including bands that I’d long been aware of and never liked; like Led Zeppelin, Kiss – and Black Sabbath. The first Black Sabbath album I owned was Sabotage, bought for 50 pence in a charity shop. The texture of the sleeve was, interestingly, the same texture as my LP of Joy Division’s Unknown Pleasures, but the imagery was a little less classy, thanks to Bill Ward’s checked underpants being visible through his red tights; oh well. Ozzy sounded pretty much as I remembered from “Miracle Man,” but primed by Charles Shaar Murray’s description of Ozzy [caterwauling] about something or other in a locked basement and with a more sympathetic production and – crucially – the far more bare and elemental sound of Black Sabbath, so unappealing just a few years earlier, he sounded right. And then, when I heard the earliest Black Sabbath albums, Black Sabbath and Paranoid, both from 1970, one of the things they reminded me of, most unexpectedly, was Joy Division.

Black Sabbath in 1970 by Keef, Joy Division in 1979 by Anton Corbijn

Yes, the whole aura is different, Sabbath were surly and aggressive where Joy Division were solemn and withdrawn, but there’s something about the simplicity of the sound. Geezer and Hooky’s basses took up as much space as Tony and Bernard’s guitars. Bill Ward, like Stephen Morris, was a drummer who brought a strong dance/funk element into the band’s rock music without any sense of incongruity. Ozzy and Ian Curtis are worlds apart as vocalists, but both have a despairing intensity that makes them stand out, even within their respective genres. Both bands were from the grim, grey, hopeless industrial 1970s north of England, but whereas Joy Division were definitively a product of Manchester, with all the gritty coolness that conferred upon them, Sabbath were solidly of Birmingham, with all of the perceived oafishness and lack of credibility that entailed in the music press at least. Both singers were self-destructive too, but the same year that Ian Curtis tragically ended his life, Ozzy was reflecting on his self-destructive behaviour in “Suicide Solution”* and starting his life anew, launching a solo career which, against all expectations, made him an even bigger star and ultimately the icon who is being mourned today, far more widely than I’m sure he would ever have imagined. It was a good ending.

*Ozzy was always a far more thoughtful lyricist than he’s given credit for; I can’t think of any other artist from the aggressively cocky 80s hair metal scene who would have written the glumly confessional anthem “Secret Loser” from Ozzy’s 1986 album The Ultimate Sin

Hulme, Manchester in the 1970s, by David Chadwick

Because I’m a nerd, and not just a music nerd, writing this piece made me think of Michael Moorcock’s elegiac sci-fi/fantasy novel novel, The End of All Songs, published in 1976, the year that Ian Curtis, Peter Hook and Bernard Sumner met at a Sex Pistols concert in the Lesser Free Trade Hall in Manchester, the year that Black Sabbath released their seventh album, Technical Ecstasy, generally agreed to be the one where the cracks started to show in the Ozzy-led lineup but one of my favourites. Moorcock took the title of his novel from a poem by the Victorian writer Ernest Dowson, which feels appropriate to end with, since fading out is kind of a hassle, text-wise.

With pale, indifferent eyes, we sit and wait
For the drop’d curtain and the closing gate:
This is the end of all the songs man sings.
Ernest Dowson, Dregs (1899)

lost and found in translation

“Nothing was to be seen of the Castle hill; fog and darkness surrounded it; not even the faintest glimmer of light was present to suggest that the Castle was there.” Franz Kafka, The Castle, translated by Jon Calame & Seth Rogoff, 2014, Vitalis Verlag

“The Castle hill was hidden veiled in mist and darkness, nor was there even a glimmer of light to show that a castle was there.” Franz Kafka, The Castle, translated by Willa & Edwin Muir, 1930, my edition Penguin Modern Classics, 1984

The Castle (Penguin, 1984) vs The Castle, (Vitalis, 2014)

I have a possibly bad habit of buying multiple copies of books I love, if I see them for a good price with a cover that I like and don’t already have. Fairly often, I won’t ever read the new-to-me edition unless I happen to be in the mood for that particular book at the time of the purchase, because after all, it’s the same book. Or at least it usually is. I’ve had my 1984 Penguin Modern Classics paperback of Kafka’s The Castle for decades, though it was already second hand when I bought it. I first read the book at high school, a falling-to-bits old hardback from the school library. I have no idea which edition that was, but when I read it again in my early 20s, the novel seemed just as I remembered. That school version was almost certainly some edition the 1930 translation by the fascinating Scottish couple Willa and Edwin Muir, since they were the first translators of Kafka in English and theirs was and to some extent still is the standard version. Indeed, the couple introduced Kafka and his particular aura to the English-reading world; which is quite a big deal when you think about it.

Recently, in a charity shop, I came across a copy of The Castle that I hadn’t seen before, with a cover I was immediately drawn to. It’s from 2014 and though it’s in English it’s was put out by by Vitalis books, a publisher which, judging by its Wikipedia entry, sounds uniquely suited to the works of Kafka, a German-speaking Czech Jew who was raised in a Yiddish-speaking household:

Vitalis Publishing is the only German literary publisher in the Czech Republic. Founded in 1993 by Austrian-born physician and medical historian Harald Salfellner, it harks back to the cultural heyday of the fin de siècle before 1914, a period of shared German, Czech, and Jewish influence. The publishing program features Czech (Jan Neruda, Božena Němcová), German (Gustav Meyrink, Rainer Maria Rilke), Jewish (Oskar Wiener, Oskar Baum), and Austrian (Adalbert Stifter, Marie von Ebner-Eschenbach) authors as common representatives of Bohemian literature.

My old Penguin paperback of The Castle, which features two chapters not included in the original 1930 UK edition (which were separately translated by Eithne Wilkins & Ernst Kaiser), is from of my least favourite stylistic phase of the Penguin Modern Classics series. In that point in the early 80s, the spines, in a nostalgic nod to the classic early (orange and white) days of Penguin. It does have a nice cover illustration, by Elizabeth Pyle, but otherwise the design is a little drab. The book is 298 pages of fairly small but readable print. The Vitalis edition is far more stylish and the cover artwork uses a beautifully evocative photograph of (Bohemian?) “Peasant women” from 1918 and a photograph of Friedland (or Frýdlant) castle in Czechia. The type looks around the same size as the Penguin edition, but though the book is slightly bigger than the Penguin, it has 382 pages.

Even allowing for the fact that the Vitalis Castle includes nice, dark, moody and scratchy illustrations by Karel Hruška, it’s a noticeably longer book, and the reason for that is revealed in the two quotes at the top of the page. The Muirs’ prose – like Edwin Muir’s poetry – is terse and spare, but also flexible and evocative. It’s the “voice” that Kafka has had for me since I was a teenager. It also has the benefit – or at least I think it’s a benefit, more later – of having been translated close to Kafka’s own time. When that first British edition of The Castle was published and Edwin Muir wrote in his introduction “Franz Kafka’s name, as far as I can discover, is almost unknown to English readers,” he was talking about an author who had only been dead for six years, and the book itself had only been in print in Kafka’s own language for four years.

Calame and Rogoff’s writing is slightly more lyrical to my ears/eyes, a little more long-winded, but in its way just as precise. I very much appreciate the two semi-colons in the first sentence of the passage above. The cumulative effect of their translation is a book which feels familiar but gently different. Another comparison, this time the opening of chapter 10:

“K. stepped out into the windswept street and peered into the darkness.” (Willa & Edwin Muir)
versus
“K. stepped outside onto the wildly windswept steps and peered into the darkness.” (Calame and Rogoff)

Which is the better sentence is just a matter of taste; the Muir version doesn’t feel especially superior to me, but on the other hand it does feel more ‘Kafka-esque’ – but is it? And what about this, from the end of chapter 15?

“And he pressed her hand cordially once more as he swung himself on to the wall of the neighbouring garden.” (Muirs)
versus
“He was still pressing her hand fervently as he swung himself onto the fence of the neighbouring garden.” (C&R)

Well; ‘cordially’ and ‘fervently’ are two very different things aren’t they? To me, that word choice significantly changes the tone of the passage. And this time, it’s the modern version that feels more redolent of Kafka as I think of him; which isn’t the same as saying it’s a better translation of the original text.
I have no idea whether it impacted on Calame and Rogoff or not, but modern translations of Kafka are made in a world where ‘Kafka-esque’ is a thing, and where Kafka himself – both his image, with those big, dark, suspicion-filled eyes and the hypersensitive personality from his personal writings, prone to intense feelings of harassment and persecution – colour how we see his work. The Trial in particular feels like that persona, that image, shaped into a novel, and surely anybody embarking on a new translation of the book could be uninfluenced by its familiar Kafka-ness, regardless of how faithful or otherwise they were to the original text.

Faith and Faithfulness

witty (if dated) wordplay in Asterix

There’s a mystery to what faithfulness means in translation – Google translate and AI are perfectly capable of making word-for-word translations of texts, but they seem somehow unable to make living, readable prose out of them. When I think of books that I’ve only ever read in translation (and I’ve never read more than a few pages in any language other than English or Scots, alas), going all the way back to childhood and the Asterix (René Goscinny, trans. Anthea Bell & Derek Hockridge) and Tintin (Hergé – Leslie Lonsdale-Cooper & Michael Turner) series’, I realise how much of the character of those books is owed to their translations. In those particular cases the translations seem almost miraculously good. To capture witty wordplay, puns etc while also keeping the original narrative flowing is a formidable skill. I can’t help thinking that if I read literal translations of those books, or learned to read French myself (let’s not get carried away) and read the originals, I would only discover a new respect for both the translators and the original authors.

wordy whimsy in Tintin

Translating from one language to another seems like it should be a practical rather than artistic thing, but the extent to which Kafka’s work is ‘Kafka-esque’ in English is in some ways a choice, and as time goes on more and more choices are available to the translator of any text. The obvious choices – whether to be true, word-for-word, to an author’s text, or to their ‘voice’ and atmosphere, whether to provide a faithful translation or a ‘good read’ have always been there. But with the passing of time and arguably just as important, is the decision of whether to make a novel or piece of writing true to its time and place or to our own. This isn’t a small thing, it’s both the readability and the character of a book. The right thing to do presumably varies from book to book, but in my experience, you don’t really know what you prefer until you come across something you don’t like.

Dostoevsky presented as a trashy airport novel (with no translator credit)

With The Castle, although the more modern text felt different to me, it wasn’t a difference that spoiled or significantly altered my enjoyment of the book, it was just something I noticed. But those translation choices can be jarring. A recent example of this came when reading two novels by the Finnish author Arto Paasilinna – The Year of the Hare (1975) and The Howling Miller (1981). Both were (which I find obscurely annoying) translated into English from French translations rather than from Finnish, but while The Howling Miller (which I read first) was written in straightforward, simple and clear English prose which felt a bit basic, but entirely appropriate to the subject, the translator of The Year of the Hare made the (completely valid) decision to translate the casual, slang-filled prose of the French translation (and presumably the Finnish original) into supposedly modern and slang-filled British English, which was deeply irritating and also damaged the integrity of the novel. Standard phrases like “bloody hell” or whatever are one thing; so familiar as to seem timeless and universal. But more slang dates quickly, is often generationally specific and can be weirdly embarrassing to read, if it’s not your slang.

Even worse in narrative terms, using regionally specific terms when you don’t change the distinctively ‘foreign’ names of characters or the setting of a book can give a feeling of unreality to the whole text. Quite possibly it’s just me, but reading a passage where a character called Kaarlo Vatanen, living in rural Finland, refers to having “twenty quid” in his pocket is kind of like reading Crime and Punishment and coming across something like “Shit! It’s the pigs!” hollered Raskolnikov. Don’t do that please.

But even though I didn’t like the idiom the translator used for The Year of the Hare, the arguments for doing it are pretty sound. When adapting a foreign, unfamiliar book for a new audience, making it accessible is clearly important. That novel was published in 1975 and implicitly set in that period, so there’s nothing technically wrong with writing it in modern, slangy English, except that it’s not set in Britain and so it feels wrong to pedants. Related but probably more difficult is translating a classic novel into modern English. I’m not really a Dickens fan, but when I think of the few books of his that I’ve read, his prose seems inseparable from his stories and from his period. Does that mean that Tolstoy or Zola’s works should be translated into “Victorian” English? Annoying as that might well be, I’m tempted to say that for me, the answer is yes.

Positives and negatives

It’s a different kind of translation, but making books into films brings these kinds of questions into focus. There have been several film adaptation of HG Wells’ The War of the Worlds, but for all of their virtues, if you return to Wells’s novel it seems obvious within the first few pages that though it’s eminently adaptable, a film of the novel set in 1898 would be far better (but presumably ridiculously expensive to make) than the existing versions. Similarly, no adaptation of Nineteen Eighty-Four has quite captured the stark, bracing post-war, entirely British greyness (in a good way) of Orwell’s prose. It’s that tone, as much as anything, that people think of as “Orwellian,” even though outside of Nineteen Eighty-Four and (to a far lesser extent) Animal Farm, it’s really not the usual tone of his writing.

The other dystopian novel frequently paired with Nineteen Eighty-Four is Aldous Huxley’s Brave New World, but despite the relative closeness of age, class and education of Orwell (born 1903) and Huxley (born 1894), they could hardly be tonally farther apart. As someone who first read and loved Huxley’s earlier, satirical social comedies like Antic Hay  (1923) and Point Counter Point (1928), the thing that struck me most when I first read Brave New World (1932) is how similar it is in its prose. Although, unlike The War of the Worlds but like Nineteen Eighty-Four, it’s set in the future, any film of it should really be set in a 1930s future and have a slightly old fashioned, ‘Boy’s Own adventure’ flavour which seems completely at odds with the book’s grim dystopian reputation. When reading the novel, its tone (which feels more post-WW1 1920s than pre-WW2 1930s), feels entirely natural and is a part of what makes the book so readable. But is it that tone there in modern foreign translations of the book? Possibly not, and when you think about it, why would it be?

The Bible and the Bloody Countess

John Donne: a portrait of the poet as a young dandy

As anyone who has had to “do” Shakespeare at school – or who likes reading him – will know, 16th/17th century writers had a respect for and love of puns that is far removed from their current status as vessels of knowingly lame humour (that said, ‘brave new world’ is from Shakespeare, isn’t it?). It’s sad that that love of wordplay has become so debased, because even though I personally do love puns even just as lame humour, it means we have to consciously think or analyse in order to appreciate the breadth of allusions and associations and therefore feelings that a writer could evoke in their readership (or a playwright in their audience) without having to labour a point.

Partly, it was easier to pun meaningfully before spelling was fully standardised. When John Donne wrote The Sun Rising* , it was risqué in the mild way it still is – the poet is complaining about the sunrise because he doesn’t want he and his girlfriend to have to get out of bed – but also in a far more daring way. To a Jacobean audience the sun (or sunne, or sonne) rising would automatically create an association with the son (of God) rising, a pun that transforms and strengthens the meaning of the poem, since, then as now (or more than now), the earthly representatives of God were not especially keen on young unmarried couples lying in bed together.

*published in 1633 but necessarily written earlier – he died in 1631 – and probably quite a lot earlier since he was known as a poet in his youth but a priest and preacher from 1615

And that textual richness just the intended meanings and associations – but as language evolves so does meaning, and so, whether one likes it or not, do associations. Since the 1960s, seeing the title The Sun Rising may well make people think of Rolf Harris’s 1960 novelty pop hit Sun Arise – a kind of well-intentioned but not unproblematic pastiche of Aboriginal Australian music that was a big hit all over the English-speaking world. Harris’s subsequent career as a popular children’s entertainer and, latterly, a hugely unpopular sexual predator make the already iffy song even more dubious, but even that creates its own set of unexpected cultural associations. Back in 1971, before settling definitively on a kind of bad taste, pantomime horror modus operandi, the American rock band Alice Cooper (then the name of both the singer and band) experimented with a kind of general absurdist, transgressive approach. To that end, on their third (but first commercially successful) album Love It To Death, alongside paeans to troubled teendom (I’m Eighteen, Is It My Body?) and old horror movies (The Ballad of Dwight Frye), the band recorded an amusingly straight-faced cover of Sun Arise, just to be smartasses. Only 40 years later did the song, turn out to be a masterstroke that unexpectedly fit in with their macabre and tasteless raison d’être after all; patience is a virtue, clearly.

But anyway, the idea of translating The Sun Rising, with even its intended meaning intact, into a language that doesn’t share common roots and words with English makes me think of Philip Larkin saying* (wrongly, I think) “A writer can have only one language, if language is going to mean anything to him.” It makes sense in a way – there can be an impersonal quality, especially when reading poetry in translation, that makes lots of translations feel the same, not that that’s always a bad thing necessarily.

*in a 1982 interview with Robert Phillips in the Paris Review (Philip Larkin, Required Writing, p.69)

Another Penguin Classics book I love is the 1965 collection Poems of the Late T’ang, in which A.C. Graham translates the works of seven Chinese poets whose lives span more than a century, from 712 to 858 AD. In his introduction, Graham stresses the differences between poets, contrasting the ‘bare, bleak style’ of Meng Chiao (751 – 814) with the ‘strange and daring’ poetry of Meng Chiao’s friend Han Yü (768 – 824) but although I love both, I don’t really find a huge tonal difference between them (just to quote the first examples of each that he publishes):

Above the gorges one thread of sky:
Cascades in the gorges twine a thousand cords
(opening lines of Sadness of the Gorges)

And

A frosty wind harries the wu-t’ung, (parasol tree)
The crowded leaves stick wilting to the tree
(opening lines of Autumn Thoughts)

It might just be me, but I don’t even detect major differences between the poetry of between Tu Fu, writing in the 750s or 60s –

The autumn wastes are each day wilder:
Cold in the river the blue sky stirs
(opening lines of The Autumn Wastes)

and Li Shang-Yin, who was writing almost a century later:

The East wind sighs, the fine rains come:
Beyond the pool of waterlilies, the noise of faint thunder.
(Untitled)

I wouldn’t expect poets in English to write this similarly, but of course the words I am reading are AC Graham’s and not Tu Fu’s or Meng Chiao’s. These are beautiful poems and if there’s a deficiency in them it’s mine, not the poets’ and certainly not the translator’s. In poetry that’s this compressed and distilled there must be a whole world of meaning, allusion and subtlety – the sort of thing I can see (when forced to think about it) in Donne – that AC Graham was aware of but could only explain in footnotes and appendices. And I’m sure that’s exactly what Philip Larkin referred to in his strictures about language – but if a writer can have only truly have one language, “if language is going to mean anything to him,” what about translators, who are almost always also writers in their own right? And what about unusual cases like JRR Tolkien or Anthony Burgess?

Burgess’s A Clockwork Orange is one of my favourite novels – it’s also the product, very obviously, of someone who could speak and think, fluently, in a lot of languages – ten is the number he usually gave, ‘with bits and pieces of others’. Burgess created the book’s slang, Nadsat, in order to write about ‘the youth’ in a way that didn’t date like real slang and it definitely worked. Rightly, I think, Burgess didn’t want a glossary of Nadsat terms in the book. Although some publishers have added one anyway, the book works far better if the reader just immerses themselves in the narrator’s voice and his disorienting world. But Burgess was only human, and in perhaps the novel’s weakest moment (because it takes us out of that world) he couldn’t resist pointing out that the language his young narrator Alex speaks isn’t just whimsy on the part of the author:

Quaint,’ said Dr Brodsky, like smiling, ‘the dialect of the tribe. Do you know anything of its provenance, Branom?
Odd bits of rhyming slang,’ said Branom, who did not look quite so much like a friend any more. ‘A bit of gipsy talk, too. But most of the roots are Slav. Propaganda. Subliminal penetration.’
All right, all right, all right,’ said Dr Brodsky, like impatient and not interested any more.

I’ve always felt that Brodsky’s impatience is really Burgess’s mild embarrassment at finding himself pointing out how clever he is, but who knows? How A Clockwork Orange works in translation I can’t imagine, especially in countries with the Slavic languages Burgess borrows from, but I can imagine it must be both a joy and a nightmare to translate.

I hope those for the sake of its readers that who tackle A Clockwork Orange come up with words as horribly effective as Burgess’s. When Alex and his gang (yes, I know they are his droogs) come across a rival gang attacking a child, Alex says that they were “just getting ready to do something on a weepy young devotchka they had there, not more than ten, she creeching away but with her platties still on,” The word “creeching” is clearly just “screeching” without the s, but somehow it seems harsher, more intense, implying a rawness related as much to a croak as a screech; Burgess knew what he was doing. So, in his very different way, did Tolkien, another linguist, who gives the cultures and places of Middle Earth their individual, believable textures via languages that draw on real prototypes in the same way as Burgess’s Nadsat does. It’s also worth comparing Tolkien’s beautifully translated Beowulf with Seamus Heaney’s very different, but equally beautiful one. Both writers have a reverence for the original text and their interpretations are similar enough to suggest fidelity to the original – but they are also different enough to demonstrate just how flexible language can be.

That flexibility suggests that no text is truly beyond translation, and the fact that fictional cultures can be realistically portrayed by the words they and their creators use hints at the power inherent in language. Like any power, it can be used in negative ways as well as good ones. Translations can, or at least could, be withheld when it was felt expedient to do so, though the internet has probably made that more difficult. It seems trivial, but something that was (up until the 1960s I’d guess) fairly common and which I’ve occasionally come across in older books, are translations of foreign texts where the narrative lapses into its original language – it even occasionally into French in books actually written in English – when the writing becomes ‘obscene.’

trashy 70s paperback of non-trashy 50s meditative biography

An example that springs to mind, because I have it, is the 1957 biography of the notorious medieval Hungarian Countess Erzsébet Bathory, by the surrealist poet Valentine Penrose (nee Hugo). In its English translation – by the also somewhat notorious Scottish writer Alexander Trocchi – Penrose’s text is rendered into sensual English, except, that is when Bathory’s predatory exploits against young peasant women in her orbit become too explicit, at which point the text falls back into French. No doubt the publisher, John Calder – who specialised in avant-garde literature and especially previously banned books – was wary of obscenity charges, which he would later fall foul of with Alexander Trocchi’s Cain’s Book and Hubert Selby Jr’s Last Exit to Brooklyn. Ironically, my 1970s NEL edition, though by design a trashy, titillating paperback, reproduces the Calder text, elisions and all. (It also features a lazy, sensationalist blurb on the cover which reveals that the publisher didn’t know that Valentine Penrose was a woman, which is unnerving).

But even if British publishers were self-censoring for mostly legal reasons, the clear lesson that comes from old editions of transgressive texts is that those with a classical education – that is, the upper classes, who routinely learned Latin, Greek and French at school, but only they – could be entrusted to read all the sex and violence they liked. I’m in two minds over whether the reason for that is the literally patronising one of ‘protecting the children’ or the more generally patronising one that the upper class could be trusted with that kind of thing but the more animalistic and irrational the lower classes might be led astray by it. Either way it’s kind of ironic, given that centuries earlier, the impetus for publishing anything at all in English was to allow the expanding literate population to read the Bible in their own language.

And if the translation of a modern text into modern English can create variations as different as a cordial vs a fervent hand-hold* imagine the pitfalls inherent in making the translation of an ancient text central to a modern civilisation. And not just ‘an ancient text’ but a collection of various ancient texts, partly written in obscure and difficult language. And add to that that key books of the text purport to be eye-witness accounts which are however written in Greek, but reporting on sermons and parables originally delivered in spoken Aramaic.

*if that seems trivial, imagine receiving an invitation to some kind of gathering that begins, “you are cordially invited to… versus “you are fervently invited to…” The second would seem a little alarming to me

We’re used to the fact that almost everything in the Bible is open to interpretation, partly because by now ‘the Christian church’ is actually hundreds of Christian churches, each with its own version of what the Bible means, and that’s just talking about the Bible as it is now, regardless of how accurately modern translations relate to the original text, or how accurately the original text relates to the events it describes. It doesn’t take much reading to discover that things as fundamental to the faith as the monotheistic nature of the Old Testament god, or the Virgin birth in the New Testament are dependent on translations which may be approximate rather than precise. Just as one example, writers – both scholarly and crank-ish – have observed that the word used to describe Mary’s state, “parthenos” in ancient Greek texts generally refers only to a young woman and not necessarily, not even usually, a virgin. Getting into murkier waters, it’s therefore been credibly suggested (by Jane Schaberg, among many others) that in the Gospels God therefore only blesses Mary’s pregnancy, rather than causing it himself. Credibly, that is, if one’s main issue with the story of Jesus is the Virgin birth, rather than the existence of God in the first place.

possibly less begetting and smiting in this bible

But however one chooses to interpret it, interpretation is required when looking at events which have come down to us in much the same way as Homer’s Odyssey, and with as many different voices involved along the way. Even if one takes the Bible at face value – notoriously difficult, in its contradictory entirety – and accepts it as truth, it’s a problematic text, to say the least. The Gospels were written down by followers of Jesus – who they knew personally, and worshipped – in the aftermath of his early death. For parts pre-dating their association with him, they are presumably relying for some parts on accounts given to them by the man himself. These would be based on his own memories of his youth and childhood, but for the circumstances of his own birth thirty-three years earlier, he presumably only had the accounts of his parents (whether earthly or divine) to rely on. Unless Jesus spoke Greek (I feel like they would have mentioned it if he had), those memories were then translated into a different language with different allusions and associations from his own, before being subjected to centuries of edits and deletions, only later being given ‘authoritative’ editions (different ones for different countries and sects), each of them offering its own, rather than the definitive truth.

So, whether we are reading Homer or Ovid or the Gospel of St Luke, or The Castle, or Asterix the Legionary in English, we are reading an adaptation, a work imagined into existence by more than one writer and if we’re lucky it’s Willa and Edwin Muir or Anthea Bell and Derek Hockridge. If we’re not so lucky we may end up inadvertently worshipping a false idol or something and, who knows, even facing eternal damnation if you believe in such things. It’s an important job.

credit where its due: the translators get (almost) equal billing with the authors

to the victor

Everyone knows that history is written by the victors, but I’m not sure it’s always realised – or I should say that I hadn’t considered – how much of a privilege that is. Of course there are the obvious material, societal privileges that come with winning, but I’m thinking specifically about history and the way it’s transmitted.

This little epiphany came while watching the now ten-year-old show, Tony Robinson’s Wild West. It’s not a piece of history that generally interests me, but Robinson belongs to the last generation that grew up with ‘cowboys and Indians’ occupying a major place in popular culture but is also a presenter who tends to look at history from a point of view that conservative critics would call postmodern; i.e. he tends to think that history is complicated and acknowledges injustices, calls exploitation exploitation and that kind of thing, and so it piqued my interest. But one of the unexpected consequences of making an enlightened history show is the focus it places on experts and the way that they come across.

Ancient Greek red-figure pottery c.480 BC

Now, I don’t mean to condemn the historians I’m talking about here; most historians are in some ways at least enthusiasts for their subject, and the best ones have a special, and sometimes quite a nerdy, bond with the period and people that they have chosen to study. When I studied ancient history, one of the best lecturers I had was a specialist on ancient Greece who had seemingly based his physical appearance on the look of Greek men on ancient red-figure pottery; curly hair, big beard and all. And there was a historian of Rome in my Classical Studies course with a ‘Julius Caesar’ haircut. And it feels natural that someone would want to immerse themselves in the civilisation which fascinates them enough to make it their life’s work.

Joe LeFors – lawman & fashion role model

And so it makes sense that some of the historians in Tony Robinson’s Wild West, with their (to British eyes) Edwardian moustaches and waistcoats should look as though they are tending the bar in a saloon in Calamity Jane, or are speaking to the camera from the frontier, while wearing their fringed buckskin jackets. It’s kind of comical, but also isn’t, in the context of that history. The clothes seem like a concession to the romanticism of the Old West, but these are modern historians, and despite being occasionally slightly squeamish about the history they are recounting, and tending to overstate the balance of power in the ‘Indian wars’ they make no serious attempt to whitewash it.

But it feels significant that the Lakota historian talking about the massacre at Wounded Knee in 1890 wasn’t wearing historical fancy dress, and it would seem strange if he was. And it’s the discussion of atrocities like Wounded Knee which, to paraphrase that historian, is framed as a war crime, because that feels historically, appropriate but in fact there was no war – it was just a crime, albeit one committed by a government. And nakedly genocidal attitudes at the time, encapsulated by a quotes from L. Frank Baum, author of – of all things – the Wizard of Oz and its sequels, make the buckskins, waistcoats and archaic facial hair and fashions of the Western historians seem somewhat incongruous.

Bartender at the Toll Gate Saloon in Black Hawk, Colorado c.1897

Hearing of the death of Sitting Bull in the aftermath of Wounded Knee, Baum wrote in his popular newspaper:

The proud spirit of the original owners of these vast prairies inherited through centuries of fierce and bloody wars for their possession, lingered last in the bosom of Sitting Bull. With his fall the nobility of the Redskin is extinguished, and what few are left are a pack of whining curs who lick the hand that smites them. The Whites, by law of conquest, by justice of civilization, are masters of the American continent, and the best safety of the frontier settlements will be secured by the total annihilation of the few remaining Indians. Why not annihilation? Their glory has fled, their spirit broken, their manhood effaced; better that they die than live the miserable wretches that they are.

Though thankfully not, or not quite, enacted, this wasn’t a controversial view at the time. But with even with a little knowledge of the historical context (there had been a government policy explicitly called the Indian Removal Act only 50 years earlier), its knowing blatancy is shocking. It’s one thing to talk or read about massacres of Greeks or Spartans or Persians thousands of years ago; it’s quite another to think of it happening in the time of your own great-grandparents, or to think of it happening elsewhere literally as I write these words.

It’s an obvious, problematic and quite cheesy comparison to make, but I’ll make it. A situation where a historian could sit in a black shirt and swastika armband, talking about World War Two – acknowledging quite reasonably the terrible things that were done on both sides, but talking, in an essentially neutral way about the formation of modern Europe, while a far more sombre Jewish historian shows us the Auschwitz museum, is almost unimaginable. And yet – judging by the way these things actually work – it’s not an especially outlandish alternate present.

the Emperor Augustus, statue from the 1st century AD with eternally modern hair

Not only is history written by the victors, but the world we live in is forged by them, and generations on from winning an overwhelming victory they are no longer ‘they,’ they are just us. But that takes time. The historian who loves ancient Greece can afford love it, wherever they come from. Greece was barbaric as well as enlightened, Greece had slaves and was misogynistic – but Greece is only what we call it now for convenience; the collection of independent city states and relatively heterogynous cultures now called ‘ancient Greece’ is not the same as the country Greece. “Rome” is not Italy. The world – especially but not exclusively the western world – has benefitted from ancient Greek and Roman culture and though it’s easy to argue for its negative influence too, that’s less the fault of those ancient people than it is our more recent ancestors who understood those cultures in the way they could and wanted to understand them.

Regarding America, and Britain and everywhere else, all of that may come, in a thousand years or so, when the state of the world will probably be as unimaginable to us now as the internet would be to Herodotus. But. There are photographs of the mass graves being dug at Wounded Knee and while the winners of the ‘Indian wars’ were able to impress their culture on the country and to define that recent history in terms of expansion, exploration and consolidation, the other side of the story is still living history too. And the losers get to just live with it and tell their stories to whoever will listen.

It’s fair to say that the nature of the conquest of America – not just the 1800s and the Old West, but everything from the founding of the first colonies to the establishment and erosion of the reservations – has never been forgotten, but it’s only relatively recently that it’s been publicly acknowledged. Tony Robinson’s television show came about precisely because he belongs to a generation – my father’s generation – where young boys were entertained by stories of heroic cowboys fighting savage Indians in a way that now looks, despite the inevitable messiness of history as opposed to fiction, like an almost perfect inversion of the facts. And I belong to one of the first generations where it hasn’t been popularly presented that way. It’s never too soon to be passionate about history or to tell the truth about it, but even though it’s always nice to find yourself on the winning team, when that happens it’s probably worth considering what it really means and how you got there. And if the dressing up still feels appropriate then go for it.

a victory over ourselves – versions of 1984 in 2025

Remembering 1984 as someone who was a child then, I find that although the clocks didn’t strike thirteen, the year – as encapsulated by two specific and very different but not unconnected childhood memories, as we’ll see – is almost as alien nowadays as Orwell’s Airstrip One. Of course, I know far more now about both 1984 and Nineteen Eighty-Four than I did at the time. I was aware – thanks mostly I think to John Craven’s Newsround – of the big, defining events of the year.

Surely the greatest ever cover for Nineteen Eighty-Four, by Stuart Hughes, for the 1990 Heinemann New Windmills edition

I knew, for instance about the Miners’ Strike and the Greenham Common Women’s Peace Camp, but they didn’t have anything like the same impact on me personally as Indiana Jones and the Temple of Doom. I remember Zola Budd tripping or whatever it was at the Olympics and Prince Harry being born, but they weren’t as important to me as Strontium Dog or Judge Dredd. Even Two Tribes by Frankie Goes to Hollywood made a bigger impression on me than most of the big news events, despite the fact that I didn’t like it. My favourite TV show that year was probably Grange Hill – and here we go.

Grange Hill, essentially a kids’ soap opera set in a big comprehensive high school in London, ran for 30 years and I recently discovered that the era of it that I remember most fondly – the series’ that ran from 1983-6 – is available on YouTube. When I eventually went to high school later in the 80s, my first impression of the school was that it was like Grange Hill, and now I find that despite the silliness and melodrama, Grange Hill still reflects the reality, the look and the texture of my high school experience in the 80s with surprising accuracy.

Grange Hill”s 5th formers in 1984 (back: Stewpot & Glenroy, front: Suzanne & Mr McGuffy)

But anyway, watching old Grange Hill episodes out of nostalgia, I was struck by how good it seems in the context of the 2020s, despite the obvious shortcomings of being made for children. Check out this scene from series seven, episode five, written by Margaret Simpson and aired in January of 1984. In among typical story arcs about headlice and bullying, the Fifth form class (17 year olds getting towards the end of their time at school) get the opportunity to attend a mock UN conference with representatives from other schools. In a discussion about that, the following exchange occurs between Mr McGuffy (Fraser Cains) and his pupils Suzanne Ross (Susan Tully), Christopher “Stewpot” Stewart (Mark Burdis), Claire Scott (Paula-Ann Bland) and Glenroy (seemingly of no last name) (Steven Woodcock). It’s worth noting that this was the year before Live Aid.

Suzanne: [re. the UN]:"It's about as effective as the school council."

Mr McGuffy: "Oh well I wouldn't quite say that. The UN does some excellent work - UNESCO, the Food and Agriculture Organisation, the UN Peacekeeping force..."   [...]

Claire: "What's the conference gonna be about?"


Mr McGuffy: "The world food problem. There was a real UN conference on this topic ten years ago..Glenroy: "Didn't solve much then, did they? Millions of people still starvin'"

Stewpot: ”Yeah that's cos they ain't got no political clout to do anything about it though, ain't it"

Glenroy: ”Naw man, it's because the rich countries keep them that waySuzanne: “The only chance a poor country's got is if it's got something we wantGlenroy:That's right - they got something the west wants and they'd better watch out because the west starts to mess with their government."

Mr McGuffy: "Well it's clear from what you've all said so far that you're interested in the sort of issues that will be discussed that weekend..."

Suzanne & Claire, 1984

It’s not too much of an exaggeration to say that that is a more mature political discussion than is often heard on Question Time in 2025. Interestingly, it’s not an argument between left and right as such, but between standard, humanitarian and more radical left-wing viewpoints. Needless to say, if it was presented on a TV show that’s popular with ten year-olds nowadays, a certain demographic would be foaming at the mouth about the BBC indoctrinating the young with “wokeness.” But as a kid this sort of discussion didn’t at all mar my enjoyment of the show – naturally there’s also a lot of comedic stuff in the series about stink bombs and money-making schemes, but one of the reasons that Grange Hill remained popular (and watchable for 8 year-olds and 15 year-olds alike) for so many years was that it refused to talk down to its audience.

The way the writers tackled the obvious big themes – racism, sexism, parents getting divorced, bullying, gangs, sex education etc – are impressive despite being, quite broad, especially when the younger pupils are the focus of the storyline, but what makes a bigger impression on me now is the background to it all. It’s a little sad – though true to Thatcher’s Britain – to see through all this period the older pupils’ low-level fretting about unemployment and whether it’s worth being in school at all.

And maybe they were right. In 1984, when Suzanne and Stewpot were 17, a fellow Londoner who could in a parallel universe have been in the year above them at Grange Hill was the 18-year-old model Samantha Fox. That year, she was The Sun newspaper’s “Page 3 Girl of the Year.” She had debuted as a topless model for the paper aged 16, which is far more mind boggling to a nostalgic middle-aged viewer of Grange Hill than it would have been to me at the time. Presumably, some parts of the anti-woke lobby would not mind Sam’s modelling as much as they would mind the Grange Hill kids’ political awareness, but who knows?

Sam Fox in (approximately) Grange Hill mode c.1986, not sure who took it

Naturally, the intended audience for Page 3 wasn’t Primary School children – but everybody knew who Sam Fox was and in the pre-internet, 4-channel TV world of 80s Britain he had a level of fame far beyond that of any porn star 40 years later (arguments about whether or not Page 3 was porn are brain-numbingly stupid, so I won’t go there; and anyway, I don’t mean porn to be a derogatory term). Anyway, Sam (she’ll always be “Sam” to people who grew up in the 80s) and her Page 3 peers made occasional accidental appearances in the classroom, to general hilarity, when the class was spreading old newspapers on our desks to prepare for art classes. It was also pretty standard then to see the “Page Three Stunnas” (as I think The Sun put it) blowing around the playground or covering a fish supper. It wasn’t like growing up with the internet, but in its own way the 80s was an era of gratuitous nudity.

a nice Yugoslav edition of 1984 from 1984

Meanwhile, on Page Three of Nineteen Eighty-Four, Winston Smith – who is, shockingly, a few years younger than I am now – is trying to look back on his own childhood to discern whether things were always as they are now:

But it was no use, he could not remember: nothing remained of his childhood except a series of bright-lit tableaux, occurring against no background and mostly unintelligible.” George Orwell, Nineteen Eighty-Four (1949) P.3 – New Windmill edition, 1990

By contrast, some of the roots of 2025 are plain to see in 1984, despite the revolution of the internet that happened halfway between then and now. As the opposing poles of the Grange Hill kids and The Sun demonstrate, there were tensions in British society which would never so far be resolved, but they would come to some kind of semi-conclusion at the end of the Thatcher era when when ‘Political Correctness,’ the chimerical predecessor of the equally chimerical ‘Woke’ began to work in its unpredictable (but I think mostly positive) ways.

 

Most obviously, Page 3 became ever more controversial and was toned down (no nipples) and then vanished from the tabloids altogether for a while (though in the 90s the appearance of “lads’ mags” which mainstreamed soft porn made the death of Page 3 kind of a pyrrhic victory.) More complicatedly, the kind of confrontational storylines about topics like racism that happened in kids shows in the 80s became a little more squeamish, to the point where (for entirely understandable reasons) racist bullies on kids’ shows would rarely use actual racist language and then barely appear at all, replaced by positivity in the shape of more inclusive casting and writing. All of which became pretty quaint as soon as the internet really took off.

a very 1984-looking edition of Nineteen Eighty-Four from 1984

So, that was part of the 1984 that I remember; what Orwell would have made of any of it I don’t know. It wasn’t his nineteen eighty-four, which might have pleased him. For me, it all looks kind of extreme but also refreshingly straightforward, though I’m sure I only think so because I was a child. It’s all very Gen-X isn’t it?

of comfort no man speak

Everybody has their comforts, but after trying to analyse some of my own to see why they should be comforting I’ve pretty much come up with nothing, or at least nothing really to add to what I wrote a few years ago; “comforting because it can be a relief to have one’s brain stimulated by something other than worrying about external events.” But that has nothing to do with what it is that makes the specific things comforting. Like many people, I have a small group of books and films and TV shows and so on that I can read or watch or listen to at almost any time, without having to be in the mood for them, and which I would classify as ‘comforting.’ They aren’t necessarily my favourite things, and they definitely weren’t all designed to give comfort, but obscurely they do. But what does that mean or signify? I’ve already said I don’t know, so it’s not exactly a cliffhanger of a question, but let’s see how I got here at least.

I’ve rewritten this part so many times: but in a way that’s apposite. I started writing it at the beginning of a new year, while wars continued to rage in Sudan and Ukraine and something even less noble than a war continued to unfold in Gaza, and as the world prepared for an only partly precedented new, oligarchical (I think at this point that’s the least I could call it) US government. Writing this now, just a few months later, events have unfolded somewhat worse than might have been expected. Those wars still continue and despite signs to the contrary, the situation in Gaza seems if anything bleaker than before. That US administration began the year by talking about taking territory from what had been allies, supporting neo-Nazi and similar political groups across the world, celebrating high profile sex offenders and violent criminals while pretending to care about the victims of sex offenders and violent criminals, and has gone downhill from there. In the original draft of this article I predicted that this Presidential term would be an even more farcical horrorshow (not in the Clockwork Orange/Nadsat sense, although Alex and his Droogs might well enjoy this bit of the 2020s; I suppose what I mean is ‘horror show’) than the same president’s previous one, and since it already feels like the longest presidency of my lifetime I guess I was right. So, between the actual news and the way it never stops coming (hard to remember, but pre-internet ‘the news’ genuinely wasn’t so relentless or inescapable, although events presumably happened at the same rate) it’s important to find comfort somewhere. The obvious, big caveat is that one has to be in a somewhat privileged position to be able to find some comfort in the first place. There are people all over the world – including here in the UK – who can only find it, if at all, in things like prayer or philosophy; but regardless, not being so dragged down by current events that you can’t function is kind of important however privileged you are, and even those who find the whole idea of ‘self-love’ inimical have to find comfort somewhere.

But where? And anyway, what does comfort even mean? Well, everyone knows what it means, but though as a word it seems fluffy and soft (Comfort fabric softener, the American word “comforter” referring to a quilt), it actually comes from the Latin “com-fortis” meaning something like “forceful strength” – but let’s not get bogged down in etymology again.

But wherever you find it, the effect of comfort has a mysterious relationship to the things that actually offer us support or soothe our grief and mental distress. Which is not obvious; if you want to laugh, you turn to something funny, which obviously subjective but never mind. Sticking to books, because I can – for me lots of things would work, if I want to be amused, Afternoon Men by Anthony Powell, Sue Townsend’s Adrian Mole books and, less obviously, The Psychopath Test by Jon Ronson always raise a smile or a laugh. Conversely, if you want to be scared or disgusted (in itself a strange and obscure desire, but a common one), you’d probably turn to horror, let’s say HP Lovecraft, Stephen King’s IT or, less generic but not so different, Bret Easton Ellis’s American Psycho. But as you might have guessed if you’ve read anything else on this website, I’d probably all of those things among my ‘comfort reads.’

not my comfort reads

But whatever I am reading, I’m not alone; people want ‘comfort reads’ and indeed there is a kind of comfort industry; these days. Not just these days, but over the years it’s developed from poetry anthologies and books of inspirational quotes to more twee versions of the same thing. I think of books of the Chicken Soup for the Soul kind (I don’t think I made that up; if I recall my mother owned a little book of that title, full of ‘words of wisdom’ and comforting quotes) as a 90s phenomenon, but that might be wrong. But at some point that evolved into the more widespread ‘mindfulness’ (colouring books, crochet, apps), Marketing-wise there have been phenomena like hygge (as far as I’ve seen books of the Chicken Soup type, but with more crossover into other areas, as with mindfulness) and, in Scotland at least, hygge rebranded, aggravatingly, as ‘coorie.’ In this context ‘coorie’ is a similar concept to ‘hygge’ but it’s not really how I’ve been used to hearing the word used through my life so something like ‘A Little Book of Coorie‘ just doesn’t sound right. But maybe a book of hygge doesn’t either, if you grew up with that word?

People take comfort in pretty much anything that distracts them, so often the best kind of comfort is being active; walking, running, working or eating, and I understand that; nothing keeps you in the moment or prevents brooding like focusing on what you’re doing. But, unless you’re in a warzone or something, it’s when you aren’t busy that the world seems the most oppressive, and while running may keep you occupied, which can be comforting, it isn’t ‘comfortable’ (for me) in the usual sense of the word. Personally, the things I do for comfort are most likely to be the same things I write about most often, because I like them; reading, listening to music, watching films or TV.

Comfort reading, comfort viewing, comfort listening are all familiar ideas, and at first I assumed that the core of what makes them comforting must be their familiarity. And familiarity presumably does have a role to play – I probably wouldn’t turn to a book I knew nothing about for comfort, though I might read something new by an author I already like. Familiarity, though it might be – thinking of my own comfort reads – the only essential ingredient for something to qualify as comforting, is in itself a neutral quality at best and definitely not automatically comforting. But even when things are comforting, does that mean they have anything in common with each other, other that the circular fact of their comforting quality? Okay, it’s getting very annoying writing (and reading) the word comforting now.

Many of the books that I’d call my all-time favourites don’t pass the comfort test; that is, I have to be in the mood for them. I love how diverse and stimulating books like Dawn Ades’ Writings on Art and Anti-Art and Harold Rosenberg’s The Anxious Object are, but although I can dip into them at almost any time, reading an article isn’t the same as reading a book. There are not many novels I like better than The Revenge for Love or The Apes of God by Wyndham Lewis. They are funny and clever and mean-spirited in a way that I love and I’ve read them several times and will probably read them again; but I never turn to Lewis for comfort. But even though he would probably be glad not to be a ‘comfort read,’ that has nothing (as far as I can tell) to do with the content of his books. Some of my ‘comfort reads’ are obvious, and in analysing them I can come up with a list of plausible points that make them comforting, but others less so.

random selection of comfort reads

In that obvious category are books I read when I was young, but that I can still happily read as an adult. There is an element of nostalgia in that I’m sure, and nostalgia in its current form is a complicated kind of comfort. I first read The Lord of the Rings in my early teens but, as I’ve written elsewhere, I had previously had it read to me as a child, so I feel like I’ve always known it. Obviously that is comforting in itself, but there’s also the fact that it is an escapist fantasy; magical and ultimately uplifting, albeit in a bittersweet way. The same goes for my favourites of Michael Moorcock’s heroic fantasy series. I read the CorumHawkmoon and Elric series’ (and various other bits of the Eternal Champion cycle) in my teens and though Moorcock is almost entirely different from Tolkien, the same factors (escapist fantasy, heroic, magical etc) apply. Even the Robert Westall books I read and loved as a kid, though they (The Watch House, The Scarecrows, The Devil on the Road, The Wind Eye, the Machine Gunners, Fathom Five) are often horrific, have the comforting quality that anything you loved when you were 11 has. Not that the books stay the same; as an adult they are, surprisingly, just as creepy as I remembered, but I also notice things I didn’t notice then. Something too mild to be called misogyny, but a little uncomfortable nonetheless and, more impressively, characters that I loved and identified with now seem like horrible little brats, which I think is actually quite clever. But that sense of identification, even with a horrible little brat, has a kind of comfort in it, possibly.

The same happens with (mentioned in too many other things on this site) IT. A genuinely nasty horror novel about a shapeshifting alien that pretends to be a clown and kills and eats children doesn’t at first glance seem like it should be comforting. But if you read it when you were thirteen and identified with the kids rather than the monster, why wouldn’t it be? Having all kind of horrible adventures with your friends is quite appealing as a child and having them vicariously via a book is the next best thing, or actually a better or at least less perilous one.

But those are books I read during or before adolescence and so the comforting quality comes to them naturally, or so it seems. The same could be true of my favourite Shakespeare plays, which I first read during probably the most intensely unhappy part of my adolescence – but in a weird, counterintuitive way, that adds to the sense of nostalgia. Sue Townsend’s Adrian Mole books are kind of in a category of their own. When I read the first one, Adrian was 13 and I would have been 11. And then, I read the second a year or so later, but the others just randomly through the years. I’m not sure I was even aware of them when they were first published, but the ones where Adrian is an adult are just as funny but also significantly more painful. It’s a strange thing to read about the adult life of a character you “knew” when you were both unhappy children. Although she had a huge amount of acclaim and success during her life, I’m still not sure Townsend gets quite the credit she’s due for making Adrian Mole a real person. Laughing at a nerdy teenager’s difficult adolescence and his cancer treatment as a still-unhappy adult is a real imaginative and empathic achievement. Still; the comfort there could be in the familiar, not just the character but the world he inhabits. Adrian is, reading him as an adult (and as he becomes an adult) surprisingly nuanced; even though he’s an uptight and conservative and in a way a little Englander and terminally unreliable as a teenager and loses none of those traits as an adult, you somehow know that you can count on him not to be a Nazi or misogynist, no small thing in this day and age.

But if Frodo and Elric and Adrian Mole are characters who I knew from childhood or adolescence, what about A Clockwork Orange, which I first read and immediately loved in my early 20s and which, despite the (complicatedly) happy ending could hardly be called uplifting? Or The Catcher in the Rye, which again I didn’t read until my 20s and have been glad ever since that I didn’t “do” it at school as so many people did. Those books have a lot in common with Adrian Mole, in the sense that they are first-person narratives by troubled teenagers. Not that Alex is “troubled” in the Adrian/Holden Caulfield sense. But maybe it’s that sense of a ‘voice’ that’s comforting? If so, what does that say about the fact that Crash by JG Ballard or worse, American Psycho is also a comfort read for me? I read both of those in my 20s too, and immediately liked them but not in the same way as The Catcher in the Rye. When I read that book, part of me responds to it in the identifying sense; that part of me will probably always feel like Holden Caulfield, even though I didn’t do the things he did or worry about ‘phonies’ as a teenager. I loved Crash from the first time I read the opening paragraphs but although there must be some sense of identification (it immediately felt like one of ‘my’ books) and although have a lot of affection for Ballard as he comes across in interviews, I don’t find myself reflected in the text, thankfully. Same (even more thankfully) with American Psycho – Patrick Bateman is an engaging, very annoying narrator (more Holden than Alex, interestingly) and I find that as with Alex in A Clockwork Orange his voice feels oddly effortless for me to read. Patrick isn’t as nice(!) or as funny or clever as Alex, but still, there’s something about his neurotic observations and hilariously tedious lists that’s – I don’t know, not soothing to read, exactly, but easy to read. Or something. Hmm.

But if Alex, Adrian, Holden and Patrick feel real, what about actual real people? I didn’t read Jake Adelstein’s Tokyo Vice until I was in my early 30s, but it quickly became a book that I can pick up and enjoy it at any time. And yet, though there is a kind of overall narrative and even a sort of happy ending, that isn’t really the main appeal; and in this case it isn’t familiarity either. It’s episodic and easy to dip into (Jon Ronson’s books have that too and so do George Orwell’s Essays and Journalism and Philip Larkin’s Selected Letters, which is another comfort read from my 20s) The culture of Japan that Adelstein documents as a young reporter has an alien kind of melancholy that is somehow hugely appealing even when it’s tragic. Another true (or at least fact-based) comfort read, Truman Capote’s In Cold Blood, which I only read in my 40s after meaning to read it ever since high school, has no business whatsoever being comforting. So why is it? I’m not getting any closer to an answer.

Predictability presumably has a role to play; as mentioned above, I wouldn’t read a book for the first time as ‘a comfort read’ and even though I said I might read a familiar author that way, it suddenly occurs to me that that is only half true. I would read Stephen King for comfort, but I can think of at least two of his books where the comfort has been undone because the story went off in a direction that I didn’t want it to. That should be a positive thing; predictability, even in genre fiction which is by definition generic to some extent, is the enemy of readability and the last thing you want is to lose interest in a thriller. I’ve never been able to enjoy whodunnit type thrillers for some reason; my mother loved them and they – Agatha Christie, Ngaio Marsh, Sue Grafton, even Dick Francis, were her comfort reads. Maybe they are too close to puzzles for my taste? Not sure.

So to summarise; well-loved stories? Sometimes comforting. Identifiable-with characters? Sometimes comforting. Authorial voices? This may be the only unifying factor in all the books I’ve listed and yet it still seems a nebulous kind of trait and Robert Westall has little in common with Sue Townsend or Bret Easton Ellis, or (etc, etc). So instead of an actual conclusion, I’ll end with a funny, sad and comforting quote from a very silly, funny but in some ways comforting book; Harry Harrison’s 1965 satirical farce Bill, the Galactic Hero. The book is in lots of ways horrific; Bill, an innocent farm boy, finds himself swept up into the space corps and a series of ridiculous and perilous adventures. The ending of the book is both funny and very bitter, but rewinding to the end of part one, Bill has lost his left arm in combat but had a new one – but a right arm, which belonged to his best friend, grafted on:

He wished he could talk to some of his old buddies, then remembered that they were all dead and his spirits dropped further. He tried to cheer himself up but could think of nothing to be cheery about until he discovered that he could shake hands with himself. This made him feel a little better. He lay back on the pillows and shook hands with himself until he fell asleep.

Harry Harrison, Bill the Galactic Hero, p.62 (Victor Gollancz, 1965)

what do you look like?

A few years ago a friend sent me a photograph of the ten-year-old us in our Primary School football team. I was able, without too much thought, to put names to all eleven of the boys, but the biggest surprise was that my initial reaction, for maybe a second but more like two seconds, was not to recognise myself. In my defence, I don’t have any other pictures of me at that age, and even more unusually, in that picture I’m genuinely smiling. Usually I froze when a camera was pointed at me (and still do, if it takes too long), but I must have felt safer than usual in a group shot, because it is a real smile and not the standard grimace that normally happened when I was asked to smile for photographs. I could possibly also be forgiven for my confusion because in contrast with my present self, ten year old me had no eyebrows, a hot-pink-to-puce complexion and unmanageably thick, wavy, fair hair; but even so, that was the face I looked at in the mirror every day for years and, more to the point, that gangly child with comically giant hands actually is me; but what would I know?

My favourite of David Hockney’s self portraits – Self Portrait with Blue Guitar (1977)

In a recent documentary, the artist David Hockney made a remark (paraphrased because I don’t have it to refer to) that resonated with me; your face isn’t for you, it’s for other people. And, as you’d expect of someone who has spent a significant part of his long career scrutinizing people and painting portraits of them, he has a point. Everyone around you has a more accurate idea of what you look like than you do. Even when you see someone ‘in real life’ who you are used to seeing in photographs or films, there’s a moment of mental recalibration; even if they look like their image, the human being before you in three dimensions is a whole different scale from the thing you are used to seeing. I remember reading in some kids’ novel that the young footballer me liked (I’m guessing Willard Price but can’t swear to it) that when being shown photographs of themselves, the indigenous people of (I think) New Guinea, not only weren’t impressed, but didn’t recognise them as anything in particular. Like Hockney, they had a point; if the Victorian people who invented photography hadn’t grown up with a tradition of ‘realistic,’ representational art would they have seen any relationship between themselves as living, breathing, colourful, space-filling three-dimensional organisms and the monochromatic marks on little flat pieces of paper? The response of the fictional New Guinea tribespeople is actually more logical than the response (surprise, wonder, awe) that’s expected of them in the novel.

Hockney went on further to say that portrait painting (if the sitter is present with the artist) gives a better idea of a person than photography does. At first this is a harder argument to buy into in a way, but it has its own logic too. A photograph, as he pointed out, is a two-dimensional record of one second in time, whereas the portrait painter creates their also two-dimensional image from spending time in the company of the sitter and focusing on them, a different, deeper kind of focus, since it engages the brain as well as the senses, than the technical one that happens with a lens, light and film or digital imaging software. A camera doesn’t care what you are like, it just sees how you look, from that angle, for that second. Maybe my big 10-year-old smile really is representative of how I was, but from memory it doesn’t represent that period for me at all.

Egon Schiele in his studio c.1915 (left) vs his 1913 self-portrait (right)

But I might never have written this had I not been reading Frank Whitford’s excellent monograph on the Austrian expressionist painter Egon Schiele (Thames & Hudson, 1981). Schiele is famous for (among other things) his twisted, emaciated and fanatically awkward self-portraits. The man he depicts is scrawny, elongated, intense, sometimes almost feline and utterly modern. Schiele in photographs, on the other hand, is quite a different presence. He sometimes has the expected haunted look and the familiar shock of hair, and he poses almost as awkwardly, but otherwise he looks surprisingly dapper, civilised, diminutive, square faced and elfin. But if we think – and it seems logical that we do – that the photographs show us the ‘real’ Schiele, then the descriptions of those who knew him suggest otherwise. “a slim young man of more than average height… Pale but not sickly, thin in the face, large dark eyes and full longish dark brown hair which stood out in all directions. His manner was a little shy, a little timid and a little self-confident. He did not say much, but when spoken to his face always lit up with the glimmer of a quiet smile.” (Heinrich Benesch, quoted in Whitford, p.66) This description doesn’t exactly clash with the Schiele of the photographs (though he never appears especially tall), but it’s somehow far easier to identify with the dark-eyed, paradoxically shy and confident Schiele of the self portraits. In his own writings, Schiele seems as tortured and intense as in his paintings, but in photographs he appears confident, knowing and slightly arch.  His face, as Hockney says, may not have been for him, but he seems to have captured it in his art in ways that his friends and acquaintances recognised, and which the camera apparently didn’t.

Schiele in 1914 by Josef Anton Trčka (left) vs his 1911 self portrait (right)

Which, what, proves Hockney both right (portraiture is superior to photography) and wrong (Schiele knew his own face)? And anyway, what does that have to do with the 10-year old me? Nothing really, except that the camera, objective and disinterested, captured an aspect of me in that second which may or may not have been “true.” Objectivity and disinterestedness are positive qualities for evaluating facts, but when it comes to human beings, facts and truth have a complicated relationship. Photography, through its “realness,” has issues capturing these complexities, unless the photographer is aware of them and – Diane Arbus and Nan Goldin spring to mind – has the ability to imbue their work with more than the obvious surface information that is the camera’s speciality. But manually-created art, with its human heart and brain directing, naturally takes the relationship between truth and facts in its stride.

One final example that proves nothing really, except to my satisfaction. Around the year 1635, the Spanish painter Diego Velázquez was tasked with painting portraits of the assorted fools, jesters dwarfs and buffoons whose lives were spent entertaining the Spanish court. Most of these people suffered from mental or physical disabilities (or both) and were prized (I think a more accurate word than ‘valued’ in this context) for their difference from ‘normal’ people; in the same way as carnival “freaks” into the early 20th century in fact. Although these people were comparatively privileged, compared to what their lives would have been like had they not been adopted by the Royal court, their position in the household was more akin to pets than friends or even servants. Juan de Calabazas (“John of Gourds; a gourd was a traditional jester’s attribute) suffered from unknown mental illnesses and physical tics. In a time and place where formality and manners were rigidly maintained, especially around the monarch – where a misstep in etiquette could have serious or even fatal consequences, buffoons like Juan entertained the court with unfettered, sometimes nonsensical or outrageous speech, impulsive laughter and strange, free behaviour. Whereas in normal society these people would be lucky even to survive, in the Court their behaviour was celebrated and encouraged. Velázquez is rightly famous for the empathy and humanity with which he painted portraits of these marginalised figures, but although, as Wikipedia (why not?) puts it; “Velázquez painted [Juan] in a relatively calm state, further showing Velazquez’s equal show of dignity to all, whether king or jester” that seems an unusual response to the portrait below, It’s not untrue, but for me at least, Velázquez’s process of humanisation is painful too. The knowledge that this man lived his life as a plaything of the rich and powerful, alive only because they found him funny is troubling enough. But that pathos seems to be embodied in the picture and you know, or it feels like you know, that Velázquez didn’t find him funny, or at least not only funny. It’s something like watching David Lynch’s The Elephant Man compared to looking at the Victorian photographs of the real Joseph Merrick. Seeing the photographs is troubling, seeing Lynch’s cinematic portrait is too, but it’s deeply moving too.

Juan de Calabazas (c.1635-9) by Diego Velázquez

All of which may just be a way of saying that a camera is a machine and does what it does – recording the exterior of what it’s pointed at – perfectly, while a human being does, and feels, many things simultaneously, probably not perfectly. Well I’m sure we all knew that anyway. I eventually got eyebrows, by the way.

 

most things don’t exist

 

eh, Mel Gibson: but he played a good Hamlet (dir Franco Zeffirelli, 1990)

With apologies to Marcel Proust – but not very vehement apologies, because it’s true – the taste of honey on toast is as powerfully evocative and intensely transporting to me as anything that I can think of. The lips and tongue that made that association happen don’t exist anymore and neither does the face, neither do the eyes, and neither does one of the two brains and/or hearts* that I suppose really made it happen (mine are still there, though). In 21st century Britain, it’s more likely than not that even her bones don’t exist anymore, which makes the traditional preoccupation with returning to dust feel apt and more immediate and (thankfully?) reduces the kind of corpse-fetishising morbidity that seems to have appealed so much to playgoers in the Elizabethan/Jacobean era.

Death & Youth (c.1480-90) by the unknown German artist known as The Master of the Housebook

Thou shell of death,
Once the bright face of my betrothed lady,
When life and beauty naturally fill’d out
These ragged imperfections,
When two heaven-pointed diamonds were set
In those unsightly rings: then ’twas a face
So far beyond the artificial shine
Of any woman’s bought complexion

(The Revenger’s Tragedy (1606/7) by Thomas Middleton and/or Cyril Tourneur, Act one, Scene one)

 

                                                                                                     *is the heart in the brain? In one sense obviously not, in another maybe, but the sensations associated with the heart seem often to happen somewhere around the stomach; or is that just me?

More to the point, “here hung those lips that I have kissed I know not how oft“, etc. All of which is beautiful; but for better or worse, a pile of ash isn’t likely to engender the same kind of thoughts or words as Yorick’s – or anybody’s – skull. But anyway, the non-existence of a person – or, even more abstractly, the non-existence of skin that has touched your skin (though technically of course all of the skin involved in those kisses has long since disappeared into dust and been replaced anyway) is an absence that’s strange and dismal to think about. But then most things don’t exist.

Vanitas: Still Life with Skull (c.1671) by an unknown English painter

But honey does exist of course; and the association between human beings and sugary bee vomit goes back probably as long as human beings themselves. There are Mesolithic cave paintings, 8000 years old or more, made by people who don’t exist, depicting people who may never have existed except as drawings, or may have once existed but don’t anymore, plundering beehives for honey. Honey was used by the ancient Egyptians, who no longer exist, in some of their most solemn rites, it had sacred significance for the ancient Greeks, who no longer exist, it was used in medicine in India and China, which do exist now but technically didn’t then, by people who don’t, now. Mohammed recommended it for its healing properties; it’s a symbol of abundance in the Bible and it’s special enough to be kosher despite being the product of unclean insects. It’s one of the five elixirs of Hinduism, Buddha was brought honey by a monkey that no longer exists. The Vikings ate it and used it for medicine too. Honey was the basis of mead, the drink of the Celts who sometimes referred to the island of Britain as the Isle of Honey.

probably my favourite Jesus & Mary Chain song: Just Like Honey (1985)

And so on and on, into modern times. But also (those Elizabethan-Jacobeans  again) “The sweetest honey is loathsome in its own deliciousness. And in the taste destroys the appetite.” (William Shakespeare, Romeo and Juliet (c.1595) Act 2, scene 6)Your comfortable words are like honey. They relish well in your mouth that’s whole; but in mine that’s wounded they go down as if the sting of the bee were in them.”(John Webster, The White Devil (1612), Act 3. Sc.ene 3). See also “honey trap”. “Man produces evil as a bee produces honey.”You catch more flies with honey.

But on the whole, the sweetness of honey is not and has never been sinister. A Taste of Honey, Tupelo Honey, “Wild Honey,” “Honey Pie”, “Just like Honey,” “Me in Honey,” “Put some sugar on it honey,” Pablo Honey, “Honey I Sure Miss You.” Honey to the B. “Honey” is one of the sweetest (yep) of endearments that people use with each other. Winnie-the-Pooh and Bamse covet it. Honey and toast tasted in a kiss at the age of 14 is, in the history of the world, a tiny and trivial thing, but it’s enough to resonate throughout a life, just as honey has resonated through the world’s human cultures. Honey’s Dead. But the mouth that tasted so sweetly of honey doesn’t exist anymore. Which is sad, because loss is sad. But how sad? Most things never exist and even most things that have existed don’t exist now, so maybe the fact that it has existed is enough.

“Most things don’t exist” seems patently untrue: for a thing to be ‘a thing’ it must have some kind of existence, surely? And yet, even leaving aside things and people that no longer exist, we are vastly outnumbered by the things that have never existed, from the profound to the trivial. Profound, well even avoiding offending people and their beliefs, probably few people would now say that Zeus and his extended family are really living in a real Olympus. Trivially, 70-plus years on from the great age of the automobile, flying cars as imagined by generations of children, as depicted in books and films, are still stubbornly absent from the skies above our roads. The idea of them exists, but even if – headache-inducing notion – it exists as a specific idea (“the idea of a flying car”), rather than just within the general realm of “ideas,” an idea is an idea, a thing perhaps but not the thing that it is about. Is a specific person’s memory of another person a particular thing because it relates to a particular person, or does it exist only under the larger and more various banner of “memories”? Either way, it’s immaterial, because even though the human imagination is a thing that definitely exists, the idea of a flying car is no more a flying car than Leonardo da Vinci’s drawing of a flying machine was a flying machine or that my memory of honey-and-toast kisses is a honey-and-toast kiss.

If you or I picture a human being with electric blue skin, we can imagine it and if we have the talent we can draw it, someone could depict it in a film, but it wouldn’t be the thing itself, because human beings with electric blue skin, like space dolphins, personal teleportation devices, seas of blood, winged horses, articulate sentient faeces and successful alchemical experiments, don’t exist. And depending on the range of your imagination (looking at that list mine seems a bit limited), you could think of infinite numbers of things that don’t exist. There are also, presumably, untold numbers of things that do exist but that we personally don’t know about or that we as a species don’t know about yet. But even if it was possible to make a complete list of all of the things in existence (or things in existence to date; new things are invented or develop or evolve all the time), it would always be possible to think of even more things that don’t exist, – simply, in the least imaginative way, by naming variations on, or parodies of everything that does exist. So supermassive black holes exist? Okay, but what about supertiny pink holes? What about supermedium beige holes? This June, a new snake (disappointingly named Ovophis jenkinsi) was discovered. But what about a version of Ovophis jenkinsi that sings in Spanish or has paper bones or smells like Madonna? They don’t exist.

JAMC Honey’s Dead, 1992

Kind of a creepy segue if you think about it (so please don’t), but like those beautifully-shaped lips that tasted of honey, my mother no longer exists, except as a memory, or lots of different memories, belonging to lots of different people. Presumably she exists in lots of memories as lots of different people who happen to have the same name. But unlike supermedium beige holes, the non-existence of previously-existing things and people is complex, because of the different perspectives they are remembered from. But regardless, they are still fundamentally not things anymore. But even with the ever-growing, almost-infinite number of things, there are, demonstrably, more things that don’t exist. And, without wishing to be horribly negative or repeating things I’ve written before, one of the surprises with the death of a close relative was to find that death does exist. Well, obviously, everyone knows that – but not just as an ending or as the absence of life, as was always known, but as an active, grim-reaper-like force of its own. For me, the evidence for that – which I’m sure could be explained scientifically by a medical professional – is the cold that I mentioned in the previous article. Holding a hand that gets cold seems pretty normal; warmth ebbing away as life ebb away; that’s logical and natural. But this wasn’t the expected (to me) cooling down of a warm thing to room temperature, like the un-drunk cups of tea which day after day were brought and cooled down because the person they were brought for didn’t really want them anymore, just the idea of them. That cooling felt natural, as did the warming of the glass of water that sat un-drunk at the bedside because the person it was for could no longer hold or see it. That water had been cold but had warmed up to room temperature, but the cold in the hand wasn’t just a settling in line with ambient conditions. It was active cold; hands chilling and then radiating cold in quite an intense way, a coldness that dropped far below room temperature. I mentioned it to a doctor during a brief, unbelievably welcome break to get some air, and she said “Yes, she doesn’t have long left.” Within a few days I wished I’d asked for an explanation of where that cold was coming from; where is it generated? Which organ in the human body can generate cold so quickly and intensely? Does it do it in any other situations? And if not, why not? So, although death can seem abstract, in the same sense that ‘life’ seems abstract, being big and pervasive, death definitely exists. But as what? Don’t know; not a single entity, since it’s incipient in everyone, coded into our DNA: but that coding has nothing to do with getting hit by cars or drowning or being shot, does it? So, a big question mark to that. Keats would say not to question it, just to enjoy the mystery. Well alright then.

Klaus Nomi as “the Cold Genius” from his 1981 version of Purcell’s “The Cold Song”

But since most things *don’t* exist, but death definitely does exist, existence is, in universal terms, rare enough to be something like winning the lottery. But like winning the lottery, existence in itself is not any kind of guarantee of happiness or satisfaction or even honey-and-toast kisses; but it at least offers the possibility of those things, whereas non-existence doesn’t offer anything, not even peace, which has to be experienced to exist. We have all not existed before and we will all not exist again; but honey will still be here, for as long as bees are at least. I don’t know if that’s comforting or not. But if you’re reading this – and I’m definitely writing it – we do currently exist, so try enjoy your lottery win, innit.

Something silly about music next time I think.

Ancient Roman vanitas mosaic showing a skull and the wheel of fortune

passive-digressive

There are two kinds of people* – those who like forewords, introductions, prefaces, author’s notes, footnotes, appendices, bibliographies, notes on the text, maps etc, and those who don’t. But we’ll get back to that shortly.

* there are more than two kinds of people. Possibly infinite kinds of people. Or maybe there’s only one kind; I’m never sure

A few times recently, I’ve come across the idea (which I think is mainly an American academic one, but I might be completely mistaken about that) that parentheses should only be used when you really have to (but when do you really have to?) because anything that is surplus to the requirements of the main thrust of one’s text is surplus to requirements full stop, and should be left out. But that’s wrong. The criticism can be and is extended to anything that interrupts the flow* of the writing. But that is also wrong. Unless you happen to be writing a manual or a set of directions or instructions, writing isn’t (or needn’t be) a purely utilitarian pursuit and the joy of reading (or of writing) isn’t in how quickly or efficiently (whatever that means in this context) you can do it. Aside from technical writing, the obvious example where economy just may be valuable is poetry – which however is different and should probably have been included in a footnote, because footnotes are useful for interrupting text without separating the point you’re making (in a minute) from the point you’re commenting on or adding to (a few sentences ago), without other, different stuff getting in the way.

*like this¹                                                                                                                                                                ¹but bear in mind that people don’t write footnotes by accident – the interruption is deliberate²                        ²and sometimes funny

Poly-Olbion – that’s how you write a title page to pull in the readers

I would argue (though the evidence of a lot of poetry itself perhaps argues against me – especially the Spenser’s Faerie Queen, Michael Drayton’s Poly-Olbion kind of poetry that I’m quite fond of) that a poem should be** the most economical or at least the most effective way of saying what you have to say – but who’s to say that economical and effective are the same thing anyway?)

** poets, ignore this; there is no should be

 

 

 

Clearly (yep), the above is a needlessly convoluted way of writing, and can be soul-achingly annoying to read; but – not that this is an effective defence – I do it on purpose. As anyone who’s read much here before will know, George Orwell is one of my all-time favourite writers, and people love to quote his six rules for writing, but while I would certainly follow them if writing a news story or article where brevity is crucial, otherwise I think it’s more sensible to pick and choose. So;

Never use a metaphor, simile, or other figure of speech which you are used to seeing in print. Absolutely; although sometimes you would use them because they are familiar, if making a specific point, or being amusing. Most people, myself included, just do it by accident; because where does the dividing line fall? In this paragraph I have used “by accident” and “dividing line” which seem close to being commonly used figures of speech (but then so does “figure of speech”). But would “accidentally” or something like “do it without thinking” be better than “by accident?” Maybe.

Never use a long word where a short one will do. The key point here is will do. In any instance where a writer uses (for example) the word “miniscule” then “small” or “tiny” would probably “do”. But depending on what it is they are writing about, miniscule or microscopic might “do” even better. Go with the best word, not necessarily the shortest.

If it is possible to cut a word out, always cut it out. Note that Orwell wrote ‘always’ here where he could just have said If it is possible to cut a word out, cut it out. Not everything is a haiku, George.

Never use the passive where you can use the active. Surely it depends what you’re writing? If you are trying, for instance, to pass the blame for an assault from a criminal on to their victim, you might want a headline that says “X stabbed after drug and alcohol binge” rather than “Celebrity kills X.” You kind of see Orwell’s point though.

Never use a foreign phrase, a scientific word, or a jargon word if you can think of an everyday English equivalent. Both agree and disagree; as a mostly monolingual person I agree, but some words and phrases (ironically, usually ones in French, a language I have never learned and feel uncomfortable trying to pronounce; raison d’etre or enfant terrible for example) just say things more quickly and easily (I can be utilitarian too) than having to really consider and take the time to say what you mean. They are a shorthand that people in general understand. Plus, in the age of smartphones, it really doesn’t do native English speakers any harm to have to look up the meanings of foreign words occasionally (I do this a lot). The other side of the coin (a phrase I’m used to seeing in print) is that with foreign phrases is it’s funny to say them in bad translations like “the Tour of France” (which I guess must be correct) or “piece of resistance” (which I am pretty sure isn’t) so as long as you are understood (assuming that you want to be understood) use them any way you like.

Break any of these rules sooner than say anything outright barbarous. It’s hard to guess what George Orwell would have considered outright barbarous (and anyway, couldn’t he have cut “outright”?) but anyone reading books from even 30, 50 or a hundred years ago quickly sees that language evolves along with culture, so that rules – even useful ones – rarely have the permanence of commandments.

So much for Orwell’s rules; I was more heartened to find that something I’ve instinctively done – or not done – is supported by Orwell elsewhere. That is, that I prefer, mostly in the name of cringe-avoidance, not to use slang that post-dates my own youth. Even terms that have become part of normal mainstream usage (the most recent one is probably “woke”) tend to appear with inverted commas if I feel like I must use them, because if it’s not something I would be happy to say out loud (I say “woke” with inverted commas too) then I’d prefer not to write it. There is no very logical reason for this and words that I do comfortably use are no less subject to the whims of fashion, but still; the language you use is part of who you are, and I think Orwell makes a very good case here, (fuller version far below somewhere because even though I have reservations about parts of it it ends very well):

“Each generation imagines itself to be more intelligent than the one that went before it, and wiser than the one that comes after it. This is an illusion, and one should recognise it as such, but one ought also to stick to one’s world-view, even at the price of seeming old-fashioned: for that world-view springs out of experiences that the younger generation has not had, and to abandon it is to kill one’s intellectual roots.”

Review of A Coat of Many Colours: Occasional Essays by Herbert Read. (1945) The Collected Essays, Journalism and Letters of George Orwell Volume 4. Penguin 1968, p.72 

the fold-out map in The Silmarillion is a thing of beauty

Back to those two kinds* of people: I am the kind of person that likes and reads forewords, introductions, prefaces, author’s notes, footnotes, appendices, bibliographies, notes on the text, maps and all of those extras that make a book more interesting/informative/tedious.

 

*I know.

 

In one of my favourite films, Whit Stillman’s Metropolitan (1990), the protagonist Tom Townsend (Edward Clements), says “I don’t read novels. I prefer good literary criticism. That way you get both the novelists’ ideas as well as the critics’ thinking. With fiction I can never forget that none of it really happened, that it’s all just made up by the author.” Well, that is not me; but I do love a good bit of criticism and analysis as well as a good novel. One of my favourite ever pieces of writing of any kind, which I could, but choose not to recite parts of by heart, is the late Anne Barton’s introduction to the 1980 New Penguin Shakespeare edition of Hamlet*. I love Hamlet, but I’ve read Barton’s introduction many more times than I’ve read the play itself, to the point where phrases and passages have become part of my mind’s furniture. It’s a fascinating piece of writing, because Professor Barton had a fascinating range and depth of knowledge, as well as a passion for her subject; but also and most importantly because she was an excellent writer. If someone is a good enough writer**, you don’t even have to be especially interested in the subject to enjoy what they write. Beyond the introduction/footnote but related in a way are the review and essay. Another of my favourite books – mentioned elsewhere I’m sure, as it’s one of the reasons that I have been working as a music writer for the past decade and a half, is Charles Shaar Murray’s Shots from the Hip, a collection of articles and reviews. The relevant point here is that more than half of its articles – including some of my favourites – are about musicians whose work I’m quite keen never to hear under any circumstances, if humanly possible. Similarly, though I find it harder to read Martin Amis’s novels than I used to (just changing taste, not because I think they are less good), I love the collections of his articles, especially The War Against Cliché and Visiting Mrs Nabokov. I already go on about Orwell too much, but as I must have said somewhere, though I am a fan of his novels, it’s the journalism and criticism that he probably thought of as ephemeral that appeals to me the most.

*All of the New Penguin Shakespeare introductions that I’ve read have been good, but that is in a different league. John Dover Wilson’s What Happens in Hamlet (1935, though the edition I have mentions WW2 in the introduction, as I remember; I like the introduction) is sometimes easy to disagree with but it has a similar excitement-of-discovery tone as Anne Barton’s essay

** Good enough, schmood enough; what I really mean is if you like their writing enough. The world has always been full of good writers whose work leaves me cold

a scholarly approach to comics

All this may have started, as I now realise that lots of things seem to in my writing did, with Tolkien. From the first time I read his books myself, I loved that whatever part of Middle-Earth and its people you were interested in, there was always more to find out. Appendices, maps, whole books like The Silmarillion which extended the enjoyment and deepened the immersion in Tolkien’s imaginary world. And they were central to that world – for Tolkien, mapping Middle-Earth was less making stuff up than it was a detailed exploration of something he had already at least half imagined. Maybe because I always wanted to be a writer myself – and here I am, writing – whenever I’ve really connected with a book, I’ve always wanted to know more. I’ve always been curious about the writer, the background, the process. I’ve mentioned Tintin lots of times in the past too and my favourite Tintin books were, inevitably, the expanded editions which included Herge’s sketches and ideas, the pictures and objects and texts that inspired him. I first got one of those Tintin books when I was 9 or so, but as recently as the last few years I bought an in many ways similar expanded edition of one of my favourite books as an adult, JG Ballard’s Crash. It mirrors the Tintins pretty closely; explanatory essays, sketches, notes, ephemera, all kinds of related material. Now just imagine how amazing a graphic novel of Crash in the Belgian ligne claire style would be.*

*a bit like Frank Miller and Geof Darrow’s fantastic-looking but not all that memorable Hard Boiled (1990-92) I guess, only with fewer robots-with-guns shenanigans and more Elizabeth Taylor

a scholarly approach to cautionary 1970s semi-pornography/horror: the expanded Crash

A good introduction or foreword is (I think) important for a collection of poems or a historical text of whatever kind. Background and context and, to a lesser extent, analysis expand the understanding and enjoyment of those kinds of things. An introduction for a modern novel though is a slightly different thing and different also from explanatory notes, appendices and footnotes and it’s probably not by chance that they mainly appear in translations or reprints of books that already enjoyed some kind of zeitgeisty success. When I first read Anne Barton’s introduction to Hamlet, I already knew what Hamlet was about, more or less. And while I don’t think “spoilers” are too much of an issue with fiction (except for whodunnits, which I have so far not managed to enjoy), do you really want to be told what to think of a book before you read it? But a really good introduction will never tell you that. If in doubt, read them afterwards!

Some authors, and many readers, see all of these extraneous things as excess baggage, surplus to requirements, which obviously they really are, and that’s fair enough. If the main text of a novel, a play or whatever, can’t stand on its own then no amount of post-production scaffolding will make it satisfactory.* And presumably, many readers pass their entire lives without finding out or caring why the author wrote what they wrote, or what a book’s place in the pantheon of literature (or just “books”) is. Even as unassailably best-selling an author as Stephen King tends to be a little apologetic about the author’s notes that end so many of his books, despite the fact that nobody who doesn’t read them will ever know that he’s apologetic. Still; I for one would like to assure his publisher that should they ever decide to put together all of those notes, introductions and prefaces in book form, I’ll buy it. But would Stephen King be tempted to write an introduction for it?

 

* though of course it could still be interesting, like Kafka’s Amerika, Jane Austen’s Sanditon or Tolkien and Hergé (them again) with Unfinished Tales or Tintin and Alph-Art

 

That Orwell passage in full(er):

“Clearly the young and middle aged ought to try to appreciate one another. But one ought also to recognise that one’s aesthetic judgement is only fully valid between fairly well-defined dates. Not to admit this is to throw away the advantage that one derives from being born into one’s own particular time. Among people now alive there are two very sharp dividing lines. One is between those who can and can’t remember the period before 1914; the other is between those who were adults before 1933 and those who were not.* Other things being equal, who is likely to have a truer vision at the moment, a person of twenty or a person of fifty? One can’t say, though on some points posterity may decide. Each generation imagines itself to be more intelligent than the one that went before it, and wiser than the one that comes after it. This is an illusion, and one should recognise it as such, but one ought also to stick to one’s world-view, even at the price of seeming old-fashioned: for that world-view springs out of experiences that the younger generation has not had, and to abandon it is to kill one’s intellectual roots.”

*nowadays, the people who can or can’t remember life before the internet and those who were adults before 9/11? Or the Trump presidency? Something like that seems right