of comfort no man speak

Everybody has their comforts, but after trying to analyse some of my own to see why they should be comforting I’ve pretty much come up with nothing, or at least nothing really to add to what I wrote a few years ago; “comforting because it can be a relief to have one’s brain stimulated by something other than worrying about external events.” But that has nothing to do with what it is that makes the specific things comforting. Like many people, I have a small group of books and films and TV shows and so on that I can read or watch or listen to at almost any time, without having to be in the mood for them, and which I would classify as ‘comforting.’ They aren’t necessarily my favourite things, and they definitely weren’t all designed to give comfort, but obscurely they do. But what does that mean or signify? I’ve already said I don’t know, so it’s not exactly a cliffhanger of a question, but let’s see how I got here at least.

I’ve rewritten this part so many times: but in a way that’s apposite. I started writing it at the beginning of a new year, while wars continued to rage in Sudan and Ukraine and something even less noble than a war continued to unfold in Gaza, and as the world prepared for an only partly precedented new, oligarchical (I think at this point that’s the least I could call it) US government. Writing this now, just a few months later, events have unfolded somewhat worse than might have been expected. Those wars still continue and despite signs to the contrary, the situation in Gaza seems if anything bleaker than before. That US administration began the year by talking about taking territory from what had been allies, supporting neo-Nazi and similar political groups across the world, celebrating high profile sex offenders and violent criminals while pretending to care about the victims of sex offenders and violent criminals, and has gone downhill from there. In the original draft of this article I predicted that this Presidential term would be an even more farcical horrorshow (not in the Clockwork Orange/Nadsat sense, although Alex and his Droogs might well enjoy this bit of the 2020s; I suppose what I mean is ‘horror show’) than the same president’s previous one, and since it already feels like the longest presidency of my lifetime I guess I was right. So, between the actual news and the way it never stops coming (hard to remember, but pre-internet ‘the news’ genuinely wasn’t so relentless or inescapable, although events presumably happened at the same rate) it’s important to find comfort somewhere. The obvious, big caveat is that one has to be in a somewhat privileged position to be able to find some comfort in the first place. There are people all over the world – including here in the UK – who can only find it, if at all, in things like prayer or philosophy; but regardless, not being so dragged down by current events that you can’t function is kind of important however privileged you are, and even those who find the whole idea of ‘self-love’ inimical have to find comfort somewhere.

But where? And anyway, what does comfort even mean? Well, everyone knows what it means, but though as a word it seems fluffy and soft (Comfort fabric softener, the American word “comforter” referring to a quilt), it actually comes from the Latin “com-fortis” meaning something like “forceful strength” – but let’s not get bogged down in etymology again.

But wherever you find it, the effect of comfort has a mysterious relationship to the things that actually offer us support or soothe our grief and mental distress. Which is not obvious; if you want to laugh, you turn to something funny, which obviously subjective but never mind. Sticking to books, because I can – for me lots of things would work, if I want to be amused, Afternoon Men by Anthony Powell, Sue Townsend’s Adrian Mole books and, less obviously, The Psychopath Test by Jon Ronson always raise a smile or a laugh. Conversely, if you want to be scared or disgusted (in itself a strange and obscure desire, but a common one), you’d probably turn to horror, let’s say HP Lovecraft, Stephen King’s IT or, less generic but not so different, Bret Easton Ellis’s American Psycho. But as you might have guessed if you’ve read anything else on this website, I’d probably all of those things among my ‘comfort reads.’

not my comfort reads

But whatever I am reading, I’m not alone; people want ‘comfort reads’ and indeed there is a kind of comfort industry; these days. Not just these days, but over the years it’s developed from poetry anthologies and books of inspirational quotes to more twee versions of the same thing. I think of books of the Chicken Soup for the Soul kind (I don’t think I made that up; if I recall my mother owned a little book of that title, full of ‘words of wisdom’ and comforting quotes) as a 90s phenomenon, but that might be wrong. But at some point that evolved into the more widespread ‘mindfulness’ (colouring books, crochet, apps), Marketing-wise there have been phenomena like hygge (as far as I’ve seen books of the Chicken Soup type, but with more crossover into other areas, as with mindfulness) and, in Scotland at least, hygge rebranded, aggravatingly, as ‘coorie.’ In this context ‘coorie’ is a similar concept to ‘hygge’ but it’s not really how I’ve been used to hearing the word used through my life so something like ‘A Little Book of Coorie‘ just doesn’t sound right. But maybe a book of hygge doesn’t either, if you grew up with that word?

People take comfort in pretty much anything that distracts them, so often the best kind of comfort is being active; walking, running, working or eating, and I understand that; nothing keeps you in the moment or prevents brooding like focusing on what you’re doing. But, unless you’re in a warzone or something, it’s when you aren’t busy that the world seems the most oppressive, and while running may keep you occupied, which can be comforting, it isn’t ‘comfortable’ (for me) in the usual sense of the word. Personally, the things I do for comfort are most likely to be the same things I write about most often, because I like them; reading, listening to music, watching films or TV.

Comfort reading, comfort viewing, comfort listening are all familiar ideas, and at first I assumed that the core of what makes them comforting must be their familiarity. And familiarity presumably does have a role to play – I probably wouldn’t turn to a book I knew nothing about for comfort, though I might read something new by an author I already like. Familiarity, though it might be – thinking of my own comfort reads – the only essential ingredient for something to qualify as comforting, is in itself a neutral quality at best and definitely not automatically comforting. But even when things are comforting, does that mean they have anything in common with each other, other that the circular fact of their comforting quality? Okay, it’s getting very annoying writing (and reading) the word comforting now.

Many of the books that I’d call my all-time favourites don’t pass the comfort test; that is, I have to be in the mood for them. I love how diverse and stimulating books like Dawn Ades’ Writings on Art and Anti-Art and Harold Rosenberg’s The Anxious Object are, but although I can dip into them at almost any time, reading an article isn’t the same as reading a book. There are not many novels I like better than The Revenge for Love or The Apes of God by Wyndham Lewis. They are funny and clever and mean-spirited in a way that I love and I’ve read them several times and will probably read them again; but I never turn to Lewis for comfort. But even though he would probably be glad not to be a ‘comfort read,’ that has nothing (as far as I can tell) to do with the content of his books. Some of my ‘comfort reads’ are obvious, and in analysing them I can come up with a list of plausible points that make them comforting, but others less so.

random selection of comfort reads

In that obvious category are books I read when I was young, but that I can still happily read as an adult. There is an element of nostalgia in that I’m sure, and nostalgia in its current form is a complicated kind of comfort. I first read The Lord of the Rings in my early teens but, as I’ve written elsewhere, I had previously had it read to me as a child, so I feel like I’ve always known it. Obviously that is comforting in itself, but there’s also the fact that it is an escapist fantasy; magical and ultimately uplifting, albeit in a bittersweet way. The same goes for my favourites of Michael Moorcock’s heroic fantasy series. I read the CorumHawkmoon and Elric series’ (and various other bits of the Eternal Champion cycle) in my teens and though Moorcock is almost entirely different from Tolkien, the same factors (escapist fantasy, heroic, magical etc) apply. Even the Robert Westall books I read and loved as a kid, though they (The Watch House, The Scarecrows, The Devil on the Road, The Wind Eye, the Machine Gunners, Fathom Five) are often horrific, have the comforting quality that anything you loved when you were 11 has. Not that the books stay the same; as an adult they are, surprisingly, just as creepy as I remembered, but I also notice things I didn’t notice then. Something too mild to be called misogyny, but a little uncomfortable nonetheless and, more impressively, characters that I loved and identified with now seem like horrible little brats, which I think is actually quite clever. But that sense of identification, even with a horrible little brat, has a kind of comfort in it, possibly.

The same happens with (mentioned in too many other things on this site) IT. A genuinely nasty horror novel about a shapeshifting alien that pretends to be a clown and kills and eats children doesn’t at first glance seem like it should be comforting. But if you read it when you were thirteen and identified with the kids rather than the monster, why wouldn’t it be? Having all kind of horrible adventures with your friends is quite appealing as a child and having them vicariously via a book is the next best thing, or actually a better or at least less perilous one.

But those are books I read during or before adolescence and so the comforting quality comes to them naturally, or so it seems. The same could be true of my favourite Shakespeare plays, which I first read during probably the most intensely unhappy part of my adolescence – but in a weird, counterintuitive way, that adds to the sense of nostalgia. Sue Townsend’s Adrian Mole books are kind of in a category of their own. When I read the first one, Adrian was 13 and I would have been 11. And then, I read the second a year or so later, but the others just randomly through the years. I’m not sure I was even aware of them when they were first published, but the ones where Adrian is an adult are just as funny but also significantly more painful. It’s a strange thing to read about the adult life of a character you “knew” when you were both unhappy children. Although she had a huge amount of acclaim and success during her life, I’m still not sure Townsend gets quite the credit she’s due for making Adrian Mole a real person. Laughing at a nerdy teenager’s difficult adolescence and his cancer treatment as a still-unhappy adult is a real imaginative and empathic achievement. Still; the comfort there could be in the familiar, not just the character but the world he inhabits. Adrian is, reading him as an adult (and as he becomes an adult) surprisingly nuanced; even though he’s an uptight and conservative and in a way a little Englander and terminally unreliable as a teenager and loses none of those traits as an adult, you somehow know that you can count on him not to be a Nazi or misogynist, no small thing in this day and age.

But if Frodo and Elric and Adrian Mole are characters who I knew from childhood or adolescence, what about A Clockwork Orange, which I first read and immediately loved in my early 20s and which, despite the (complicatedly) happy ending could hardly be called uplifting? Or The Catcher in the Rye, which again I didn’t read until my 20s and have been glad ever since that I didn’t “do” it at school as so many people did. Those books have a lot in common with Adrian Mole, in the sense that they are first-person narratives by troubled teenagers. Not that Alex is “troubled” in the Adrian/Holden Caulfield sense. But maybe it’s that sense of a ‘voice’ that’s comforting? If so, what does that say about the fact that Crash by JG Ballard or worse, American Psycho is also a comfort read for me? I read both of those in my 20s too, and immediately liked them but not in the same way as The Catcher in the Rye. When I read that book, part of me responds to it in the identifying sense; that part of me will probably always feel like Holden Caulfield, even though I didn’t do the things he did or worry about ‘phonies’ as a teenager. I loved Crash from the first time I read the opening paragraphs but although there must be some sense of identification (it immediately felt like one of ‘my’ books) and although have a lot of affection for Ballard as he comes across in interviews, I don’t find myself reflected in the text, thankfully. Same (even more thankfully) with American Psycho – Patrick Bateman is an engaging, very annoying narrator (more Holden than Alex, interestingly) and I find that as with Alex in A Clockwork Orange his voice feels oddly effortless for me to read. Patrick isn’t as nice(!) or as funny or clever as Alex, but still, there’s something about his neurotic observations and hilariously tedious lists that’s – I don’t know, not soothing to read, exactly, but easy to read. Or something. Hmm.

But if Alex, Adrian, Holden and Patrick feel real, what about actual real people? I didn’t read Jake Adelstein’s Tokyo Vice until I was in my early 30s, but it quickly became a book that I can pick up and enjoy it at any time. And yet, though there is a kind of overall narrative and even a sort of happy ending, that isn’t really the main appeal; and in this case it isn’t familiarity either. It’s episodic and easy to dip into (Jon Ronson’s books have that too and so do George Orwell’s Essays and Journalism and Philip Larkin’s Selected Letters, which is another comfort read from my 20s) The culture of Japan that Adelstein documents as a young reporter has an alien kind of melancholy that is somehow hugely appealing even when it’s tragic. Another true (or at least fact-based) comfort read, Truman Capote’s In Cold Blood, which I only read in my 40s after meaning to read it ever since high school, has no business whatsoever being comforting. So why is it? I’m not getting any closer to an answer.

Predictability presumably has a role to play; as mentioned above, I wouldn’t read a book for the first time as ‘a comfort read’ and even though I said I might read a familiar author that way, it suddenly occurs to me that that is only half true. I would read Stephen King for comfort, but I can think of at least two of his books where the comfort has been undone because the story went off in a direction that I didn’t want it to. That should be a positive thing; predictability, even in genre fiction which is by definition generic to some extent, is the enemy of readability and the last thing you want is to lose interest in a thriller. I’ve never been able to enjoy whodunnit type thrillers for some reason; my mother loved them and they – Agatha Christie, Ngaio Marsh, Sue Grafton, even Dick Francis, were her comfort reads. Maybe they are too close to puzzles for my taste? Not sure.

So to summarise; well-loved stories? Sometimes comforting. Identifiable-with characters? Sometimes comforting. Authorial voices? This may be the only unifying factor in all the books I’ve listed and yet it still seems a nebulous kind of trait and Robert Westall has little in common with Sue Townsend or Bret Easton Ellis, or (etc, etc). So instead of an actual conclusion, I’ll end with a funny, sad and comforting quote from a very silly, funny but in some ways comforting book; Harry Harrison’s 1965 satirical farce Bill, the Galactic Hero. The book is in lots of ways horrific; Bill, an innocent farm boy, finds himself swept up into the space corps and a series of ridiculous and perilous adventures. The ending of the book is both funny and very bitter, but rewinding to the end of part one, Bill has lost his left arm in combat but had a new one – but a right arm, which belonged to his best friend, grafted on:

He wished he could talk to some of his old buddies, then remembered that they were all dead and his spirits dropped further. He tried to cheer himself up but could think of nothing to be cheery about until he discovered that he could shake hands with himself. This made him feel a little better. He lay back on the pillows and shook hands with himself until he fell asleep.

Harry Harrison, Bill the Galactic Hero, p.62 (Victor Gollancz, 1965)

meted out to the man

Although Mr Musk’s*  statement about Hitler, Stalin and Mao is (surely not unexpectedly) ignorant and abhorrent, he is making a serious point that’s worth remembering, even if his reasons for doing so come from a paranoid, (wouldn’t normally go straight for the WW2 analogy but he already did, so why not?) bunker-mentality sense of self-preservation.
Hitler was the main architect of the Holocaust and other Nazi atrocities from murder to mental/physical torture to the indoctrination of children in a misanthropic ideology, and so he therefore bears a large part of the moral responsibility for it. BUT, he genuinely wasn’t standing there in the streets of Warsaw or the hills of Ukraine, swinging small children by the legs and smashing them to death against walls or leading groups of half-starved prisoners into ravines and machine-gunning them, or even holding a gun to the heads of those who did to make sure they did it.

*nice innit? Sounds kind of like a fox from an old children’s book

Stalin’s policies led, both directly and indirectly to the death of millions, but he wasn’t personally there in the salt mines working people to death, or stabbing them in the head with ice-picks or torturing and shooting them because their vision of communism differed from his, or simply because they refused to agree with him.

Mao Zedong instigated vast, dehumanising programs that decimated the people of his country through famine and starvation and led campaigns that ruthlessly wiped out political opponents – but he did it with words or with a pen, not with bullets or by actually snatching food from people’s mouths.

In all of those cases, those atrocities happened for two reasons; most importantly, because the instigators wanted it; they would not have happened without those three individuals. But also because others, most of whose names are now unknown to us without a lot of tedious and depressing research, were willing to make it happen. The people who murdered and tortured did those things, some no doubt more enthusiastically than others, because they were paid to do so. Now, there are people ending international aid to starving children, or impeding Ukraine’s fight against the invading forces of Russia, or firing veterans or ‘just’ setting up armed cordons around car dealerships and arresting people that they or their superiors are pretending for ideological reasons to think are dangerous aliens – and whatever the level of enthusiasm, they are essentially doing those things because they are being paid to.

Some of these people (it doesn’t matter which era or regime you apply this to, as bodycam and mobile phone footage testifies) perform additional cruelties which they aren’t specifically being paid for, and that their leaders may never even know about, just because they can and because it gratifies them in some way, while others are simply following the orders they are given.

But ‘just following orders’ – complicity, in a word – wasn’t considered a reasonable defence in the war trials of 1945 and it still isn’t one now. And the reptilian act of formulating and issuing dehumanising orders, even (or perhaps especially?) without personally committing any atrocities oneself isn’t any kind of defence at all. It was and should be part of any prosecution’s case for maximum culpability. Leaders require followers and followers need leaders, but you don’t have to be either.

 

a dream itself is but a shadow

Sigmund Freud c.1885, with Freudian facial hair

Whether or not you agree with Sigmund Freud that “the dream proves to be the first of a series of abnormal psychic formations” or that “one who cannot explain the origin of the dream pictures will strive in vain to understand [the] phobias, the obsessive and delusional ideas and likewise their therapeutic importance,” (The Interpretation of Dreams, 1913 translated by A.A. Brill) dreams are a regular, if not daily/nightly part of human life regardless of culture, language, age etc, and so not without significance. I could go on about dreams like I did about honey in a previous post but I won’t – they are too pervasive popular culture – just everywhere in culture, in books, and plays and art and films and songs (Dreams they complicate/complement my life, as Michael Stipe wrote.) That’s enough of that.

But what about daydreams? If dreams arrive uninvited from the unconscious or subconscious mind, then surely the things we think about, or dwell on, deliberately are even more important. “Dwell on” is an interesting phrase – to dwell is “to live in a place or in a particular way” BUT ‘dwell’ has a fascinating history that makes it seem like exactly the right word in this situation – from the Old English dwellan “to lead into error, deceive, mislead,” related to dwelian “to be led into error, go wrong in belief or judgment” etc, etc, according to etymonline.com : I’ll put the whole of this in a footnote* because I think it’s fascinating, but the key point is that at some time in the medieval period it’s largely negative connotations, to “delay” become modified to mean “to stay.” But I like to think the old meaning of the word lingers in the subtext like dreams in the subconscious.

But I could say something similar about my own use of the word “deliberately” above (“Things we dwell on deliberately”) and even more so the phrase I nearly used instead, which was “on purpose” – but then this would become a ridiculously long and convoluted piece of writing, so that’s enough etymology for now.

The human mind is a powerful thing. Even for those of us who don’t believe in telekinesis or remote viewing or ‘psychic powers’ in the explicitly paranormal sense. After all, your mind controls everything you think and nearly everything you do to the point where separating the mind from the body, as western culture tends to do, becomes almost untenable. Even though the euphemism “unresponsive wakefulness syndrome” has gained some traction in recent years, that’s because the dysphemism (had to look that word up) “persistent vegetative state” is something we fear and therefore that loss of self, or of humanity offends us. It’s preferable for most of us, as fiction frequently demonstrates, to believe that even in that state, dreams of some kind continue in the mind; because as human beings we are fully our mind in a way that we are only occasionally fully our body. One of the fears connected to the loss of self is that we lose the ability to choose what to think about, which is intriguing because that takes us again into the (might as well use the pompous word) realm of dreams.

The Danish actress Asta Nielsen as Hamlet in 1921

My favourite Shakespeare quote is the last line from this scene in Hamlet (Act 2 Scene 2)

Hamlet: Denmark’s a prison.
Rosencrantz: Then is the world one.
Hamlet: A goodly one, in which there are many confines, wards, and
dungeons, Denmark being one o’ th’ worst.
Rosencrantz: We think not so, my lord.
Hamlet: Why then ’tis none to you; for there is nothing either good or
bad, but thinking makes it so. To me it is a prison.

There is nothing either good or bad, but thinking makes it so” seems to deny any possibility of objective morality, but its logic is undeniable. After all, you or I may think that [insert one of thousands of examples from current politics and world events] is ‘wrong’, but if [individual in position of power] doesn’t think so, and does the wrong thing, even if all of the worst possible outcomes stem from it, the most you can say is that you, and people who agree with you, think it was wrong. Hitler almost certainly believed, as he went to the grave, that he was a martyr who had failed in his grand plan only because of the betrayal and duplicity of others. I think that’s wrong, you hopefully think that’s wrong, even “history” thinks that’s wrong, but none of that matters to Hitler in his bunker in 1945, any more than Rosencrantz & Guildenstern finding Denmark to be a nice place if only their old friend Hamlet could regain his usual good humour makes any difference to Hamlet.

Anyway, daydreams or reveries (a nice word that feels a bit pretentious to say); its dictionary definitions are mostly very positive – a series of pleasant thoughts about something you would prefer to be doing or something you would like to achieve in the future.
A state of abstracted musing.
A loose or irregular train of thought occurring in musing or mediation; deep musing – and there’s a school of thought that has been around for a long time but seems even more prevalent today, which values daydreams as, not just idle thoughts, but as affirmations. Anyone who has tried to change their life through hypnosis or various kinds of therapy will find that daydreaming and visualising are supposed to be important aspects of your journey to a better you. In a way all of these self-help gurus, lifestyle coaches and therapists are saying the same thing; as Oscar Hammerstein put it, “You got to have a dream,
If you don’t have a dream, How you gonna have a dream come true?” But is making your dreams, even your daydreams, come true necessarily a good thing?

patriotic vapour trails spotted this winter

I seem to remember once reading that if you can focus all of your attention on something for 15 seconds you’ll remember it forever (not sure about the duration; if you Google stuff like this you find there are millions of people offering strategies to improve your memory, which isn’t quite what I was looking for). Whether or not that’s true, every time I see a vapour trail in an otherwise blue sky, I have the same thought/image – actually two thoughts, but “first you look so strong/then you fade away” came later and failed to replace the earlier thought, which must date from the age of 9 or 10 or so. I realise that people telling you their dreams is boring (or so people say, I never find it to be so), but you don’t have to keep reading. I can see the fluffy, white trail against the hot pale blue sky (it’s summer, the sun is incandescent and there are no clouds) and as my eye follows it from its fraying, fading tail to its source, I can see the nose cone of the plane glinting in the sun, black or red and metallic. It looks slow, leisurely even, but the object is travelling at hundreds of miles an hour. I know there’s no pilot inside that warm, clean shell (I can imagine feeling its heat, like putting your hand on the bonnet or roof of a car parked in direct sunshine; only there are rivets studding the surface of this machine). I’m shading my eyes with my hand, watching its somehow benign-looking progress, but I know that it’s on its way to the nearby airforce base and that others are simultaneously flying towards other bases and major cities and soon, everything I can see and feel will be vapourised and cease to exist. I had this daydream many times as a child, I have no idea how long it lasted but I can remember the clarity and metallic taste of it incredibly clearly. Did I want it to happen? Definitely not. Was I scared? No, although I remember an almost physical sense of shaking it off afterwards. Did I think it would happen? It’s hard to remember, maybe – but I wouldn’t have been alone in that if so. But anyway, the interesting point to me is that this wasn’t a dream that required sleep or the surrender of the conscious mind to the unconscious – I was presumably doing it “on purpose”, although what that purpose was I have no idea; nothing very nice anyway.

Childhood hero: Charles M Schulz’s Charlie Brown

Probably most of us carry around a few daydreams with us, most I’m sure far more pleasant than that one. I can remember a few from my adolescence that were almost tangible then and still feel that way now (I would swear that I can remember what a particular person’s cheek felt like against my fingertips though I definitely didn’t ever touch it. As my childhood role model, Charlie Brown would say, “Augh!” Charles M. Schulz clearly knew about these things and still felt them vividly as an adult (as, more problematically, did Egon Schiele, subject of my previous article; but let’s not go into that). Most of the daydreams we keep with us into adulthood (or create in adulthood) are probably nicer baggage to carry around than the vapour trail one, unless you’re one of those people who fantasises about smashing people’s heads in with an iron bar (who has such a thing as an iron bar? Why iron? Wouldn’t brass do the job just as well and lead even better?) beyond the teenage years when violent daydreams are almost inevitable, but hopefully fleeting.

But thinking about your daydreams is odd, they are, like your thoughts and dreams, yours and nobody else’s, but where they come from in their detail seems almost as obscure as dream-dreams. Perhaps Freud would know. I have a couple of daydreams that have been lurking around for decades, but while I don’t believe in telekinesis or even the current obsession with affirmations and ‘manifesting,’ apparently I must be a bit superstitious; because if I wrote them down they might not come true innit?

 

*Old English dwellan “to lead into error, deceive, mislead,” related to dwelian “to be led into error, go wrong in belief or judgment,” from Proto-Germanic *dwaljana “to delay, hesitate,” *dwelana “go astray” (source also of Old Norse dvelja “to retard, delay,” Danish dvæle “to linger, dwell,” Swedish dväljas “to dwell, reside;” Middle Dutch dwellen “to stun, perplex;” Old High German twellen “to hinder, delay”) from PIE *dhwel-, extended form of root *dheu- (1) “dust, cloud, vapor, smoke” (also forming words with the related notions of “defective perception or wits”).

The apparent sense evolution in Middle English was through “to procrastinate, delay, be tardy in coming” (late 12c.), to “linger, remain, stay, sojourn,” to “make a home, abide as a permanent resident” (mid-14c.). From late 14c. as “remain (in a certain condition or status),” as in phrase dwell upon “keep the attention fixed on.” Related: Dwelled; dwelt (for which see went); dwells.

It had a noun form in Old English, gedweola “error, heresy, madness.” Also compare Middle English dwale “deception, trickery,” from Old English dwala or from a Scandinavian cognate (such as Danish dvale “trance, stupor, stupefaction”); dwale survived into late Middle English as “a sleeping potion, narcotic drink, deadly nightshade.”

what do you look like?

A few years ago a friend sent me a photograph of the ten-year-old us in our Primary School football team. I was able, without too much thought, to put names to all eleven of the boys, but the biggest surprise was that my initial reaction, for maybe a second but more like two seconds, was not to recognise myself. In my defence, I don’t have any other pictures of me at that age, and even more unusually, in that picture I’m genuinely smiling. Usually I froze when a camera was pointed at me (and still do, if it takes too long), but I must have felt safer than usual in a group shot, because it is a real smile and not the standard grimace that normally happened when I was asked to smile for photographs. I could possibly also be forgiven for my confusion because in contrast with my present self, ten year old me had no eyebrows, a hot-pink-to-puce complexion and unmanageably thick, wavy, fair hair; but even so, that was the face I looked at in the mirror every day for years and, more to the point, that gangly child with comically giant hands actually is me; but what would I know?

My favourite of David Hockney’s self portraits – Self Portrait with Blue Guitar (1977)

In a recent documentary, the artist David Hockney made a remark (paraphrased because I don’t have it to refer to) that resonated with me; your face isn’t for you, it’s for other people. And, as you’d expect of someone who has spent a significant part of his long career scrutinizing people and painting portraits of them, he has a point. Everyone around you has a more accurate idea of what you look like than you do. Even when you see someone ‘in real life’ who you are used to seeing in photographs or films, there’s a moment of mental recalibration; even if they look like their image, the human being before you in three dimensions is a whole different scale from the thing you are used to seeing. I remember reading in some kids’ novel that the young footballer me liked (I’m guessing Willard Price but can’t swear to it) that when being shown photographs of themselves, the indigenous people of (I think) New Guinea, not only weren’t impressed, but didn’t recognise them as anything in particular. Like Hockney, they had a point; if the Victorian people who invented photography hadn’t grown up with a tradition of ‘realistic,’ representational art would they have seen any relationship between themselves as living, breathing, colourful, space-filling three-dimensional organisms and the monochromatic marks on little flat pieces of paper? The response of the fictional New Guinea tribespeople is actually more logical than the response (surprise, wonder, awe) that’s expected of them in the novel.

Hockney went on further to say that portrait painting (if the sitter is present with the artist) gives a better idea of a person than photography does. At first this is a harder argument to buy into in a way, but it has its own logic too. A photograph, as he pointed out, is a two-dimensional record of one second in time, whereas the portrait painter creates their also two-dimensional image from spending time in the company of the sitter and focusing on them, a different, deeper kind of focus, since it engages the brain as well as the senses, than the technical one that happens with a lens, light and film or digital imaging software. A camera doesn’t care what you are like, it just sees how you look, from that angle, for that second. Maybe my big 10-year-old smile really is representative of how I was, but from memory it doesn’t represent that period for me at all.

Egon Schiele in his studio c.1915 (left) vs his 1913 self-portrait (right)

But I might never have written this had I not been reading Frank Whitford’s excellent monograph on the Austrian expressionist painter Egon Schiele (Thames & Hudson, 1981). Schiele is famous for (among other things) his twisted, emaciated and fanatically awkward self-portraits. The man he depicts is scrawny, elongated, intense, sometimes almost feline and utterly modern. Schiele in photographs, on the other hand, is quite a different presence. He sometimes has the expected haunted look and the familiar shock of hair, and he poses almost as awkwardly, but otherwise he looks surprisingly dapper, civilised, diminutive, square faced and elfin. But if we think – and it seems logical that we do – that the photographs show us the ‘real’ Schiele, then the descriptions of those who knew him suggest otherwise. “a slim young man of more than average height… Pale but not sickly, thin in the face, large dark eyes and full longish dark brown hair which stood out in all directions. His manner was a little shy, a little timid and a little self-confident. He did not say much, but when spoken to his face always lit up with the glimmer of a quiet smile.” (Heinrich Benesch, quoted in Whitford, p.66) This description doesn’t exactly clash with the Schiele of the photographs (though he never appears especially tall), but it’s somehow far easier to identify with the dark-eyed, paradoxically shy and confident Schiele of the self portraits. In his own writings, Schiele seems as tortured and intense as in his paintings, but in photographs he appears confident, knowing and slightly arch.  His face, as Hockney says, may not have been for him, but he seems to have captured it in his art in ways that his friends and acquaintances recognised, and which the camera apparently didn’t.

Schiele in 1914 by Josef Anton Trčka (left) vs his 1911 self portrait (right)

Which, what, proves Hockney both right (portraiture is superior to photography) and wrong (Schiele knew his own face)? And anyway, what does that have to do with the 10-year old me? Nothing really, except that the camera, objective and disinterested, captured an aspect of me in that second which may or may not have been “true.” Objectivity and disinterestedness are positive qualities for evaluating facts, but when it comes to human beings, facts and truth have a complicated relationship. Photography, through its “realness,” has issues capturing these complexities, unless the photographer is aware of them and – Diane Arbus and Nan Goldin spring to mind – has the ability to imbue their work with more than the obvious surface information that is the camera’s speciality. But manually-created art, with its human heart and brain directing, naturally takes the relationship between truth and facts in its stride.

One final example that proves nothing really, except to my satisfaction. Around the year 1635, the Spanish painter Diego Velázquez was tasked with painting portraits of the assorted fools, jesters dwarfs and buffoons whose lives were spent entertaining the Spanish court. Most of these people suffered from mental or physical disabilities (or both) and were prized (I think a more accurate word than ‘valued’ in this context) for their difference from ‘normal’ people; in the same way as carnival “freaks” into the early 20th century in fact. Although these people were comparatively privileged, compared to what their lives would have been like had they not been adopted by the Royal court, their position in the household was more akin to pets than friends or even servants. Juan de Calabazas (“John of Gourds; a gourd was a traditional jester’s attribute) suffered from unknown mental illnesses and physical tics. In a time and place where formality and manners were rigidly maintained, especially around the monarch – where a misstep in etiquette could have serious or even fatal consequences, buffoons like Juan entertained the court with unfettered, sometimes nonsensical or outrageous speech, impulsive laughter and strange, free behaviour. Whereas in normal society these people would be lucky even to survive, in the Court their behaviour was celebrated and encouraged. Velázquez is rightly famous for the empathy and humanity with which he painted portraits of these marginalised figures, but although, as Wikipedia (why not?) puts it; “Velázquez painted [Juan] in a relatively calm state, further showing Velazquez’s equal show of dignity to all, whether king or jester” that seems an unusual response to the portrait below, It’s not untrue, but for me at least, Velázquez’s process of humanisation is painful too. The knowledge that this man lived his life as a plaything of the rich and powerful, alive only because they found him funny is troubling enough. But that pathos seems to be embodied in the picture and you know, or it feels like you know, that Velázquez didn’t find him funny, or at least not only funny. It’s something like watching David Lynch’s The Elephant Man compared to looking at the Victorian photographs of the real Joseph Merrick. Seeing the photographs is troubling, seeing Lynch’s cinematic portrait is too, but it’s deeply moving too.

Juan de Calabazas (c.1635-9) by Diego Velázquez

All of which may just be a way of saying that a camera is a machine and does what it does – recording the exterior of what it’s pointed at – perfectly, while a human being does, and feels, many things simultaneously, probably not perfectly. Well I’m sure we all knew that anyway. I eventually got eyebrows, by the way.

 

henrik palm and the releases of the year 2024 (with typically lengthy disclaimer!)

Way back in April this year Henrik Palm released an album called Nerd Icon (via Svart Records). It’s very good – 80s-inflected melodic hard rock is as good a description as any, I guess, but it has a very individual personality and none of the pomposity or poser quality of that kind of music (no offence to actual 80s rock, which I love). In fact it’s one of my albums of the year (see short list below somewhere). But thinking about ‘albums of the year’ (yes, I probably whinge about this annually) especially in the context of Henrik Palm’s work makes me think of what a meaningless accolade it is. Not because there isn’t lots of good music produced every year, but just because people who love music don’t generally accumulate favourite albums in a real time, chronological way. The point of recorded music is that it has been recorded and can be therefore enjoyed outside of the time and place that it was made.

To labour the point, if music only moved forwards, with this year’s top 50 (or whatever) albums superseding last year’s and so on, ‘classic albums’ wouldn’t exist and once-obscure artists would remain obscure and people like Nick Drake (obvious example I know) would be only loved by the shockingly tiny handful of living people who bought his work at the time. But even before the internet that wasn’t the case and it still isn’t, so end of year lists end up being as peculiar a time capsule as the top 40 from years ago is. Yes, they are ranked by quality rather than popularity, but as looking back at these things demonstrates, they are no more reliable for that.

Not an album of this year, but an unexpected favourite

But the reason Henrik Palm illustrates this point for me is that in 2020 he released Poverty Metal, I heard it at the time and quite liked it but I don’t think I wrote about it anywhere, though it got a surprised mention in 2022  – and to my continued surprise I still play it fairly often. It’s an album as unassuming and quirky (I mean that is the right word but bleh) as its title – melodic, sometimes kind of 70s-ish, sometimes not, rarely very metal, often quite delicate and always thoughtful. It’s peculiar, but part of what makes it peculiar is how conventional it is – but at the same time, how unusual it is by the standards of those conventions. I guess it has become one of my favourite albums, which I don’t think anything on my actual 2020 ‘albums of the year’ lists did. And after the dust has settled on 2024, it may be that if any album from this year enters my personal pantheon, it could be one that hasn’t really registered with me yet or that I haven’t even heard.

Now that I’ve undermined it in advance, here’s my ‘albums of the year’ feature.

My favourite albums of this year are two which I (obviously) think are great, but for varying reasons I don’t know if they will stick around my personal playlist like Poverty Metal has – but they may.

The first is In Concert by Diamanda Galás (Intravenal Sound Operations).  Live albums are interesting in that many people (including myself) can be slightly dismissive of them (“_____ has a new album coming out! Oh, it’s just a live album“), a strange reaction, because if you’re lucky enough to see your favourite artists live you never think “oh, it was just a live performance.” In the context of home listening, none of the ephemeral magic of a live show – the stuff that’s really about you – is present, but theoretically the most important part is. In comparison with Galás’ recent, brilliantly gruelling work (Broken Gargoyles was my album of the year in 2022) the album is simple, or at least unadorned; just her extraordinary voice and uniquely expressive piano. But that’s quite a ‘just’ – and she plays a set of songs that are urgent, deeply moving, haunting, wise, shockingly relevant and occasionally wickedly funny. What more do you want? It’s about as far removed from a stadium band delivering polished versions of their greatest hits as you can get and though it would no doubt be a fantastic souvenir and reminder if you were lucky enough to see the performance, it’s entirely transporting just as a record. Will it join the Masque of the Red Death trilogy, The Litanies of Satan, The Sporting Life and Broken Gargoyles as one of my favourite Diamanda Galás albums? Who knows? Some of her work takes time to really get to know in a way that In Concert doesn’t, and I feel like I’m still ‘working on’ (not the right phrase) some of her older work – what that means for this album I don’t know, but I do know that nothing this year has cut deeper.

Joint album of the year – The Cure’s Songs of a Lost World

My second album of the year is Songs of a Lost World by The Cure (Fiction), which I reviewed here, which is just as visceral for me, but for completely different reasons. It is, as an amazing amount of people seem to agree, a superb album, moving and memorable and all of that; but I have been a fan of The Cure since I was seventeen and there hasn’t been any point where I stopped listening to them completely. That doesn’t necessarily mean I was predisposed to like it – their last couple of records didn’t do much for me, though they have their moments – but it is relevant to my personal response to it. Even though by any objective methods of analysis (there aren’t any) Songs of a Lost World is probably as good as anything the band has done, will it join Seventeen Seconds, Disintegration, Japanese Whispers and Pornography as one of my all-time favourite Cure albums? Or even Faith, The Top, Kiss Me, Kiss Me, Kiss Me, the Head on the Door and Boys Don’t Cry as my second-tier, almost-favourite Cure albums? Only time will tell, but in that time I will no longer be the me that was most receptive to their music and the band will have to compete with far more music (old, new, whatever) than they ever did when I obsessively listened to them. Then I had no way of getting their work except by buying it or making tapes from friends who owned it. I definitely think I love music just as much as I ever did, but I don’t obsessively listen to anything the way I did in my teens and early 20s. The older Cure records, even the ones I liked relatively less, like Wish and Kiss Me… are imprinted on my brain in a way that just doesn’t get a chance to happen now. But in a way I feel like Songs of a Lost World addresses and encapsulates all of those feelings, which is one of the reasons it’s so good.

Not sure if it’s coincidental or significant that both of my favourite albums of the year are by artists I’ve been listening to for decades, but it’s interesting either way. So anyway, a wee list of honourable mentions and we’re done with this for another year

Henrik Palm – Nerd Icon (Svart Records) – sort of 80s-ish, sort of metal-ish, 100% individual

 

Myriam Gendron – Mayday (Feeding Tube) – I loved Not So Deep as a Well ten years ago (mentioned in passing here) and love this even more

 

Ihsahn – Ihsahn (Candlelight) – wrote about it here – for me it doesn’t top my favourite Das Seelenbrechen, but it’s as good as any of his others

 

One of my top 3 or 4 albums of all time, John Cale’s Paris 1919 was reissued this year, his latest POPtical Illusion was good too

 

Mick Harvey – Five Ways to Say Goodbye (Mute) – lovely autumnal album by ex-Bad Seed and musical genius, more here

 


Nick Cave & the Bad Seeds – Wild God (PIAS recordings) – for me a good rather than amazing Nick Cave album, but he’s better than most people so still easily made the list, though I’m not sure I like it more than his old colleague’s work

Aara – Eiger (Debemur Morti Productions) – Superior Swiss black metal, conceptual without being pompous and full of great tunes and atmosphere

 

Claire Rousay – Sentiment (Thrill Jockey) – bracingly sparse and desolate but lovely too

 

Alcest – Chants de L’Aurore (Nuclear Blast) – seems so long ago that I almost forgot about it, but this was (I thought) the best Alcest album for years, beautiful, wistful and generally lovely. I talked to Neige about it at the time, I should post that interview here at some point!

Onwards!

most things don’t exist

 

eh, Mel Gibson: but he played a good Hamlet (dir Franco Zeffirelli, 1990)

With apologies to Marcel Proust – but not very vehement apologies, because it’s true – the taste of honey on toast is as powerfully evocative and intensely transporting to me as anything that I can think of. The lips and tongue that made that association happen don’t exist anymore and neither does the face, neither do the eyes, and neither does one of the two brains and/or hearts* that I suppose really made it happen (mine are still there, though). In 21st century Britain, it’s more likely than not that even her bones don’t exist anymore, which makes the traditional preoccupation with returning to dust feel apt and more immediate and (thankfully?) reduces the kind of corpse-fetishising morbidity that seems to have appealed so much to playgoers in the Elizabethan/Jacobean era.

Death & Youth (c.1480-90) by the unknown German artist known as The Master of the Housebook

Thou shell of death,
Once the bright face of my betrothed lady,
When life and beauty naturally fill’d out
These ragged imperfections,
When two heaven-pointed diamonds were set
In those unsightly rings: then ’twas a face
So far beyond the artificial shine
Of any woman’s bought complexion

(The Revenger’s Tragedy (1606/7) by Thomas Middleton and/or Cyril Tourneur, Act one, Scene one)

 

                                                                                                     *is the heart in the brain? In one sense obviously not, in another maybe, but the sensations associated with the heart seem often to happen somewhere around the stomach; or is that just me?

More to the point, “here hung those lips that I have kissed I know not how oft“, etc. All of which is beautiful; but for better or worse, a pile of ash isn’t likely to engender the same kind of thoughts or words as Yorick’s – or anybody’s – skull. But anyway, the non-existence of a person – or, even more abstractly, the non-existence of skin that has touched your skin (though technically of course all of the skin involved in those kisses has long since disappeared into dust and been replaced anyway) is an absence that’s strange and dismal to think about. But then most things don’t exist.

Vanitas: Still Life with Skull (c.1671) by an unknown English painter

But honey does exist of course; and the association between human beings and sugary bee vomit goes back probably as long as human beings themselves. There are Mesolithic cave paintings, 8000 years old or more, made by people who don’t exist, depicting people who may never have existed except as drawings, or may have once existed but don’t anymore, plundering beehives for honey. Honey was used by the ancient Egyptians, who no longer exist, in some of their most solemn rites, it had sacred significance for the ancient Greeks, who no longer exist, it was used in medicine in India and China, which do exist now but technically didn’t then, by people who don’t, now. Mohammed recommended it for its healing properties; it’s a symbol of abundance in the Bible and it’s special enough to be kosher despite being the product of unclean insects. It’s one of the five elixirs of Hinduism, Buddha was brought honey by a monkey that no longer exists. The Vikings ate it and used it for medicine too. Honey was the basis of mead, the drink of the Celts who sometimes referred to the island of Britain as the Isle of Honey.

probably my favourite Jesus & Mary Chain song: Just Like Honey (1985)

And so on and on, into modern times. But also (those Elizabethan-Jacobeans  again) “The sweetest honey is loathsome in its own deliciousness. And in the taste destroys the appetite.” (William Shakespeare, Romeo and Juliet (c.1595) Act 2, scene 6)Your comfortable words are like honey. They relish well in your mouth that’s whole; but in mine that’s wounded they go down as if the sting of the bee were in them.”(John Webster, The White Devil (1612), Act 3. Sc.ene 3). See also “honey trap”. “Man produces evil as a bee produces honey.”You catch more flies with honey.

But on the whole, the sweetness of honey is not and has never been sinister. A Taste of Honey, Tupelo Honey, “Wild Honey,” “Honey Pie”, “Just like Honey,” “Me in Honey,” “Put some sugar on it honey,” Pablo Honey, “Honey I Sure Miss You.” Honey to the B. “Honey” is one of the sweetest (yep) of endearments that people use with each other. Winnie-the-Pooh and Bamse covet it. Honey and toast tasted in a kiss at the age of 14 is, in the history of the world, a tiny and trivial thing, but it’s enough to resonate throughout a life, just as honey has resonated through the world’s human cultures. Honey’s Dead. But the mouth that tasted so sweetly of honey doesn’t exist anymore. Which is sad, because loss is sad. But how sad? Most things never exist and even most things that have existed don’t exist now, so maybe the fact that it has existed is enough.

“Most things don’t exist” seems patently untrue: for a thing to be ‘a thing’ it must have some kind of existence, surely? And yet, even leaving aside things and people that no longer exist, we are vastly outnumbered by the things that have never existed, from the profound to the trivial. Profound, well even avoiding offending people and their beliefs, probably few people would now say that Zeus and his extended family are really living in a real Olympus. Trivially, 70-plus years on from the great age of the automobile, flying cars as imagined by generations of children, as depicted in books and films, are still stubbornly absent from the skies above our roads. The idea of them exists, but even if – headache-inducing notion – it exists as a specific idea (“the idea of a flying car”), rather than just within the general realm of “ideas,” an idea is an idea, a thing perhaps but not the thing that it is about. Is a specific person’s memory of another person a particular thing because it relates to a particular person, or does it exist only under the larger and more various banner of “memories”? Either way, it’s immaterial, because even though the human imagination is a thing that definitely exists, the idea of a flying car is no more a flying car than Leonardo da Vinci’s drawing of a flying machine was a flying machine or that my memory of honey-and-toast kisses is a honey-and-toast kiss.

If you or I picture a human being with electric blue skin, we can imagine it and if we have the talent we can draw it, someone could depict it in a film, but it wouldn’t be the thing itself, because human beings with electric blue skin, like space dolphins, personal teleportation devices, seas of blood, winged horses, articulate sentient faeces and successful alchemical experiments, don’t exist. And depending on the range of your imagination (looking at that list mine seems a bit limited), you could think of infinite numbers of things that don’t exist. There are also, presumably, untold numbers of things that do exist but that we personally don’t know about or that we as a species don’t know about yet. But even if it was possible to make a complete list of all of the things in existence (or things in existence to date; new things are invented or develop or evolve all the time), it would always be possible to think of even more things that don’t exist, – simply, in the least imaginative way, by naming variations on, or parodies of everything that does exist. So supermassive black holes exist? Okay, but what about supertiny pink holes? What about supermedium beige holes? This June, a new snake (disappointingly named Ovophis jenkinsi) was discovered. But what about a version of Ovophis jenkinsi that sings in Spanish or has paper bones or smells like Madonna? They don’t exist.

JAMC Honey’s Dead, 1992

Kind of a creepy segue if you think about it (so please don’t), but like those beautifully-shaped lips that tasted of honey, my mother no longer exists, except as a memory, or lots of different memories, belonging to lots of different people. Presumably she exists in lots of memories as lots of different people who happen to have the same name. But unlike supermedium beige holes, the non-existence of previously-existing things and people is complex, because of the different perspectives they are remembered from. But regardless, they are still fundamentally not things anymore. But even with the ever-growing, almost-infinite number of things, there are, demonstrably, more things that don’t exist. And, without wishing to be horribly negative or repeating things I’ve written before, one of the surprises with the death of a close relative was to find that death does exist. Well, obviously, everyone knows that – but not just as an ending or as the absence of life, as was always known, but as an active, grim-reaper-like force of its own. For me, the evidence for that – which I’m sure could be explained scientifically by a medical professional – is the cold that I mentioned in the previous article. Holding a hand that gets cold seems pretty normal; warmth ebbing away as life ebb away; that’s logical and natural. But this wasn’t the expected (to me) cooling down of a warm thing to room temperature, like the un-drunk cups of tea which day after day were brought and cooled down because the person they were brought for didn’t really want them anymore, just the idea of them. That cooling felt natural, as did the warming of the glass of water that sat un-drunk at the bedside because the person it was for could no longer hold or see it. That water had been cold but had warmed up to room temperature, but the cold in the hand wasn’t just a settling in line with ambient conditions. It was active cold; hands chilling and then radiating cold in quite an intense way, a coldness that dropped far below room temperature. I mentioned it to a doctor during a brief, unbelievably welcome break to get some air, and she said “Yes, she doesn’t have long left.” Within a few days I wished I’d asked for an explanation of where that cold was coming from; where is it generated? Which organ in the human body can generate cold so quickly and intensely? Does it do it in any other situations? And if not, why not? So, although death can seem abstract, in the same sense that ‘life’ seems abstract, being big and pervasive, death definitely exists. But as what? Don’t know; not a single entity, since it’s incipient in everyone, coded into our DNA: but that coding has nothing to do with getting hit by cars or drowning or being shot, does it? So, a big question mark to that. Keats would say not to question it, just to enjoy the mystery. Well alright then.

Klaus Nomi as “the Cold Genius” from his 1981 version of Purcell’s “The Cold Song”

But since most things *don’t* exist, but death definitely does exist, existence is, in universal terms, rare enough to be something like winning the lottery. But like winning the lottery, existence in itself is not any kind of guarantee of happiness or satisfaction or even honey-and-toast kisses; but it at least offers the possibility of those things, whereas non-existence doesn’t offer anything, not even peace, which has to be experienced to exist. We have all not existed before and we will all not exist again; but honey will still be here, for as long as bees are at least. I don’t know if that’s comforting or not. But if you’re reading this – and I’m definitely writing it – we do currently exist, so try enjoy your lottery win, innit.

Something silly about music next time I think.

Ancient Roman vanitas mosaic showing a skull and the wheel of fortune

who’d have them?

My mother died just about a month ago, and I think she/her death is taking up too much space in my conscious mind to trouble my subconscious or unconscious self too much. It’s interesting to note that even though death is one of the central themes of much of the most important art ever created, and although I am someone with an interest in Art, in the capital A, “high culture” sense, what came into my mind in that room, while holding her hand was actually a line from a song which turned out to have an accuracy I didn’t realise until then; “it’s so cold, it’s like the cold if you were dead.”* Mum wouldn’t have liked that. And if she wasn’t dead I probably wouldn’t be posting what follows online, even though there’s nothing in it she would object to and even though, as far as I’m aware, she never read a word I wrote: which sounds petulant but it’s not a complaint. Our parents know us too well in one way to want them to know us in other ways, or at least that’s how I think I feel about it.

*Plainsong by The Cure, which luckily I’ve barely been able to stand for many years although I really do love it.

Max Ernst – Max Ernst Showing a Young Girl the Head of his Father (1926/7)

Anyway, last night, for the first time in what feels like decades, I dreamed about my dad. The dream was full of vivid, long forgotten details, most of which almost immediately receded back into the murk of subconscious memory on waking. Not all of them though; how could I have forgotten his strangely hissing laugh (less sinister than it sounds)? But waking up, what was lurking in my mind as the dream faded was, of all things – pop culture strikes again – lines from Stephen King’s IT (which mum read, but dad didn’t, he was squeamish about horror) and a feeling of dread that wasn’t terrifying or even upsetting, just somehow inevitable and in some way kind of comfortable.

That quote comes from a scene in the book when the young protagonists come across the monster, Pennywise, in an old newspaper clipping from 1945. I had no idea that I had absorbed this paragraph, or at least its final lines, first read when I was 14, completely enough to have known it almost word for word, but there it was (have included the whole paragraph for sense):

The headline: JAPAN SURRENDERS – IT’S OVER! THANK GOD IT’S OVER! A parade was snake-dancing its way along Main Street toward Up-Mile Hill. And there was the clown in the background, wearing his silver suit with the orange buttons, frozen in the matrix of dots that made up the grainy newsprint photo, seeming to suggest (at least to Bill) that nothing was over, no one had surrendered, nothing was won, nil was still the rule, zilch still the custom; seeming to suggest above all that all was still lost.  

Stephen King, IT, 1986 p.584 (in the edition I have)

Pietà or Revolution by Night (1923)

Which is not really fair; dad had his faults but he was not a shape-shifting alien clown that ate kids. And anyway, it wasn’t even a nightmare as such. Details are receding – and have almost vanished even since I made the original note this morning – but essentially, nothing bad happened, we were in a house, dad was there, my siblings were there, offering eye-rolling ‘he’s annoying but what can you do?’ support, but what lingers is the last phase before waking – an interminably long, drawn out scene where I was attempting, unsuccessfully, to make coffee for everyone in an unfamiliar kitchen, but couldn’t find the right spoon, with dad behind me watching with condescending amusement and laughing that hissing laugh. And then I woke up to a Stephen King quote. So thanks for that, brain. One of the hardest lessons to learn and re-learn is that other people are none of your business, or to put it less negatively, that you have no claim on any other human being and they have no claim on you. Except for your parents of course; but that’s that dealt with anyway.

nostalgia isn’t going to be what it was, or something like that

When I was a child there was music which was, whether you liked it or not, inescapable. I have never – and this is not a boast – deliberately or actively listened to a song by Michael Jackson, Madonna, Phil Collins, Duran Duran, Roxette, Take That, Bon Jovi, the Spice Girls… the list isn’t endless, but it is quite long. And yet I know some, or a lot, of songs by all of those artists. And those are just some of the household names. Likewise I have never deliberately listened to “A Horse With No Name” by America, “One Night in Bangkok” by Murray Head or “Would I Lie to You” by Charles & Eddie; and yet, there they are, readily accessible should I wish (I shouldn’t) to hum, whistle or sing them, or just have them play in my head, which I seemingly have little control over.

Black Lace: the unacceptable face(s) of 80s pop

And yet, since the dawn of the 21st century, major stars come and go, like Justin Bieber, or just stay, like Ed Sheeran, Lana Del Rey or Taylor Swift, without ever really entering my consciousness or troubling my ears. I have consulted with samples of “the youth” to see if it’s just me, but no: like me, there are major stars that they have mental images of, but unless they have actively been fans, they couldn’t necessarily tell you the titles of any of their songs and have little to no idea of what they actually sound like. Logical, because they were no more interested in them than I was in Dire Straits or Black Lace; but alas, I know the hits of Dire Straits and Black Lace. And the idea of ‘the Top 40 singles chart’ really has little place in their idea of popular music. Again, ignorance is nothing to be proud of and I literally don’t know what I’m missing. At least my parents could dismiss Madonna or Boy George on the basis that they didn’t like their music. It’s an especially odd situation to find myself in as my main occupation is actually writing about music; but of course, nothing except my own attitude is stopping me from finding out about these artists.

The fact is that no musician is inescapable now. Music is everywhere, and far more accessibly so than it was in the 80s or 90s – and not just new music. If I want to hear Joy Division playing live when they were still called Warsaw or track down the records the Wu-Tang Clan sampled or hear the different version of the Smiths’ first album produced by Troy Tate, it takes as long about as long to find them as it does to type those words into your phone. Back then, if you had a Walkman you could play tapes, but you had to have the tape (or CD – I know CDs are having a minor renaissance, but is there any more misbegotten, less lamented creature than the CD Walkman?) Or you could – from the 1950s onwards – carry a radio with you and listen to whatever happened to be playing at the time. I imagine fewer people listen to the radio now than they did even 30 years ago, but paradoxically, though there are probably many more – and many more specialised –  radio stations now than ever, their specialisation actually feeds the escapability of pop music. Because if I want to hear r’n’b or metal or rap or techno without hearing anything else, or to hear 60s or 70s or 80s or 90s pop without having to put up with their modern-day equivalents, then that’s what I and anyone else will do. I have never wanted to hear “Concrete and Clay” by Unit 4+2 or “Agadoo” or “Come On Eileen” or “Your Woman” by White Town or (god knows) “Crocodile Shoes” by Jimmy Nail; but there was a time when hearing things I wanted to hear but didn’t own, meant running the risk of being subjected to these, and many other unwanted songs. As I write these words, “Owner of a Lonely Heart” by Yes, a song that until recently I didn’t know I knew is playing in my head.

And so, the music library in my head is bigger and more diverse than I ever intended it to be. In a situation where there were only three or four TV channels and a handful of popular radio stations, music was a kind of lingua franca for people, especially for young people. Watching Top of the Pops on a Thursday evening, or later The Word on Friday was so standard among my age group that you could assume that most people you knew had seen what you saw; that’s a powerful, not necessarily bonding experience, but a bond of sorts, that I don’t see an equivalent for now, simply because even if everyone you know watches Netflix, there’s no reason for them to have watched the same thing at the same time as you did. It’s not worse, in some ways it’s obviously better; but it is different. Of course, personal taste back then was still personal taste, and anything not in the mainstream was obscure in a way that no music, however weird or niche, is now obscure, but that was another identity-building thing, whether one liked it or not.

Growing up in a time when this isn’t the case and the only music kids are subjected to is the taste of their parents (admittedly, a minefield) or fragments of songs on TV ads, if they watch normal TV or on TikTok, if they happen to use Tiktok, is a vastly different thing. Taylor Swift is as inescapable a presence now, much as Madonna was in the 80s, but her music is almost entirely avoidable and it seems probable that few teenagers who are entirely uninterested in her now will find her hits popping unbidden into their heads in middle age. But conversely, the kids of today are more likely to come across “Owner of a Lonely Heart” on YouTube than I would have been to hear one of the big pop hits of 1943 in the 80s.

Far Dunaway as Bonnie Parker; a little bit 1930s, a lot 1960s

What this means for the future I don’t know; but surely its implications for pop-culture nostalgia – which has grown from its humble origins in the 60s to an all-encompassing industry, are huge. In the 60s, there was a brief fashion for all things 1920s and 30s which prefigures the waves of nostalgia that have happened ever since. But for a variety of reasons, some technical, some generational and some commercial, pop culture nostalgia is far more elaborate than ever before. We live in a time when constructs like “The 80s” and “The 90s” are well-defined, marketable eras that mean something to people who weren’t born then, in quite a different way from the 1960s version of the 1920s. Even back then, the entertainment industry could conjure bygone times with an easy shorthand; the 1960s version of the 1920s and 30s meant flappers and cloche hats and Prohibition and the Charleston and was evoked on records like The Beatles’ Honey Pie and seen onstage in The Boy Friend or in the cinema in Bonnie & Clyde. But the actual music of the 20s and 30s was mostly not relatable to youngsters in the way that the actual entertainment of the 80s and 90s still is. Even if a teenager in the 60s did want to watch actual silent movies or listen to actual 20s jazz or dance bands they would have to find some way of accessing them. In the pre-home video era that meant relying on silent movie revivals in cinemas, or finding old records and having the right equipment to play them on, since old music was then only slowly being reissued in modern formats. The modern teen who loves “the 80s” or “the 90s” is spoiled by comparison, not least because its major movie franchises like Star Wars, Indiana Jones, Ghostbusters and Jurassic Park are still around and its major musical stars still tour or at least have videos and back catalogues that can be accessed online, often for free.

Supergrass in 1996: a little bit 60s, a lot 70s, entirely 90s

Fashion has always been cyclical, but this feels quite new (which doesn’t mean it is though). Currently, culture feels not like a wasteland but like Eliot’s actual Waste Land, a dissonant kind of poetic collage full of meaning and detritus and feeling and substance and ephemera but at first glance strangely shapeless. For example, in one of our current pop culture timestreams there seems to be a kind of 90s revival going on, with not only architects of Britpop like the Gallagher brothers and Blur still active, but even minor bands like Shed Seven not only touring the nostalgia circuit but actually getting in the charts. Britpop was notoriously derivative of the past, especially the 60s and 70s. And so, some teenagers and young adults (none of these things being as pervasive as they once were) are now growing up in a time when part of ‘the culture’ is a version of the culture of the 90s, which had reacted to the culture of the 80s by absorbing elements of the culture of the 60s and 70s. And while the artists of 20 or 30 years ago refuse to go away even modern artists from alternative rock to mainstream pop stars make music infused with the sound of 80s synths and 90s rock and so on and on. Nothing wrong with that of course, but what do you call post-post-modernism? And what will the 2020s revival look like when it rears its head in the 2050s, assuming there is a 2050s? Something half interesting, half familiar no doubt.

an alan smithee war

an annoying but perhaps necessary note; “Alan (or Allan, or Allen) Smithee” is a pseudonym used by Hollywood film directors when they wish to disown a project

Watch out, this starts off being insultingly elementary, but then gets complicated and probably contradictory, quite quickly.

Countries, States and religions are not monoliths and nor are they sentient. They don’t have feelings, aims, motivations or opinions. So whatever is happening in the Middle East isn’t ‘Judaism versus Islam’ or even ‘Israel versus Palestine’, any more than “the Troubles”* were/are ‘Protestantism versus Catholicism’ or ‘Britain versus Ireland’.

* a euphemism, which, like most names for these things is partly a method of avoiding blame – as we’ll see

Places and atrocities aren’t monoliths either; Srebrenica didn’t massacre anybody**, the Falkland islands didn’t have a conflict, ‘the Gulf’ didn’t have any wars and neither did Vietnam or Korea. But somebody did. As with Kiefer Sutherland and Woman Wanted in 1999 or Michael Gottlieb and The Shrimp on the Barbie in 1990 and whoever it was that directed Gypsy Angels in 1980, nobody wants to claim these wars afterwards. But while these directors have the handy pseudonym Allan Smithee to use, there is no warmongering equivalent, and so what we get is geography, or flatly descriptive terms like ‘World War One’, which divert the focus from the aggressor(s) and only the occasional exception (The American War of Independence) that even references the real point of the war. But, whether interfered with by the studio or not, Kevin Yagher did direct Hellraiser: Bloodline, just as certain individuals really are responsible for actions which are killing human beings as you read this. Language and the academic study of history will probably help to keep their names quiet as events turn from current affairs and into history. Often this evasion happens for purely utilitarian reasons, perhaps even unknowingly, but sometimes it is more sinister.

** see?

As the 60s drew to its messy end, the great Terry “Geezer” Butler wrote lines which, despite the unfortunate repeat/rhyme in the first lines, have a Blakean power and truth:

Generals gathered in their masses
Just like witches at black masses

Black Sabbath, War Pigs, 1970

There is something sinister and even uncanny in the workings of power, in the distance between avowed and the underlying motivations behind military action. Power politics feels like it is – possibly because intuitively it seems like it should be – cold and logical, rather than human and emotional. It doesn’t take much consideration though to realise that even beneath the chilly, calculated actions of power blocs there are weird and strangely random human desires and opinions, often tied in with personal prestige, which somehow seems to that person to be more important than not killing people or not having people killed.

Anyway, Geezer went on to say:

Politicians hide themselves away
They only started the war
Why should they go out to fight?
They leave that role to the poor

Still Black Sabbath, War Pigs (1970)

And that’s right too; but does that mean Butler’s ‘poor’ should take no responsibility at all for their actions? In the largest sense they are not to blame for war or at least for the outbreak of war; and conscripts and draftees are clearly a different class again from those who choose to “go out to fight.” But. As so often WW2 is perhaps the most extreme and therefore the easiest place to find examples; whatever his orders or reasons, the Nazi soldier (and there were lots of them) who shot a child and threw them in a pit, actually did shoot a child and throw them in a pit. His immediate superior may have done so too, but not to that particular child. And neither did Himmler or Adolf Hitler. Personal responsibility is an important thing, but responsibility, especially in war, isn’t just one act and one person. Between the originator and the architects of The Final Solution and the shooter of that one individual child there is a chain of people, any one of whom could have disrupted that chain and even if only to a tiny degree, affected the outcome. And that tiny degree may have meant that that child, that human being, lived or died. A small thing in a death toll of something over 6 million people; unless you happen to be that person, or related to that person.

As with the naming of wars and atrocities, terms like “genocide” and “the Holocaust” are useful, especially if we want – as we clearly do – to have some kind of coherent, understandable narrative that can be taught and remembered as history. But in their grim way, these are still euphemisms. The term ‘the Holocaust’ memorialises the countless – actually not countless, but still, nearly 80 years later, being counted – victims of the Nazis’ programme of extermination. But the term also makes the Holocaust sound like an event, rather than a process spread out over the best part of a decade, requiring the participation of probably thousands of people who exercised – not without some form of coercion perhaps, but still – their free will in that participation. The Jewish scholar Hillel the Elder’s famous saying,  whosoever saves a life, it is as though he had saved the entire world is hard to argue with, insofar as the world only exists for us within our perceptions. Even the knowledge that it is a spinning lump of inorganic and organic matter in space, and that other people populate it who might see it differently only exists in our perceptions. Or at least try to prove otherwise. And so the converse of Hillel’s saying – which is actually included in it but far less often quoted – is Whosoever destroys one soul, it is as though he had destroyed the entire world. Which sounds like an argument for pacifism, but while pacifism is entirely viable and valuable on an individual basis as an exercise of one’s free will* – and on occasion has a real positive effect – one-sided pacifism relies on its opponents not taking a cynically Darwinian approach, which is hopeful at best. Pacifism can only really work if everyone is a pacifist, and everyone isn’t a pacifist.

*the lone pacifist can at least say, ‘these terrible things happened, but I took no part in them’, which is something, especially if they used what peaceful means they could to prevent those terrible things and didn’t unwittingly contribute to the sum total of suffering; but those are murky waters to wade in.

But complicated though it all is, people are to blame for things that happen. Just who to blame is more complicated – more complicated at least than the workable study of history can afford to admit. While countries and religions are useful as misleading, straw man scapegoats, even the more manageable unit of a government is, on close examination, surprisingly hard to pin down. Whereas (the eternally handy example of) Hitler’s Nazi Party or Stalin’s Council of People’s Commissars routinely purged heretics, non-believers and dissidents, thus acting as a genuine, effective focus for their ideologies and therefore for blame and responsibility, most political parties allow for a certain amount of debate and flexibility and therefore blame-deniability. Regardless, when a party delivers a policy, every member of that party is responsible for it, or should publicly recuse themselves from it if they aren’t.

The great (indeed Sensational) Scottish singer Alex Harvey said a lot of perceptive things, not least and “[Something] I learned from studying history. Nobody ever won a war. A hundred thousand dead at Waterloo*. No glory in that. Nobody needs that.” Nobody ever won a war;  but plenty of people, on both sides of every conflict, have lost one – and, as the simple existence of a second world war attests, many, many people have lost a peace too.

*Modern estimates put it at ‘only’ 11,000 plus another 40,000 or so casualties; but his point stands

But the “causes” of war are at once easily traced and extremely slippery. Actions like the 1939 invasion of Poland by the armies of Germany and the USSR were, as military actions still are, the will of certain individuals, agreed to by other individuals and then acted upon accordingly. You may or may not agree with the actions of your government or the leaders of your faith. You may even have had some say in them, but in most cases you probably haven’t. Some of those dead on the fields of Waterloo were no doubt enthusiastic about their cause, some probably less so. But very few would have had much say in the decisions which took them to Belgium in the first place.

The buck should stop with every person responsible for wars, crimes, atrocities; but just because that’s obviously impossible to record – and even if it wasn’t, too complex to write in a simple narrative – that doesn’t mean the buck should simply not stop anywhere. Victory being written by the winners often means that guilt is assigned to the losers, but even when that seems fair enough (there really wouldn’t have been a World War Two without Hitler) it’s a simplification (there wouldn’t have been an effective Hitler without the assistance of German industrialists) and a one-sided one (it was a World War because most of the leading participants had already had unprovoked wars of conquest). That was a long sentence. But, does the disgusting history of Western colonialism, the arguably shameful treatment of Germany by the Allies after WW1 and the dubious nature of the allies and some of their actions make Hitler himself any less personally responsible for the war? And does Hitler’s own guilt make the soldier who shoots a child or unarmed adult civilians, or the airman who drops bombs on them any less responsible for their own actions?

Again; only human beings do these things, so the least we can do is not act like they are some kind of unfathomable act of nature when we discuss them or name them. Here’s Alex Harvey again; “Whether you like it or not, anybody who’s involved in rock and roll is involved in politics. Anything that involves a big crowd of people listening to what you say is politics.” If rock and roll is politics, then actual politics is politics squared; and for as long as we settle, however grudgingly or complacently, for pyramidal power structures for our societies then the person at the top of that pyramid, enjoying its vistas and rarefied air should be the one to bear its most sombre responsibilities. But all who enable the pyramid to remain standing should accept their share of it too.

So when you’re helplessly watching something that seems like an unbelievable waste of people’s lives and abilities, pay close attention to who’s doing and saying what, even if you don’t want to, because the credits at the end probably won’t tell you who’s really responsible.

 

 

 

passive-digressive

There are two kinds of people* – those who like forewords, introductions, prefaces, author’s notes, footnotes, appendices, bibliographies, notes on the text, maps etc, and those who don’t. But we’ll get back to that shortly.

* there are more than two kinds of people. Possibly infinite kinds of people. Or maybe there’s only one kind; I’m never sure

A few times recently, I’ve come across the idea (which I think is mainly an American academic one, but I might be completely mistaken about that) that parentheses should only be used when you really have to (but when do you really have to?) because anything that is surplus to the requirements of the main thrust of one’s text is surplus to requirements full stop, and should be left out. But that’s wrong. The criticism can be and is extended to anything that interrupts the flow* of the writing. But that is also wrong. Unless you happen to be writing a manual or a set of directions or instructions, writing isn’t (or needn’t be) a purely utilitarian pursuit and the joy of reading (or of writing) isn’t in how quickly or efficiently (whatever that means in this context) you can do it. Aside from technical writing, the obvious example where economy just may be valuable is poetry – which however is different and should probably have been included in a footnote, because footnotes are useful for interrupting text without separating the point you’re making (in a minute) from the point you’re commenting on or adding to (a few sentences ago), without other, different stuff getting in the way.

*like this¹                                                                                                                                                                ¹but bear in mind that people don’t write footnotes by accident – the interruption is deliberate²                        ²and sometimes funny

Poly-Olbion – that’s how you write a title page to pull in the readers

I would argue (though the evidence of a lot of poetry itself perhaps argues against me – especially the Spenser’s Faerie Queen, Michael Drayton’s Poly-Olbion kind of poetry that I’m quite fond of) that a poem should be** the most economical or at least the most effective way of saying what you have to say – but who’s to say that economical and effective are the same thing anyway?)

** poets, ignore this; there is no should be

 

 

 

Clearly (yep), the above is a needlessly convoluted way of writing, and can be soul-achingly annoying to read; but – not that this is an effective defence – I do it on purpose. As anyone who’s read much here before will know, George Orwell is one of my all-time favourite writers, and people love to quote his six rules for writing, but while I would certainly follow them if writing a news story or article where brevity is crucial, otherwise I think it’s more sensible to pick and choose. So;

Never use a metaphor, simile, or other figure of speech which you are used to seeing in print. Absolutely; although sometimes you would use them because they are familiar, if making a specific point, or being amusing. Most people, myself included, just do it by accident; because where does the dividing line fall? In this paragraph I have used “by accident” and “dividing line” which seem close to being commonly used figures of speech (but then so does “figure of speech”). But would “accidentally” or something like “do it without thinking” be better than “by accident?” Maybe.

Never use a long word where a short one will do. The key point here is will do. In any instance where a writer uses (for example) the word “miniscule” then “small” or “tiny” would probably “do”. But depending on what it is they are writing about, miniscule or microscopic might “do” even better. Go with the best word, not necessarily the shortest.

If it is possible to cut a word out, always cut it out. Note that Orwell wrote ‘always’ here where he could just have said If it is possible to cut a word out, cut it out. Not everything is a haiku, George.

Never use the passive where you can use the active. Surely it depends what you’re writing? If you are trying, for instance, to pass the blame for an assault from a criminal on to their victim, you might want a headline that says “X stabbed after drug and alcohol binge” rather than “Celebrity kills X.” You kind of see Orwell’s point though.

Never use a foreign phrase, a scientific word, or a jargon word if you can think of an everyday English equivalent. Both agree and disagree; as a mostly monolingual person I agree, but some words and phrases (ironically, usually ones in French, a language I have never learned and feel uncomfortable trying to pronounce; raison d’etre or enfant terrible for example) just say things more quickly and easily (I can be utilitarian too) than having to really consider and take the time to say what you mean. They are a shorthand that people in general understand. Plus, in the age of smartphones, it really doesn’t do native English speakers any harm to have to look up the meanings of foreign words occasionally (I do this a lot). The other side of the coin (a phrase I’m used to seeing in print) is that with foreign phrases is it’s funny to say them in bad translations like “the Tour of France” (which I guess must be correct) or “piece of resistance” (which I am pretty sure isn’t) so as long as you are understood (assuming that you want to be understood) use them any way you like.

Break any of these rules sooner than say anything outright barbarous. It’s hard to guess what George Orwell would have considered outright barbarous (and anyway, couldn’t he have cut “outright”?) but anyone reading books from even 30, 50 or a hundred years ago quickly sees that language evolves along with culture, so that rules – even useful ones – rarely have the permanence of commandments.

So much for Orwell’s rules; I was more heartened to find that something I’ve instinctively done – or not done – is supported by Orwell elsewhere. That is, that I prefer, mostly in the name of cringe-avoidance, not to use slang that post-dates my own youth. Even terms that have become part of normal mainstream usage (the most recent one is probably “woke”) tend to appear with inverted commas if I feel like I must use them, because if it’s not something I would be happy to say out loud (I say “woke” with inverted commas too) then I’d prefer not to write it. There is no very logical reason for this and words that I do comfortably use are no less subject to the whims of fashion, but still; the language you use is part of who you are, and I think Orwell makes a very good case here, (fuller version far below somewhere because even though I have reservations about parts of it it ends very well):

“Each generation imagines itself to be more intelligent than the one that went before it, and wiser than the one that comes after it. This is an illusion, and one should recognise it as such, but one ought also to stick to one’s world-view, even at the price of seeming old-fashioned: for that world-view springs out of experiences that the younger generation has not had, and to abandon it is to kill one’s intellectual roots.”

Review of A Coat of Many Colours: Occasional Essays by Herbert Read. (1945) The Collected Essays, Journalism and Letters of George Orwell Volume 4. Penguin 1968, p.72 

the fold-out map in The Silmarillion is a thing of beauty

Back to those two kinds* of people: I am the kind of person that likes and reads forewords, introductions, prefaces, author’s notes, footnotes, appendices, bibliographies, notes on the text, maps and all of those extras that make a book more interesting/informative/tedious.

 

*I know.

 

In one of my favourite films, Whit Stillman’s Metropolitan (1990), the protagonist Tom Townsend (Edward Clements), says “I don’t read novels. I prefer good literary criticism. That way you get both the novelists’ ideas as well as the critics’ thinking. With fiction I can never forget that none of it really happened, that it’s all just made up by the author.” Well, that is not me; but I do love a good bit of criticism and analysis as well as a good novel. One of my favourite ever pieces of writing of any kind, which I could, but choose not to recite parts of by heart, is the late Anne Barton’s introduction to the 1980 New Penguin Shakespeare edition of Hamlet*. I love Hamlet, but I’ve read Barton’s introduction many more times than I’ve read the play itself, to the point where phrases and passages have become part of my mind’s furniture. It’s a fascinating piece of writing, because Professor Barton had a fascinating range and depth of knowledge, as well as a passion for her subject; but also and most importantly because she was an excellent writer. If someone is a good enough writer**, you don’t even have to be especially interested in the subject to enjoy what they write. Beyond the introduction/footnote but related in a way are the review and essay. Another of my favourite books – mentioned elsewhere I’m sure, as it’s one of the reasons that I have been working as a music writer for the past decade and a half, is Charles Shaar Murray’s Shots from the Hip, a collection of articles and reviews. The relevant point here is that more than half of its articles – including some of my favourites – are about musicians whose work I’m quite keen never to hear under any circumstances, if humanly possible. Similarly, though I find it harder to read Martin Amis’s novels than I used to (just changing taste, not because I think they are less good), I love the collections of his articles, especially The War Against Cliché and Visiting Mrs Nabokov. I already go on about Orwell too much, but as I must have said somewhere, though I am a fan of his novels, it’s the journalism and criticism that he probably thought of as ephemeral that appeals to me the most.

*All of the New Penguin Shakespeare introductions that I’ve read have been good, but that is in a different league. John Dover Wilson’s What Happens in Hamlet (1935, though the edition I have mentions WW2 in the introduction, as I remember; I like the introduction) is sometimes easy to disagree with but it has a similar excitement-of-discovery tone as Anne Barton’s essay

** Good enough, schmood enough; what I really mean is if you like their writing enough. The world has always been full of good writers whose work leaves me cold

a scholarly approach to comics

All this may have started, as I now realise that lots of things seem to in my writing did, with Tolkien. From the first time I read his books myself, I loved that whatever part of Middle-Earth and its people you were interested in, there was always more to find out. Appendices, maps, whole books like The Silmarillion which extended the enjoyment and deepened the immersion in Tolkien’s imaginary world. And they were central to that world – for Tolkien, mapping Middle-Earth was less making stuff up than it was a detailed exploration of something he had already at least half imagined. Maybe because I always wanted to be a writer myself – and here I am, writing – whenever I’ve really connected with a book, I’ve always wanted to know more. I’ve always been curious about the writer, the background, the process. I’ve mentioned Tintin lots of times in the past too and my favourite Tintin books were, inevitably, the expanded editions which included Herge’s sketches and ideas, the pictures and objects and texts that inspired him. I first got one of those Tintin books when I was 9 or so, but as recently as the last few years I bought an in many ways similar expanded edition of one of my favourite books as an adult, JG Ballard’s Crash. It mirrors the Tintins pretty closely; explanatory essays, sketches, notes, ephemera, all kinds of related material. Now just imagine how amazing a graphic novel of Crash in the Belgian ligne claire style would be.*

*a bit like Frank Miller and Geof Darrow’s fantastic-looking but not all that memorable Hard Boiled (1990-92) I guess, only with fewer robots-with-guns shenanigans and more Elizabeth Taylor

a scholarly approach to cautionary 1970s semi-pornography/horror: the expanded Crash

A good introduction or foreword is (I think) important for a collection of poems or a historical text of whatever kind. Background and context and, to a lesser extent, analysis expand the understanding and enjoyment of those kinds of things. An introduction for a modern novel though is a slightly different thing and different also from explanatory notes, appendices and footnotes and it’s probably not by chance that they mainly appear in translations or reprints of books that already enjoyed some kind of zeitgeisty success. When I first read Anne Barton’s introduction to Hamlet, I already knew what Hamlet was about, more or less. And while I don’t think “spoilers” are too much of an issue with fiction (except for whodunnits, which I have so far not managed to enjoy), do you really want to be told what to think of a book before you read it? But a really good introduction will never tell you that. If in doubt, read them afterwards!

Some authors, and many readers, see all of these extraneous things as excess baggage, surplus to requirements, which obviously they really are, and that’s fair enough. If the main text of a novel, a play or whatever, can’t stand on its own then no amount of post-production scaffolding will make it satisfactory.* And presumably, many readers pass their entire lives without finding out or caring why the author wrote what they wrote, or what a book’s place in the pantheon of literature (or just “books”) is. Even as unassailably best-selling an author as Stephen King tends to be a little apologetic about the author’s notes that end so many of his books, despite the fact that nobody who doesn’t read them will ever know that he’s apologetic. Still; I for one would like to assure his publisher that should they ever decide to put together all of those notes, introductions and prefaces in book form, I’ll buy it. But would Stephen King be tempted to write an introduction for it?

 

* though of course it could still be interesting, like Kafka’s Amerika, Jane Austen’s Sanditon or Tolkien and Hergé (them again) with Unfinished Tales or Tintin and Alph-Art

 

That Orwell passage in full(er):

“Clearly the young and middle aged ought to try to appreciate one another. But one ought also to recognise that one’s aesthetic judgement is only fully valid between fairly well-defined dates. Not to admit this is to throw away the advantage that one derives from being born into one’s own particular time. Among people now alive there are two very sharp dividing lines. One is between those who can and can’t remember the period before 1914; the other is between those who were adults before 1933 and those who were not.* Other things being equal, who is likely to have a truer vision at the moment, a person of twenty or a person of fifty? One can’t say, though on some points posterity may decide. Each generation imagines itself to be more intelligent than the one that went before it, and wiser than the one that comes after it. This is an illusion, and one should recognise it as such, but one ought also to stick to one’s world-view, even at the price of seeming old-fashioned: for that world-view springs out of experiences that the younger generation has not had, and to abandon it is to kill one’s intellectual roots.”

*nowadays, the people who can or can’t remember life before the internet and those who were adults before 9/11? Or the Trump presidency? Something like that seems right