With apologies to Marcel Proust – but not very vehement apologies, because it’s true – the taste of honey on toast is as powerfully evocative and intensely transporting to me as anything that I can think of. The lips and tongue that made that association happen don’t exist anymore and neither does the face, neither do the eyes, and neither does one of the two brains and/or hearts* that I suppose really made it happen (mine are still there, though). In 21st century Britain, it’s more likely than not that even her bones don’t exist anymore, which makes the traditional preoccupation with returning to dust feel apt and more immediate and (thankfully?) reduces the kind of corpse-fetishising morbidity that seems to have appealed so much to playgoers in the Elizabethan/Jacobean era.
Thou shell of death, Once the bright face of my betrothed lady, When life and beauty naturally fill’d out These ragged imperfections, When two heaven-pointed diamonds were set In those unsightly rings: then ’twas a face So far beyond the artificial shine Of any woman’s bought complexion (The Revenger’s Tragedy (1606/7) by Thomas Middleton and/or Cyril Tourneur, Act one, Scene one)
*is the heart in the brain? In one sense obviously not, in another maybe, but the sensations associated with the heart seem often to happen somewhere around the stomach; or is that just me?
More to the point, “here hung those lips that I have kissed I know not how oft“, etc. All of which is beautiful; but for better or worse, a pile of ash isn’t likely to engender the same kind of thoughts or words as Yorick’s – or anybody’s – skull. But anyway, the non-existence of a person – or, even more abstractly, the non-existence of skin that has touched your skin (though technically of course all of the skin involved in those kisses has long since disappeared into dust and been replaced anyway) is an absence that’s strange and dismal to think about. But then most things don’t exist.
But honey does exist of course; and the association between human beings and sugary bee vomit goes back probably as long as human beings themselves. There are Mesolithic cave paintings, 8000 years old or more, made by people who don’t exist, depicting people who may never have existed except as drawings, or may have once existed but don’t anymore, plundering beehives for honey. Honey was used by the ancient Egyptians, who no longer exist, in some of their most solemn rites, it had sacred significance for the ancient Greeks, who no longer exist, it was used in medicine in India and China, which do exist now but technically didn’t then, by people who don’t, now. Mohammed recommended it for its healing properties; it’s a symbol of abundance in the Bible and it’s special enough to be kosher despite being the product of unclean insects. It’s one of the five elixirs of Hinduism, Buddha was brought honey by a monkey that no longer exists. The Vikings ate it and used it for medicine too. Honey was the basis of mead, the drink of the Celts who sometimes referred to the island of Britain as the Isle of Honey.
And so on and on, into modern times. But also (those Elizabethan-Jacobeans again) “The sweetest honey is loathsome in its own deliciousness. And in the taste destroys the appetite.” (William Shakespeare, Romeo and Juliet (c.1595) Act 2, scene 6) “Your comfortable words are like honey. They relish well in your mouth that’s whole; but in mine that’s wounded they go down as if the sting of the bee were in them.”(John Webster, The White Devil (1612), Act 3. Sc.ene 3). See also “honey trap”. “Man produces evil as a bee produces honey.”You catch more flies with honey.
But on the whole, the sweetness of honey is not and has never been sinister. A Taste of Honey, Tupelo Honey, “Wild Honey,” “Honey Pie”, “Just like Honey,” “Me in Honey,” “Put some sugar on it honey,” Pablo Honey, “Honey I Sure Miss You.” Honey to the B. “Honey” is one of the sweetest (yep) of endearments that people use with each other. Winnie-the-Pooh and Bamse covet it. Honey and toast tasted in a kiss at the age of 14 is, in the history of the world, a tiny and trivial thing, but it’s enough to resonate throughout a life, just as honey has resonated through the world’s human cultures. Honey’s Dead. But the mouth that tasted so sweetly of honey doesn’t exist anymore. Which is sad, because loss is sad. But how sad? Most things never exist and even most things that have existed don’t exist now, so maybe the fact that it has existed is enough.
“Most things don’t exist” seems patently untrue: for a thing to be ‘a thing’ it must have some kind of existence, surely? And yet, even leaving aside things and people that no longer exist, we are vastly outnumbered by the things that have never existed, from the profound to the trivial. Profound, well even avoiding offending people and their beliefs, probably few people would now say that Zeus and his extended family are really living in a real Olympus. Trivially, 70-plus years on from the great age of the automobile, flying cars as imagined by generations of children, as depicted in books and films, are still stubbornly absent from the skies above our roads. The idea of them exists, but even if – headache-inducing notion – it exists as a specific idea (“the idea of a flying car”), rather than just within the general realm of “ideas,” an idea is an idea, a thing perhaps but not the thing that it is about. Is a specific person’s memory of another person a particular thing because it relates to a particular person, or does it exist only under the larger and more various banner of “memories”? Either way, it’s immaterial, because even though the human imagination is a thing that definitely exists, the idea of a flying car is no more a flying car than Leonardo da Vinci’s drawing of a flying machine was a flying machine or that my memory of honey-and-toast kisses is a honey-and-toast kiss.
If you or I picture a human being with electric blue skin, we can imagine it and if we have the talent we can draw it, someone could depict it in a film, but it wouldn’t be the thing itself, because human beings with electric blue skin, like space dolphins, personal teleportation devices, seas of blood, winged horses, articulate sentient faeces and successful alchemical experiments, don’t exist. And depending on the range of your imagination (looking at that list mine seems a bit limited), you could think of infinite numbers of things that don’t exist. There are also, presumably, untold numbers of things that do exist but that we personally don’t know about or that we as a species don’t know about yet. But even if it was possible to make a complete list of all of the things in existence (or things in existence to date; new things are invented or develop or evolve all the time), it would always be possible to think of even more things that don’t exist, – simply, in the least imaginative way, by naming variations on, or parodies of everything that does exist. So supermassive black holes exist? Okay, but what about supertiny pink holes? What about supermedium beige holes? This June, a new snake (disappointingly named Ovophis jenkinsi) was discovered. But what about a version of Ovophis jenkinsi that sings in Spanish or has paper bones or smells like Madonna? They don’t exist.
Kind of a creepy segue if you think about it (so please don’t), but like those beautifully-shaped lips that tasted of honey, my mother no longer exists, except as a memory, or lots of different memories, belonging to lots of different people. Presumably she exists in lots of memories as lots of different people who happen to have the same name. But unlike supermedium beige holes, the non-existence of previously-existing things and people is complex, because of the different perspectives they are remembered from. But regardless, they are still fundamentally not things anymore. But even with the ever-growing, almost-infinite number of things, there are, demonstrably, more things that don’t exist. And, without wishing to be horribly negative or repeating things I’ve written before, one of the surprises with the death of a close relative was to find that death does exist. Well, obviously, everyone knows that – but not just as an ending or as the absence of life, as was always known, but as an active, grim-reaper-like force of its own. For me, the evidence for that – which I’m sure could be explained scientifically by a medical professional – is the cold that I mentioned in the previous article. Holding a hand that gets cold seems pretty normal; warmth ebbing away as life ebb away; that’s logical and natural. But this wasn’t the expected (to me) cooling down of a warm thing to room temperature, like the un-drunk cups of tea which day after day were brought and cooled down because the person they were brought for didn’t really want them anymore, just the idea of them. That cooling felt natural, as did the warming of the glass of water that sat un-drunk at the bedside because the person it was for could no longer hold or see it. That water had been cold but had warmed up to room temperature, but the cold in the hand wasn’t just a settling in line with ambient conditions. It was active cold; hands chilling and then radiating cold in quite an intense way, a coldness that dropped far below room temperature. I mentioned it to a doctor during a brief, unbelievably welcome break to get some air, and she said “Yes, she doesn’t have long left.” Within a few days I wished I’d asked for an explanation of where that cold was coming from; where is it generated? Which organ in the human body can generate cold so quickly and intensely? Does it do it in any other situations? And if not, why not? So, although death can seem abstract, in the same sense that ‘life’ seems abstract, being big and pervasive, death definitely exists. But as what? Don’t know; not a single entity, since it’s incipient in everyone, coded into our DNA: but that coding has nothing to do with getting hit by cars or drowning or being shot, does it? So, a big question mark to that. Keats would say not to question it, just to enjoy the mystery. Well alright then.
But since most things *don’t* exist, but death definitely does exist, existence is, in universal terms, rare enough to be something like winning the lottery. But like winning the lottery, existence in itself is not any kind of guarantee of happiness or satisfaction or even honey-and-toast kisses; but it at least offers the possibility of those things, whereas non-existence doesn’t offer anything, not even peace, which has to be experienced to exist. We have all not existed before and we will all not exist again; but honey will still be here, for as long as bees are at least. I don’t know if that’s comforting or not. But if you’re reading this – and I’m definitely writing it – we do currently exist, so try enjoy your lottery win, innit.
There are two kinds of people* – those who like forewords, introductions, prefaces, author’s notes, footnotes, appendices, bibliographies, notes on the text, maps etc, and those who don’t. But we’ll get back to that shortly.
* there are more than two kinds of people. Possibly infinite kinds of people. Or maybe there’s only one kind; I’m never sure
A few times recently, I’ve come across the idea (which I think is mainly an American academic one, but I might be completely mistaken about that) that parentheses should only be used when you really have to (but when do you reallyhave to?) because anything that is surplus to the requirements of the main thrust of one’s text is surplus to requirements full stop, and should be left out. But that’s wrong. The criticism can be and is extended to anything that interrupts the flow* of the writing. But that is also wrong. Unless you happen to be writing a manual or a set of directions or instructions, writing isn’t (or needn’t be) a purely utilitarian pursuit and the joy of reading (or of writing) isn’t in how quickly or efficiently (whatever that means in this context) you can do it. Aside from technical writing, the obvious example where economy just may be valuable is poetry – which however is different and should probably have been included in a footnote, because footnotes are useful for interrupting text without separating the point you’re making (in a minute) from the point you’re commenting on or adding to (a few sentences ago), without other, different stuff getting in the way.
*like this¹ ¹but bear in mind that people don’t write footnotes by accident – the interruption is deliberate² ²and sometimes funny
I would argue (though the evidence of a lot of poetry itself perhaps argues against me – especially the Spenser’s Faerie Queen, Michael Drayton’s Poly-Olbion kind of poetry that I’m quite fond of) that a poem should be** the most economical or at least the most effective way of saying what you have to say – but who’s to say that economical and effective are the same thing anyway?)
**poets, ignore this; there is no should be
Clearly (yep), the above is a needlessly convoluted way of writing, and can be soul-achingly annoying to read; but – not that this is an effective defence – I do it on purpose. As anyone who’s read much here before will know, George Orwell is one of my all-time favourite writers, and people love to quote his six rules for writing, but while I would certainly follow them if writing a news story or article where brevity is crucial, otherwise I think it’s more sensible to pick and choose. So;
Never use a metaphor, simile, or other figure of speech which you are used to seeing in print. Absolutely; although sometimes you would use them because they are familiar, if making a specific point, or being amusing. Most people, myself included, just do it by accident; because where does the dividing line fall? In this paragraph I have used “by accident” and “dividing line” which seem close to being commonly used figures of speech (but then so does “figure of speech”). But would “accidentally” or something like “do it without thinking” be better than “by accident?” Maybe.
Never use a long word where a short one will do. The key point here is will do. In any instance where a writer uses (for example) the word “miniscule” then “small” or “tiny” would probably “do”. But depending on what it is they are writing about, miniscule or microscopic might “do” even better. Go with the best word, not necessarily the shortest.
If it is possible to cut a word out, always cut it out. Note that Orwell wrote ‘always’ here where he could just have said If it is possible to cut a word out, cut it out.Not everything is a haiku, George.
Never use the passive where you can use the active. Surely it depends what you’re writing? If you are trying, for instance, to pass the blame for an assault from a criminal on to their victim, you might want a headline that says “X stabbed after drug and alcohol binge” rather than “Celebrity kills X.” You kind of see Orwell’s point though.
Never use a foreign phrase, a scientific word, or a jargon word if you can think of an everyday English equivalent. Both agree and disagree; as a mostly monolingual person I agree, but some words and phrases (ironically, usually ones in French, a language I have never learned and feel uncomfortable trying to pronounce; raison d’etre or enfant terrible for example) just say things more quickly and easily (I can be utilitarian too) than having to really consider and take the time to say what you mean. They are a shorthand that people in general understand. Plus, in the age of smartphones, it really doesn’t do native English speakers any harm to have to look up the meanings of foreign words occasionally (I do this a lot). The other side of the coin (a phrase I’m used to seeing in print) is that with foreign phrases is it’s funny to say them in bad translations like “the Tour of France” (which I guess must be correct) or “piece of resistance” (which I am pretty sure isn’t) so as long as you are understood (assuming that you want to be understood) use them any way you like.
Break any of these rules sooner than say anything outright barbarous. It’s hard to guess what George Orwell would have considered outright barbarous (and anyway, couldn’t he have cut “outright”?) but anyone reading books from even 30, 50 or a hundred years ago quickly sees that language evolves along with culture, so that rules – even useful ones – rarely have the permanence of commandments.
So much for Orwell’s rules; I was more heartened to find that something I’ve instinctively done – or not done – is supported by Orwell elsewhere. That is, that I prefer, mostly in the name of cringe-avoidance, not to use slang that post-dates my own youth. Even terms that have become part of normal mainstream usage (the most recent one is probably “woke”) tend to appear with inverted commas if I feel like I must use them, because if it’s not something I would be happy to say out loud (I say “woke” with inverted commas too) then I’d prefer not to write it. There is no very logical reason for this and words that I do comfortably use are no less subject to the whims of fashion, but still; the language you use is part of who you are, and I think Orwell makes a very good case here, (fuller version far below somewhere because even though I have reservations about parts of it it ends very well):
“Each generation imagines itself to be more intelligent than the one that went before it, and wiser than the one that comes after it. This is an illusion, and one should recognise it as such, but one ought also to stick to one’s world-view, even at the price of seeming old-fashioned: for that world-view springs out of experiences that the younger generation has not had, and to abandon it is to kill one’s intellectual roots.”
Review of A Coat of Many Colours: Occasional Essays by Herbert Read. (1945) The Collected Essays, Journalism and Letters of George Orwell Volume 4. Penguin 1968, p.72
Back to those two kinds* of people: I am the kind of person that likes and reads forewords, introductions, prefaces, author’s notes, footnotes, appendices, bibliographies, notes on the text, maps and all of those extras that make a book more interesting/informative/tedious.
*I know.
In one of my favourite films, Whit Stillman’s Metropolitan (1990), the protagonist Tom Townsend (Edward Clements), says “I don’t read novels. I prefer good literary criticism. That way you get both the novelists’ ideas as well as the critics’ thinking. With fiction I can never forget that none of it really happened, that it’s all just made up by the author.” Well, that is not me; but I do love a good bit of criticism and analysis as well as a good novel. One of my favourite ever pieces of writing of any kind, which I could, but choose not to recite parts of by heart, is the late Anne Barton’s introduction to the 1980 New Penguin Shakespeare edition of Hamlet*. I love Hamlet, but I’ve read Barton’s introduction many more times than I’ve read the play itself, to the point where phrases and passages have become part of my mind’s furniture. It’s a fascinating piece of writing, because Professor Barton had a fascinating range and depth of knowledge, as well as a passion for her subject; but also and most importantly because she was an excellent writer. If someone is a good enough writer**, you don’t even have to be especially interested in the subject to enjoy what they write. Beyond the introduction/footnote but related in a way are the review and essay. Another of my favourite books – mentioned elsewhere I’m sure, as it’s one of the reasons that I have been working as a music writer for the past decade and a half, is Charles Shaar Murray’s Shots from the Hip, a collection of articles and reviews. The relevant point here is that more than half of its articles – including some of my favourites – are about musicians whose work I’m quite keen never to hear under any circumstances, if humanly possible. Similarly, though I find it harder to read Martin Amis’s novels than I used to (just changing taste, not because I think they are less good), I love the collections of his articles, especially The War Against Cliché and Visiting Mrs Nabokov. I already go on about Orwell too much, but as I must have said somewhere, though I am a fan of his novels, it’s the journalism and criticism that he probably thought of as ephemeral that appeals to me the most.
*All of the New Penguin Shakespeare introductions that I’ve read have been good, but that is in a different league. John Dover Wilson’s What Happens in Hamlet (1935, though the edition I have mentions WW2 in the introduction, as I remember; I like the introduction) is sometimes easy to disagree with but it has a similar excitement-of-discovery tone as Anne Barton’s essay
**Good enough, schmood enough; what I really mean is if you like their writing enough. The world has always been full of good writers whose work leaves me cold
All this may have started, as I now realise that lots of things seem to in my writing did, with Tolkien. From the first time I read his books myself, I loved that whatever part of Middle-Earth and its people you were interested in, there was always more to find out. Appendices, maps, whole books like The Silmarillion which extended the enjoyment and deepened the immersion in Tolkien’s imaginary world. And they were central to that world – for Tolkien, mapping Middle-Earth was less making stuff up than it was a detailed exploration of something he had already at least half imagined. Maybe because I always wanted to be a writer myself – and here I am, writing – whenever I’ve really connected with a book, I’ve always wanted to know more. I’ve always been curious about the writer, the background, the process. I’ve mentioned Tintin lots of times in the past too and my favourite Tintin books were, inevitably, the expanded editions which included Herge’s sketches and ideas, the pictures and objects and texts that inspired him. I first got one of those Tintin books when I was 9 or so, but as recently as the last few years I bought an in many ways similar expanded edition of one of my favourite books as an adult, JG Ballard’s Crash. It mirrors the Tintins pretty closely; explanatory essays, sketches, notes, ephemera, all kinds of related material. Now just imagine how amazing a graphic novel of Crash in the Belgian ligne claire style would be.*
*a bit like Frank Miller and Geof Darrow’s fantastic-looking but not all that memorable Hard Boiled (1990-92) I guess, only with fewer robots-with-guns shenanigans and more Elizabeth Taylor
A good introduction or foreword is (I think) important for a collection of poems or a historical text of whatever kind. Background and context and, to a lesser extent, analysis expand the understanding and enjoyment of those kinds of things. An introduction for a modern novel though is a slightly different thing and different also from explanatory notes, appendices and footnotes and it’s probably not by chance that they mainly appear in translations or reprints of books that already enjoyed some kind of zeitgeisty success. When I first read Anne Barton’s introduction to Hamlet, I already knew what Hamletwas about, more or less. And while I don’t think “spoilers” are too much of an issue with fiction (except for whodunnits, which I have so far not managed to enjoy), do you really want to be told what to think of a book before you read it? But a really good introduction will never tell you that. If in doubt, read them afterwards!
Some authors, and many readers, see all of these extraneous things as excess baggage, surplus to requirements, which obviously they really are, and that’s fair enough. If the main text of a novel, a play or whatever, can’t stand on its own then no amount of post-production scaffolding will make it satisfactory.* And presumably, many readers pass their entire lives without finding out or caring why the author wrote what they wrote, or what a book’s place in the pantheon of literature (or just “books”) is. Even as unassailably best-selling an author as Stephen King tends to be a little apologetic about the author’s notes that end so many of his books, despite the fact that nobody who doesn’t read them will ever know that he’s apologetic. Still; I for one would like to assure his publisher that should they ever decide to put together all of those notes, introductions and prefaces in book form, I’ll buy it. But would Stephen King be tempted to write an introduction for it?
* though of course it could still be interesting, like Kafka’s Amerika, Jane Austen’s Sanditon or Tolkien and Hergé (them again) with Unfinished Tales or Tintin and Alph-Art
That Orwell passage in full(er):
“Clearly the young and middle aged ought to try to appreciate one another. But one ought also to recognise that one’s aesthetic judgement is only fully valid between fairly well-defined dates. Not to admit this is to throw away the advantage that one derives from being born into one’s own particular time. Among people now alive there are two very sharp dividing lines. One is between those who can and can’t remember the period before 1914; the other is between those who were adults before 1933 and those who were not.* Other things being equal, who is likely to have a truer vision at the moment, a person of twenty or a person of fifty? One can’t say, though on some points posterity may decide. Each generation imagines itself to be more intelligent than the one that went before it, and wiser than the one that comes after it. This is an illusion, and one should recognise it as such, but one ought also to stick to one’s world-view, even at the price of seeming old-fashioned: for that world-view springs out of experiences that the younger generation has not had, and to abandon it is to kill one’s intellectual roots.”
*nowadays, the people who can or can’t remember life before the internet and those who were adults before 9/11? Or the Trump presidency? Something like that seems right
A wise woman once sang “It’s not real if you don’t feel it”* and as far as the arts are concerned it’s as good a measure of quality as anything. But what is “it” that you are feeling? Is everyone feeling the same thing? Clearly not. Even the opinions of people who do like the same song, the same book, the same film, the same painting, are likely to diverge when it comes to the detail of what they like and how it feels.
*The Goonies “R” Good Enough, (Cyndi Lauper, Stephen Broughton Lunt, Arthur Stead, 1985
Part of the mission of modernism in the early 20th century was to free art from associations; from sentimentality, from tradition, culture, religion, politics and define it for itself. That was necessary, in order to break the endless repetitive staleness of academicism and/or lowest-common-denominator entertainment, and because photography and recorded sound and near-universal literacy had all become significant factors in western society. Looking at the visual arts; if all that art does is to repeat what is already popular, to record and represent and recreate the visual and the actual, then how can it compare or compete with something like the camera which captures that external reality? And if that external reality, in the form of contemporary society, is something the artist rejects or objects to, then why use its tools and its language at all?
It’s hard to imagine, a century after the modernist explosion (say 1900-1939), the extent to which the arts were in thrall to academicism, presumably because, having fought first for freedom from the world of manual labour and craftsmanship, artists were keen to stress their respectability, their links to nobility, aristocracy and wealth. But access to that world came, not surprisingly, with rules, manners and forms of behaviour which settled, over the course of a couple of centuries, into its own rigid traditions. Therefore, the artists of the modernist era were, like any revolutionaries, especially concerned with making their own manifestos and statements. ‘Art for art’s sake’ is a nineteenth century, essentially romantic/bohemian idea which feels remote from the milieu of modernism, but at the same time a theory of pure art is found even more clearly in something like Kazimir Malevich’s The Non-Objective World (1926) than in anything written by Théophile Gautier or Edgar Allen Poe;
“Art no longer cares to serve the state and religion, it no longer wishes to illustrate the history of manners, it wants to have nothing further to do with the object, as such, and believes that it can exist, in and for itself, without “things”.’
Though formulated later, this is the kind of theorising that helps partially to explain works like Malevich’s Black Square (1st version 1915). Un-controversially considered a masterpiece – and one that I myself like a lot – it nevertheless seems to me a work that gains enormously from some kind of context, even if all that context is, is the knowledge that it is in fact a painting by an artist. ‘Left to itself’, without any associations, if encountered ‘cold’, especially outside of a gallery, it might just as easily not be ‘art’ at all. And while that isn’t a bad thing, a random black square encountered in one’s daily life doesn’t – depending of course on the individual who encounters it – have the intensity or pregnant quality that one can (repeat of previous caveat) feel standing in front of Malevich’s ‘Black Square’. But what Malevich does in his statement is to take the artist out of the art and anthropomorphise the art itself (“…it wants to have…”). This seems to me to negate – not unintentionally – what is meant by art at all. For myself, I prefer the German Expressionist Karl Schmidt-Rottluff’s statement which, while it doesn’t even slightly contradict the idea of purely abstract art, puts the artist at its centre, rather than treating art as a kind of self-creating phenomenon:
“I know of no new ‘programme’…. Only that art is forever manifesting itself in new forms, since there are forever new personalities – its essence can never alter, I believe. Perhaps I am wrong. But, speaking for myself, I know that I have no programme, only the unaccountable longing to grasp what I see and feel, and to find the purest means of expression for it.”
Karl Schmidt-Rottluff in Kunst und Kunstler (1914) quoted Wolf-Dieter Dube, The Expressionists, p.21 (T&H 1972, transl. Mary Whittall)
If a painting hangs in a forest…
The three key factors here (for me) then are creator-work-recipient. If the artists (Schmidt-Rottluff’s ‘personalities’) are trying to communicate something specific to the recipient with their work, then they either succeed or they don’t. If the artist doesn’t succeed in communicating what they intended to communicate – or if they aren’t thinking of the ‘end user’ at all, and are expressing their own feelings/ideas purely for their own reasons – they may (and probably will) still transmit something of themselves; a personality, an emotion or group of emotions, a mood or idea. But although in either case the work may be imbued with that power, it only becomes power when someone is there to experience and/or interact with it. In material terms, the great masterpieces of painting, be it the Mona Lisa (oil paint on wood), or the Black Square (oil paint on linen) have little more intrinsic ‘value’ than a few tubes of oil paint or a piece of wood or linen; after the lights go out and the visitors go home, they basically cease to exist as art. The alchemy that takes place when art finds an audience is what makes it art; at least, so it seems to me.
So can there be ‘good’ or ‘bad’ art? Short answer; intuition says yes, but experience says no. Alongside the disintegration of traditional academic rules, there has been the growth and persistence of the myth that, in order to break the rules of art, you must first understand and adhere to the rules. This idea has been strengthened by the fact that some of the iconic figures of modern art, like Picasso and Dali, have been immensely talented by the traditional, renaissance standards of art and could easily have made a career in academic painting; but so what? Would Guernica, looking exactly as it does, be a lesser work if it was the only painting Picasso had ever done, or if his immature works had been unimpressive?
Separating personal, aesthetic judgements of good and bad from objective judgements is almost impossible; a strong argument could be made for either of the above images being ‘better’ especially since the emotional impact is as subjective as anything else. And separating these kind of aesthetic judgements from moral ones can become even more complicated – can a work of art that is an expression of something ‘bad’ be good? If for example we discovered that Picasso was celebrating rather than mourning the slaughter and destruction at Guernica, would the painting be as good? And what does good even mean in that sentence anyway? The idea that (for instance) a painting, or a song is “bad” is essentially meaningless, despite the fact that millions of paintings and songs are clearly very bad. They can never be demonstrably bad because, as Hamlet says, and even the relatively short history of pop music proves, “there is nothing either good or bad, but thinking makes it so.” Even the most derivative, tuneless, unimaginative, moronic or amateurish song can and will be loved by someone, or many someones. And beyond people liking it, how can the quality of something like art truly be gauged? Yes, ‘Liking’ can be a complex thing and is not the same as ‘admiring’ and yes, there are people with knowledge and expertise and highly developed critical faculties and so forth; but their opinion can no more prove a work of art is good than a restaurant critic can prove that a Michelin-starred chef’s finest creation tastes better than a Big Mac.
Despite the ‘golden ratio’ of the ancients, Hogarth’s ‘line of beauty’ and the Turner Prize, despite Grammys and Brits and Eurovision Song Contests, there is no logical ‘2 + 2 = 4’ type equation which can prove that “4” = a good work of art. In architecture at least, a building either works as a building (ie stands up and people can go inside) or it doesn’t, but even then, it would probably be easier to ‘prove’ that your local supermarket is logically ‘better’ as a building than Chartres Cathedral, rather than vice versa. But it obviously isn’t (unless you are very lucky) better than Chartres Cathedral. It feels too trite and easy to say ‘art is only as good or bad as an individual’s opinion of it’, but I can’t really do any better than that. You can’t make someone like something by telling them it’s good, however convincing your argument may be to you.
I also don’t think (though I am less convinced about this) there are good or bad reasons for liking a work of art, a song or a book, although there are certainly different levels of engagement, which are still however subjective; I like Citizen Kane but I love Robocop. Do I think Robocop is therefore the better film? Absolutely not. In the western world there is a kind of agreed pantheon of ‘great art’, encapsulated in the ‘high art’ end of the scale by the way in which art history, English literature, cinema et al are taught in institutions and, at the lower end of the scale in books and websites of the ‘1000 albums/films you must hear/see before you die’ type, but in practice everyone constructs their own pantheon, with the importance of the ‘official’ ones being little more than a guide. I know Robocop wouldn’t exist in the same form as it does without the innovations of Citizen Kane, but that doesn’t change the way I feel about either film. In reality, the only way to gauge (for example) the “greatest album ever recorded” is to have a public vote without offering a list of previously selected albums to choose from and then see who ‘wins’ – and I am sure I still wouldn’t agree with it.
Over the years, it has often been considered that the correct critical attitude is to remove sentimentality from judgements on the arts, and although it is one way – judging pictures on their composition, harmony etc, ignoring subject altogether, evaluating music on its structure, technical skill etc – it is sometimes almost impossible to do, and really, thinking again of both the emotional satisfaction people get from songs, films, pictures they love, and the example of Malevich’s Black Square, is it even desirable? Thinking of Black Square, to judge a work which has so much context; theoretical, spiritual, cultural and emotional – by the sum of its basic physical attributes is reductive, as well as boring. Likewise, a great portrait in no way relies on the viewer knowing anything about the sitter, but – is Holbein’s great Henry VIII (1537) more interesting/engaging as flat masses of colour laid out in a particular, intricate design on a two-dimensional surface, or as the impression and interpretation of one human being through the eyes, mind and skill of another? The answer for me is the latter, which is really both, since the technical aspects of the first option are anyway incorporated in the second.
Pogo and the Black Square
A debate that rears its head fairly often – and I guess will increasingly do so as information about everything becomes more readily available – is whether ‘bad’ people (or just bad people) can make good art. Unlike art, and despite the murkiness of morality (influenced as it is by essentially amoral and anyway changeable concepts like tradition, religion and culture) there are some people that we can agree are bad, or at the very least, ‘not good’. Here’s an uncontroversial opinion; John Wayne Gacy, the ‘killer clown’, rapist and murderer of around 33 young people, was – even if he was at the mercy of his own personality disorder – a bad person. He also made something that is as close to being ‘bad art’ as anything I can think of. The fact that his paintings are collected by people and have sold for serious sums of money has nothing to do with their quality and everything to do with their associations. You could of course say much the same about the Black Square. And if the imaginary passerby who unpreparedly encountered the Black Square also encountered one of Gacy’s paintings, how would the experience differ?
Firstly, they would know immediately that it was a painting made by a human being, and, if from a western background, they would probably recognise the subject matter. Because of this, Gacy is both at an advantage and disadvantage; advantage because, no matter how the viewer feels about clowns, they have immediate ‘access’ to the painting – ‘I know what that is’. Disadvantage, because while the black square is a black square and therefore looks like a black square, Gacy’s clowns, portraits, skulls etc are – by the standards that most people judge art by – pretty amateurish. He wasn’t an accomplished enough artist (I don’t mean just in a technical way) to communicate anything very deliberately (he wanted his paintings to bring joy into peoples’ lives; which seems unlikely, unless said people are serial killer fetishists), so what the viewer is left with are his obsessions – or at least the ones he could express to his own satisfaction through his paintings.
Going back to my highly dubious creator-work-recipient idea of art, the creator, Gacy was (or said he was) trying to do something specific – to create bright and happy pictures to bring joy to the recipient. Whether he succeeded in this aim, regardless of who he was, depends on how one responds to childlike but sometimes enigmatic pictures of clowns. What he definitely did do was to transmit something of himself; a clear-cut but deeply alienated/alienating vision of the world; actually, without a world. Not, as one might expect, a simplified Norman Rockwell America, with the sun in the sky and a clown in the garden, but essentially just the clown; mostly in fact Pogo the clown, Gacy’s own alter ego, sometimes with an extremely cursory, but telling hint of a setting. Not a circus, or the suburbia of the childrens’ parties he haunted, but a hint of a dark, fairytale (the seven dwarfs appear in a particularly odd picture) forest. These are clowns in the wild. The term ‘outsider art’ could have been coined for Gacy’s paintings. The other often-used term, ‘naïve art’ seems fleetingly appropriate, until one considers pictures like his paintings of Charles Manson, or even more so, of Tim Curry’s Pennywise from the TV adaptation of Stephen King’s IT. Gacy may not have been a good painter, he may have been to all intents and purposes insane, but he was not naïve; he knew that he belonged to a pantheon of famous murderers, that he was the original killer clown and he was flattered by the association.
But Gacy was chosen as an intentionally extreme example; even more extreme would be Hitler, whose serviceable but bland and slightly lifeless paintings are also highly collectable, despite lacking even the visceral ‘disturbed’ quality of Gacy’s. Whereas the innocent buyer might just be attracted to Gacy’s clowns for their kitsch, weird, outsider quality, Hitler’s works are best suited for what they were meant to be – postcards, unambitious souvenirs, illustrations. The lack of frisson they have as images is an indicator that the reasons people have for buying them have little to do with the pictures themselves. For, hopefully, a variety of reasons, these people are not buying ‘art’ at all, they are buying history.
The art didn’t abuse…
The world of actual art also has its fair share of murderers, rapists and so forth, and the question of whether their lives and actions invalidates their work is never really answerable. Apart from anything else, what about the legions of artists, musicians, writers whose private lives and opinions we know little or nothing about? Or artists like Andrea del Castagno, known for centuries as a murderer because of a mistake (whether malicious or not we cannot know) in Giorgio Vasari’s biography of him? At this distance of time it isn’t really an issue, even when talking about a definite murderer like Caravaggio. We don’t expect historical figures to have views, opinions and beliefs that we would find acceptable in the 21st century, although people of the 16th century certainly felt at least as strongly about murder as we do now. When we get closer to our own time, things become more complicated. For me, it’s easy to disregard the achievements of, say Eric Gill*, because even without the knowledge of his child (and animal) abuse, his work is not really my cup of tea; graceful and stylish yes, but, given that he was a contemporary of people like Jacob Epstein and Constantin Brâncuși, also a bit un-dynamic, insipidly faux-modern and backwards-looking. And then, adding the context, knowing about Gill’s religious beliefs, a bit churchy, and then, knowing about his abuse of his daughters, hypocritically pious too; it leaves a bad taste. Which doesn’t stop people from loving it, and nor should it; the art didn’t abuse anyone. (This short article by Waldemar Januszczak is very good on Gill I think).
But one of the points about Gill is that even his apologists probably wouldn’t, these days, hold an exhibition of Gill the artist without at least acknowledging the problems with Gill the man. More my cup of tea, and more relevant to now, the Scottish National Gallery of Modern Art will be hosting an exhibition of Emil Nolde’s work this summer. German Expressionism (or in Nolde’s case, German-Danish Expressionism) is one of the areas of art I love the most and, although Nolde is not one of my favourite artists I will be excited to see his work. But. Emil Nolde was a member of the Nazi Party. That of course doesn’t change his paintings, but it makes them – and the exhibition – problematic for several reasons. The main reason for me, is that, in its pre-exhibition publicity at least, the NGS makes no mention of his Nazism whatsoever. That might still be okay, I suppose, if they didn’t include this little snippet in their bio:
“This exhibition…covers Nolde’s complete career, from his early atmospheric paintings of his homeland right through to the intensely coloured, so-called ‘unpainted paintings’, works done on small pieces of paper during the Third Reich, when Nolde was branded a ‘degenerate’ artist and forbidden to work as an artist.”
There is a certain amount of schadenfreude in this detail. But there is also the ghost of fellow Expressionist Elfriede Lohse-Wächtler, murdered at Sonnenstein castle in 1940 as part of a government programme to eliminate the mentally ill, and of German-Jewish painters like Charlotte Salomon and the surrealist Felix Nussbaum, murdered in Auschwitz in 1943 and 44 respectively. As a member of the Nazi Party, Nolde was to an extent complicit in their deaths; for him, ‘entartete kunst’, a policy he didn’t necessarily oppose in general, meant he had to paint unobtrusively, in private and couldn’t exhibit his work until after the war. For those artists it meant a death sentence, for many others it meant harassment or exile. A more wide-ranging exhibition in which Nolde’s paintings bridge the gap between the work of his fellow ‘degenerates’ including perhaps some of Nussman’s Auschwitz paintings, and the art of Nazi-approved painters like Adolf Ziegler or Conrad Hommel would be a strange and indigestible (and chronologically back to front) thing perhaps, but I think that failing that kind of an overview we, at the very least, shouldn’t be encouraged to feel sorry for Nolde that he had to work in secret because of the actions of the government he supported.
Is Nolde’s art then ‘Nazi art’? No, or at least not in the same way that state-sponsored art under Hitler was. It isn’t didactic, realist or heroic. Nolde saw expressionism and therefore his own painting as definitively German, and was deeply moved by colour, which he equated with emotion. The works of his which I like best (which, by coincidence perhaps, long pre-date even the idea of the Third Reich and belong to the period when he had recently been in contact with the younger artists of Die Brücke) translate that emotion into intense and visionary land and seascapes. These pictures feel utterly free of the ideology of Nazism – but that said, even under Nazi rule, the German ideal of the nude Freikörperkultur (Free Body Culture) and ‘oneness with nature’ was respectable in a way that was unthinkable in the UK, so the apparent freedom of the painting need not be reflected in the kind
of egalitarian ideals that artists like Ernst Ludwig Kirchner expressed in their art. If expressionism can be seen as the ultimate kind of subjective painting; where the aim is ultimately to make the viewer feel what the artist feels by filtering a subject through the distorting lens of their individual perception, then Nolde’s paintings show the world as it was felt by someone who could write, in 1938;
As to the question of how easy it is to like Nolde’s ‘unpainted pictures’, I’ll have to wait for the exhibition.
How do you solve a problem like Morrissey (it solves itself)
The Nolde exhibition is only one reason that these issues have been on my mind recently; the other, more personal one is Morrissey. Morrissey is clearly not John Wayne Gacy, or Adolf Hitler, or even Emil Nolde. Nor is he, unlike Varg Vikernes, whose music I also like, a murderer. But I never felt let down by any of those people; with Varg I knew about him before I ever heard his music, I have no emotional investment in it, whereas Morrissey’s recent utterances seem completely at odds with the worldview of his earlier music; which is not his problem, or his fault, I simply interpreted what I wanted to from the art he created, just as it’s possible to look at Emil Nolde’s work and see beauty and freedom there, even if that freedom and beauty is diametrically opposed to the views he professed in his non-artistic life.
I first listened to The Smiths and Morrissey when I was 17, although I was aware of them/him years before. Of all the music I loved as a teenager I think Morrissey’s was the music I identified with the most. I liked The Cure and Joy Division and The Fall probably as much, but their music was – I suppose because it’s less lyrically straightforward – less personal to me. To this day, Morrissey’s lyrics (up to the mid 90s at least) are engraved on my memory and I certainly know more of his lyrics by heart than any other band or artist’s. It’s been very clear for a while now (and murkily apparent for much, much longer) what kind of person, politically, Morrissey is. And that’s fair enough; he is entitled to his views, even if I think he’s wrong and don’t feel inclined to fund him any further (I still think he is more complex than his worst detractors would say, but so what?)
It’s no use really to say as some people do, that there are artists out there making great work who don’t have extreme right wing views. Obviously that’s true; but unless their art speaks to you why would you care? And most of the time, one has no idea what opinions or beliefs of an artist are anyway, unless they specifically say so. And (to me) art that is explicitly political/religious or politically/religiously-motivated rarely connects on a very deep level; and to paraphrase Cyndi again, it’s not real unless I feel it.
And I always felt The Smiths’ music, deeply, and much of Morrissey’s solo stuff too, though it is less critically acclaimed. His recent/latest statements in the press don’t seem like the words of someone who could write “It’s so easy to laugh/It’s so easy to hate/It takes strength to be gentle and kind”, but that’s people for you.
Initially, several controversies ago, I decided that although I wouldn’t actively avoid Morrissey and his works, I would just no longer buy them in a way which would benefit him directly; mean and possibly unfair I know, but that’s people for you too. I am not someone who is going to burn records, CDs and books, or even throw/give them away in disgust, if they have ever meant anything to me. But then came the latest and most crass Morrissey interview (so far) and I got to the point where I’d be kind of embarrassed to buy anything Morrissey-related at all. It’s not so much (as one example out of many) the factual inaccuracy of statements like “Hitler was left wing” – people have been saying moronic things like that (Hitler was a Zionist etc etc etc) for many years. It’s the fact that, as with those who claim the death toll in the holocaust has been exaggerated, people like Morrissey seem to think that his amazing revelation about Hitler is in any way relevant to the things his regime did and how one should feel about it. As with (ironically) people who taunt vegetarians with ‘Hitler was a vegetarian’, it spectacularly misses the point; Hitler is not famous because he’s a vegetarian, any more than he’s famous for his ‘left wing’ views. And you know that, so don’t be so stupid.
But anyway, in the end my fears that the soundtrack to my youth/life would be tainted only came half true. When Morrissey songs popped up in a shuffle I found that, without any feeling of revulsion, drama or anguish, I just didn’t want to hear them anymore. The connection seems to be gone, without regret and possibly with the relief that I was never – despite the fact that I even, unrepentantly, like his autobiography – one of those Morrissey obsessives. Maybe one day my love of his music will come back, maybe not. It’s not real if you don’t feel it and, right now I just don’t, so it isn’t. Ho hum.
“Time, time, time, see what’s become of me…” When The Bangles covered Simon & Garfunkel’s A Hazy Shade of Winter in 1987, the song was 21 years and one month old, now The Bangles’ version (from the underrated – according to me – movie of Bret Easton Ellis’ Less Than Zero) is 30 years and one month old; time flies, another year draws to an end etc etc etc. It took until the early 90s for 60s nostalgia to really take hold and, true to form 30 years on from the 1980s, 80s nostalgia is everywhere; in music, in fashion, (especially) in film and television. Even the tired, terrifying old tropes of the cold war are back; excellent stuff.
It’s approximately 90 years since HP Lovecraft wrote, “The oldest and strongest emotion of mankind is fear, and the oldest and strongest kind of fear is the fear of the unknown.” (in the essay Supernatural Horror in Literature (1926-7)), and it’s got to be something like 25 years or so since I first read those words (in the HP Lovecraft Omnibus Vol 2, Dagon and other Macabre Tales, Grafton Books, 1985, p.423 ). So what about it?
Lovecraft might well be right about fear; but more pertinent to my intro is that perhaps the oldest emotion preserved in literature – at least (major, major caveat, based on my ignorance) in the literature of Europe – is nostalgia, and the feeling that things were better in the past. (see also here for an excellent & thoughtful look at nostalgia) The literature of the ancient Greeks makes clear that the age of heroes already lay in the distant past; the pride and arrogance of Imperial Rome was tempered – formally, at least – by the belief that it was a pale imitation of the Republic which the Empire supplanted. The earliest literature in (old) English makes it clear that the inhabitants of what was one day to become England were a) not entirely sure of what had come before, but b) knew that it was in many ways ‘better’ and certainly more impressive than the present day of the 8th century:
“The work of the Giants, the stonesmiths,/ mouldereth…
And the wielders and wrights?/Earthgrip holds them – gone, long gone”
The Ruin, (Translated by Michael Alexander, The Earliest English Poems, Penguin Classics (3rd edition, 1991, p. 2)
Even closer to home (for me), the earliest literature of Scotland, the Goddodin of the poet Aneirin, dating from anywhere from the 7th to 10th century and originally, it is presumed, written – or at least passed down – in the ancient British language now called Old Welsh (which it is of course, but it is also, geographically, old English and old Scots, since it seems to have been spoken in a far wider area than modern Wales). The Goddodin is a series of elegies mourning the loss of the warriors of eponymous ancient kingdom (which spread roughly over what are now the modern Scottish regions of Lothian and Borders) in battle, and with them the heroic culture of the era.*
To say that nostalgia as opposed to fear may be mankind’s oldest emotion is problematic, both logically (chicken/egg innit), and because for all of its obviously dominant ingredients – sadness/regret and happiness – a large component of nostalgia can be fear, and, specifically, Lovecraft’s ‘fear of the unknown’ (in this case the always unknowable future). This is problematic for many reasons; in the examples noted above, the glamour (not intended to have its old, magical meaning, but actually that is probably even more appropriate) attached to the past is partly because it can’t come again. If the people of ”now” are as noble, heroic etc as the people of “then”, then somehow the past – and the ancestors, a vital component of the values of most non-Christian and pre-Christian cultures – is not receiving its due reverence.
*this theme even crops up in a very similar form in the Fortinbras subplot of Shakespeare’s Hamlet, preserved at one remove from the earliest known version of the story, Saxo Grammaticus’ elemental/mythological 13th century version from his Gesta Danorum. But even this is assumed to be derived from an earlier, lost source, probably Icelandic.
Although it seems almost incomprehensible to someone of my generation, there seems to be a similar, ‘don’t disrespect the ancestors’ unease nowadays in the unwillingness in some circles to condemn wholesale the expansion/existence of the British Empire. And really, it’s not complicated – it is entirely possible to be impressed by and/or grateful for the innovations of the Victorian era – flushing toilets, railways and whatnot – while seeing the culture and times for what they were; repressive, oppressive, misogynistic, racist, ignorant. It shouldn’t be difficult, because it’s happened before, more or less. Christianity made it easy for previous ages to condemn the pagan empires of Rome, Greece, Egypt and co (and indeed the ancient Arabic civilisations) without abandoning the inventions and innovations of those civilisations. Indeed, even at the height of Christian belief in Europe, interest in the cultures of the pagan empires remained high, even if Christian scholars felt the need to inflict a version of their own value system onto their researches. There’s no reason that people now shouldn’t be able to do the same with the ages we have left behind, or are hopefully in the process of leaving behind. Yes, good things come from bad, but not because of the bad, but because (most) human beings are extraordinary.
In 2017 there seemed to be – as I suppose there always must be – an ever-increasing number of warring nostalgias and counter-nostalgias, the latest being for the Russian Revolution in 1917 – a violent event, with vast and oppressive consequences and therefore definitely negative, but like most revolutions, born of aspirations and ideals which are hard to dismiss. In fact, Dickens’ famous opening to A Tale Of Two Cities seems uncannily prophetic, because Dickens – as he explicitly realised – could see that human nature and human actions remain fairly constant:
“It was the best of times, it was the worst of times, it was the age of wisdom, it was the age of foolishness, it was the epoch of belief, it was the epoch of incredulity, it was the season of Light, it was the season of Darkness, it was the spring of hope, it was the winter of despair, we had everything before us, we had nothing before us, we were all going direct to Heaven, we were all going direct the other way – in short, the period was so far like the present period, that some of its noisiest authorities insisted on its being received, for good or for evil, in the superlative degree of comparison only”
I think it’s probably true that it’s always the best of times, for somebody, in some respect, it’s certainly always the worst of times for others; which sounds complacent or at least fatalistic, but only if one doesn’t try in some way to improve things. This kind of impersonal nostalgia – for ‘better’ times – is, necessarily selective. (in fact, all nostalgia is, because perception is selective – hmm, it seems like this just started copying the thing about realism I wrote recently, but bear with me) and relies to a large degree on ignorance and/or self-deception in order to be nostalgia at all.
History isn’t a subject, history is everything; people, peoples, cultures, societies, but, necessarily “history” as taught, or absorbed through popular culture, filters and simplifies, to the point where some people in Britain still talk nostalgically about ‘Victorian values’ without (usually) intending any reference to the exploitation and subjugation of untold millions of people, child prostitution and child labour, the life expectancy of the average Victorian person etc etc etc. And, as always, history is more complex than its popular image. The era may be symbolised for British people by the building of railways or the expansion of the Empire, or by Jack the Ripper, or Queen Victoria being unamused, or by the establishment’s treatment of Oscar Wilde; but it was also the era that produced and shaped Jack the Ripper, Queen Victoria and of course, Wilde himself, as well as the whole decadent movement. Interestingly, Sigmund Freud was only two years younger than Wilde; an apparently value-free but perhaps significant observation…
This kind of complexity is what makes history more interesting than it’s sometimes given credit for; the Scottish Enlightenment was a wonderful, positive, outward-looking movement, but it coexisted in Scotland with a joyless, moralising and oppressive Calvinist culture. Time and nostalgia have a way of homogenising peoples and cultures. The popular idea of ancient Rome is probably one of conquest, grandeur and decadence, but what is the popular idea, if there is one, of ‘an ancient Roman’? Someone, probably a man, probably from Italy, in a toga or armour; quite likely an emperor, a soldier or a gladiator, rather than say, a merchant, clerk or farmer. Even within this fairly narrow image, a complex figure like the emperor Elagabalus (Syrian, teenage, possibly transgender) defeats the obvious school textbook perceptions of ‘Romanness’ (as, perhaps, it did for the Romans themselves). Even in our own time, the fact that older generations from the 60s/70s to the present could lament the passing of times when ‘men were men & women were women’ etc is – to say the least – extremely disingenuous – presumably what they mean is a time when non-‘manly’ men could be openly discriminated against and/or abused and women could be expected to be quiet and submissive.* Similarly, throughout my life I have heard people – and not exclusively right-wing people – talk about the economic success that Hitler brought to Germany; but you don’t have to be the chairperson of a financial think tank to see that a programme of accelerated militarism that requires war in order to function isn’t really a viable economic model for anyone who doesn’t also espouse the ideology of Nazism. But a strange kind of nostalgia dictates that if it wasn’t for all those pesky Nazi faults he could have been a great leader. He couldn’t, though, because he was a real person, he did the things he did and therefore he wasn’t a great leader.
*throughout this article I have been referring to ‘people’ and ‘humankind’ in what is intended to be an inclusive kind of way, referring to people of all races, genders or indeed lack of gender. I admit I have probably referred to gender in a binary sense, partly no doubt through laziness. However, I do have a tendency to not use the term ‘cis’, unless necessary – for me personally, the word ‘women’ includes trans women and the word men includes trans men. I don’t intend any offence by this, but I also don’t really mind if anyone is offended. I think it’s a shame that something as basic (if not simple) as a person’s gender should be a matter of opinion, but so it seems to be. My own view is that the contents of someone’s underwear is none of my business unless they explicitly make it so.
As I’ve said at least one too many times, history is complex, but nostalgia, despite being impossible to sum up in a single word other than itself* has a simplifying quality. Nostalgia is safety – political reactionaries always look to the past for ideas of stability – but that is only because the past itself is stable, in the sense of being unchangeable. As we see daily, though, although (until the invention of the time machine) it is unchangeable, history, through endless re-interpretations and re-evaluations and new points of view, isn’t really ‘stable’ at all – and I think it’s fair to assume that (as Dickens implied) every ‘golden age’ masks a dark age. And although it mainly seems otherwise, people are, by and large, fairly positive, they want to look back with fondness, even if it’s a melancholy fondness. There’s a quote from the great Scottish singer/songwriter Alex Harvey that strips away the soft-focus effect that the distorting lens of nostalgia imposes on history:
“Nobody ever won a war. A hundred thousand dead at Waterloo. No glory in that. Nobody needs that.” (quoted in Charles Shaar Murray’s Shots From The Hip, Penguin Books, 1991, p.71)
This is, I think, indisputably true; but evidently I am wrong – people are entirely capable of being nostalgic about almost any negative event. ‘The Blitz Spirit’ is remembered fondly in Britain because the blitz ended years ago and all of its bombs already fell and lots of people survived it. It’s hard to make a film about the past without an element of nostalgia, especially when the film is played out as a thriller or adventure of some kind. But even leaving aside war movies and the old fashioned western film, there is and has been in recent(ish) times a whole sub-genre of ‘elegiac’ Western movies which, by and large, focus on the dying days of the ‘old west’ while barely acknowledging the genocide and horror that is the historical backdrop of the period. In a way, that’s fair enough – those stories are not about that subject – but when there are not only no (or very few) films about that subject, and it is barely even acknowledged by ‘official’ narratives of taught history, it’s a stark and telling omission.
*though interestingly, its original Greek meaning ‘homecoming pain’ is more specific than the word itself has come to be in English, and most of the European languages tend to use variations of the word ‘nostalgia’ rather than having their own word with the same meaning)
It’s my personal feeling that nothing good is produced by adversity; which is not to deny that people are amazing, resourceful, resilient and inspiring; they are. When I said before that every golden age masks a dark age, it’s probably true too that every dark age is shot through with some elements of positivity, although I won’t scrutinise that statement too closely. Countries which were colonised by the British Empire (or indeed any empire) manage to grow and assert and define their own cultures; but we can never know what was lost. I love blues music (and indeed the whole phenomenon of western popular music which mostly grew from it), but again; we can never know what would have been, had these energies not been re-directed by a couple of hundred years of slavery and exploitation. Individuals achieve almost superhuman feats of bravery and resourcefulness etc when facing adversity; escaping from abusers, kidnappers etc. But no-one in their right mind would – I hope – recommend that all young people undergo these kinds of ordeals in order to fully achieve their potential. I don’t think it’s particularly useful for individuals (although governments and institutions are a different thing) to feel guilty about the deeds of the people of the past (or proud of the achievements of the past, really), I also see no need to pretend that, because India has a big railway network, the British Empire did something positive by oppressing the country’s people and culture and stealing its resources. Nothing good came of the British in India. India survived anyway, just as people survive catastrophes everywhere and achieve amazing things in doing so.
So much for impersonal nostalgia – the personal kind is in many ways very similar, if less destructive. I’ve always been a nostalgic person; both for things I don’t remember, or that were long before ‘my time’ (you name it; silent movies, the 1960s, the Weimar Republic, Hong Kong cinema of the 70s, the Northern Renaissance, the Scottish Enlightenment, 80s teen movies) and, more naturally perhaps, within own experiences. One of the things that initially made me write this was a reference in Anthony DeCurtis’ biography Lou Reed – A Life (John Murray, 2017)* to Reed’s 70s partner/muse Rachel, a fascinating figure who seems to have vanished into history. In googling her I discovered various sites about vanishing/vanished aspects of New York and, because old photographs are endlessly fascinating, somehow segued from that to the vanished Jewish East End of London and the vanished and vanishing everything of everywhere. But as irretrievable as Jewish East London of the 60s and the underbelly of 70s New York are, one’s own childhood is equally as irretrievable, not that one wants to retrieve it, exactly.
* An excellent book, but one which illustrates some of my points; while Lou Reed spent most of his adult life complaining about his conservative 1950s childhood, DeCurtis himself has a more rose-tinted view of the period, saying “In stark contrast to the identity politics of today, assimilation was the order of the day…and none of Reed’s friends, Jewish or not, recall incidents of anti-Semitism or bias” (p.14) – fair enough, except that he also says, ‘Richard Mishkin was a fraternity brother of Allan Hyman’s in Sigma Alpha Mu, a so-called Jewish fraternity because at the time Jews were not permitted in many other fraternities.” (p.36)
Most of the polaroids etc that make up the ever-browsable Internet K-hole appear to be American, but any child of the 80s will recognise the texture and aura of the era we grew up in. When George Orwell wrote (I think in The Lion and the Unicorn, but I might be wrong; I’ll check) – “What have you in common with the child of five whose photograph your mother keeps on the mantelpiece? Nothing, except that you happen to be the same person” he was putting his finger on one of the strange paradoxes of culture, heritage and nostalgia. The memories I have of the 80s are made up of a distorted, child’s-eye view of events and culture which is truly mine, plus things I know now that I didn’t then, other peoples’ memories, TV, films. The most potent sources of nostalgia seem to be – as the makers of shows like Stranger Things and Dark, and films like Super 8 and (too many to list) are very aware – the things you didn’t notice that you had noticed, the most ephemeral details; jingles from adverts, fonts, packaging, slang.
And this is right, I think. The fleetingness of things remembered has nothing to do with their power as memories. I have no idea what the first horror film I saw was, but I do know that a scene on some TV show where skinheads (or possibly a single skinhead) glued a man’s hands to the wall of a lift/elevator scared me as a child and stayed with me for a long time; maybe because I used to see skinheads around on the streets (you had to watch the colour of the laces in their Doc Martens to see if they were ‘bad’ skinheads or not – though they were probably kids too, I now realise). I also know now (but didn’t then) that these were the second wave of skinheads, which is why I also saw Oi! written on various walls around the town; at the time I don’t think I ever made the connection. Again, when one thinks of the impact of very small occurrences it shows how impossible a really objective view of history is. I no longer bear any high school grudges, but without really thinking about it, many small and/or random sneers and insults from my youth have stayed with me in vivid detail, along with the people and places involved. Similarly (but nicer) I will eternally feel grateful to two beautiful black girls in Camden in (I think) 1990 or 91 who made remarks to me which, even at the time were, at best ‘not politically correct’ but which pleased me immensely; it is among the very few teenage memories that boosted rather than eroded my confidence; a tiny thing, barely even an ‘incident’, but a big deal to a painfully shy adolescent. What to make of such a minor, slightly embarrassing (especially at the time; I can still vividly remember – although it was not a rarity – my whole face burning when I blushed. People often remarked on the redness of my blushes. I remember – not even slightly nostalgically – being compared to a tomato, being told I looked like I would ‘burst’ etc) episode? Nothing, except that real nostalgia, unlike the nostalgia industry (“it was the 70s; Buckaroo!”, to quote Alan Partridge) is particular, not general. The Camden episode may include references to youth, gender, race etc, but it has nothing to do with those factors really, and I doubt if the two girls remembered it even days later. These are not the kinds of details which are worthy of a biographer’s attention; but they define my youth every bit as much as the music I listened to, the sweets I remember that no longer exist, or the clothes I wore.
To me, 80s nostalgia has less to do with “the 80s” in the sense it that it appears in TV shows and films as it does a litany of gloomy-sounding things: the urban decay of 60s and 70s council estates, indoor markets, army stores, arcades, brutalist churches that harmonised with the concrete towers that the fire brigade used for practise. This is a kind of eeriness as nostalgia; reflected in my liking for empty streets and art that represents empty streets: Algernon Newton, Maurice Utrillo, Takanori Oguiss , the photography of Masataka Nakano and taken to its extreme, Giorgio de Chirico, where the emptiness isn’t empty so much as it is pregnant , reminding me always of – nostalgia again – the ruined city of Charn in CS Lewis’ The Magician’s Nephew (by far my favourite Narnia book) – which made a huge impression on me as a child – and may be where my liking for such things (including ‘urbex’ photography, like that of Andre Govia, and of course, The Ruin, quoted way back in the first paragraph) comes from.
“The passing of time and all of its crimes, is making me sad again” – sadly, one of those crimes is that when I first heard that line (from Rubber Ring by The Smiths) in 1989 or thereabouts, Morrissey seemed to be on the side of the downtrodden and marginalised, whereas now he seems to be one of that increasing number of people who pretends that the mainstream of British culture is itself somehow being marginalised; which is patently ridiculous. And nostalgic, of course. And there’s a whole culture industry with its own cultural shorthand, to bolster the standardised view of any given period; especially now, when a decade can be summed up by a b-list cultural commentator or celebrity who clearly isn’t old enough to remember some of what they are talking about, saying “‘e were mad, weren’t ‘e?” about some figurehead of the era. Not so great of course, when said figurehead turns out to be Jimmy Savile or Rolf Harris, at which point even nostalgia, like history, has to be revised. But, as endlessly mentioned above, the beauty of all nostalgia is that it’s selective. The 70s that Morrissey seems to feel nostalgic (in the true, mixed feelings sense) about (witness the whole of Viva Hate, which I love) wasn’t ‘better’ than nowadays, but the writer of its songs was young then. He isn’t now. There are younger people who are also nostalgic about the 70s, or the 80s, because they see the partial versions of the era(s) preserved by those who were there then, or who pretend to have been. The people who mourned the loss of the blitz spirit mourned it because a) they were younger then, and b) they survived it, and told people about its spirit. The people who are nostalgic for the Empire will (hopefully) never have to deal with being in charge of a mass of powerless, subject people whose resources they are stealing (or be the subject of the same), but they can enjoy the things it brought to all of our lives; the wealth of the Empire which, like the mythical ages of Greece and Rome, and the giants that the Anglo-Saxon poet pondered over only exist now as the faded, distorted memory of a faded, distorted memory. Like the 70s, like the 80s, like 2017, like yesterday, they are wonderful and terrible because they can never come again.