an alan smithee war

an annoying but perhaps necessary note; “Alan (or Allan, or Allen) Smithee” is a pseudonym used by Hollywood film directors when they wish to disown a project

Watch out, this starts off being insultingly elementary, but then gets complicated and probably contradictory, quite quickly.

Countries, States and religions are not monoliths and nor are they sentient. They don’t have feelings, aims, motivations or opinions. So whatever is happening in the Middle East isn’t ‘Judaism versus Islam’ or even ‘Israel versus Palestine’, any more than “the Troubles”* were/are ‘Protestantism versus Catholicism’ or ‘Britain versus Ireland’.

* a euphemism, which, like most names for these things is partly a method of avoiding blame – as we’ll see

Places and atrocities aren’t monoliths either; Srebrenica didn’t massacre anybody**, the Falkland islands didn’t have a conflict, ‘the Gulf’ didn’t have any wars and neither did Vietnam or Korea. But somebody did. As with Kiefer Sutherland and Woman Wanted in 1999 or Michael Gottlieb and The Shrimp on the Barbie in 1990 and whoever it was that directed Gypsy Angels in 1980, nobody wants to claim these wars afterwards. But while these directors have the handy pseudonym Allan Smithee to use, there is no warmongering equivalent, and so what we get is geography, or flatly descriptive terms like ‘World War One’, which divert the focus from the aggressor(s) and only the occasional exception (The American War of Independence) that even references the real point of the war. But, whether interfered with by the studio or not, Kevin Yagher did direct Hellraiser: Bloodline, just as certain individuals really are responsible for actions which are killing human beings as you read this. Language and the academic study of history will probably help to keep their names quiet as events turn from current affairs and into history. Often this evasion happens for purely utilitarian reasons, perhaps even unknowingly, but sometimes it is more sinister.

** see?

As the 60s drew to its messy end, the great Terry “Geezer” Butler wrote lines which, despite the unfortunate repeat/rhyme in the first lines, have a Blakean power and truth:

Generals gathered in their masses
Just like witches at black masses

Black Sabbath, War Pigs, 1970

There is something sinister and even uncanny in the workings of power, in the distance between avowed and the underlying motivations behind military action. Power politics feels like it is – possibly because intuitively it seems like it should be – cold and logical, rather than human and emotional. It doesn’t take much consideration though to realise that even beneath the chilly, calculated actions of power blocs there are weird and strangely random human desires and opinions, often tied in with personal prestige, which somehow seems to that person to be more important than not killing people or not having people killed.

Anyway, Geezer went on to say:

Politicians hide themselves away
They only started the war
Why should they go out to fight?
They leave that role to the poor

Still Black Sabbath, War Pigs (1970)

And that’s right too; but does that mean Butler’s ‘poor’ should take no responsibility at all for their actions? In the largest sense they are not to blame for war or at least for the outbreak of war; and conscripts and draftees are clearly a different class again from those who choose to “go out to fight.” But. As so often WW2 is perhaps the most extreme and therefore the easiest place to find examples; whatever his orders or reasons, the Nazi soldier (and there were lots of them) who shot a child and threw them in a pit, actually did shoot a child and throw them in a pit. His immediate superior may have done so too, but not to that particular child. And neither did Himmler or Adolf Hitler. Personal responsibility is an important thing, but responsibility, especially in war, isn’t just one act and one person. Between the originator and the architects of The Final Solution and the shooter of that one individual child there is a chain of people, any one of whom could have disrupted that chain and even if only to a tiny degree, affected the outcome. And that tiny degree may have meant that that child, that human being, lived or died. A small thing in a death toll of something over 6 million people; unless you happen to be that person, or related to that person.

As with the naming of wars and atrocities, terms like “genocide” and “the Holocaust” are useful, especially if we want – as we clearly do – to have some kind of coherent, understandable narrative that can be taught and remembered as history. But in their grim way, these are still euphemisms. The term ‘the Holocaust’ memorialises the countless – actually not countless, but still, nearly 80 years later, being counted – victims of the Nazis’ programme of extermination. But the term also makes the Holocaust sound like an event, rather than a process spread out over the best part of a decade, requiring the participation of probably thousands of people who exercised – not without some form of coercion perhaps, but still – their free will in that participation. The Jewish scholar Hillel the Elder’s famous saying,  whosoever saves a life, it is as though he had saved the entire world is hard to argue with, insofar as the world only exists for us within our perceptions. Even the knowledge that it is a spinning lump of inorganic and organic matter in space, and that other people populate it who might see it differently only exists in our perceptions. Or at least try to prove otherwise. And so the converse of Hillel’s saying – which is actually included in it but far less often quoted – is Whosoever destroys one soul, it is as though he had destroyed the entire world. Which sounds like an argument for pacifism, but while pacifism is entirely viable and valuable on an individual basis as an exercise of one’s free will* – and on occasion has a real positive effect – one-sided pacifism relies on its opponents not taking a cynically Darwinian approach, which is hopeful at best. Pacifism can only really work if everyone is a pacifist, and everyone isn’t a pacifist.

*the lone pacifist can at least say, ‘these terrible things happened, but I took no part in them’, which is something, especially if they used what peaceful means they could to prevent those terrible things and didn’t unwittingly contribute to the sum total of suffering; but those are murky waters to wade in.

But complicated though it all is, people are to blame for things that happen. Just who to blame is more complicated – more complicated at least than the workable study of history can afford to admit. While countries and religions are useful as misleading, straw man scapegoats, even the more manageable unit of a government is, on close examination, surprisingly hard to pin down. Whereas (the eternally handy example of) Hitler’s Nazi Party or Stalin’s Council of People’s Commissars routinely purged heretics, non-believers and dissidents, thus acting as a genuine, effective focus for their ideologies and therefore for blame and responsibility, most political parties allow for a certain amount of debate and flexibility and therefore blame-deniability. Regardless, when a party delivers a policy, every member of that party is responsible for it, or should publicly recuse themselves from it if they aren’t.

The great (indeed Sensational) Scottish singer Alex Harvey said a lot of perceptive things, not least and “[Something] I learned from studying history. Nobody ever won a war. A hundred thousand dead at Waterloo*. No glory in that. Nobody needs that.” Nobody ever won a war;  but plenty of people, on both sides of every conflict, have lost one – and, as the simple existence of a second world war attests, many, many people have lost a peace too.

*Modern estimates put it at ‘only’ 11,000 plus another 40,000 or so casualties; but his point stands

But the “causes” of war are at once easily traced and extremely slippery. Actions like the 1939 invasion of Poland by the armies of Germany and the USSR were, as military actions still are, the will of certain individuals, agreed to by other individuals and then acted upon accordingly. You may or may not agree with the actions of your government or the leaders of your faith. You may even have had some say in them, but in most cases you probably haven’t. Some of those dead on the fields of Waterloo were no doubt enthusiastic about their cause, some probably less so. But very few would have had much say in the decisions which took them to Belgium in the first place.

The buck should stop with every person responsible for wars, crimes, atrocities; but just because that’s obviously impossible to record – and even if it wasn’t, too complex to write in a simple narrative – that doesn’t mean the buck should simply not stop anywhere. Victory being written by the winners often means that guilt is assigned to the losers, but even when that seems fair enough (there really wouldn’t have been a World War Two without Hitler) it’s a simplification (there wouldn’t have been an effective Hitler without the assistance of German industrialists) and a one-sided one (it was a World War because most of the leading participants had already had unprovoked wars of conquest). That was a long sentence. But, does the disgusting history of Western colonialism, the arguably shameful treatment of Germany by the Allies after WW1 and the dubious nature of the allies and some of their actions make Hitler himself any less personally responsible for the war? And does Hitler’s own guilt make the soldier who shoots a child or unarmed adult civilians, or the airman who drops bombs on them any less responsible for their own actions?

Again; only human beings do these things, so the least we can do is not act like they are some kind of unfathomable act of nature when we discuss them or name them. Here’s Alex Harvey again; “Whether you like it or not, anybody who’s involved in rock and roll is involved in politics. Anything that involves a big crowd of people listening to what you say is politics.” If rock and roll is politics, then actual politics is politics squared; and for as long as we settle, however grudgingly or complacently, for pyramidal power structures for our societies then the person at the top of that pyramid, enjoying its vistas and rarefied air should be the one to bear its most sombre responsibilities. But all who enable the pyramid to remain standing should accept their share of it too.

So when you’re helplessly watching something that seems like an unbelievable waste of people’s lives and abilities, pay close attention to who’s doing and saying what, even if you don’t want to, because the credits at the end probably won’t tell you who’s really responsible.

 

 

 

passive-digressive

There are two kinds of people* – those who like forewords, introductions, prefaces, author’s notes, footnotes, appendices, bibliographies, notes on the text, maps etc, and those who don’t. But we’ll get back to that shortly.

* there are more than two kinds of people. Possibly infinite kinds of people. Or maybe there’s only one kind; I’m never sure

A few times recently, I’ve come across the idea (which I think is mainly an American academic one, but I might be completely mistaken about that) that parentheses should only be used when you really have to (but when do you really have to?) because anything that is surplus to the requirements of the main thrust of one’s text is surplus to requirements full stop, and should be left out. But that’s wrong. The criticism can be and is extended to anything that interrupts the flow* of the writing. But that is also wrong. Unless you happen to be writing a manual or a set of directions or instructions, writing isn’t (or needn’t be) a purely utilitarian pursuit and the joy of reading (or of writing) isn’t in how quickly or efficiently (whatever that means in this context) you can do it. Aside from technical writing, the obvious example where economy just may be valuable is poetry – which however is different and should probably have been included in a footnote, because footnotes are useful for interrupting text without separating the point you’re making (in a minute) from the point you’re commenting on or adding to (a few sentences ago), without other, different stuff getting in the way.

*like this¹                                                                                                                                                                ¹but bear in mind that people don’t write footnotes by accident – the interruption is deliberate²                        ²and sometimes funny

Poly-Olbion – that’s how you write a title page to pull in the readers

I would argue (though the evidence of a lot of poetry itself perhaps argues against me – especially the Spenser’s Faerie Queen, Michael Drayton’s Poly-Olbion kind of poetry that I’m quite fond of) that a poem should be** the most economical or at least the most effective way of saying what you have to say – but who’s to say that economical and effective are the same thing anyway?)

** poets, ignore this; there is no should be

 

 

 

Clearly (yep), the above is a needlessly convoluted way of writing, and can be soul-achingly annoying to read; but – not that this is an effective defence – I do it on purpose. As anyone who’s read much here before will know, George Orwell is one of my all-time favourite writers, and people love to quote his six rules for writing, but while I would certainly follow them if writing a news story or article where brevity is crucial, otherwise I think it’s more sensible to pick and choose. So;

Never use a metaphor, simile, or other figure of speech which you are used to seeing in print. Absolutely; although sometimes you would use them because they are familiar, if making a specific point, or being amusing. Most people, myself included, just do it by accident; because where does the dividing line fall? In this paragraph I have used “by accident” and “dividing line” which seem close to being commonly used figures of speech (but then so does “figure of speech”). But would “accidentally” or something like “do it without thinking” be better than “by accident?” Maybe.

Never use a long word where a short one will do. The key point here is will do. In any instance where a writer uses (for example) the word “miniscule” then “small” or “tiny” would probably “do”. But depending on what it is they are writing about, miniscule or microscopic might “do” even better. Go with the best word, not necessarily the shortest.

If it is possible to cut a word out, always cut it out. Note that Orwell wrote ‘always’ here where he could just have said If it is possible to cut a word out, cut it out. Not everything is a haiku, George.

Never use the passive where you can use the active. Surely it depends what you’re writing? If you are trying, for instance, to pass the blame for an assault from a criminal on to their victim, you might want a headline that says “X stabbed after drug and alcohol binge” rather than “Celebrity kills X.” You kind of see Orwell’s point though.

Never use a foreign phrase, a scientific word, or a jargon word if you can think of an everyday English equivalent. Both agree and disagree; as a mostly monolingual person I agree, but some words and phrases (ironically, usually ones in French, a language I have never learned and feel uncomfortable trying to pronounce; raison d’etre or enfant terrible for example) just say things more quickly and easily (I can be utilitarian too) than having to really consider and take the time to say what you mean. They are a shorthand that people in general understand. Plus, in the age of smartphones, it really doesn’t do native English speakers any harm to have to look up the meanings of foreign words occasionally (I do this a lot). The other side of the coin (a phrase I’m used to seeing in print) is that with foreign phrases is it’s funny to say them in bad translations like “the Tour of France” (which I guess must be correct) or “piece of resistance” (which I am pretty sure isn’t) so as long as you are understood (assuming that you want to be understood) use them any way you like.

Break any of these rules sooner than say anything outright barbarous. It’s hard to guess what George Orwell would have considered outright barbarous (and anyway, couldn’t he have cut “outright”?) but anyone reading books from even 30, 50 or a hundred years ago quickly sees that language evolves along with culture, so that rules – even useful ones – rarely have the permanence of commandments.

So much for Orwell’s rules; I was more heartened to find that something I’ve instinctively done – or not done – is supported by Orwell elsewhere. That is, that I prefer, mostly in the name of cringe-avoidance, not to use slang that post-dates my own youth. Even terms that have become part of normal mainstream usage (the most recent one is probably “woke”) tend to appear with inverted commas if I feel like I must use them, because if it’s not something I would be happy to say out loud (I say “woke” with inverted commas too) then I’d prefer not to write it. There is no very logical reason for this and words that I do comfortably use are no less subject to the whims of fashion, but still; the language you use is part of who you are, and I think Orwell makes a very good case here, (fuller version far below somewhere because even though I have reservations about parts of it it ends very well):

“Each generation imagines itself to be more intelligent than the one that went before it, and wiser than the one that comes after it. This is an illusion, and one should recognise it as such, but one ought also to stick to one’s world-view, even at the price of seeming old-fashioned: for that world-view springs out of experiences that the younger generation has not had, and to abandon it is to kill one’s intellectual roots.”

Review of A Coat of Many Colours: Occasional Essays by Herbert Read. (1945) The Collected Essays, Journalism and Letters of George Orwell Volume 4. Penguin 1968, p.72 

the fold-out map in The Silmarillion is a thing of beauty

Back to those two kinds* of people: I am the kind of person that likes and reads forewords, introductions, prefaces, author’s notes, footnotes, appendices, bibliographies, notes on the text, maps and all of those extras that make a book more interesting/informative/tedious.

 

*I know.

 

In one of my favourite films, Whit Stillman’s Metropolitan (1990), the protagonist Tom Townsend (Edward Clements), says “I don’t read novels. I prefer good literary criticism. That way you get both the novelists’ ideas as well as the critics’ thinking. With fiction I can never forget that none of it really happened, that it’s all just made up by the author.” Well, that is not me; but I do love a good bit of criticism and analysis as well as a good novel. One of my favourite ever pieces of writing of any kind, which I could, but choose not to recite parts of by heart, is the late Anne Barton’s introduction to the 1980 New Penguin Shakespeare edition of Hamlet*. I love Hamlet, but I’ve read Barton’s introduction many more times than I’ve read the play itself, to the point where phrases and passages have become part of my mind’s furniture. It’s a fascinating piece of writing, because Professor Barton had a fascinating range and depth of knowledge, as well as a passion for her subject; but also and most importantly because she was an excellent writer. If someone is a good enough writer**, you don’t even have to be especially interested in the subject to enjoy what they write. Beyond the introduction/footnote but related in a way are the review and essay. Another of my favourite books – mentioned elsewhere I’m sure, as it’s one of the reasons that I have been working as a music writer for the past decade and a half, is Charles Shaar Murray’s Shots from the Hip, a collection of articles and reviews. The relevant point here is that more than half of its articles – including some of my favourites – are about musicians whose work I’m quite keen never to hear under any circumstances, if humanly possible. Similarly, though I find it harder to read Martin Amis’s novels than I used to (just changing taste, not because I think they are less good), I love the collections of his articles, especially The War Against Cliché and Visiting Mrs Nabokov. I already go on about Orwell too much, but as I must have said somewhere, though I am a fan of his novels, it’s the journalism and criticism that he probably thought of as ephemeral that appeals to me the most.

*All of the New Penguin Shakespeare introductions that I’ve read have been good, but that is in a different league. John Dover Wilson’s What Happens in Hamlet (1935, though the edition I have mentions WW2 in the introduction, as I remember; I like the introduction) is sometimes easy to disagree with but it has a similar excitement-of-discovery tone as Anne Barton’s essay

** Good enough, schmood enough; what I really mean is if you like their writing enough. The world has always been full of good writers whose work leaves me cold

a scholarly approach to comics

All this may have started, as I now realise that lots of things seem to in my writing did, with Tolkien. From the first time I read his books myself, I loved that whatever part of Middle-Earth and its people you were interested in, there was always more to find out. Appendices, maps, whole books like The Silmarillion which extended the enjoyment and deepened the immersion in Tolkien’s imaginary world. And they were central to that world – for Tolkien, mapping Middle-Earth was less making stuff up than it was a detailed exploration of something he had already at least half imagined. Maybe because I always wanted to be a writer myself – and here I am, writing – whenever I’ve really connected with a book, I’ve always wanted to know more. I’ve always been curious about the writer, the background, the process. I’ve mentioned Tintin lots of times in the past too and my favourite Tintin books were, inevitably, the expanded editions which included Herge’s sketches and ideas, the pictures and objects and texts that inspired him. I first got one of those Tintin books when I was 9 or so, but as recently as the last few years I bought an in many ways similar expanded edition of one of my favourite books as an adult, JG Ballard’s Crash. It mirrors the Tintins pretty closely; explanatory essays, sketches, notes, ephemera, all kinds of related material. Now just imagine how amazing a graphic novel of Crash in the Belgian ligne claire style would be.*

*a bit like Frank Miller and Geof Darrow’s fantastic-looking but not all that memorable Hard Boiled (1990-92) I guess, only with fewer robots-with-guns shenanigans and more Elizabeth Taylor

a scholarly approach to cautionary 1970s semi-pornography/horror: the expanded Crash

A good introduction or foreword is (I think) important for a collection of poems or a historical text of whatever kind. Background and context and, to a lesser extent, analysis expand the understanding and enjoyment of those kinds of things. An introduction for a modern novel though is a slightly different thing and different also from explanatory notes, appendices and footnotes and it’s probably not by chance that they mainly appear in translations or reprints of books that already enjoyed some kind of zeitgeisty success. When I first read Anne Barton’s introduction to Hamlet, I already knew what Hamlet was about, more or less. And while I don’t think “spoilers” are too much of an issue with fiction (except for whodunnits, which I have so far not managed to enjoy), do you really want to be told what to think of a book before you read it? But a really good introduction will never tell you that. If in doubt, read them afterwards!

Some authors, and many readers, see all of these extraneous things as excess baggage, surplus to requirements, which obviously they really are, and that’s fair enough. If the main text of a novel, a play or whatever, can’t stand on its own then no amount of post-production scaffolding will make it satisfactory.* And presumably, many readers pass their entire lives without finding out or caring why the author wrote what they wrote, or what a book’s place in the pantheon of literature (or just “books”) is. Even as unassailably best-selling an author as Stephen King tends to be a little apologetic about the author’s notes that end so many of his books, despite the fact that nobody who doesn’t read them will ever know that he’s apologetic. Still; I for one would like to assure his publisher that should they ever decide to put together all of those notes, introductions and prefaces in book form, I’ll buy it. But would Stephen King be tempted to write an introduction for it?

 

* though of course it could still be interesting, like Kafka’s Amerika, Jane Austen’s Sanditon or Tolkien and Hergé (them again) with Unfinished Tales or Tintin and Alph-Art

 

That Orwell passage in full(er):

“Clearly the young and middle aged ought to try to appreciate one another. But one ought also to recognise that one’s aesthetic judgement is only fully valid between fairly well-defined dates. Not to admit this is to throw away the advantage that one derives from being born into one’s own particular time. Among people now alive there are two very sharp dividing lines. One is between those who can and can’t remember the period before 1914; the other is between those who were adults before 1933 and those who were not.* Other things being equal, who is likely to have a truer vision at the moment, a person of twenty or a person of fifty? One can’t say, though on some points posterity may decide. Each generation imagines itself to be more intelligent than the one that went before it, and wiser than the one that comes after it. This is an illusion, and one should recognise it as such, but one ought also to stick to one’s world-view, even at the price of seeming old-fashioned: for that world-view springs out of experiences that the younger generation has not had, and to abandon it is to kill one’s intellectual roots.”

*nowadays, the people who can or can’t remember life before the internet and those who were adults before 9/11? Or the Trump presidency? Something like that seems right

 

 

a conflict of ghosts

 

2019 is (to me at least) one of those times when the zeitgeist feels like an actual entity, less the ‘spirit of the age’ and more an actual ‘time ghost’, a baleful Lovecraftian presence whose unseen influence poisons the atmosphere of the era, insidiously affecting the minds of influential people.

A silly conceit perhaps (although few ancient civilisations would have thought so), but a handy one; great swathes of history can be explained by it; ages of empire and revolution and war and faith and enlightenment and (ambiguous word) “progress” of various kinds.
Looked at as a succession of identifiable ages, the idea of zeitgeist (as entity, or in the usual usage) has pluses and minuses. On the one hand it gives us history in a usefully linear, easy-to-summarise/teach/learn kind of way, (too) neatly summarising otherwise amorphous stretches of time. On the other, it removes to an extent the sense of individual and group responsibility at the heart of all human activity and ventures.

This is almost fair, insofar as asking people to act other than as products of their time and environment is pointless; mostly it’s unfair though, since, whatever time people come from, ideas of good/bad (extreme ones anyway) remain somewhat static: people generally do know when they are acting badly. But then again, one has to admit that even rational and enlightened human beings can be counted on to do irrational things like firing missiles at people who they don’t know and have no personal disagreement with, or voting for political parties which it is not in their own interest to have in power, or protesting by destroying the neighbourhoods they live in, when logic would dictate that they should attack those of the people who cause their woes etc etc. Being swept up in the zeitgeist is a thing, and in a way the proof that it is, is that it can be hard to justify afterwards.

Currently, being drunk on bigotry and self-interest seems to be what the zeitgeist desires. The hangover from this kind of a binge we already know; bulldozing piles of bodies into pits and swearing it’ll never happen again. Only the next time, we (or they, depending on how events play out) may have to dirty our/themselves by doing the ‘bulldozing’ by hand, since ignoring ecological disaster in favour of increased profit (as I write, commercial whaling has been resumed after a thirty year cessation) is part of the whole bigotry/self-interest worldview.

In the UK, the two main political parties – theoretically irreconcilably different in almost every respect –  are facing what, however it works out, is one of the biggest political challenges since World War Two (I mean Brexit, I suppose I’d better name it for reasons of clarity, much as I hate to) in exactly the same way. Not – as might be expected (or reasonably, demanded) – by taking steps to prevent the problems that are inevitably to arise, or even (as might be reassuring, if perhaps comical) by plotting some utopian alternative Britain which will blossom in the aftermath of the upheaval, but instead by wringing their hands over the future of the parties themselves in the aftermath of the divisiveness they have helped to fuel, or at best not tried to heal. Oh well.

In 1826, William Hazlitt wrote (not in The Spirit of the Age, though that would have been neater:

…hating, like a poisonous mineral, eats into the heart of religion, and turns it to rankling spleen and bigotry; it makes patriotism an excuse for carrying fire, pestilence and famine into other lands: it leaves to virtue nothing but the spirit of censoriousness, and a narrow, jealous, inquisitorial watchfulness over the actions and motives of others.                                                                                                       On The Pleasure of Hating from The Plain Speaker (1826) in Selected Writings, p. 400-1, Penguin Classics, 1982

The extent to which this is still a demonstrably true and relevant statement is depressing, suggesting that while ages may each have their own spirit, the ghost at the heart of them is humanity itself. Like businessmen (and they usually are men) polluting their own land and rioters destroying their own neighbourhoods, it suggests that, if catastrophe comes, it will be human nature that facilitates it, while at every stage, offering apparently valid reasons for doing so; as Hazlitt also noted, ‘Reason, with most people, means their own opinion’ (Ibid, p. 439)*

*he wrote ‘It is always easier to quote an authority than to carry on a chain of reasoning’(ibid; p. 449) too, which is perhaps even more relevant here, as I do it

Having said all that, although “the” zeitgeist is talked and written about, there never is only one spirit of any age. Against Adam Smith’s definitive statements of the Scottish Enlightenment like ‘Science is the great antidote to the poison of enthusiasm and superstition’. (The Weath of Nations, 1776), you have to set Byron’s memories of childhood in Aberdeenshire at the end of that same century: “I remember a Methodist preacher who on perceiving a profane grin on the faces of part of his congregation – exclaimed ‘no hopes for them as laughs.’”
(Lord Byron Selected Letters & Journals ed. Leslie A Marchand, Pimlico, 1993, p.352.

British life in the 1930s

Two of my favourite books, Cyril Connolly’s The Rock Pool and George Orwell’s The Road To Wigan Pier were published a year apart from each other (in 1936 and 1937 respectively (more about the former here), by people who were not only contemporaries, but who knew each other and went to school together; a narrow focus you’d think, but they perfectly exemplify very different currents in European society of the time. Which brings up the question (because I’m bringing it up) of hierarchies of zeitgeist. The Great Depression and conditions of working class people (Orwell), and the dying years of ‘jazz age’ decadence and the ennui of the moneyed class (Connolly) are almost opposites, but both were to fuel the coming war; are these two zeitgeists or one? The mass of unemployed or poverty stricken working classes for whom the Depression meant starvation and the need for change in order to survive, and the differently disaffected upper class, products of and heirs to decaying empires, but with little desire to deal with the running of them in the aftermath of the seemingly hollow victory (or disastrous defeat) of World War One are the yin and yang of interwar Europe, but are yin and yang one entity, or two? (both, inevitably)

Closer to our own time, what could be more 80s than yuppie culture, racism, Thatcherism and Reaganomics? But also, what could be more 80s than “alternative comedy”, Rock Against Racism and the miner’s strike? In the early 90s, rave culture peaked around the same time as Guns ‘n’ Roses; a disappointingly sturdy beast as it turned out; zeitgeist lore would have you believe that a pincer movement of dance music and Nirvana’s Nevermind swept away cheesy trad rock and its stylings, but in fact “Slash” was miming a solo on an unplugged Les Paul in the desert in the video to a hit single just months after Smells Like Teen Spirit had apparently rendered such things obsolete. So it goes; Mull Of Kintyre was the song that topped the charts as the year of punk came to an end for Christ’s sake. As with empires and revolutions, eras of whatever kind are rarely as neat as we’d like them to be retrospectively; and I say that as someone who owned, without any feeling of incongruity, albums by Nirvana and Guns ‘n’ Roses and the The Shamen.

in 2019, 80s nostalgia is at an all-time (or time to date) high; but, even in the western world, there was more than one 1980s

But away (partly) from music, the ways in which apparently opposed forces come together to define an era is always fascinating to look at. When they are violently opposed, as in the case of something like the hippies putting flowers in guns and then being shot at Kent State in 1970, it’s pretty black and white. Whether or not you think the hippies were ‘the good guys’, shooting unarmed protesters will always make you ‘the bad guys’. The two sides of the conflict were clear. On the other hand, once you remove the life-and-death struggle, things become more ambiguous. To cite a trivial example; the founding of the extremely successful label Earache Records in 1985 as part of a government sponsored enterprise scheme (essentially rebranding unemployed teenagers as entrepreneurs) is often celebrated as a kind of ironic victory of the anarcho-punk-crusty underground over nasty old Thatcherism – label founder Digby Pearson:

“… in the 80s, when you were unemployed in the UK, you had to go to visit the unemployment office every two weeks, and I didn’t fancy doing that. If you start a company, you get the same amount of money and you don’t have to visit the unemployment office every two weeks. You’re not unemployed anymore, so it’s a method for the government to reduce the unemployment figures…They didn’t care what business you did, as long as you did something… it was an excuse to say ‘Wow! I’m a record company!’ But the truth is I had no plans, nothing really.”
quoted in Albert Mudrian, Choosing Death – The Improbable History of Death Metal and Grindcore, Feral House, 2004 p.121

Much as one applauds any victory over Thatcherism, isn’t the success of Earache Records (going strong over 30 years later, with offices in London and New York), for all its rebellious, anti-Thatcher stance, just what the government wanted to happen? Doesn’t it kind of prove that, in this one specific instance, Thatcherism kind of worked? Bleh. A silly segue, but it makes me think of this achingly ironic note from Breaking Free (1989) by “J. Daniels” – a very entertaining revolutionary socialist (or perhaps more precisely, anarcho-syndicalist or some such thing) Tintin book in which Tintin and Captain Haddock  help to bring down western capitalism.

Breaking Free: “we have copyrighted Tintin” – good luck with that

Apologies for abruptly bringing optimism into what has so far been apocalyptically downbeat, but the point here if there is one, is that people can and retrospectively do choose the zeitgeist they prefer (the changing critical fortunes of pop stars are always very interesting to observe – the world is full of “the kind of people who had to wait until 1968, when it became chic to say that Brian Wilson was a genius, before they could admit that they liked The Beach Boys”*) – so why not do it now, and in doing so strengthen the spirit itself? Against Trump, Farage, rigid political ideology and religious dogma you have to set Greta Thunberg, Katrín Jakobsdóttir, David Attenborough, Bonnie Greer, Alexandria Ocasio-Cortez, David Lammy, Stormzy, Carole Cadwalladr and really, so many more; this was a random, pulled-out-of-the-air list, in no way meant as definitive or even representative really.

*Charles Shaar Murray in Cream magazine, 1972, from Shots From The Hip, Penguin books 1991, p.16

revolutionary Tintin

The current, sunnily optimistic issue of the alumnus magazine of my alma mater (well, why not? I’ve never written that phrase before!) pleased me – because if populism and intolerance are ‘the zeitgeist’, then so is this –  and what’s more it is the future too. It’s hard to think of a more conservative (in the tradition-bound sense) institution than the University of St Andrews, but even aside from the cover story (Internationally Scottish; an exhibition celebrating diversity), the magazine regularly celebrates its award-winning graduates from all over the world, the globally important research undertaken at the university and, on a more intimate level, has a news column recording marriages and civil partnerships of its alumni; that is, a hugely diverse mix of people from a multiplicity of backgrounds, doing a range of things. It celebrates diversity (have to admit that phrase is irksome though) – just like movies and TV shows and commercials and shops and organisations now do – not because such things as internationalism and civil partnerships are either ‘politically correct’ or daringly edgy, or because it’s somehow forced on them (by whom, anyway?), but because it’s good business; because it’s society, it’s people, and what people do, how they live and what they want. When people stop being diverse, this will stop happening. And the point is that people always have been diverse, but the people in charge have not. But they are starting to be.

15th century university in the 21st century

Looking at the bigger picture, it quickly becomes clear that all this apparently endless Brexit/Trump reactionary nonsense is just the foamy-mouthed dying throes of old ways of life, ways which, despite the constant yammering about elites and freedom, were established by people with an inflated sense of their own importance and exceptionalism (and/or that of whatever they identify with; nation, gender, ethnicity; the usual suspects) and an interest in a version of freedom which only means their own freedom to do whatever they want to do without interference.

That’s not to say that the dying throes  of outworn cultures are harmless (see WW1 for instance), and I’m not naive enough to say all will be well; but the wave of reactionary negativity is doomed, because ultimately people don’t want authoritarianism unless they happen to be the ones in positions of authority and because people who have grown up and lived in relative freedom will not have it easily taken away; I hope anyway. In history there are very few analogues to the present time, which is probably why the geist of the Weimar Republic hovers so ominously.

Despite the current state of world and British politics, in most important ways, more things are probably better for more people – certainly in the western world (not, I realise, a minor caveat)  – in 2019 than they were in, say 1989 – and the bits that are worse are fixable, given the political will to fix them (always a problem, admittedly; and more and more I feel the will will have to be forced upwards from ‘ordinary’ people).

But while looking forward, it’s instructive to look at what it actually is that people are nostalgic about. Yes, there are those who yearn for times when they could do whatever they wanted because of the class/country/whatever they came from, but there are also things like the wartime spirit, or the solidarity of the mining communities before Thatcher destroyed them. No-one wants to be bombed, and few if any people actually enjoyed working in coal mines – what people generally miss is the sense of community that arises in adversity* The thing to do then, is to try to create the missing sense of community without having to experience the adversity. And people are doing exactly those kinds of things; community projects, ecological movements, local groups, international organisations. Imagine the progress – in the sense of good things for the future of the world – that could be made if people tried to humanise entities like the EU, rather than breaking them apart or divorcing from them or viewing them as first and foremost business ventures – if hate groups are on the rise (and they always seem to be), then more positive movements are flourishing too. Personally, although I think it’s great, I don’t really feel comfortable belonging to things, but I’m glad other people want to. But like the ever more arcane (and ever more necessary) rules about recycling and plastic-usage, I’ll get used to it. We can still be okay in the end, if we want to. This wasn’t what I started out to write, but it’s a nice note to end on.

 

*Side note: it can be shocking for someone of my generation to realise the extent to which shared experience – already very much in the decline in the 70s and 80s, has changed and all but disappeared. To take a very trivial example, if you were at school in  the UK in the 80s, and if your family was the sort where the TV was on in the evening, you could pretty much guarantee you and almost everyone you know would be watching one of 4 (or even 3) shows at any given time. Not only did you as a child know what was in the top 10 (possibly most kids still know that) but, thanks to Top of the Pops your parents did too, and possibly even your grandparents, if you had such things. I’m not saying it was better, but it was substantially different, and it seems (to me) that what we have in place of that kind of boring, take-it-for-granted shared experience now is similar but utterly different; instant familiarity – ‘re-imaginings’, reboots, remakes, new songs that sound like old songs (I recently heard a hit song that blatantly “borrows” the melody of the verses from Dolly Parton’s Jolene and another which lifts the chord sequence of Every Breath You Take by The Police; these are not obscure reference points, but nor are they acknowledged as pastiches or homages, or credited as samples are). Familiarity, however much contempt it’s supposed to breed, is apparently comforting, or at least saleable.