a dream itself is but a shadow

Sigmund Freud c.1885, with Freudian facial hair

Whether or not you agree with Sigmund Freud that “the dream proves to be the first of a series of abnormal psychic formations” or that “one who cannot explain the origin of the dream pictures will strive in vain to understand [the] phobias, the obsessive and delusional ideas and likewise their therapeutic importance,” (The Interpretation of Dreams, 1913 translated by A.A. Brill) dreams are a regular, if not daily/nightly part of human life regardless of culture, language, age etc, and so not without significance. I could go on about dreams like I did about honey in a previous post but I won’t – they are too pervasive popular culture – just everywhere in culture, in books, and plays and art and films and songs (Dreams they complicate/complement my life, as Michael Stipe wrote.) That’s enough of that.

But what about daydreams? If dreams arrive uninvited from the unconscious or subconscious mind, then surely the things we think about, or dwell on, deliberately are even more important. “Dwell on” is an interesting phrase – to dwell is “to live in a place or in a particular way” BUT ‘dwell’ has a fascinating history that makes it seem like exactly the right word in this situation – from the Old English dwellan “to lead into error, deceive, mislead,” related to dwelian “to be led into error, go wrong in belief or judgment” etc, etc, according to etymonline.com : I’ll put the whole of this in a footnote* because I think it’s fascinating, but the key point is that at some time in the medieval period it’s largely negative connotations, to “delay” become modified to mean “to stay.” But I like to think the old meaning of the word lingers in the subtext like dreams in the subconscious.

But I could say something similar about my own use of the word “deliberately” above (“Things we dwell on deliberately”) and even more so the phrase I nearly used instead, which was “on purpose” – but then this would become a ridiculously long and convoluted piece of writing, so that’s enough etymology for now.

The human mind is a powerful thing. Even for those of us who don’t believe in telekinesis or remote viewing or ‘psychic powers’ in the explicitly paranormal sense. After all, your mind controls everything you think and nearly everything you do to the point where separating the mind from the body, as western culture tends to do, becomes almost untenable. Even though the euphemism “unresponsive wakefulness syndrome” has gained some traction in recent years, that’s because the dysphemism (had to look that word up) “persistent vegetative state” is something we fear and therefore that loss of self, or of humanity offends us. It’s preferable for most of us, as fiction frequently demonstrates, to believe that even in that state, dreams of some kind continue in the mind; because as human beings we are fully our mind in a way that we are only occasionally fully our body. One of the fears connected to the loss of self is that we lose the ability to choose what to think about, which is intriguing because that takes us again into the (might as well use the pompous word) realm of dreams.

The Danish actress Asta Nielsen as Hamlet in 1921

My favourite Shakespeare quote is the last line from this scene in Hamlet (Act 2 Scene 2)

Hamlet: Denmark’s a prison.
Rosencrantz: Then is the world one.
Hamlet: A goodly one, in which there are many confines, wards, and
dungeons, Denmark being one o’ th’ worst.
Rosencrantz: We think not so, my lord.
Hamlet: Why then ’tis none to you; for there is nothing either good or
bad, but thinking makes it so. To me it is a prison.

There is nothing either good or bad, but thinking makes it so” seems to deny any possibility of objective morality, but its logic is undeniable. After all, you or I may think that [insert one of thousands of examples from current politics and world events] is ‘wrong’, but if [individual in position of power] doesn’t think so, and does the wrong thing, even if all of the worst possible outcomes stem from it, the most you can say is that you, and people who agree with you, think it was wrong. Hitler almost certainly believed, as he went to the grave, that he was a martyr who had failed in his grand plan only because of the betrayal and duplicity of others. I think that’s wrong, you hopefully think that’s wrong, even “history” thinks that’s wrong, but none of that matters to Hitler in his bunker in 1945, any more than Rosencrantz & Guildenstern finding Denmark to be a nice place if only their old friend Hamlet could regain his usual good humour makes any difference to Hamlet.

Anyway, daydreams or reveries (a nice word that feels a bit pretentious to say); its dictionary definitions are mostly very positive – a series of pleasant thoughts about something you would prefer to be doing or something you would like to achieve in the future.
A state of abstracted musing.
A loose or irregular train of thought occurring in musing or mediation; deep musing – and there’s a school of thought that has been around for a long time but seems even more prevalent today, which values daydreams as, not just idle thoughts, but as affirmations. Anyone who has tried to change their life through hypnosis or various kinds of therapy will find that daydreaming and visualising are supposed to be important aspects of your journey to a better you. In a way all of these self-help gurus, lifestyle coaches and therapists are saying the same thing; as Oscar Hammerstein put it, “You got to have a dream,
If you don’t have a dream, How you gonna have a dream come true?” But is making your dreams, even your daydreams, come true necessarily a good thing?

patriotic vapour trails spotted this winter

I seem to remember once reading that if you can focus all of your attention on something for 15 seconds you’ll remember it forever (not sure about the duration; if you Google stuff like this you find there are millions of people offering strategies to improve your memory, which isn’t quite what I was looking for). Whether or not that’s true, every time I see a vapour trail in an otherwise blue sky, I have the same thought/image – actually two thoughts, but “first you look so strong/then you fade away” came later and failed to replace the earlier thought, which must date from the age of 9 or 10 or so. I realise that people telling you their dreams is boring (or so people say, I never find it to be so), but you don’t have to keep reading. I can see the fluffy, white trail against the hot pale blue sky (it’s summer, the sun is incandescent and there are no clouds) and as my eye follows it from its fraying, fading tail to its source, I can see the nose cone of the plane glinting in the sun, black or red and metallic. It looks slow, leisurely even, but the object is travelling at hundreds of miles an hour. I know there’s no pilot inside that warm, clean shell (I can imagine feeling its heat, like putting your hand on the bonnet or roof of a car parked in direct sunshine; only there are rivets studding the surface of this machine). I’m shading my eyes with my hand, watching its somehow benign-looking progress, but I know that it’s on its way to the nearby airforce base and that others are simultaneously flying towards other bases and major cities and soon, everything I can see and feel will be vapourised and cease to exist. I had this daydream many times as a child, I have no idea how long it lasted but I can remember the clarity and metallic taste of it incredibly clearly. Did I want it to happen? Definitely not. Was I scared? No, although I remember an almost physical sense of shaking it off afterwards. Did I think it would happen? It’s hard to remember, maybe – but I wouldn’t have been alone in that if so. But anyway, the interesting point to me is that this wasn’t a dream that required sleep or the surrender of the conscious mind to the unconscious – I was presumably doing it “on purpose”, although what that purpose was I have no idea; nothing very nice anyway.

Childhood hero: Charles M Schulz’s Charlie Brown

Probably most of us carry around a few daydreams with us, most I’m sure far more pleasant than that one. I can remember a few from my adolescence that were almost tangible then and still feel that way now (I would swear that I can remember what a particular person’s cheek felt like against my fingertips though I definitely didn’t ever touch it. As my childhood role model, Charlie Brown would say, “Augh!” Charles M. Schulz clearly knew about these things and still felt them vividly as an adult (as, more problematically, did Egon Schiele, subject of my previous article; but let’s not go into that). Most of the daydreams we keep with us into adulthood (or create in adulthood) are probably nicer baggage to carry around than the vapour trail one, unless you’re one of those people who fantasises about smashing people’s heads in with an iron bar (who has such a thing as an iron bar? Why iron? Wouldn’t brass do the job just as well and lead even better?) beyond the teenage years when violent daydreams are almost inevitable, but hopefully fleeting.

But thinking about your daydreams is odd, they are, like your thoughts and dreams, yours and nobody else’s, but where they come from in their detail seems almost as obscure as dream-dreams. Perhaps Freud would know. I have a couple of daydreams that have been lurking around for decades, but while I don’t believe in telekinesis or even the current obsession with affirmations and ‘manifesting,’ apparently I must be a bit superstitious; because if I wrote them down they might not come true innit?

 

*Old English dwellan “to lead into error, deceive, mislead,” related to dwelian “to be led into error, go wrong in belief or judgment,” from Proto-Germanic *dwaljana “to delay, hesitate,” *dwelana “go astray” (source also of Old Norse dvelja “to retard, delay,” Danish dvæle “to linger, dwell,” Swedish dväljas “to dwell, reside;” Middle Dutch dwellen “to stun, perplex;” Old High German twellen “to hinder, delay”) from PIE *dhwel-, extended form of root *dheu- (1) “dust, cloud, vapor, smoke” (also forming words with the related notions of “defective perception or wits”).

The apparent sense evolution in Middle English was through “to procrastinate, delay, be tardy in coming” (late 12c.), to “linger, remain, stay, sojourn,” to “make a home, abide as a permanent resident” (mid-14c.). From late 14c. as “remain (in a certain condition or status),” as in phrase dwell upon “keep the attention fixed on.” Related: Dwelled; dwelt (for which see went); dwells.

It had a noun form in Old English, gedweola “error, heresy, madness.” Also compare Middle English dwale “deception, trickery,” from Old English dwala or from a Scandinavian cognate (such as Danish dvale “trance, stupor, stupefaction”); dwale survived into late Middle English as “a sleeping potion, narcotic drink, deadly nightshade.”

passive-digressive

There are two kinds of people* – those who like forewords, introductions, prefaces, author’s notes, footnotes, appendices, bibliographies, notes on the text, maps etc, and those who don’t. But we’ll get back to that shortly.

* there are more than two kinds of people. Possibly infinite kinds of people. Or maybe there’s only one kind; I’m never sure

A few times recently, I’ve come across the idea (which I think is mainly an American academic one, but I might be completely mistaken about that) that parentheses should only be used when you really have to (but when do you really have to?) because anything that is surplus to the requirements of the main thrust of one’s text is surplus to requirements full stop, and should be left out. But that’s wrong. The criticism can be and is extended to anything that interrupts the flow* of the writing. But that is also wrong. Unless you happen to be writing a manual or a set of directions or instructions, writing isn’t (or needn’t be) a purely utilitarian pursuit and the joy of reading (or of writing) isn’t in how quickly or efficiently (whatever that means in this context) you can do it. Aside from technical writing, the obvious example where economy just may be valuable is poetry – which however is different and should probably have been included in a footnote, because footnotes are useful for interrupting text without separating the point you’re making (in a minute) from the point you’re commenting on or adding to (a few sentences ago), without other, different stuff getting in the way.

*like this¹                                                                                                                                                                ¹but bear in mind that people don’t write footnotes by accident – the interruption is deliberate²                        ²and sometimes funny

Poly-Olbion – that’s how you write a title page to pull in the readers

I would argue (though the evidence of a lot of poetry itself perhaps argues against me – especially the Spenser’s Faerie Queen, Michael Drayton’s Poly-Olbion kind of poetry that I’m quite fond of) that a poem should be** the most economical or at least the most effective way of saying what you have to say – but who’s to say that economical and effective are the same thing anyway?)

** poets, ignore this; there is no should be

 

 

 

Clearly (yep), the above is a needlessly convoluted way of writing, and can be soul-achingly annoying to read; but – not that this is an effective defence – I do it on purpose. As anyone who’s read much here before will know, George Orwell is one of my all-time favourite writers, and people love to quote his six rules for writing, but while I would certainly follow them if writing a news story or article where brevity is crucial, otherwise I think it’s more sensible to pick and choose. So;

Never use a metaphor, simile, or other figure of speech which you are used to seeing in print. Absolutely; although sometimes you would use them because they are familiar, if making a specific point, or being amusing. Most people, myself included, just do it by accident; because where does the dividing line fall? In this paragraph I have used “by accident” and “dividing line” which seem close to being commonly used figures of speech (but then so does “figure of speech”). But would “accidentally” or something like “do it without thinking” be better than “by accident?” Maybe.

Never use a long word where a short one will do. The key point here is will do. In any instance where a writer uses (for example) the word “miniscule” then “small” or “tiny” would probably “do”. But depending on what it is they are writing about, miniscule or microscopic might “do” even better. Go with the best word, not necessarily the shortest.

If it is possible to cut a word out, always cut it out. Note that Orwell wrote ‘always’ here where he could just have said If it is possible to cut a word out, cut it out. Not everything is a haiku, George.

Never use the passive where you can use the active. Surely it depends what you’re writing? If you are trying, for instance, to pass the blame for an assault from a criminal on to their victim, you might want a headline that says “X stabbed after drug and alcohol binge” rather than “Celebrity kills X.” You kind of see Orwell’s point though.

Never use a foreign phrase, a scientific word, or a jargon word if you can think of an everyday English equivalent. Both agree and disagree; as a mostly monolingual person I agree, but some words and phrases (ironically, usually ones in French, a language I have never learned and feel uncomfortable trying to pronounce; raison d’etre or enfant terrible for example) just say things more quickly and easily (I can be utilitarian too) than having to really consider and take the time to say what you mean. They are a shorthand that people in general understand. Plus, in the age of smartphones, it really doesn’t do native English speakers any harm to have to look up the meanings of foreign words occasionally (I do this a lot). The other side of the coin (a phrase I’m used to seeing in print) is that with foreign phrases is it’s funny to say them in bad translations like “the Tour of France” (which I guess must be correct) or “piece of resistance” (which I am pretty sure isn’t) so as long as you are understood (assuming that you want to be understood) use them any way you like.

Break any of these rules sooner than say anything outright barbarous. It’s hard to guess what George Orwell would have considered outright barbarous (and anyway, couldn’t he have cut “outright”?) but anyone reading books from even 30, 50 or a hundred years ago quickly sees that language evolves along with culture, so that rules – even useful ones – rarely have the permanence of commandments.

So much for Orwell’s rules; I was more heartened to find that something I’ve instinctively done – or not done – is supported by Orwell elsewhere. That is, that I prefer, mostly in the name of cringe-avoidance, not to use slang that post-dates my own youth. Even terms that have become part of normal mainstream usage (the most recent one is probably “woke”) tend to appear with inverted commas if I feel like I must use them, because if it’s not something I would be happy to say out loud (I say “woke” with inverted commas too) then I’d prefer not to write it. There is no very logical reason for this and words that I do comfortably use are no less subject to the whims of fashion, but still; the language you use is part of who you are, and I think Orwell makes a very good case here, (fuller version far below somewhere because even though I have reservations about parts of it it ends very well):

“Each generation imagines itself to be more intelligent than the one that went before it, and wiser than the one that comes after it. This is an illusion, and one should recognise it as such, but one ought also to stick to one’s world-view, even at the price of seeming old-fashioned: for that world-view springs out of experiences that the younger generation has not had, and to abandon it is to kill one’s intellectual roots.”

Review of A Coat of Many Colours: Occasional Essays by Herbert Read. (1945) The Collected Essays, Journalism and Letters of George Orwell Volume 4. Penguin 1968, p.72 

the fold-out map in The Silmarillion is a thing of beauty

Back to those two kinds* of people: I am the kind of person that likes and reads forewords, introductions, prefaces, author’s notes, footnotes, appendices, bibliographies, notes on the text, maps and all of those extras that make a book more interesting/informative/tedious.

 

*I know.

 

In one of my favourite films, Whit Stillman’s Metropolitan (1990), the protagonist Tom Townsend (Edward Clements), says “I don’t read novels. I prefer good literary criticism. That way you get both the novelists’ ideas as well as the critics’ thinking. With fiction I can never forget that none of it really happened, that it’s all just made up by the author.” Well, that is not me; but I do love a good bit of criticism and analysis as well as a good novel. One of my favourite ever pieces of writing of any kind, which I could, but choose not to recite parts of by heart, is the late Anne Barton’s introduction to the 1980 New Penguin Shakespeare edition of Hamlet*. I love Hamlet, but I’ve read Barton’s introduction many more times than I’ve read the play itself, to the point where phrases and passages have become part of my mind’s furniture. It’s a fascinating piece of writing, because Professor Barton had a fascinating range and depth of knowledge, as well as a passion for her subject; but also and most importantly because she was an excellent writer. If someone is a good enough writer**, you don’t even have to be especially interested in the subject to enjoy what they write. Beyond the introduction/footnote but related in a way are the review and essay. Another of my favourite books – mentioned elsewhere I’m sure, as it’s one of the reasons that I have been working as a music writer for the past decade and a half, is Charles Shaar Murray’s Shots from the Hip, a collection of articles and reviews. The relevant point here is that more than half of its articles – including some of my favourites – are about musicians whose work I’m quite keen never to hear under any circumstances, if humanly possible. Similarly, though I find it harder to read Martin Amis’s novels than I used to (just changing taste, not because I think they are less good), I love the collections of his articles, especially The War Against Cliché and Visiting Mrs Nabokov. I already go on about Orwell too much, but as I must have said somewhere, though I am a fan of his novels, it’s the journalism and criticism that he probably thought of as ephemeral that appeals to me the most.

*All of the New Penguin Shakespeare introductions that I’ve read have been good, but that is in a different league. John Dover Wilson’s What Happens in Hamlet (1935, though the edition I have mentions WW2 in the introduction, as I remember; I like the introduction) is sometimes easy to disagree with but it has a similar excitement-of-discovery tone as Anne Barton’s essay

** Good enough, schmood enough; what I really mean is if you like their writing enough. The world has always been full of good writers whose work leaves me cold

a scholarly approach to comics

All this may have started, as I now realise that lots of things seem to in my writing did, with Tolkien. From the first time I read his books myself, I loved that whatever part of Middle-Earth and its people you were interested in, there was always more to find out. Appendices, maps, whole books like The Silmarillion which extended the enjoyment and deepened the immersion in Tolkien’s imaginary world. And they were central to that world – for Tolkien, mapping Middle-Earth was less making stuff up than it was a detailed exploration of something he had already at least half imagined. Maybe because I always wanted to be a writer myself – and here I am, writing – whenever I’ve really connected with a book, I’ve always wanted to know more. I’ve always been curious about the writer, the background, the process. I’ve mentioned Tintin lots of times in the past too and my favourite Tintin books were, inevitably, the expanded editions which included Herge’s sketches and ideas, the pictures and objects and texts that inspired him. I first got one of those Tintin books when I was 9 or so, but as recently as the last few years I bought an in many ways similar expanded edition of one of my favourite books as an adult, JG Ballard’s Crash. It mirrors the Tintins pretty closely; explanatory essays, sketches, notes, ephemera, all kinds of related material. Now just imagine how amazing a graphic novel of Crash in the Belgian ligne claire style would be.*

*a bit like Frank Miller and Geof Darrow’s fantastic-looking but not all that memorable Hard Boiled (1990-92) I guess, only with fewer robots-with-guns shenanigans and more Elizabeth Taylor

a scholarly approach to cautionary 1970s semi-pornography/horror: the expanded Crash

A good introduction or foreword is (I think) important for a collection of poems or a historical text of whatever kind. Background and context and, to a lesser extent, analysis expand the understanding and enjoyment of those kinds of things. An introduction for a modern novel though is a slightly different thing and different also from explanatory notes, appendices and footnotes and it’s probably not by chance that they mainly appear in translations or reprints of books that already enjoyed some kind of zeitgeisty success. When I first read Anne Barton’s introduction to Hamlet, I already knew what Hamlet was about, more or less. And while I don’t think “spoilers” are too much of an issue with fiction (except for whodunnits, which I have so far not managed to enjoy), do you really want to be told what to think of a book before you read it? But a really good introduction will never tell you that. If in doubt, read them afterwards!

Some authors, and many readers, see all of these extraneous things as excess baggage, surplus to requirements, which obviously they really are, and that’s fair enough. If the main text of a novel, a play or whatever, can’t stand on its own then no amount of post-production scaffolding will make it satisfactory.* And presumably, many readers pass their entire lives without finding out or caring why the author wrote what they wrote, or what a book’s place in the pantheon of literature (or just “books”) is. Even as unassailably best-selling an author as Stephen King tends to be a little apologetic about the author’s notes that end so many of his books, despite the fact that nobody who doesn’t read them will ever know that he’s apologetic. Still; I for one would like to assure his publisher that should they ever decide to put together all of those notes, introductions and prefaces in book form, I’ll buy it. But would Stephen King be tempted to write an introduction for it?

 

* though of course it could still be interesting, like Kafka’s Amerika, Jane Austen’s Sanditon or Tolkien and Hergé (them again) with Unfinished Tales or Tintin and Alph-Art

 

That Orwell passage in full(er):

“Clearly the young and middle aged ought to try to appreciate one another. But one ought also to recognise that one’s aesthetic judgement is only fully valid between fairly well-defined dates. Not to admit this is to throw away the advantage that one derives from being born into one’s own particular time. Among people now alive there are two very sharp dividing lines. One is between those who can and can’t remember the period before 1914; the other is between those who were adults before 1933 and those who were not.* Other things being equal, who is likely to have a truer vision at the moment, a person of twenty or a person of fifty? One can’t say, though on some points posterity may decide. Each generation imagines itself to be more intelligent than the one that went before it, and wiser than the one that comes after it. This is an illusion, and one should recognise it as such, but one ought also to stick to one’s world-view, even at the price of seeming old-fashioned: for that world-view springs out of experiences that the younger generation has not had, and to abandon it is to kill one’s intellectual roots.”

*nowadays, the people who can or can’t remember life before the internet and those who were adults before 9/11? Or the Trump presidency? Something like that seems right