Tell me now, I beg you, where Flora is, that fair Roman; Archippa, and Thaïs rare, Who the fairer of the twain? Echo too, whose voice each plain, River, lake and valley bore; Lovely these as springtime lane, But where are they, the snows of yore?¹
François Villon, Ballade des dames du temps jadis(1461)¹
My uncle died two years ago now, but his Instagram account is still there. How many dead people live on in their abandoned social media accounts? The future never seems to arrive, never really exists, but history never ends. For over a quarter of a century, social media has mirrored and shaped lives, always evolving, but leaving behind its detritus just like every other phase of civilisation. Where are the people we were sociable with on the forgotten single-community (bands, hobbies, comedy, whatever) forums and message boards of the 90s and 2000s², or the friends we made on MySpace in 2005? Some live on, ageing at an only slightly faster rate than their profile pictures (Dorian Gray would now age privately at home, his picture migrating untouched from MySpace to Facebook to Twitter to Instagram to TikTok etc), others lost, vanished, dead? But still partially living on, like those sunlit American families in the home movies of the 50s and 60s.
Twenty-five years is a long, generation-spanning time, but, just as abstract expressionism essentially still lives on, in almost unaltered forms but no longer radical, long past the lifetimes of Rothko, Jackson Pollock and de Kooning, so the (just) pre-internet countercultural modernity of the late 80s and early 90s, the shock-monster-gender-fluid-glam of Michael Alig and the Club Kids, still prevalent back in the Myspace era³, (captured brilliantly in the 1998 ‘shockumentary’ Party Monster and less brilliantly in the somewhat unsatisfactory 2003 movie Party Monster) lives on and still feels current on Instagram and Tiktok and reality TV and in whatever is left of the top 40. Bulimic pop culture eats reconstituted chunks of itself and just as the 60s haunted the early 90s, bringing genuine creativity (Andrew Weatherall, to pick a name at random) and feeble dayglo pastiche (Candy Flip, to deliberately target a heinous offender), a weird (if you were there) amalgam of the 1980s and 90s haunts the 2020s, informing both the shallow dreck that proliferates everywhere and some of the genuine creativity of today.
‘I’m ready now,’ Piper Hill said, eyes closed, seated on the carpet in a loose approximation of the lotus position. ‘Touch the spread with your left hand.’ Eight slender leads trailed from the sockets behind Piper’s ears to the instrument that lay across her tanned thighs.
entering cyberspace in William Gibson’s Mona Lisa Overdrive (1988) Grafton Books, p.105.
Cyberspace, like any landscape which people have inhabited, has its lost cultures and ruins, becoming ever more remote and unknowable with the passing of years, but, like Macchu Picchu or the Broch of Gurness, retaining a sense that it all meant something significant once. The not-quite barren wastelands of Geocities and Xanga, the ruined palace of MySpace, a Rosetta stone partly effaced with dead links and half forgotten languages; photobucket, imageshack, tripod, what do these mean if you’re 15? Would the old, useable interface of MySpace seem as charmingly quaint and remote now as the penpal columns in the pages of ’80s music magazines do?
But there was a time when Lycos, Alta Vista and Ask Jeeves were peers of Google, and Bebo rivalled Facebook and Twitter, both now seemingly in senile phases of their development. Until very recently Facebook (Meta) and Twitter were brands that were seemingly unassailable, but empires do fall, albeit more slowly than bubbles burst.4 And meanwhile, the users of social networks age and die and give way to generations who remember them, just as the Incas and the Iron Age Orcadians are remembered by their monuments, if nothing else. Depressing, when you think about it; probably won’t write about history next time.
It’s funny. Don’t ever tell anybody anything. If you do, you start missing everybody. JD Salinger, The Catcher In The Rye, Penguin, 1958, p.220
¹ translated by Lewis Wharton in The Poems of François Villon, JM Dent & Sons, 1935, p54. Not reading French – I seem to go on about that a lot – this is my favourite translation I’ve come across, although apparently it’s a pretty free one, judging by the literal – but still quite nice – one here
² the continuing success of Reddit suggests that people never really grew discontented with the interface of the Kiss online fanclub c. 2005 (etc etc)
³It’s weird to note that the Club Kids would be considered – even without the murder etc – just as outrageous today as in the late 80s, even though their aesthetic was itself put together from a mix of Bowie, gore movies, Japanese pop culture etc etc. But then as I think I recently noted, there are people who still find the word fuck outrageous, after something like a millennium.
4Online and mainstream culture, even after this quarter century, remain mysteriously separate. Online news unfolds as it happens, but meanwhile in the daytime world, mainstream culture hangs on to husks even older than Geocities; publicly owned TV news shows don’t look to what’s happening now, but pore over the front pages of newspapers – yesterday’s news… today! – simultaneously being redundant and ensuring that newspaper owners’ views get publicity beyond their dwindling readership and therefore giving them an artificial sense of relevance. Which is really just about money, just as Google and Facebook are; the crumbling aristocracy of print media, its tendrils still entwined with the establishment, versus the new money, steadily buying its way in.
The dying man glows with sickness in his mildewy-looking bed, the light seeming to emanate from where he sits, crammed into the airless, box-like room. He signs his will while his friend looks on intently with concern and restrained grief.
The artist who painted Thomas Braithwaite of Ambleside making his will in 1607 may not have been considered important enough as an artist, (still a person of relatively low social status in northern Europe, though this was starting to change with painters like Rubens and his pupil Anthony Van Dyck) to warrant signing the picture or having their name recorded at all, except perhaps in the household accounts – but they were important as a witness, and the painting is itself a kind of legal document, although it’s more than that too. The great enemy of the Elizabethan and Jacobean ages wasn’t death, with which most adults would have been on very familiar terms, but disorder and chaos*; and this, despite its tragic appearance, is a painting devoted to the age’s great virtue; order. Both the dying lord (an inscription records the date of his death (Thomas Braithwaite of gentry stock, died 22 December, 1607, aged 31) and his friend George Preston of Holker are identifiable to those who knew them by their likenesses and to those who didn’t, by their coats-of-arms. Biblical texts tell us that Thomas Braithwaite was a virtuous man, but so does the painting itself; this is a man who, even while he lay dying, took care of his business. His passing is tragic, but, he reassures us, it will cause only grief and not inconvenience.
*see EMW Tillyard, The Elizabethan World Picture, Pelican Books, 1972, p.24
We talk about religious faith now as a kind of choice as much as a belief system, but for all its paranoia about atheism –and all the subsequent romanticism about that era’s new spirit of humanism – the Tudor and Stewart ages had inherited a world view in which the existence, not only of God and Heaven and Hell, but the essential hierarchy of existence, was more or less taken for granted. We may differentiate arbitrarily now between religion and superstition, but for the people in these cramped and airless paintings there was no real contradiction between, say Christianity and astrology, because in accepting without exception the primacy of god the creator, it all works out in the end – everything that has ever existed and everything that will ever exist, already exists. Perhaps human beings aren’t supposed to divine the future, but God has written it and the signs – comets, unseasonal weather, the movement of the stars and the behaviour of animals – are there to be read and interpreted by anyone with the nerve to do so.
In an off-kilter, vertigo-inducing room that seems almost to unfurl outwards from the skull at its centre, an illogical space hung with black velvet, a man and his son, looking outwards, but not at us, stand by the deathbed of their wife and mother, while a glamorous young woman meets our gaze from where she sits, apparently on the floor at the foot of the bed.
There’s virtue in this painting too, but mostly this one really is about death. It’s there at the centre, where the lord’s hand sits on a skull, recalling the kind of drama which was then passing out of fashion, just as this kind of painting was. The skull, like the black-draped cradle (with its inscription that reads He who sows in flesh reaps bones), acts as a vanitas motif, focussing the viewer’s attention on the shortness of life, but also recalls the enthusiastically morbid writing of men like John Webster and Thomas Middleton. Sir Thomas and his wife had grown up in an England where plays like Middleton’s Revenger’s Tragedy often featured soliloquies over the remains of loved ones. Sir Thomas Aston is not being consumed by a desire for revenge, but his hand on the skull can’t help recalling Hamlet, or even more so, anti-heroes like Middleton’s Vindice, who opens The Revenger’s Tragedy contemplating the skull of his fiancée;
My study’s ornament, thou shell of death/once the bright face of my betrothed lady/When life and beauty naturally fill’d out/these ragged imperfections,/when two heaven-pointed diamonds were set/ in those unsightly rings – then t’was a face/so far beyond the artificial shine/of any woman’s bought complexion The Revenger’s Tragedy, Act1 Sc 1, in Thomas Middleton, Five Plays ed. Bryan Loughrey & Neil Taylor, Penguin Books, 1988 p.73
Sir Thomas, unlike Vindice, displays the correct behaviour for a grieving man with an orphaned young son – not, the deadpan ‘stiff upper lip’ restraint of later generations of British gentlemen – though he is a dignified figure, but the kind of behaviour noted in books of etiquette like the anonymous Bachelor’s Banquet of 1603, which states that if
in the midst of this their mutual love and solace, it chanceth she dies, whereat he grieves so extremely, that he is almost beside himself with sorrow: he mourns, not only in his apparel for a show, but unfeignedly, in his very heart, and that so much, that he shuns all places of pleasure, and all company, lives solitary, and spends the time in daily complaints and moans, and bitterly bewailing the loss of so good a wife, wherein no man can justly blame him, for it is a loss worthy to be lamented.
The Bachelor’s Banquet in The Laurel Masterpieces of World Literature – Elizbethan Age, ed. Harry T. Moore, Dell Books, 1965, p.324)
It is perhaps this behaviour we should read in Sir Thomas’s sideways glance, not the hauteur of the nobleman but the remoteness of the recently bereaved. His black sash is adorned with a death’s head brooch; he and his young son (also Thomas) are to be considered men of the world; to their left a globe sits on a tapestry decorated with elephants. But all their worldly knowledge and faith is no help here; the two Astons grasp a cross staff bearing the inscription, The seas can be defined, the earth can be measured, grief is immeasurable. Given this display of intense, but restrained grief, the smiling girl – the only person who makes eye contact with us – is a strange figure, despite her beautiful mourning clothes, and it may be that she is the lady in the bed, as she looked in happier times, there to show us, and remind father and son, of what they are missing.
On what looks like a shallow stage opening onto a bed in a cupboard, a strangely-scaled set of figures pose stiffly, only the older child meeting our eye with a knowing smirk, although the strangely capsule-like baby seems aware of us too.
As in the Souch painting, the father figure dominates, just as they dominated their households; the household being a microcosm of the state, the state itself a microcosm of the universe.* Mr Saltonstall, despite being at the apex of a pyramid of hierarchy that allowed absolute power, does not look devoid of compassion or warmth – indeed, he has had himself depicted holding the hand of his son, who himself mirrors (in, it has to be said, a less benign-looking way) this gesture of casual mastery, holding his little sister’s wrist, demonstrating just how the links in this chain of family work. And the family is inside the kind of house familiar nowadays to the heritage tourist as a mirror of the world that produced it; mansions like overgrown doll’s houses, big on the outside, but strangely cramped and illogical inside, with peculiar little wood-panelled rooms and an ancient smell of damp.
The nakedness of the power structure here isn’t subtle; and it isn’t supposed to be, because it wasn’t there to be questioned but accepted. Virtue lies in following god’s system of organisation, any suggestion to the contrary would make it an entirely different kind of painting. And indeed when painting – and painters – achieved a higher social standing in the century that followed, the messages become more subtle, only reappearing in something like this blatant form again in western art in the post-Freudian era, with a painting like Dorothea Tanning’s 1954 A Family Portrait. But Tanning’s painting is a knowing representation of a reality she was aware of but which had the force of tradition alone. Its appearance in the mid-17th century reflects the reality of the age; the truth, if not the only truth.
*EMW Tillyard, The Elizabethan World Picture, p.98-9
The first impression, looking at these kinds of paintings, is something like looking at fairyland through the distorting lens of Richard Dadd’s insanity centuries later; comical and disturbing, familiar and illogical. These painters of the Elizabethan and Jacobean tradition (their art died out at around the same time as Charles I did in the middle of the seventeeth century) – Souch, Des Granges, William Larkin and their many nameless contemporaries – were at the tail end of a dying tradition that would be replaced by something more spacious, gracious, modern and ‘realistic’; but ‘realistic’ is a loaded word and it’s entirely likely that this older tradition captures their world more accurately. We don’t need a time machine (though it would be nice) – a visit to almost any castle, palace or stately home is enough to confirm that the velvet curtains and classical paraphernalia of a Rubens or Van Dyck portrait does not tell the whole story of their era, even among the tiny demographic who their art served. It is a world that we would probably find dark and claustrophobic; witness the smallness of furniture, the lowness of the doorways and the dark paintings of dead ancestors, and this – regardless of the fact that it is partly due to what would later be seen as incompetence* – is what is preserved in this tradition of painting, as well as in the homes these people left behind.
* it’s a matter of fact that the average artist drawing a superhero comic in the 20th/21st century has a better grasp of mathematical perspective – and the idea of perspective at all – than even the more accomplished Elizabethan or Jacobean portrait painter
This is the kind of art that the Renaissance and its aftermath is supposed to have made obsolete – but though the word ‘art’ may owe its origin to its nature as something artificial, it also tells the truth, or a truth, regardless of its creators’ intentions. But if I’m implying that it’s realistic rather than idealistic, what does ‘realistic’ mean? Often when deriding ‘modern art’ (a meaningless term, since the art it usually refers to is often post-dated by art – like Jack Vettriano for instance – that is not considered to be ‘modern’) the assumption is that modern art is kind of aberration, a straying from a realistic norm*. But when looked at as a whole (or as much of a whole as is possible from a particular cultural viewpoint) it becomes quickly apparent that art that is ‘realistic’ in the narrowly photographic sense is a tiny island in the vast ocean of art history – and what is more, relies on ideas – such as the opposition of ‘abstract’ and ‘realistic’, that may have no currency whatsoever outside of the Western tradition.
Even within Western cultures, the idea that photographic equates to experiential is debatable; despite the persistence (outside of academia) of the idea that Picasso was primarily an artist who painted noses on the wrong side of heads etc, a painting like his Guernica clearly has more in common with images of war as it was experienced in the 20th century – even vicariously through cinema and TV – than the kind of ‘war art’ that my granddad had on his walls, beautiful paintings in a tradition that lives on through artists like Robert Taylor, visions of war where the fear and panic becomes excitement and drama, an altogether easier thing to be entertained by.
*A classic example of this attitude came from Philip Larkin, who, when writing about modernism in jazz, digressed to cover all of the arts, noting
All that I am saying is that the term ‘modern’ when applied to art, has a more than chronological meaning: it denotes a quality of irresponsibility peculiar to this [ie the 20th] century… the artist has become over-concerned with his material (hence an age of technical experiment) and, in isolation, has busied himself with the two principal themes of modernism, mystification and outrage. Philip Larkin, All What Jazz, Faber & Faber, 1970, p.23
Picasso was trying to capture the feel of his century – but most of the great courtly artists of the sixteenth and seventeenth centuries – the Renaissance masters who became household names – were trying to capture something loftier, to escape the more earthy, earthly aspects of theirs, not least because they were the first generation to attain something like the status that Picasso would later attain; artists as creators and inventors, not craftsmen and recorders. And therefore that feeling of the life of the times shines through more vividly in the work of artists like John Souch and David Des Granges. The 17th century was a time when the world – even the world inhabited by the aristocracy – was far smaller than it is today in one sense, but the wider world seemed correspondingly bigger and more dangerous, but also perhaps richer or deeper, just as these people – often married by 12 or 14, learned – if they were allowed to learn – by 20, old by 40, were both smaller and bigger than we are.
This kind of painting, part portrait, part narrative, was uniquely suited to the lives it recorded, and in one late example its strengths can be contrasted with those of the baroque style that swept it away. In 1613, Nicholas Lanier was a rising star in the English court, composer of a masque for the marriage of the Earl of Somerset. Around this time he was painted by an unknown artist, in the semi-emblematic tradition of artists like John Souch. There are references – the classical statue, the pen and paper with its mysterious inscription (RE/MI/SOL/LA) that highlight that this man is more than just a lutenist, but at the same time he is most definitely that, and the artist has taken care to render realistically Lanier’s muscles as he holds the instrument; an artist yes, but a workman of sorts too. By 1632, Lanier was the Master of the King’s Music and a trusted envoy of King Charles, who even sent him on picture-buying missions. And it is this gentleman that Van Dyck captures; aloof, authoritative, not someone we can picture sweating over a difficult piece of music.
With the art of Van Dyck, the courts of Britain were to discover an ideal of aristocratic indifference which would partly define the project of British imperialism and which is, unfortunately, still with us today. But the truth of Van Dyck’s age, and those which preceded him was stranger, darker and more human. And it’s there still, in those damp-smelling big-small houses, and in the art that died with King Charles.
On the rare occasions that anyone asks me anything about my writing, it’s usually about music reviews. The consensus seems to be that a good review (I don’t mean a positive one) should either be ‘listen to the music and say if it’s good or bad’, or ‘listen to the music and describe it so that other people can decide whether it’s their cup of tea, but keep your opinion out of it’. As it happens, I’ve given this subject a lot of thought, not only because I write a lot of reviews, but I also because I read a lot of reviews, and some of my favourite writers (Charles Shaar Murray is the classic example) manage to make me enjoy reading about music even when it’s music that I either already know I don’t like, or that I can be fairly certain from reading about it that I won’t like. Because reading a good article about music is first and foremost ‘reading a good article’.
Anyway, over the course of pondering music reviews I have come to several (possibly erroneous) conclusions:
* “star ratings” HAVE TO BE relative and all stars don’t have the same value. For instance, one might give a lesser album by a great artist 3 stars, but those are not the same 3 stars one would give a surprisingly okay album by a generally crappy artist.
* Musical taste is, as everyone knows, entirely subjective, but reviewing (for me at least) has to try be a balance between objective and subjective; just listening to something and saying what you think of it is also valid of course.
* Objective factors alone (see fun pie chart below) can never make an otherwise bad album good, but subjective factors can.
* ‘Classic’ albums make a nonsense of all other rules.
Let’s examine in more detail, with graphs! (are pie charts graphs?):
Objective factors:
Objective factors (see fun pie chart) are really only very important when the reviewer doesn’t like the music: when you love a song, whether or not the people performing it are technically talented musicians/pitch perfect singers etc is entirely irrelevant.
But, when an album or song (or movie, book etc) is dull or just blatantly abysmal, some comfort (or conversely, some outrage and annoyance) can be gained from the knowledge that at least the participants were good at the technical aspects of what they were doing, even if they are ultimately using those skills for evil.
Subjective Factors:
Although there are many subjective factors that may be relevant; nostalgia for the artist/period, personal associations, all of these really amount to either you like it or you don’t; simple but not necessarily straightforward.
The positive subjective feeling ‘I like it!’ can override all else, so that an album which is badly played, unoriginal, poorly recorded and awful even by the artist’s own standards can receive a favourable review (though the reviewer will hopefully want to point out those things)
Meanwhile the negative subjective feeling ‘I don’t like it’ can’t help but affect a review, but should hopefully be tempered by technical concerns if (an important point) the reviewer feels like being charitable. They may not.
Ideally, to me a review should be something like 50% objective / 50% subjective (as in the examples somewhere below) but in practice it rarely happens.
“Classic” status:
The reviewing of reissued classics can be awkward, as ‘classic’ status in a sense negates reviewing altogether; it is completely separate from all other concerns, therefore said classic status can affect ratings just because the album is iconic and everyone knows it. Reviews of new editions of acknowledged classics usual become either a review of what’s new (remastered sound, extra tracks etc) or a debunking of the classic status itself; which as far as I know has never toppled a classic album from its pedestal yet.
Classic album status is normally determined by popularity as much as any critical factors, but popularity itself shouldn’t play a part in the reviewer’s verdict; just because 30,000,000 people are cloth-eared faeces-consumers, it doesn’t mean the reviewer should respect their opinion, but they should probably acknowledge it, even if incredulously. Sometimes or often, classic status is attained for cultural, rather than (or as well as) musical reasons*, and it should be remembered that albums (is this still true in 2020? I don’t know) are as much a ‘cultural artefact’ (in the sense of being a mirror and/or record of their times) as cinema, TV, magazines or any other zeitgeist-capturing phenomenon.
* in their very different ways, Sgt Pepper’s Lonely Hearts Club Band, Thriller and The Spice Girls’ Spice were all as much ‘cultural phenomena’ as collections of songs
SO ANYWAY; how does this all work? Some examples:
I once offended a Tina Turner fan with an ambivalent review of the 30th anniversary edition of Ms Turner’s 1984 opus Private Dancer.
As a breakdown (of ‘out of 10’s, for simplicity) it would look something like this:
TINA TURNER: PRIVATE DANCER (3OTH ANNIVERSARY EDITION)
Objective factors * musicianship – 9/10 – hard to fault the adaptability or technical skill of her band * songwriting – 6/10 – in terms of catchy, verse-chorus-verse efficiency & memorableness these are perfectly good songs, if a bit cheesy & shallow & therefore a waste of Tina Turner * production – 9/10 – no expense was spared in making the album sound good in its extremely shiny, 80s way * originality – 0/10 – as an album designed to make TT into a successful 80s artist, it wasn’t really supposed to be original, so hard to actually fault it in that respect * by the standards of the artist – 2/10 – in the 60s/70s Tina Turner made some great, emotionally forceful, musically adventurous and just great records. In 1984 she didn’t.
Overall: 26/50 = 5.2/10
Subjective Factors
* I don’t like it: 1/10 (but not 0, because Tina Turner is a legend and it would be wrong to deny that somehow)
Overall 5.2/10 + 1/10 = 6.2/20 = 3.1/10 = 1.55/5 (round up rather than down, out of respect for Tina) = 2 stars
and in fact I did give the album two stars, though I didn’t actually do any of the calculations above; but it’s pleasing to find out that the instinctive two stars is justified by fake science.
by way of contrast, a favourite that seems to be an acquired taste at best:
VENUSIAN DEATH CELL: HONEY GIRL (2014)
Objective factors * musicianship – 1/10 – David Vora’s guitar playing is not very good, plus the guitar is out of tune anyway, and his drumming is oddly rhythm-free * songwriting – 2/10 – the songs on Honey Girl are not really songs, they may be improvised, they don’t have actual tunes as such * production – 0/10 – David pressed ‘record’ on his tape recorder * originality – 10/10 – Vora doesn’t sound like anyone else, his songs are mostly not about things other people sing about * by the standards of the artist – 9/10 – I like all of Venusian Death Cell’s albums, they are mostly kind of interchangeable, but Honey Girl is one of the better ones (chosen here over the equally great Abandonned Race only because of the uncanny similarities between the cover art of Honey Girl and Private Dancer).
Overall: 22/50 = 4.4/10
Subjective Factors
* I like it: 9/10 (but not 10, because if encouraged too much David Vora might give up and rest on his laurels. Though if he did that I’d like to “curate” a box set of his works)
Overall 4.4/10 + 9/10 = 13.4/20 = 6.7/10 = 3.35/5 (round up rather than down, out of sheer fandom) = 4 stars
And in fact I did give Honey Girl four stars, but I’ve yet to hear of anyone else who likes it. Which is of course fuel for the reviewer’s elitist snobbery; win/win
Star Ratings
I’ve used scoring systems above, but the writers I like best rarely use scores or ‘star ratings’. I don’t think anybody (artists least of all) really likes star ratings or scores because they immediately cause problems; if, for instance, I give the Beach Boys’s Pet Soundsfour stars (and the critical consensus says you have to; also, I do love it), then what do I give Wild Honey or Sunflower, two Beach Boys albums that are probably demonstrably ‘less good’, but which I still like more? But at the same time, I suppose scores are handy, especially for people who want to know if something is worth buying but don’t want an essay about it – and who trust the reviewer. The best ‘score’ system I’ve ever seen is in the early 2000s (but may still be going?) fanzine Kentucky Fried Afterbirth, in which the genius who writes the whole thing, Grey, gives albums ratings out of ten ‘cups of tea’ for how much they are or aren’t his cup of tea; This may be the fairest way of grading a subjective art form that there can possibly be.
Critical Consensus
I mentioned the critical consensus above, and there are times when it seems that music critics seem to all think the same thing, which is how come there’s so much crossover between books like 1000 Albums You Must Hear Before You Die (I always feel like there’s an implied threat in those titles) and The Top 100 Albums of the Sixties etc. I’m not sure exactly how this works, because like most people I know who love music, my favourite albums and songs aren’t always (or even usually) the most highly regarded ones. My favourite Beatles album isn’t the ‘best’ one (Revolver, seems to be the consensus now); Songs in the Key of Life is the Stevie Wonder album, but it’s probably my third or fourth favourite Stevie Wonder album; I agree that Bruce Dickinson is a metal icon but I kind of prefer Iron Maiden with Paul Di’anno (granted PD wouldn’t be as good as Bruce at things like Rime of the Ancient Mariner but it’s less often mentioned that Bruce is definitely not as good at singing Wrathchild etc as Paul was.) Much as I genuinely love The Velvet Underground and Nico, I genuinely love the critically un-acclaimed Loaded just as much; there are so many examples of this that the idea of an actual critical consensus that means anything seems like nonsense.
I’ve been writing music reviews for many years now, but my own involvement with ‘the consensus’ is rare and the only solid example I can think of is a negative one. I thought – and I still think – that Land, the fourth album by Faroese progressive metal band Týr, is the best thing they’ve ever done. I gave it a good review, not realising that the critical tide was turning against the band, and, for whatever reason (fun to speculate but lack of space is as likely as anything), my positive review never appeared in print. It wouldn’t have made any real difference to the band or to the album’s reception in general, but it did make me feel differently about albums that are notoriously bad (or good). Who is deciding these things? I’m a music critic and I’m not. And although I – like, I think everyone – take reviews with a pinch of salt anyway (someone else liking something is a strange criteria for getting it, when you think about it), I have to admit if I hadn’t had to listen to Land (which I still listen to every now & then, over a decade later), I wouldn’t have been in a hurry to check out the album after reading again and again that it was dull and boring.
Throughout this whole article the elephant in the room is that, at this point, the whole system of reviewing is out of date. You can almost always just listen to pretty much anything for free and decided yourself whether you like it, rather than acting on someone else’s opinion of it. But in a way that makes the writing more important; again, like most people, I often check things out and stop listening at the intro, or half way through the first song if I just don’t like it – except when I’m reviewing. Reviewers have to listen to the whole thing, they have to think about it and say something relevant or contextual or entertaining.* And if the reviewer is a good writer (Lester Bangs is the most famous example, though I prefer Jon Savage or the aforementioned CSM and various nowadays people), their thoughts will entertain you even if the music ultimately doesn’t.
*worth a footnote as an exception which proves the rule is a notorious Charles Shaar Murray one-word review for the Lee Hazlewood album Poet, Fool or Bum: “Bum.”
There are relatively few times in life when it’s possible to switch off your mind and enter a trance-like state without going out of your way to do so; but sitting in a classroom for a period (or better yet, a double period) of whatever subject it is that engages you least is one of those times. When the conditions are right – a sleepy winter afternoon in an overly warm room maybe, with darkness and heavy rain or snow outside and the classroom lights yellow and warm, the smell of damp coats hung over radiators and a particularly boring teacher – the effect can be very little short of hypnotic. The subject will be a matter of taste, for me the obvious one I detested was Maths, but I think that something like Geography or ‘Modern Studies’ (strangely vague subject name), where I wasn’t concerned so much with not understanding and/or hating it, would be the optimum ‘trance class’.
There’s nothing like school for making you examine the apparently stable nature of time; if, as logic (and the clock) states, the 60 or so minutes of hearing about ‘scarp-and-vale topography’ really was about the same length of time as our always-too-short lunch hour, or even as was spent running around the rugby pitch, then clearly logic isn’t everything, as far as the perception of human experience is concerned.
But it would not be true to say that I did nothing during these long, barren stretches of unleavened non-learning. Mostly, I doodled on my school books. Sometimes this was a conscious act, like the altering of maps with tippex to create fun new supercontinents, or the inevitable (in fact, almost ritualistic, after 7 years of Primary school) amending of the fire safety rules that were printed on the back of every jotter produced by The Fife Regional Council Education Committee. Often these were just nonsensical, but even so, favourite patterns emerged. I had a soft spot for “ire! ire! ire! anger! anger! anger!” (in the interests of transparency I should probably point out that I was almost certainly unaware at the time that ire means anger), and the more abstract “fir! fir fir! Dang! Dang! Dang!” (see?), but some things like ‘Remember Eire hunts – Eire kills’ were fairly universal. But also, there was the whiling (or willing) away of time by just doodling, in margins, on covers, or if the books didn’t have to be handed in at the end of the class, just anywhere; band logos and Eddies* and cartoon characters. Later, towards the end of my high school career, there’s a particularly detailed and baroque drawing of a train going over a bridge (something I wouldn’t have had much patience for drawing in an actual art class) which immediately summons up the vivid memory of a particularly long Geography class, and even which pen – a fine felt tip I liked but couldn’t write neatly with** – that I drew it with.
*Eddie = ‘Eddie the head’, Iron Maiden’s beloved zombie mascot, created – and painted best – by Derek Riggs
**i.e. ‘I wrote even less neatly than usual with’
If I could recall the things I was supposed to learn in classes this well I would have done much better at school. But the point of doodling is that it’s whatever it is your hand draws when your brain isn’t engaged; or, as André Breton put it, drawings that are ‘dictated by thought, in the absence of any control exercised by reason, exempt from any aesthetic or moral concern.’*
This is in fact from his definition of what surrealism is; ‘psychic automatism in its pure state’ and later, in The Automatic Message (1933) Breton went further, influenced by his reading of Freud, specifically referencing what would later become known as art brut or ‘outsider art’ – drawings by the mentally ill, visionaries, mediums and children – as ‘surrealist automatism’. Although it might seem to – well, it definitely does – give too much dignity and importance to the time-wasting scrawls of teenagers to consider them anything but ephemeral, the strange faces, swords, cubes, eyes, tornadoes and goats that littered my school books aged 12-14 or so do seem to preserve, not just the kind of pantheon almost every child/teenager has – made up of favourite bands, TV shows, cartoon characters etc – but a kind of landscape of enigmatic symbolism that comes from who-knows-where and perhaps represents nothing more than the imagination crying for help from the heart of a particularly stimulus-free desert. But in the end, that’s still something.
*André Breton, Manifesto of Surrealism 1924, published in Manifestoes of Surrealism, Ann Arbor paperbacks, tr. Richard Seaver and Helen R. Lane, 1972, p.26
To start with, this was mostly about books, and I think it will end that way too. But it begins with a not terribly controversial statement; hero worship is not good. And the greatest figures in the fight for human rights or human progress of one kind or another – Martin Luther King, Jr, Emmeline Pankhurst, Gandhi – without wishing to in any way diminish their achievements – would not have achieved them alone. Rosa Parks is a genuine heroine, but if she had been the only person who believed it was wrong for African-American people to be forced to give up seats for white people, the practice would still be happening. These individuals are crucial because they are catalysts for and agents of change – but the change itself happens because people – movements of people – demand it.
This is obviously very elementary and news to nobody, but it’s still worth remembering in times like these, when people seem to be drawn to somewhat messianic figures (or to elevate people who have no such pretensions themselves to quasi-messianic status). One of the problems with messiahs is that when they don’t fulfil the hopes of their followers, their various failures or defeats (of whatever kind) take on a cataclysmic significance beyond the usual, human kind of setback and re-evaluation. It’s only natural to feel discouraged if your political or spiritual dreams and hopes are shattered, but it’s also important to remember that the views and opinions that you were drawn to and which you agree with are yours too. They are likely to be shared by millions of people and the fact that they are also apparently not shared by a greater number in no way invalidates them or renders them pointless.
The history of human progress is, mostly, the history of people fighting against entrenched conservative views in order to improve the lives of all people, including, incidentally, the lives of those people they are fighting against. This obviously isn’t the case in ultimately ideological revolutions like those in France or Russia, which quickly abandoned their theoretically egalitarian positions in order to remove undesirable elements altogether, or the Nazi revolution in Germany, which never pretended to be inclusive in the first place. Hopelessness, whether cynical or Kierkegaard-ishly defiant, is a natural response, but the biggest successes of human rights movements – from the abolition of slavery to the enfranchisement of women to the end of apartheid in South Africa to the legalisation of abortion or gay marriage – have often taken place during eras which retrospectively do not seem especially enlightened; if you believe in something, there is hope.
But when change is largely driven by mass opinion or pressure – and when we know that it is – why is it the individual; Rameses II, Julius Caesar, Genghis Khan, Napoleon, Garibaldi, Lenin, Hitler, the Dalai Lama, Queens, Kings, political leaders – that looms so large in the way we see events historically? Anywhere from three to six million people died in the “Napoleonic Wars” – Napoleon wasn’t one of them, his armies didn’t even win them; but they are, to posterity, his wars. The short answer is I think because as individuals, it is individuals we identify with. We have a sense of other peoples’ lives, we live among other people (sounds a bit Invasion of the Bodysnatchers), but we only know our own life, and we only see the world through the window of our own perceptions.
The artist Sara Shamma – who, significantly has undertaken many humanitarian art projects, but has also done much of her most profound work in self-portraiture – said “I think understanding a human being is like understanding the whole of humanity, and the whole universe” and the more I’ve thought about that statement the more true it seems. If we truly understand any human being, it is first, foremost and perhaps only, ourselves. And, unless you are a psychopath, in which case you have my condolences, you will recognise the traits you have – perhaps every trait you have – in other people, people who may seem otherwise almost entirely different from you. When you look at the classifications humankind has made for itself – good/bad, deadly sins, cardinal virtues – these are things we know to exist because, in varying degrees, we feel them in ourselves, and therefore recognise them in others. Even that most valued human tool, objectivity, is a human tool, just as logic, which certainly seems to explain to our understanding the way the world works, is a human idea and also an ideal. Interestingly but perhaps significantly, unlike nature, mathematics or gravity, human behaviour itself routinely defies logic. When we say – to whatever extent – we understand the universe, what I think we mean is that we understand our own conception of it. It’s easy to talk about the universe being boundless, but not limitless, or limitless, or connected to other universes as part of a multiverse (though not easy to talk about intelligently, for me), but regardless of what is ‘out there’, what we are actually talking about is all ‘in here’, in our own brain; the universe that you talk about and think about is whatever you think it is, however you perceive it. If what you believe dictates the way you live your life it may as well be, to all intents and purposes ‘the truth’. For Stephen Hawking there were black holes in space/time, and whether or not there actually are, for a creationist there probably aren’t.
This is not to say that there are no actual solid facts about (for example) the nature of the universe; but nonetheless to even prove – to us personally while alive – that anything at all continues to exist after our own death is impossible. We can of course see that it goes on after other people’s deaths, but then I can say with what I believe to be complete conviction that there is no God and that human beings are just (well I wouldn’t say “just”) a kind of sentient hourglass with the added fun that you never know how much sand it holds to start with – but that doesn’t change the fact that a whole range of Gods have made and continue to make a decisive difference to the lives of other people and therefore to the world.
But whereas that might sound like the background for some kind of Ayn Rand-ish radical individualism, I think the opposite is true; because if any of what I have written is correct, the key part is that it applies equally to everyone. The phrase ‘we’re all in the same boat’ is being bandied about a lot lately for pandemic-related reasons, and it’s only vaguely true as regards that particular situation. We aren’t in the same boat, or even necessarily in the same kind of body exactly, but what we do all share – if broadly – is the same kind of brain. We are all individuals, and If we are conscious, we are probably self conscious. And given that we live our – as far as we can safely tell – single earthly life as an individual human being, the idea that any of us is powerless during that lifetime is nonsense. When asked to name someone who has made a difference to the world, the first person you think of should be yourself. There would be no world as you know it without you in it, and that is not a small thing; by existing, you are changing the world. Whether for better or worse, only you can say.
Having faith in other people (or even just getting along with them) makes both your and their lives better, but the belief that one particular individual outside of yourself may be the solution to the world’s (or the country’s, etc) ills is worse than feeling powerless yourself; not only because it can reinforce that sense of powerlessness, but because it’s blatantly untrue and (I hate to use this completely devalued word, but never mind) elitist. And it reduces every issue, however complex, to a finite, succeed-or-fail one, which is rarely how the world works. The idea of the hero as saviour probably has about as much validity as the idea of the lone villain as the cause of whatever ills need to be cured. Hero-worship is both logical (because we see the world from the viewpoint of “I”) and also an oddly counter-intuitive ideal to have created, since in reality as we know it, the lone individual may be us, but is largely not how we live or how things work. We have structured our societies, whether on the smaller level of family or tribe, or the larger ones like political parties or nations, in terms of groups of people. But I suppose it is the same humanity that makes us aware of and empathetic to the feelings of others that makes us want to reduce ideas to their black and white, bad vs good essentials and then dress those ideas up in human clothes.
And so, to books! Reading fiction and watching films and TV, it’s amazing how the larger-than-life (but also simpler and therefore ironically smaller-than-life) hero/ine vs villain, protagonist vs antagonist and – most hackneyed of all (a speciality of genre fiction since such a thing existed, and the preserve or religion and mythology before that) – the ‘chosen one’ vs ‘dark lord’ narrative continues to be employed by writers and enjoyed by generations of people (myself included*), long past the age that one becomes aware of the formulaic simplification of it.
*for people of my generation, the mention of a ‘dark lord’ immediately conjures up Star Wars and Darth Vader/The Emperor, though the ‘chosen one’ theme is thankfully underplayed in the original trilogy. George Lucas doesn’t get much credit for the prequels, but making the chosen one becomethe dark lord is an interesting twist, even if Lucifer got there first.
Whatever its origins, it seems that people do want these kinds of figures in their lives and will settle for celebrities, athletes, even politicians in lieu of the real thing. Hitler was aware of it and cast himself in the lead heroic role, ironically becoming, to posterity, the antithesis of the character he adopted; Lenin, who by any logical reading of The Communist Manifesto should have been immune to the lure of hero worship, also cast himself in the lead role, as did most of his successors to the present day (and really; to enthusiastically read Marx and then approve a monumental statue of oneself displays, at best, a lack of self-awareness). The Judeo-Christian god with his demand, not only to be acknowledged as the creator of everything, but also to be actually worshipped by his creations, even in his Christian, fallible, just-like-us human form, is something of a special case, but clearly these are primordial waters to be paddling in.
Still, entertainment-wise, it took a kind of humbling to get even to the stage we’re at. Heroes were once demi-gods; Gilgamesh had many adventures, overcame many enemies, but when trying to conquer death found that he could not even conquer sleep. Fallible yes, but hardly someone to identify with. And Cain killed Abel, David killed Goliath, Hercules succeeded in his twelve tasks but was eventually poisoned by the blood of a hydra, Sun Wukong the Monkey King attained immortality by mistake while drunk, Beowulf was mortally wounded in his last battle against a dragon. Cúchulainn transformed into a monstrous creature and single-handedly defeated the armies of Queen Medb. King Arthur and/or the Fisher King sleep still, to be awoken when the need for them is finally great enough. These are heroes we still recognise today and would accept in the context of a blockbuster movie or doorstop-like fantasy novel, but less so in say, a soap opera or (hopefully) on Question Time. I knew some (but not all) of these stories when I was a child, but all of them would have made sense to me because, despite the differences between the settings and the societies that produced them and that which produced me, they are not really so vastly different from most of my favourite childhood stories.
Partly that’s because some of those were those ancient stories. But even when not reading infantilised versions of the Greek myths (I loved the Ladybird book Famous Legends Vol. 1 with its versions of Theseus and the Minotaur and Perseus and Andromeda*) it was noticeable that, although there still were heroes of the unambiguous superhuman type (in comics most obviously; like um, Superman), in most of the books I read, the hero who conquers all through his or her (usually his) all-round superiority was rarely the lone, or even the main protagonist. I don’t know if it’s a consequence of Christianity (or just of literacy?) but presumably at some point people decided they preferred to identify with a hero rather than to venerate them. Perhaps stories became private rather than public when people began to read for themselves, rather than listening to stories as passed down by bards or whatever? Someone will know.
.*I remember being disappointed by the Clash of the Titans film version of Medusa, too monstrous, less human, somehow undermining the horror
The first real stories that I remember (this would initially be hearing rather than reading) are probably The Hobbit, The Lion, The Witch and The Wardrobe, Charlie and the Chocolate Factory – all of which have children or quasi-children as the main characters. Narnia is a special case in that there is a ‘chosen one’ – Aslan the lion – but mostly he isn’t the main focus of the narrative, Far more shadowy, there are books that I never went back to and read by myself, like Pippi Longstocking and my memory of those tends to be a few images rather than an actual story. As a very little kid I know I liked The Very Hungry Caterpillar and its ilk (also, vastly less well known, The Hungry Thing by Jan Slepian and Ann Seidler in which, as I recall, some rice would be nice said a baby sucking ice). Later, I loved Tintin and Asterix and Peanuts and Garfield as well as the usual UK comics; Beano, Dandy, Oor Wullie, The Broons, Victor and Warlord etc. The first fiction not reliant on pictures that I remember reading for myself (probably around the Beano era) would be the Narnia series (which I already knew), Richmal Crompton’s William books and, then Biggles (already by then an antique of a very different era), some Enid Blyton (I liked the less-famous Five Find-Outers best), Lloyd Alexander’s Chronicles of Prydain, and Willard Price’s Adventure series. Mostly these were all a bit old fashioned for the 80s now that I look at them, but I tended then as now to accumulate second hand books.
There was also a small group of classics that I had that must have been condensed and re-written for kids – a little brick-like paperback of Moby-Dick (Christmas present) and old hardbacks of Robinson Crusoe, Treasure Island and Kidnapped with illustrations by Broons/Oor Wullie genius Dudley D. Watkins (bought at ‘bring and buy’ sales at Primary School). Watkins’s versions of Crusoe, Long John Silver etc are still the ones I see in my head. More up to date, I also had a particular fondness for Robert Westall (The Machine Gunners, The Scarecrows, The Watch House etc) and the somewhat trashy Race Against Time adventure series by JJ Fortune; a very 80s concoction in which a young boy from New York called Stephen, is picked up by his (this was the initial appeal) Indiana Jones-like Uncle Richard and, unbeknownst to his parents, hauled off around the world for various implausible adventures. I liked these books so much (especially the first two that I read, The Search for Mad Jack’s Crown – bought via the Chip Book Club which our school took part in, and Duel For The Samurai Sword) that I actually, for the first and last time in my life, joined a fan club. I still have the letter somewhere, warning me as a “RAT adventurer” to be prepared to be whisked away myself. Didn’t happen yet though. And then there were gamebooks (a LOT of them), which have a special place here because they fundamentally shift the focus of the narrative back to the direct hero-conquers-all themes of ancient mythology, while also recasting the reader themselves as that hero.
There were also books I wouldn’t necessarily have chosen but was given at Christmas etc, books by people like Leon Garfield (adventures set in a vividly grotty evocation of 18thand early 19thcentury London), the aforementioned Moby-Dick, a comic strip version of The Mutiny on the Bounty, a Dracula annual. Also authors who I read and loved one book by, but never got around to reading more of; Anne Pilling’s Henry’s Leg, Jan Mark (Thunder and Lightnings; there’s a moving article about this beautifully subtle book here), Robert Leeson (The Third Class Genie). And there were also things we had to read at school, which mostly didn’t make a huge impression and are just evocative titles to me now – The Boy with the Bronze Axe by Kathleen Fidler and The Kelpie’s Pearls by Molly Hunter, Ian Serralliers’s The Silver Sword, Children on the Oregon Trail by Anna Rutgers van der Loeff and The Diddakoi by Rumer Godden. What did I do as a kid apart from reading?
Anyway; that’s a lot of books. And in the vast majority of them, the conclusion of the plot relies on the main character, or main character and sidekick or team to take some kind of decisive action to solve whatever problem they have. Heroism as the ancient Greeks would have understood it may largely have vanished, but even without superhuman strength or vastly superior cunning (even the fantasy novels mentioned like Lloyd Alexander’s which do still have the chosen one/dark lord idea at their heart, tend to have a fallible, doubt-filled human type of hero rather than a demigod) there is still the idea that individual character is what matters.
And this makes sense – something like the ‘battle of five armies’ towards the end of The Hobbit is dull enough with the inclusion of characters that the reader has come to care about. A battle between armies of nameless ciphers (think the ‘Napoleonic Wars’ sans Napoleon) would be hard to get too involved in (cue image of generals with their model battlefields moving blocks of troops about, with little or no danger to themselves). Which is fair enough; after all, being in a battle may feel impersonal, but reading about one can’t be, if the reader is to feel any kind of drama. And maybe this is the key point – reading is – albeit at one remove – a one-on-one activity. Stephen King likens it to telepathy between the writer and reader and that is the case – they think it, we read it and it transfers from their minds to ours. And since reading is something that people seem to think children have to be made to do, often against their will, children’s authors in particular are understandably keen to engage the reader by making them identify with one character or another. I don’t think it’s a coincidence that the most successful writers for children from CS Lewis to Enid Blyton to JK Rowling (to name just notable British ones) have tended to make children the protagonists of their books and surround their main characters with a variety of girls and boys of varying personality types. And children’s books about children are (I find) far easier to re-read as an adult than children’s books about adults are. As an adult, even JJ Fortune’s “Stephen” rings more or less true as a mostly bored tweenager of the 80s, while his Uncle Richard seems both ridiculous and vaguely creepy. “Grown up” heroes like Biggles, very vivid when encountered as a child, seem hopelessly two-dimensional as an adult; what do they DO all day, when not flying planes and shooting at the enemy?
I mentioned gamebooks above and they – essentially single-player role playing games, often inspired by Dungeons and Dragons – deserve special mention, partly just because in the 80s, there were so many of them. There were series’ I followed and was a completist about (up to a point) – first and best being Puffin’s Fighting Fantasy (which, when I finally lost interest consisted of around 30 books), there was its spin-off Steve Jackson’s Sorcery (four books), Joe Dever and Gary Chalk’s Lone Wolf (seven or eight books), Grey Star (four books), Grailquest (I think I lost interest around vol 5 or 6), then series’ I quite liked but didn’t follow religiously – Way of the Tiger (six books), Golden Dragon (six books), Cretan Chronicles (three books) and series’ I dipped into if I came across them: Choose Your Own Adventure (essentially the first gamebook series, but they mostly weren’t in the swords & sorcery genre and felt like they were aimed at a younger readership), Demonspawn (by JH Brennan, the author of Grailquest, but much, much more difficult), Falcon (time travel) and Sagard the Barbarian (four books; the selling point being that they were by D&D co-creator Gary Gygax. They were a bit clunky compared to the UK books). Sudden memory; even before encountering my first Fighting Fantasy book, which was Steve Jackson’s Citadel of Chaos, actually the second in the series, I had bought (the Chip club again), Edward Packard’s Exploration Infinity, which was one of the Choose Your Own Adventure series, repackaged for the UK I guess, or maybe a separate book that was later absorbed into the CYOA series? Either way, there’s a particular dreamlike atmosphere that gives me a pang of complicated melancholy nostalgia when I think of the book now.
Putting a real person – the reader – at the centre of the action ironically dispenses with the need for “character” at all, and even in books like the Lone Wolf and, Grailquest series where YOU are a specific person (Lone Wolf in the former, Pip in the latter), there is very little sense of (or point in) character building. You are the hero, this is what you need to do, and that’s all you need to know. In many cases, the protagonists of the heroic fantasy novels I devoured in my early teens – when I was drawn to any fat book with foil lettering and a landscape on the cover (the standard fantasy novel look in the 80s) – were not much more rounded than their lightly sketched gamebook counterparts. These books often achieved their epic length through plot only; the truly complex epic fantasy novel is a rare thing.
Thanks, presumably, to Tolkien, these plots generally revolved around main characters who were rarely heroes in the ancient mould (though Conan and his imitators were), but were mainly inexperienced, rural quasi-children, thrust into adventures they initially had no knowledge of (Terry Brooks’s Shannara series being the classic Tolkien-lite example). But even when, as in Stephen Donaldson’s also very Tolkien-influenced Chronicles of Thomas Covenant, the hero was a cynical, modern human being, or in Michael Moorcock’s deliberately anti-Tolkienesque Eternal Champion series, where s/he was a series of interlinked beings inhabiting the same role within different dimensions of the multiverse, the ‘chosen one’ vs some kind of implacable ‘dark lord’-ish enemy theme remained pretty constant. But this underlying core or skeleton is only most explicit in self consciously fantastical fiction; whether or not there’s an actual dark lord or a quest, in most fiction of any kind there’s a ‘chosen one’, even if they have only been chosen by the author as the focus of the story she or he wants to tell. Holden Caulfield and Sylvia Plath’s Esther Greenwood have this in common with Bilbo Baggins, Conan the Barbarian and William Brown. But really, what’s the alternative to books about people anyway? Even novels in which people (or surrogate people like Richard Adams’s rabbits or William Horwood’s moles) are not the main focus (or are half of the focus, like Alan Moore’s peculiar Voice of the Fire, where Northampton is essentially the ‘hero’) rely on us engaging with the writer as a writer, a human voice that becomes a kind of stand-in for a character.
But books are not life; one of the things that unites the most undemanding pulp novelette and the greatest works of literature is that they are to some extent – like human beings – discrete, enclosed worlds; they have their beginning, middle and end. And yet, however much all of our experience relies on our perception of these key moments, that’s not necessarily how the world feels. Even complicated books are simple in that they reveal – just by seeing their length before we read them – the sense of design that is hidden from us or absent in our own lives. Even something seemingly random or illogical (the giant helmet that falls from nowhere, crushing Conrad to death in Horace Walpole’s proto-gothic novel The Castle of Otranto (1764) for example) is deliberate; recognisably something dreamlike, from the human imagination, rather than truly random as the world can be.
What we call history (“things that have happened”) usually can’t quite manage the neatness of even the most bizarre or surreal fiction. There have been genuine, almost superhuman hero/antihero/demigod figures, but how often – even when we can see their entirety – do their lives have the satisfying shape of a story? Granted, Caesar, stabbed twenty three times by his peers in the Senate chamber, has the cause-and-effect narrative of myth; but it’s an ambiguous story where the hero is the villain, depending on your point of view. Whatever one’s point of view in TheLord of the Rings or Harry Potter, to have sympathy with someone referred to (or calling themselves) a ‘dark lord’ is to consciously choose to be on the side of ‘bad’, in a way that defending a republic as a republic, or an empire as an empire isn’t.
Or take Genghis Khan – ‘he’ conquered (the temptation is to also write ‘conquered’, but where do you stop with that?) – obviously not alone, but as sole leader – as much of the world as anyone has. And then, he remained successful, had issues with his succession and died in his mid 60s, in uncertain, rather than dramatic or tragic circumstances. The heroes of the Greek myths often have surprisingly downbeat endings (which I didn’t know about from the children’s versions I read) but they are usually significant in some way, and stem from the behaviour of the hero himself. Napoleon, old at 51, dying of stomach cancer or poisoning, a broken man, is not exactly a classic punishment from the Gods for hubris, or an end that anyone would have seen coming, let alone would have written for him. As ‘chosen ones’ go, Jesus is a pretty definitive example, and whether accepted as history or as fiction, he has an ending which, appropriately for god-made-man, manages to fit with both the stuff of myth (rises from the dead and ascends to heaven) but is also mundane in a way we can easily recognise; he isn’t defeated by the Antichrist or by some supreme force of supernatural evil, but essentially killed by a committee, on the orders of someone acting against their own better judgement. More than anything else in the New Testament, that has the ring of truth to it. A significant detail too for those who want to stress the factual basis of the gospels is that the name of the murderer himself* unlike the nemeses of the ancient heroes, wasn’t even recorded.
* I guess either the guy nailing him to the cross, or the soldier spearing him in the side (much later named as Longinus, presumably for narrative purposes)
And if Jesus’s nemesis was disappointingly mundane, when on occasion, the universe does throw up something approximating a “dark lord” it doesn’t counter them with ‘chosen ones’ to defeat them either, as one might hope or expect. Living still in the shadow of WW2, Hitler’s messy and furtive end, beleaguered and already beaten, in suicide, somehow isn’t good enough and there are a variety of rival theories about what ‘really’ happened, all of which more pleasingly fit with the kind of fiction we all grow up with. Mussolini was strung up by an angry faceless mob and his corpse was defiled. Hirohito, meanwhile, survived defeat as his troops were not supposed to do, and presided over Japan’s post-war boom to become one of the world’s longest reigning monarchs. The moral of the story is there is rarely a moral to the story. For proof of that, did the ‘heroes’ fare much better? The victors of Yalta lived on to die of a haemorrhage just months later on the eve of the unveiling of the UN (FDR), to be voted out of office, dying twenty years later a divisive figure with an ambiguous legacy (Churchill) and to become himself one of the great villains of the century with a reputation rivalling Hitler’s (Stalin).
Entertainment programs us to view history as the adventures of a series of important ‘main characters’ and how they shaped the world. It’s perhaps as good a ‘way in’ as any – like Frodo taking the ring to Mordor when no human can, or Biggles (almost) single-handedly defeating the Luftwaffe, it makes a kind of sense to us. But the distorted version of history it gives us is something to consider; think of your life and that of (name any current world leader or influential figure; apologies if you are one). If the people of the future are reading about that person, what will that tell them about your life? And what is ‘history’ telling you about really? Things that happened, yes, but prioritised by who, and for what purpose? This is an argument for reading more history, and not less I think. Other people may be the protagonists in books, but in our own history we have to take that role.
Artists (and historians too, in a different way) share their humanity with us, and there are great artists – you’ll have your own ideas, but William Shakespeare, Sue Townsend, Albrecht Dürer, Mickalene Thomas, Steven Spielberg and James Baldwin seems like a random but fair enough selection – who somehow have the capacity or empathy to give us insights into human being other than (and very different from) themselves, but somehow created entirely from their own minds and their own perceptions of the world. But just like them, however aware we are of everyone else and of existence in all its variety, we can only be ourselves, and, however many boxes we seem to fit into, we can only experience the world through our own single consciousness. If there’s a chosen one, it’s you. If there’s a dark lady or a dark lord, it’s also you.
This review may not be fair to writer/filmmaker Faith A. Pennick and her excellent book, not because I didn’t like it – it’s great – but because since I was sent the book (by now onsale), events that don’t need mentioning here have overtaken it a bit. On the plus side, probably more people have more time to read and listen to music than they have in living memory, so maybe it’s not all bad. And Pennick’s book, among other things, is an extended argument for really listening to an album as opposed to just letting it play while you do other things.
If you read my review of Glenn Hendler’s Diamond Dogs book you will probably have realised that I have quite a lot to say about Bowie (and in fact one of the few moments of pride in my writing career such as it is, is that I got to write an obituary for Bowie in an actual print magazine – and that, on reading it now I still agree with myself – which is not always the case!), whereas with D’Angelo’s Voodoo, the opposite is true; Hendler was adding to my knowledge of an artist I love, Pennick is telling me about someone who I previously knew almost nothing about. As I mentioned in that previous review, as a music journalist people are never shy about telling you what they essentially want is the music not the writing; but for me, most good writing has an element of Thomas Hardy’s dictum about poetry: “The ultimate aim of the poet should be to touch our hearts by showing his own” and in the case of the music writer that means engaging you (or rather me) whether or not one has an interest in the music itself. Here Pennick scores very highly; the narrative of how she came to know and love Voodoo manages to remain direct and personal while also bringing in all of the cultural/historical and musical context necessary to be more than a kind of diary entry.
I came to the book thinking that I didn’t know anything by D’Angelo at all*, and while setting the scene, Pennick invokes a list of artists that is – to my taste in music – both encouraging (Erykah Badu, De La Soul, Angie Stone) and, though admittedly important, offputting (Michael Jackson, Lauryn Hill). But as it turns out, the fact that I didn’t know D’Angelo’s ‘greatest hits’ is not all that surprising; a key point in Pennick’s book is about how D’Angelo’s career was defined, for better or worse, by the video for Untitled (2000) – but that single didn’t chart in the UK and if I was aware of him via osmosis at all, it would have been from the trio of singles from his previous album Brown Sugar, that made the Top 40 here five years earlier.
*in fact, I should have known that his vocals (and sometimes his musicianship) appear on records by people like Q-Tip and The Roots that are more my cup of tea than his own music.
But by 2000, even if Untitled had been a hit here, the chances are I would never have seen that video. Like many people of my generation, I had a pretty good grip on what was in the top 40, whether I liked it or not (and usually I liked it not), up until the mid-90s, when Top of the Pops (TOTP), the UK’s Top 40 music TV show, was moved from its classic Thursday night slot to a Friday. This may seem a little thing, but for background, during my childhood there were only 4 (and pre-1982 only three) TV channels, which meant that, if a family watched TV at all, there was a pretty good chance that they were watching the same things as you were; and most people I knew watched TOTP – so all through school, what was at number one was common knowledge (to be fair it probably still is for school age kids). By the mid 90s (actually, any time after one’s own taste had formed), watching the show was largely a kind of empty ritual or habit but still; it did give, pre social media, a general sense of where pop music and pop culture were at at any given time.
In 2000, when Voodoo was released (I am surprised now to find that TOTP was still on at that time, albeit not in the classic slot and beginning its slow decline that ended in cancellation in 2006), aside from odd bits of experimental hip hop heard through my brother, like Kid Koala’s Carpal Tunnel Syndrome, classics like The Wu-Tang Clan’s The W and occasional forays into UK indie like Badly Drawn Boy, I was rarely listening to any music recorded after around 1975; Bowie, Funkadelic, Lou Reed, John Cale, early 70s funk, old blues and early Black Sabbath were [probably what I listened to the most. So D’Angelo passed me by; not that I think I would have liked Voodoo much at that time anyway.
But Faith A. Pennick is persuasive; I listened to Voodoo. And she is not wrong; despite lyrics that veer from great to obnoxious (just a personal preference, but I don’t think I’ve ever heard a song I liked for more than one listen whose theme is how great the performer of that song is), the album is meticulously put together, perfectly played with skill and heart – and to my surprise, with a beautifully organic sound – and in the end the only thing that puts me off of it – while in no way reducing its stature – is D’Angelo’s voice(s). It’s not that he isn’t a great singer, he clearly, demonstrably is; but the album coincides with/crystalises that period when R’n’B vocals tended to consist of multi-layered murmuring and crooning. I didn’t really like it then and it’s still not for me now – although the immediate and noticeable lack of autotune is incredibly refreshing. I used to love robot voices as a kid, but now that the slight whine of autotuned vocals is ubiquitous whenever you turn on the radio, it’s nice to hear someone who can sing, singing. In fact, for me, if you pared the vocals on Voodoo down to one main, direct voice and gave it the clarity of the drums and bass, I’d like the album a lot more; but it wouldn’t be the same album, and that’s my deficiency, not Voodoo’s.
For me, the main strength of D’Angelo’s Voodoo (the book) is in the way that Pennick weaves her own personal relationship with album and artist and the album’s cultural/socio-political background together. Voodoo wouldn’t sound the way it does without Prince or 60s and 70s funk and soul; but neither could it have come from someone without D’Angelo’s own personal background in gospel and the African-American church, and Pennick, as an African-American woman responds to the album in ways that would be inaccessible to a white, male writer in Scotland if not for her book. Why an album sounds the way it does is always personal to the artist, but also specific to the era and culture they come from, and how an audience – on a mass or individual level – responds to that album adds depth to the work and determines its stature. Pennick brings these strands together seamlessly; concise, informal and yet powerful, in its own quiet way the book is a virtuoso performance, just as Voodoo is.
At some point in the late fifteenth century, the poet Robert Henryson (who lived in Dunfermline, not too far from where I’m writing now), began his Testament of Cresseid with one of my favourite openings of any poem:
Ane doolie sessoun to ane cairfull dyte
Suld correspond and be equivalent.
Robert Henryson – The Testament of Cresseid and Other Poems, my edition Penguin Books, 1988, p. 19
I don’t think I knew, word for word, what he was saying when I first read it, but I did get the meaning: essentially that miserable/sad times (‘doolie’, which I guess would be ‘doleful’ a few hundred years later; not sure what it would be now) call for tragic/sad/grim (“cairfull”, literally ‘full-of-care’) poetry, and the words, with their mixture of strangeness and familiarity (people in Scotland have not talked like that for many centuries, but I think that being attuned to the accents and patterns of speech here still makes it easier to understand), stayed with me. The poet goes on to talk about the weather; apparently it was an unseasonable Lent in Fife that year, when “schouris of hail can fra the north discend/that scantlie fra the cauld I micht defend.” Despite impending climate disaster, Fife weather hasn’t changed beyond all recognition it seems; It was only two weeks ago – though it seems far longer now – that I was caught in a hailstorm myself.
The season is still doolie however; not because of the weather, but because of the pandemic sweeping the world, one unlike any that Henryson would have known, but which probably wouldn’t have surprised him; one of the key elements he brought to the Troilus and Cressida story in The Testament of Cresseid is its heroine being struck down by leprosy and joining a leper colony.
the cover of my copy of his poems has a drawing from a medieval manuscript, of a figure which would have been familiar to most readers at the time; a leper with a bell begging for alms.
In fact, with dependable cosmic irony (or if you are less fatalistic, normal seasonal progress), the weather, since ‘stay home’ has been trending online and quarantine officially recommended, has been beautiful here. The streets are fairly, but not yet eerily, quiet. So this particular dyte (the old word that Henryson used referred to his poem but I think stems ultimately from the Latin dictum and can apply to any piece of writing) may not seem especially gloomy (and may in fact be quite sloppy), but it is certainly careful in the sense that Henryson intended. It’s quite easy – and I think reasonable – to be optimistic about the state of the world in April 2020, but not I hope possible for anybody with any sense of empathy to not be concerned about it.
There are some silver linings to the current situation (major caveat: so far); as well as, inevitably, bringing out the worst in some people, a crisis also brings out the best in many more. And a whole range of major and minor plus points, from a measure of environmental recovery to time to catch up with reading, have emerged. For me, one of the nicest things to come out of the crisis so far is – thanks to social media – the way that arts institutions, while physically almost empty, have begun to engage online with a wider range of people than those who are likely to, or physically able to visit the galleries themselves.
It has been said that Edward Hopper is the artist who has captured this particular moment, and it’s true that his vision of loneliness in the metropolis particularly mirrors our own age of social media and reality TV, in that it is voyeuristic* – we are not looking at ourselves, or at an absence of people, we are looking at other people whose isolation mirrors our own. If there’s something about this particular pandemic that sets it apart from the Spanish flu of 1918-19 or the great plague of 1665 or the Black Death of 1348-9, or any of the devastating outbreaks of disease that sweep the earth from time to time, it’s that online we are (a ridiculous generalisation perhaps, but if you’re reading this chances are you have internet access at least) sharing the experience of isolation; surely in itself a relatively new phenomenon, at least on this kind of a scale. When Daniel Defoe wrote in his fictional memoir of the 1665 plague (and it’s worth remembering that, although he was only five when the plague swept London, he would have had the testimony of many who had survived as adults as well as whatever shadowy memories he himself had of the period)
Passing through a Token-house Yard, in Lothbury, of a sudden a casement violently opened just over my head, and a woman gave three violent screeches, and then cried “Oh! Death, Death, Death!”in a most inimitable tone, and which struck me with horror and a chilness in my very blood. There was nobody to be seen in the whole street, neither did any other window open; for people had no curiosity now in any case; nor could any body help one another
Daniel Defore, A Journal of the Plague Year, 1722, my copy published by Paul Elek Ltd, 1958, p. 79-80
he was depicting a situation which many people could no doubt relate to; after the fact. What we have now is a sense of shared helplessness in real time; this has never existed, quite in this way before. Assuming some kind of return to normality, we (not entirely sure who I mean exactly by ‘we’) will know each other better than we ever have; something to have mixed feelings about no doubt.
*not a criticism; visual art is voyeurism
The current appeal of Edward Hopper’s paintings of lonely figures is humanistic and easy to explain. His art, with its depiction of strangers quietly sitting in anonymous places, people who paradoxically we can never know and never know much about, but who we can easily relate to, is profoundly empathetic. It belongs to a long tradition of quiet loneliness or at least alone-ness that stretches back, in Western, art to the seventeenth century and the art of Vermeer (it’s easy to forget as the children of it, but the idea of art reflecting the individual for reasons other than wealth and status is an essentially Protestant one*) through artists like Arthur Devis (though I’m not sure he intended the quiet melancholy in his paintings) and Vilhelm Hammershoi (who did). In fact, Hammershoi’s beautiful turn-of-the (19th-20th)-century paintings are if anything even more relevant to stay-at-home culture than Hopper’s diner, bar and hotel-dwelling urbanites. With Hopper, we are often watching – spying on – his characters from the outside as if through a pair of binoculars, with Hammershoi we are shut in with them, like ghosts haunting their silent rooms.
*really the only ‘lonely’ figures in pre-Protestant European art are Christ himself (think of the utter solitary misery of the crucified Jesus in Grunewald’s Isenheim altarpiece) and of course Judas, or those who like him, have separated themselves from Christianity. There is a terrifying solitary quality in some depictions of saints during martyrdom, but for their contemporary audience it was essential to bear in mind that they were not spiritually alone (note: this may be a completely false assertion)
But if Hopper’s most discussed and shared works now are those where we seem to catch, as we do from a train window, a momentary glimpse of a life that is utterly separate from our own. It’s a feeling I associate with childhood and (very) specifically, with travelling through Edinburgh in the winter and seeing glimpses of people at windows and the high ceilings in Georgian houses in the new town when Christmas decorations were up. Who were all these people?
But there are Edward Hopper paintings too – including some of my favourites, like Early Sunday Morning (see below) – where the only human presence is the artist, or the viewer, where Hopper could claim (though I have no idea if he would have) like Christopher Isherwood, I am a camera with its shutter open, quite passive, recording not thinking.* But recording, for a human being, is thinking. And the picture of a place-without-people is rarely as simple as it seems; even in the case of an actual photograph; someone had to be there to photograph it, and had their human reasons for doing so. The tradition of landscape painting exemplifies this; landscapes may be mythical, romantic, realistic, but they have been recorded or edited or invented for a variety of complex human reasons. The landscape painting of earlier eras was often self-consciously beautiful, or psychologically charged (Caspar David Friedrich is the classic example; landscape as a personal, spiritual vision; in some ways in fact his work, with its isolated or dwarfed human figures, is kind of like a romantic-era Hopper), but the fact that the urban landscape is itself an artificial, human-constructed environment gives it a different, poignant (if you are me) dimension.
*Christopher Isherwood, Goodbye To Berlin, in The Berlin Novels, Minerva 1992, p. 243.
The appeal of the empty urban landscape in art is perhaps hard to explain to those who don’t see it, but I think it’s worth examining. There is a utopian tradition beginning with (or at least exemplified by) the ‘ideal cities’ produced in Italy in the late 15th century that is in a strange way misanthropic (or at least anthro-indifferent) in that the tranquil geometric perfection of the imaginary cities can only be made less harmonious by the introduction of human figures. But it’s also important to note that these cityscapes actually pre-date landscape painting for its own sake in western art by a few centuries. I don’t think it’s much of an exaggeration to say that in the medieval and renaissance period, the urban landscape had a far greater claim to represent paradise than the natural one. The garden of Eden was a garden after all, not a wilderness, and even the word paradise denotes a walled enclosure in its original Persian meaning. We might think now of paradise existing beyond the realms of human habitation, but in ages where the landscape was mainly something perilous to be passed through as quickly as possible on your way to safety, the controlled human landscape had a lot to be said for it.
Like the Renaissance ‘ideal city’, the beautiful post-cubist-realist paintings of Charles Demuth have a sense of perfection, where the severe but harmonious geometry of his industrial buildings seems to preclude more organic shapes altogether.
But if Demuth shows an ideal world where human beings seem to have designed themselves out of their own environment, the ideal cities of the renaissance, with their impossibly perfect perspectives are something more primal and dreamlike; prototypes in fact for the examinations of the inner landscape of the subconscious as practised by proto-surrealist Giorgio de Chirico and his actual-surrealist successors. De Chirico’s eerie ‘metaphysical’ cityscapes are essentially the ideal renaissance cities by twilight, and artists like Paul Delvaux used the extreme, telescoped perspectives of the early renaissance to create their own prescient sense of urban displacement. Why the kind of linear perspective that sucks the eye into the distance should so often be, or feel like, the geometry of dreams is mysterious – one plausible possibility is that it’s the point of view that first forms our perception of the world, the low child’s eye view that renders distances longer and verticals taller; we may be the hero (or at least main protagonist) in our dreams, but that definitely doesn’t mean we dominate them.
The use of isolated human figures, as in Delvaux and Hopper’s work, gives us a ‘way in’ to a picture, something human to either to relate or respond to (although Delvaux – like Magritte in Not To Be Reproduced (1937) – emphasises the loneliness and again the ultimate unknowable nature of human beings in Isolation by showing the figure only from behind), but the cityscape that is devoid of life, or which reduces the figures to ciphers, has a very different appeal.
Whereas the unpopulated landscape may suggest a prelapsarian, primordial or mythical past, or an entirely alien realm altogether, empty streets are just that; empty. These are utilitarian environments designed specifically for human beings and their patterns reflect our needs. A meadow or hillside or mountain with no visible sign of human life may be ‘unspoiled’; towns and cities, by this definition, come ‘pre-spoiled’, and the absence of people raises questions where a natural landscape usually doesn’t; Where are the people? What has happened?
That said, nothing about Hopper’s Early Sunday Morning, Algernon Newton’s paintings of Kensington (or Oguiss’s Paris, or indeed the beautiful photographs of the city in Masataka Nakano’s Tokyo Nobody (2000)) really suggests anything ominous or post-apocalyptic, but even so, the absence of life is the most noticeable thing about them. Whether intended or not, this gives a picture a psychological depth beyond that of a simple topographical study. As with the use of musical instruments within a still life painting (whether there to express the fleetingness of time, or the lute with a broken string to denote discord etc) the inclusion of something with a specific purpose (roads, paths, buildings) apparently not fulfilling that purpose, creates a response as complex as – though very different from – the feeling of looking at those lonely figures in Hopper and Hammershoi’s paintings. Not so different in fact, from the feeling of leaving your home in the spring of 2020 and walking down the deserted street outside.
These paintings can have a slightly uncanny quality reminiscent of (or vice versa) the eerie opening scenes (the best parts) of movies like The Omega Man (1971) and 28 Days Later (2002) or John Carpenter’s classic Escape From New York (1981) where, emptied of people, any sign of life in the city becomes, not a sign of hope, but threatening and full of sinister power. Things will hopefully never reach that point in the current crisis, but as it is, avoiding people in the street is for now the new norm; for the first time I can remember, my natural reserve feels almost like a plus.
Those 15th century ‘ideal cities’ were part of the flowering of the renaissance, and, as with every other aspect of it, they were the product of people looking backwards as much as forwards. The actual, non-ideal cities that were lived in by the artists who painted the pictures were largely organic, messy, medieval conglomerations, regularly visited by outbreaks of disease. The ideal city’s emptiness is not only harmonious and logical, it’s clean. And like the classical sculptures, bleached white by time and weather, which were to prove so influential on that generation of artists, the aspiration is towards a kind of sterile perfection which never really existed until long after the culture that created the buildings and the art, had disappeared to leave a ghostly husk of its former self.
The deserted city or townscape more or less disappears from art from the 15th century until the later years of the industrial revolution, when urban life itself became the subject for modern art. And it makes sense; the reversal in European culture which saw city life become perilous and the countryside as a means of escape was a slow one, and the solution (never more than a partial one) was in building programmes, urban renewal and harmonious town planning; Empire building and colonial expansion fuelled the growth of urbanisation and were fuelled by it; to imagine an empty city at the height of Empire was to imagine extinction. If the idea of empty streets, If there was any remaining collective memory of empty streets in the late 19th century, it was probably an echo of the kind of scenario that Defoe had written about*; less graced by the muses of harmony than haunted by the dead.
*or of natural disasters like drowned villages, or man made catastrophes like the Highland Clearances.
But by the late 19th century, in Europe, plague was less a current concern than it was gothic horror, the memory of a memory, and industrialisation had – for those with a measure of financial security – rendered the city (now with drains and public transport) and the country (now sans dangerous animals and medieval lawlessness) on something of an equal footing. For the generation of the impressionists, both city and country could be celebrated, and both (as has been true ever since) could mean escape. But that impressionist cliche, the ‘bustling metropolis’, defined by Baudelaire’s “fleeting, ephemeral experience of life in an urban metropolis” – the hub of modernity, the engine of culture and progress, when the streets are empty, becomes something else, but it can never just be a collection of buildings.
Not surprisingly perhaps, it seems that to some degree, the art of the deserted street is a kind of declared outsider art; Maurice Utrillo was an alcoholic with mental health issues, and although literally at the centre of the Parisian art scene centred around Montmartre – because he was born there to an artist mother – he was nevertheless a marginal figure, and his paintings of his home town are heavy with melancholy and isolation.
Similarly, although far less gloomy, the Montmartre paintings of Maria Slavona, a foreigner – a German Impressionist painter living in Paris, are depictions of an urban landscape that, while not hostile, is enclosed and other and (to me) brings to mind the close of Philip Larkin’s Here: “Facing the sun, untalkative, out of reach.” Whether that mood is inherent in the paintings, or only in the mind of the person looking at them, is not something I can answer.
The German artists of a later generation found a similar sense of alienation at home. The neue sachlichkeit (‘new objectivity’) movement of the Weimar Repulblic may have been a rejection of the extremes of Expressionism and romanticism, but in its embracing of modernity it was a specifically urban movement too. The teeming street scenes of George Grosz and Otto Dix reflected the sometimes chaotic street life of Germany’s big cities in the social and economic upheaval following that followed World War One much as Alfred Döblin’s Berlin Alexanderplatz (1929) was to do in literature, but there were other views of the city too.
It was an era of political unrest, but if one thing united the political left and right it was the understanding that they were living in an essentially transitional period; that change would, and must come.
Hans Grundig was the epitome of the kind of artist hated by the Nazi party; politically a communist, he used his art to oppose the creeping rise of fascism but also to capture working class life in the city (in his case Dresden). But in Thunderstorm (Cold Night), 1928, it is the environment itself that condemns the society of the declining republic: the streets are empty and ghostly pale, the buildings, run down and near-derelict, offer little shelter and no comfort, and the people whose fate looked uncertain, are nowhere to be seen. Meanwhile, a storm approached.
Carl Theodor Protzen was, by contrast, an establishment figure; a member of the Association of Fine Artists and the German Society for Christian Art, he was to become a pillar of the Nazi art community. Urban landscapes were his speciality and his depictions of Nazi building projects were to make his name, but just prior to the NSDAP’s rise to power in 1933, he was painting pictures like Lonely Street (1932) that show those same urban landscapes, but without the excitement of progress. Less bleak and doom-laden than Grundig’s city, this is nevertheless an environment which does not embrace or protect humankind; the title reflects the child’s exclusion from the harshly geometric scene in which he finds himself and, although there is no sense of exaggeration, the perspective, as in surrealism, pushes the end of the road ever further into the distance.
This perspective is seen too, in Volker Böhringer’s the Road to Waiblingen, painted in the year that the Nazis came to power. Böhringer, an anti-fascist painter, was later to become a surrealist, and the ominous (blood-stained?) road, stormy clouds and sinister trees suggest that this is (with apologies to Waiblingen) not a road that he saw leading anywhere very pleasant.
Ever since I was a child, I’ve always loved to visualise (usually at night) a real place, say a nearby hilltop or field, as it is at that moment, with nobody except animals and birds there to see or experience it. It’s a strange kind of excitement that depends on not being able to experience the thing you’re excited about: psychology probably has a term for it – but at a time when people have never been more inescapable (not that one necessarily wants to escape them) there is something appealing about the complex landscapes we have created for our needs, but without the most complex element of all – ourselves – in them. Whether we enjoy the empty streets or not (and hopefully we don’t have to get too used to them), we should probably take the time to look at what is all around us; it’s a rare chance to see our world without us getting in the way.
I don’t often post book reviews here, but I was lucky enough to be sent review copies of the two newest additions to Bloomsbury’s always-interesting 331⁄3 series of books, David Bowie’s Diamond Dogs by Glenn Hendler (hopefully the spelling of his name will be consistent on the cover of the non-advance edition) and D’Angelo’s Voodoo by Faith A. Pennick, which I’ll cover in a different post.
Hendler’s book was of immediate interest; I’ve been listening to David Bowie’s Diamond Dogs (1974) for literally (though not continuously) half of my life. When I first started this blog, names for it that I rejected included ‘The Glass Asylum’ (from the song Big Brother) and ‘Crossroads and Hamburgers’ (actually based on a mishearing of a line in perhaps-best-ever-Bowie-song (or group of songs), Sweet Thing/Candidate/Sweet Thing (reprise) which is really ‘the crossroads of hamburgers and boys’, arguably a better name for a website, but perhaps overly misleading. The Glass Asylum already exists and is anyway not especially relevant. But I’ll name this site properly one day).
For years, Diamond Dogs was my favourite Bowie album, only pushed into second or third place (it changes quite often; currently #1 is Station to Station and #2 is Young Americans) because I listened to it so much that it had become hard to listen to without skipping bits.
But despite listening to it to the point where I felt like I knew every second of the album, and reading a lot about Bowie over the years (though not the lyrics apparently – I presume I just thought I knew them), Glenn Hendler’s little (150 page) book taught me a lot that I didn’t know and hadn’t considered – and, even better – sent me back to the album with fresh ears, and made me fall in love with it all over again.
As a semi-professional music journalist myself (Hendler, incidentally, isn’t one; he’s a Professor of English, though he writes on a variety of cultural & political topics) I’m very aware that there are many people who believe that music writers should focus solely on the music at hand and leave themselves out of it. This is, thankfully, not how the 331⁄3 series works, and in fact none of my own favourite music writers – Charles Shaar Murray, Jon Savage, Caitlin Moran, Lester Bangs etc etc – write from any kind of neutral position. And really, anything about music beyond the biographical and technical information is subjective anyway, so better to be in the hands of someone whose writing engages you. For me, the test of good music journalism (not relevant here, but will be for the Voodoo review) is whether the writer can make you enjoy reading about music you don’t already know, or maybe don’t even like – something which all of the aforementioned writers do.
331⁄3 books always begin with something about the writer’s history with the music that they are talking about – and it’s surprising the difference this makes to a book. For me, reading the opening chapter of Mike McGonigal’s My Bloody Valentine’s Loveless (Loveless came out when I was at high school and was very much a fan of the scene that had grown up in the long gaps between MBV’s releases; Ride, Lush, Slowdive, Curve etc etc etc) was such a strange experience – he describes encountering the band’s music in what comes across very much as a grunge, ‘alt-rock’ milieu – that, although I liked the book very much, it felt so far removed from how I saw the band that it was oddly dislocating, like it would be to read a sentence that began “Wings frontman Paul McCartney” or, more pertinently to this article, “David Bowie, vocalist of Tin Machine.”
Anyway; in this case, the author’s relationship with his subject stretches all the way back to the his first real encounter with the music – and strangeness – of Bowie, when as a 12 year old, he saw The 1980 Floor Show on NBC’s Midnight Special, filmed in 1973, which acted as a kind of fanfare for the as-yet-unreleased Diamond Dogs. This setting is important, because anyone coming to Bowie now has grown up with all of his incarnations – and the fact that he had various different personae – as background. I first knew him as the barely-weird-at-all Bowie of Let’s Dance, a pop star who was not noticeably stranger or even (stylistically/musically at least) obviously older-looking than the other acts in the charts at the time (also in the top ten during Let’s Dance’s reign at number one were the Eurythmics (Sweet Dreams (are Made of This)), Bonnie Tyler (Total Eclipse of the Heart) and Duran Duran (Is There Something I should know). The fact (not in itself so unusual in the UK) that Bowie had an earlier existence as some kind of glam rock alien of indeterminate gender was almost invariably commented upon by DJs and TV presenters in the 80s and that is a very different thing from becoming aware of him when he was a glam rock alien of indeterminate gender, especially since – in the USA at least – he was yet to really break and in ’74 was a cult figure with a surprisingly high profile, rather than one of the major stars of the previous two years.
In his book, rather than making a chronological, song-by-song examination of the album (though he does dissect every song at some point), Hendler examines the array of different inspirations (musical, literary, cultural, political, technical) that informed the writing and recording of the album, as well as looking at where it lies in relation to his work up to that point. Those inspirations; Orwell’s Nineteen Eighty-Four (Bowie’s original intention was to write a musical based on the book, but after that was vetoed by Sonia Orwell he incorporated the material he’d written into Diamond Dogs), Andy Warhol and the superstars of his Factory, some of whom were then in the UK production of his play Pork, the gay subculture of London and the post-apocalyptic gay subculture of William Burroughs’s novels, Burroughs & Brion Gysin’s ‘cut-up’ technique, Josephine Baker, A Clockwork Orange, the soul and funk that was to take centre stage on Young Americans, the Rolling Stones, the post-industrial decay and unrest of Britain in the mid-70s – are all audible to varying degrees on Diamond Dogs, a kind of linguistic stratigraphy* that mirrors the album’s layers of sounds and instruments and makes it both aurally and figuratively one of Bowie’s most richly dense albums.
*thankfully, Glenn Hendler never writes as pretentiously as this
When reading the book, two phrases other writers wrote about the Diamond Dogs era came to mind, which I think reinforce Hendler’s own conclusions about the album;
it […] single-handedly brought the glam rock era to a close. After Diamond Dogs there was nothing more to do, no way forward which would not result in self-parody or crass repetition” David Buckley – The Complete Guide To The Music of David Bowie*, Omnibus Press, 1996, p.37
*incidentally, a intriguing detail reported by Buckley but sadly not mentioned in Hendler’s book is that the territory of ‘Halloween Jack’ (the only named member of the Diamond Dogs) who ‘lives on top of Manhattan Chase’ was inspired by stories told by Bowie father (who at one point worked for Barnardo’s) of homeless children living on the rooftops in London.
And, even more to the point:
The last time I’d seen him [Bowie] had been the last day of 1973, and he’d been drunk and snooty and vaguely unpleasant, a game player supreme, a robot amuck and careening into people with a grin, not caring because after all they were only robots too; can trash be expected to care about the welfare of other trash?
Since then there’d been Diamond Dogs, the final nightmare of glitter apocalypse Charles Shaar Murray, ‘David Bowie: Who was that (un)masked man?’(1977) in Shots From The Hip, Penguin books, 1991, p.228
This sense of Diamond Dogs’ apocalyptic extremism is addressed throughout Hendler’s book; the record may not be a concept album in any clear, narrative sense (indeed, the Diamond Dogs, seemingly some kind of gang, are introduced early on but only mentioned once thereafter), but its fractured, non-linear progression and its musical maximalism (should be a thing if it isn’t) actually imbues the album with a far stronger overall identity than Ziggy Stardust or Aladdin Sane had before it. In fact it works more like a kind of collage than a conventional story. related to this, an important point that the author brings up early on concerns the role of the Burroughs/Gysin cut up technique. Although this is often used to explain (or rather, not explain) the more lyrically opaque moments in Bowie’s 70s work, Hendler stresses that this was a creative tool rather than a kind of random lyric generator. As with the use of Eno’s Oblique Strategies cards on Low a few years later, the cut up was used as a way of stimulating the imagination, not bypassing it. The lyrics to songs like Sweet Thing clearly benefit from the use of randomised elements, but these were then used to create lyrics which have an internal sense but which crucially also scan and rhyme when needed, something that would be fairly unlikely in a purely random process. The result is something like the experimental fiction that JG Ballard had pioneered earlier in the decade (most famously in The Atrocity Exhibition) which come across as sometimes-gnomic bulletins from the unconscious, filtered through a harsh, post-industrial geography, but never as random gibberish. What Hendler draws attention to (that I had never consciously noticed in all my years of listening) is the strangely dislocated perspectives of the album’s songs, where the relationship between the narrator/subject/listener are rarely clear-cut and often change within the course of a single song.
The most obvious example is in one of the book’s best parts, the exploration of Sweet Thing/Candidate/Sweet Thing (reprise) (the crossroads and hamburgers song). Although, lyrically, the song’s focus is all over the place, it never feels disjointed, and until reading about it, I’d never really considered how ambiguous it all is. Although seen through a kind of futuristic lens, thanks to the album’s loose concept (established by the album’s sinister and slightly silly intro, Future Legend), when I listen to it now, it feels very much like a condensed/compressed 70s version of Hubert Selby Jr’s notorious Last Exit To Brooklyn (1964) with its shifting viewpoints and voices and its pitiless depiction of what was – for all the novel’s controversy – the normal life for many people in the underclass of any big city. Like Selby, Bowie doesn’t help the audience by indicating who is speaking or when but places us in the centre of the action (essentially violent gangs and male prostitutes), making the listener in fact, (at times) the ‘sweet thing’ of the title (though at other times Bowie adopts that role too) not that that had ever occurred to me before. It’s a mixture of menace, sleaze and impending violence, the ‘glam’ sheen of glam rock rendering it all at once romantic and dangerous – and full of unexpected details. I had obviously always heard the line ‘Someone scrawled on the wall “I smell the blood of Les Tricoteuses”’ but I hadn’t bothered to find out what it was he said or what ‘Les Tricoteuses’ were (the old ladies who reportedly/supposedly knitted at the foot of the guillotine during the Reign of Terror that followed the French Revolution, it turns out) and therefore didn’t pick up on the way the percussion becomes the military marching snare drum. Bowie was always about theatre, but this song absorbs the theatrical elements so seamlessly into its overall structure that drama/melodrama, sincerity/artifice, truth/deceit. seduction/threat become one vivid and affecting whole. I would say the song is bigger than the sum of its parts, but there are so many parts, going in (and coming from) so many different directions that I don’t think that’s true – but it somehow holds together as a song or suite of songs; almost a kind of microcosm of the album itself.
Elsewhere, my other favourite song, We Are The Dead (directly inspired by Nineteen Eighty-Four) is dissected brilliantly, highlighting the way (again, I hadn’t noticed) that Bowie absorbs the key ideas of the novel into his own framework; this is one of the few songs aside from the title track that mentions the Diamond Dogs and, without being jarring (or at least no more than intended) sets the originally very 1940s characters of Winston Smith and Julia (not that they are named) and his timeless themes of power, sex (and the relationship between the two) and totalitarianism into the 70s post-apocalyptic dystopia that owes more to Burroughs and the street-life milieu of Lou Reed’s lyrics than it does to Orwell himself. Like the use of cut-up techniques to stimulate his own imagination, Bowie’s absorption of these disparate elements created something new and powerful that concentrated Bowie’s interests and obsessions as well as holding up a distorting mirror to the times in which it was created.
But this has gone on long enough and, rather than rewriting or paraphrasing Hendler’s book – one of the best books on Bowie I’ve read – I’ll go and read it again while listening to Diamond Dogs.
A new decade, and the year is flying past already. I intended to write something full of enthusiasm and positivity at the beginning of January, but at that point I was still clumping about in a walking boot and using crutches so it had to wait. I didn’t do my usual ‘records of the year’ for last year either (well I did, but not for this website), and the moment for that has definitely passed. For what it’s worth, my favourite album of the decade 2010-2019 was quite possibly Das Seelenbrechen by Ihsahn. But anyway, it’s Lunar New Year and I’m back in normal shoes, so Happy New Year!
I didn’t make any resolutions as such this year, my general aims though are to read more, write more and resist any of the normalisation of right wing extremism that seems to be carrying seamlessly over from last year. This week the BBC has a show where Ed Balls hangs around with various actual and quasi Nazis (maybe in the name of balance they should send Michael Portillo to hang around with some communists? On a train, if that’s what it takes*), while Channel 4 seems to think what Britain needs is more TV shows about Nigel Farage, presumably trying to get the most out of him while he still has any kind of relevance as a public figure.
* at this point,Around The World With Alan Partridge In A Bullnose On The Left barely feels like parody
So anyway, I am as always working on long, convoluted articles on various topics that aren’t yet finished, so this will be more in the nature of some brief notes and so forth.
In the holidays I re-read (the first time since childhood) the first three books in Joan Aiken’s Wolves Chronicles, set in an alternative early 19th century Britain where the Stuart monarchy was never deposed and “Jamie III”, sits on the throne. As the series starts, the country has been overrun by hungry wolves fleeing the Russian winter that have arrived through the recently completed channel tunnel (younger readers may need to be reminded that it was in reality completed in 1994). I mention the books (which are much as I remember them; entertaining, well-written and a bit silly) mainly for this passage near the beginning of The Wolves of Willoughby Chase, which, like the young heroine, I have remembered all my life (so far) – although I didn’t know where it was from and vaguely thought it must be Leon Garfield or even CS Lewis. The book is also, it turns out, the place I remember possets (Victorian hot curdled drinks) from. I’ve still never had one – they sound revolting – but reading about them made them seem desirable again.
There was something magical about this ride which Sylvia was to remember for the rest of her life – the dark, snow-scented air blowing constantly past them, the boundless wold and forest stretching away in all directions before and behind, the tramp and jingle of the horses, the snugness and security of the carriage, and above all Bonnie’s happy welcoming presence beside her Joan Aiken, The Wolves of Willoughby Chase, 962, p.44-5
In the sadly non-alternative present, Britain has a ridiculous prime minister every bit as pantomime-villain-like as Aiken’s villains are (she goes in for the kind of Dickensian villain names that seem to preclude the character from being good: “Miss Slighcarp” being the classic example) and the government is issuing with a typical and, presumably deliberate sense of bitter irony, this coin to commemorate the victory of insularity, xenophobia and – most importantly – protecting the financial interests of a small coterie of people at the centre of power:
In non-alternative Britain, somehow accusations of child abuse do not constitute a ‘royal crisis’ while two of its members making vague gestures towards some kind of unobjectionable normal life does; and maybe this is right. The idea at the heart of monarchy and aristocracy (that is, aristokratia; ‘rule of the best’) is by definition about not being ‘normal’ so perhaps, as we get further and further from the days when the monarchy involved some kind of mystical aspect and what Monty Python (RIP Terry Jones) called ‘supreme executive power’ we should expect all kinds of by-normal-standards transgressions to appear and not be seriously acknowledged by the royals and their fans, while (admittedly approximate) attempts at living ordinary lives will be punished.
I have no intention of going into serious political discussions here because I don’t want to, but 2020 has seen a minor shift in my own political views, insofar as, although I still regard (and I guess always will) nationalism of any kind as regressive and illogical, if there was to be another independence referendum in Scotland tomorrow, I would vote in favour of independence. Not without regret, as I fundamentally believe in internationalism and the principles mocked on the Brexit coin; but at some point, if the government that people vote for is not the one they get – and despite the apparent landslide won by Johnson and co, their support in Scotland is minimal – then something is fundamentally wrong with the system. That said, I’d be wary of writing off the Tories’ 25% of Scotland’s vote as insignificant; 690,000 people is a lot, even in a country of over 5 million. Overall in fact, the Scottish election results echo those of Britain as a whole, with the most noticeable feature being the collapse of anything resembling a left wing movement, depressingly. But anyway; in the unlikely event that a referendum is given by the current parliament, I hope the lessons of Brexit will be learned and that an independence campaign can well-informed and practical, but also optimistic and aspirational, rather than overwhelmingly negative and defined by the things people don’t want/like/believe in. Too much to ask, perhaps.
Onto more positive things; my friend Paul, who introduced me to the Nouveau Roman, has written a nice introduction to the movement here, which means I have more things I need to read; luckily, I have rejoined a library for the first time in over a decade. And the experimental string group Collectress have finally followed up my favourite album of 2014 (Mondegreen) with Different Geographies, out on 6 March via Peeler Records. It’s a beautiful, mysterious, allusive and elusive record; I’ve not really absorbed it yet, but here’s a nice video –
So, to sum up; it’s all a bit of a mess, but it’s a new year and a new decade, so one might as well be positive and try to do good things. Will write more soon.
This piece of writing was originally supposed to be posted in September, then at Halloween, but now that it’s finally finished maybe November is the right time after all. It’s about those nameless places that are nowhere, or even the ‘middle of nowhere’, and maybe places feel most like nowhere – or, nowhere feels most itself – in November, when as Ted Hughes wrote:
“… After long rain the land Was sodden as the bed of an ancient lake, Treed with iron and birdless” Ted Hughes, ‘November’ from Lupercal (1960) Faber & Faber, p.49 (my copy is from 1985)
This was, pompously, to be a ‘photo essay’, but the photos are – necessarily I think and not unintentionally – a bit drab and nothingy, so I wrote this too. Firstly, I should explain what I mean by ‘nowhere’ and concede straightaway that by now there probably isn’t a place in the world truly deserving that non-name, let alone in a land mass as small and populated as Britain, where if nothing else, the places I have photographed could be described as being a part of Fife, a part of Scotland, etc, etc. But still; these are places that have no name that I know of (not the same as having no name I realise), that are no longer maintained or used for anything (by human beings at least) and that don’t have any special landmarks or signs to say what they are, were, or who if anyone owns them.
So, for instance; this is nowhere, there’s not much to see. This particular nowhere has clearly not always existed; it’s the evidence of people having once been here that makes it feel like nowhere, an abandoned place, a place that perhaps used to be somewhere, but isn’t anymore; absence rather than simple emptiness. Unique in its details and at the same time interchangeable with other nowheres, like the nowheres of your childhood; places that writers (especially horror writers) call ‘vacant lots’ or ‘disused yards’, although if you’re there to see them they can’t be all that vacant and if kids play there they aren’t actually disused, so much as re-used.
What was this place? It would probably be relatively easy to find out, but finding out would make it somewhere, even if the name that denoted the place was a dead, ghost name. I remember playing in ‘the factory’ as a child, but ‘the factory’ was just cracked concrete floors and crumbled remains of walls; which means that it wasn’t a factory. Pedantic, yes (always), but while the names of places like the factory are often just words: ‘gates’ or ‘ports’ that once existed or nominally ‘new’ places that are very actually very old (“The New Forest”), there are other names we use for places that are in themselves an admission that we don’t know what they are, or were.
Maps mark places of significance with both of these kinds of words; the ones that mean they are somewhere we know something about (tumulus, castle, church) but also the ones that fill gaps in communal memory with blunt, easy to understand descriptions designed to keep ‘nowhere’ at bay like ruin or better yet, standing stone. These substitute names can themselves become names through the lack of anything better; like Stonehenge, a name that literally means something like ‘stone prehistoric structure’ but, more broadly means ‘this place was important to people once’.
The fragment of path leading nowhere (see picture) doesn’t have a lot in common with Stonehenge, except that human beings made it, presumably used it, and then abandoned it*. Usually, I don’t have much time for Keats’s “negative capability”, whatever way you describe it (he famously wrote “that is, when a man is capable of being in uncertainties, mysteries, doubts, without any irritable reaching after fact and reason“) because it amounts at times to ‘ignorance is bliss’ and personally, I find the poetry of the rainbow in no way reduced by knowledge of how it ‘works’ (quite the opposite, when you consider that human beings apparently see brighter, more colourful rainbows than other creatures. Just the idea that reality is that subjective, that the number of actual colours depends on who is seeing them, feels like a metaphor waiting to happen, as well as raising the logical idea of other ‘prime colours’ that are beyond the human eye’s ability to see. I remember as a child trying to picture another colour as unrelated to blue, red and yellow as they are to each other, but mainly ‘seeing’ purple or brown; another metaphor-in-waiting maybe.
* or, more poetically, Wrætlic is þes wealstan, wyrde gebræcon/burgstede burston, brosnað enta geweorc.
The appeal of nowhere, when it is noticed enough to have an appeal, can be the determination to see the beauty in ordinary things, like Edward Thomas’s beautifully understated/drab Tall Nettles:
Tall nettles cover up, as they have done
These many springs, the rusty harrow, the plough
Long worn out, and the roller made of stone:
Only the elm-butt tops the nettles now.
Edward Thomas, ‘Tall Nettles’ (c.1916), Selected Poems of Edward Thomas, Faber & Faber, 1964 p.35
Nowhere also has the appeal of escape, not just the escape from familiar surroundings into somewhere unknown, but maybe the actual evasion of people and consequences, as in Tom Waits’s songs about hair-raising characters dwelling on the margins of society, of which the classic example may be ’16 Shells From A Thirty-Ought-Six’ from Swordfishtrombones (1983):
Plugged sixteen shells from a thirty-ought six
And a black crow snuck through a hole in the sky
And I spent all my buttons on an old pack mule
And I made me a ladder from a pawn shop marimba
I leaned it all up against a dandelion tree…
…Now I slept in the holler of a dry creek bed
And I tore out the buckets from a red corvette
A more gothic, elaborate version of this kind of nowhere appears in Nick Cave’s early work with The Birthday Party, and is taken to a poetic extreme in his first novel And The Ass Saw The Angel (1989) set in a fantasy version of America’s Deep South. At the opposite end of the spectrum is the Thomas Hardy’s projection of how he hoped to be remembered in anthology favourite Afterwards with its accumulation of beautifully-observed everyday minutiae (“when, like an eyelid’s
soundless blink/The dewfall-hawk comes crossing the shades to alight
Upon the wind-warped upland thorn”) and its near-refrain “He was a man who used to notice such things.”
Although indebted to the poetry-is-everywhere writing of Thomas Hardy and far removed from the dramatic, lawless nowheres of Tom Waits and Nick Cave. Philip Larkin takes ‘nowhere as escape’ to its logical conclusion in poems like ‘High Windows’ (1967) with its ambivalently yearning ending:
Rather than words comes the thought of high windows:
The sun-comprehending glass,
And beyond it, the deep blue air, that shows
Nothing, and is nowhere, and is endless.
Even on a far less drastic level than Larkin’s biophobia, ‘not knowing’ is a key part of the enjoyment of being in the middle of nowhere. I write ‘not knowing’ rather than ‘mystery’, because mystery suggests a sense of excitement entirely alien to Edward Thomas’s nameless place of nettles, or this blocked off stairway (left). The pleasure of not knowing (and not wanting to know) needn’t be exciting enough to warrant being called a mystery. There’s an odd building in the local area, on a path that connects a small town with a nearby village, a couple of miles of muddy track over a hill, through woodlands and alongside some fields. The building is one room, the size of a small shed, the side walls close enough to touch with (my) outstretched hands when standing inside. It has a mangled, rusted metal door in the front; so far, so twentieth century. It’s made of (I think) concrete but, crucially, it’s shaped like a pointed arch; that seems odd. What is it? Why is it where it is, on a hill, in some woods, outside a market town? It doesn’t seem like a useful situation for anything or, anyway, a useful building beyond the sense that any shed is useful. It doesn’t seem to be connected with the farmland that surrounds it, though it could be part of an estate that no longer exists. It’s not eerie exactly (concrete, no windows; it feels more like a portaloo than a cell). But still, that odd, ecclesiastical shape. It was new once, and used for something. But now it’s in the middle of nowhere and its abandonment creates an odd pang of feeling for people and things long since lost to time; a feeling all the stronger for not being known. So in this case maybe mystery after all.
I don’t feel like that (not so much anyway) about just any building with a ‘to let’ sign on it, so why should it be easier to feel some kind of human kinship with the unknown builders of unused paths or the erectors of giant stones whose meaning is lost? Well firstly and obviously because those humans are absent and therefore not annoying; ‘human beings’ yes, but not ones with agendas, attitudes or personalities that we can know about.
And also perhaps because they aren’t around to tell us about their buildings and constructions and more importantly, to mind us looking at them.
Because the ridiculous fact remains that while this place (right) is nowhere, it probably isn’t nobody’s – but ownership of places is a strange and slippery thing. When King Lear finds himself on the heath, a place between places; not a palace, not a hovel, not even a grave, which is at least something:
Thou wert better in a grave than to answer with thy uncovered body this extremity of the skies… Unaccommodated man is no more but such a poor, bare, forked animal as thou art”
William Shakespeare, King Lear, Act III, Scene IV, Penguin Books, 1972, p.125
he is reduced (I think the right word for what Shakespeare does, though not a concept one necessarily agrees with) to the condition of an animal, albeit a more anguished one than, say, a rabbit seems to be. But crucially, up until the earlier events of the play, the King, presumably, owned this same bleak and inhospitable heath: whatever that ownership means. If a person can own a place (and clearly they can) what they can’t own, is what Shakespeare describes; someone’s experience of a place. The piece of land owned by this developer or that corporation isn’t *the same* as this piece of land with its enigmatic fragments of structures and their allusive, suggestive qualities.
Self-aggrandising perhaps, but if your life is an adventure, or at least a sequence of events in which (as Ian Livingstone and Steve Jackson would have it) YOU are the hero, then the fact remains that, whether you have deep roots in an area and a family tree stretching back to the dark ages, or you don’t even know who your own parents are, the experience of standing here, in the middle of nowhere, perceiving things with your senses and processing them with your brain, is something no-one else has ever done, and no-one else will ever do, even if everybody knows this is nowhere.