In the 2020s, the Police may feel beleaguered by the pressure to account for their actions and act within the boundaries of the laws that they are supposed to be upholding, but despite the usual complaints from conservative nostalgists about declining standards of respect, the question of ‘who watches the watchmen’ (or, ‘who will guard the guards’ or however Quis custodiet ipsos custodies? is best translated) is hardly new, and probably wasn’t new even when that line appeared in Juvenal’s Satires in the 2nd century AD.
In the UK (since I’m here), the modern police force (and quasi-police forces like the Bow Street Runners) have almost always been controversial from their foundations in the 18th century onwards – and not surprisingly so.
It’s probably true that the majority of people have always wanted to live their lives in peace, but ‘law and order’ is not the same thing as peace. The ‘order’ comes from the enforcement of the law, and ‘the law’ has never been a democratically agreed set of rules. So law and order is always somebody’s law and order, but not everybody’s. As is often pointed out, most of the things which we currently regard as barbaric in the 21st century, from slavery and torture to child labour and the lack of universal suffrage, were all technically legal. ‘Respect for the law’ may not just be a different thing from respect for your fellow human beings, it might be (and often has been) the opposite of it; so it’s no wonder that the position of the gatekeepers of the law should often be ambiguous at best.
the Keystone Cops in the nineteen-teens
Popular culture, as it tends to do – whether consciously or not – reflects this uneasy situation. Since the advent of film and television, themes of law enforcement and policing have been at the centre of the some of mediums’ key genres, but the venerable Dixon of Dock Green notwithstanding, the focus is only very rarely on orthodox police officers faithfully following the rules. Drama almost invariably favours the maverick individualist who ‘gets the job done’* over the methodical, ‘by the book’ police officer, who usually becomes a comic foil or worse. And from the Keystone Cops (or sometimes KeystoneKops) in 1912 to the present day, the police in comedies are almost invariably either inept or crooked (or both; but more of that later).
*typically, the writers of Alan Partridge manage to encapsulate this kind of stereotype while also acknowledging the ambiguity of its appeal to a conservatively-minded public. Partridge pitches ‘A detective series based in Norwich called “Swallow“. Swallow is a detective who tackles vandalism. Bit of a maverick, not afraid to break the law if he thinks it’s necessary. He’s not a criminal, you know, but he will, perhaps, travel 80mph on the motorway if, for example, he wants to get somewhere quickly.’ i.e. he is in fact a criminal, but one that fits in with the Partridgean world view
But perhaps the police of the 2020s should think themselves lucky; they are currently enduring one of their periodic crisis points with public opinion, but they aren’t yet (again) a general laughing stock; perhaps because it’s too dangerous for their opponents to laugh at them, for now. But almost everyone used to do it. For the generations growing up in the 70s and 80s, whatever their private views, the actual police force as depicted by mainstream (that is, mostly American) popular culture was almost exclusively either comical or the bad guys, or both.
redneck police: Clifton James as JW Pepper (Live and Let Die), Jackie Gleason as Buford T Justice (Smokey and the Bandit), Ernest Borgnine as ‘Dirty Lyle’ Wallace (Convoy), James Best as Rosco P Coltrane (Dukes of Hazzard)the same but different; Brian Dennehy as Teasle in First Blood
The idiot/yokel/corrupt/redneck cop has an interesting cinematic bloodline, coming into their own in the 1960s with ambivalent exploitation films like The Wild Angels (1966) and genuine Vietnam-war-era countercultural artefacts like Easy Rider, but modulating into the mainstream – and the mainstream of kids’ entertainment at that – with the emergence of Roger Moore’s more comedic James Bond in Live and Let Die in 1973. This seems to have tonally influenced similar movies like The Moonrunners (1975; which itself gave birth to the iconic TV show The Dukes of Hazzard, 1979-85), Smokey and the Bandit (1977), Any Which Way You Can (1980) and The Cannonball Run (1981) among others. Variations of these characters – police officers concerned more with the relentless pursuit of personal vendettas than actual law enforcement, appeared (sometimes sans the redneck accoutrements) in both dramas (Convoy, 1978) and comedies (The Blues Brothers, 1980), while the more sinister, corrupt but not necessarily inept police that pushed John Rambo to breaking point in First Blood (1982) could also be spotted harassing (equally, if differently, dysfunctional Vietnam vets) The A-Team from 1983 to ’85.
iconic movie; iconic poster
In fact, the whole culture of the police force was so obviously beyond redemption as far as the makers of kids and teens entertainment were concerned, that the only cops who could be the good guys were the aforementioned ‘mavericks.’ These were borderline vigilantes who bent or broke or ignored the rules as they saw fit, but who were inevitably guided by a rigid sense of justice that was generally unappreciated by their superiors. This kind of cop reaches some kind of peak in Paul Verhoeven’s masterly Robocop (1987). Here, just beneath the surface of straightforward fun sci-fi/action movie violent entertainment, the director examines serious questions of ‘law’ vs ‘justice’ and the role of human judgement and morality in negotiating between those two hopefully-related things. Robocop himself is, as the tagline says ‘part man, part machine; all cop’ but the movie also gives us pure machine-cop in the comical/horrific ED-209, which removes the pesky human element that makes everything so complicated and gives us instead an amoral killing machine. The film also gives us good and bad human-cops, in the persons of Officer Lewis and Dick Jones. Lewis (the always-great Nancy Allen) has a sense of justice is no less keen than that of her robot counterpart, but her power is limited by the machinations of the corrupt hierarchy of the organisation she works for, and she’s vulnerable to physical injury. Jones (the brilliant Ronny Cox) is very aware of both the practical and moral problems with law enforcement, but he’s than happy to benefit personally from them.
Part Man, Part Blue Jeans; All Cop
The following year, Peter Weller (Robocop himself) returned in the vastly inferior Shakedown, worthy of mention because it too features unorthodox/mismatched law enforcers (a classic 80s trope, here it’s Weller’s clean-cut lawyer and Sam Elliott’s scruffy, long haired cop) teaming up to combat a corrupt police force; indeed the movie’s original tagline was Whatever you do… don’t call the cops. And it’s also worthy of mention because its UK (and other territories) title was Blue Jean Cop, though it sadly lacked the ‘part man, part blue jean; all cop’ tagline one would have hoped for). Into the 90s, this kind of thing seemed hopelessly unsophisticated, but even a ‘crooked cops’ masterpiece like James Mangold’s Cop Land (1997) relies, like Robocop, on the police – this time in the only mildly unconventional form of a good, simple-minded cop (Sylvester Stallone), to police the bad, corrupt, too-clever police, enforcing the rules that they have broken so cavalierly. The film even ends with the explicit statement (via a voiceover) that crime doesn’t pay; despite just showing the viewer that if you are the police, it mostly seems to, for years, unless someone else on the inside doesn’t like it.
There’s always an ironic focus on ‘the rules’ – ironic because the TV and movie police tend to be bending them a-la Starsky and Hutch (and the rest), or ineffectually wringing their hands over that rule-bending, like the strait-laced half of almost every mismatched partnership (classic examples being Judge Reinhold in 1984’s Beverley Hills Cop and Danny Glover in Lethal Weapon, another famous ‘unorthodox cop’ movie from the same year as Robocop) or even disregarding them altogether like Clint Eastwood’s Dirty Harry. So, it’s no surprise that the training of the police and the learning of those rules should become the focus of at least one story. Which brings us to Police Academy.
the spiritual children of the Keystone Cops
Obviously any serious claim one makes for Police Academy is a claim too far. It’s not, nor was it supposed to be, a serious film, or even possibly a good film, and certainly not one with much of a serious message. But its theme is a time-honoured one; going back to the medieval Feast of Fools and even further to the Roman festival of Saturnalia, it’s the world upside down, the lords of Misrule. And in honouring this tradition, the film tells us a lot about the age that spawned it. Police Academy purports to represent the opposite of what was the approved behaviour of the police in 1984 and yet, despite its (not entirely unfounded) reputation for sexism and crass stereotypes it remains largely watchable where many similar films do not. But, more surprisingly, it also feels significantly less reactionary than, say the previous year’s Dirty Harry opus, Sudden Impact.
While it’s a trivial piece of fluff, Police Academy is notable for – unlike many more enlightened films before and since – passing the Bechdel test. Don’t expect anything too deep – not just from the female characters – but it also has having noticeably more diversity among its ensemble cast than the Caddyshack/National Lampoon type of films that were in its comedy DNA. Three prominent African-American characters with more than cameo roles in a mainstream Hollywood movie may not seem like much – and it definitely isn’t – but looking at the era it feels almost radical. At this point in Hollywood history, let’s not forget, the idea for a film where a rich white kid finds the easiest way to get into college is by disguising as a black kid not only got picked up by a major studio, but actually made it to the screen.
In that context, these three actors – Marion Ramsey, Michael Winslow and the late Bubba Smith could look back on a series of movies which may not have been* cinematic masterpieces, but which allowed them to use their formidable comedic talents in a non-token way. More to the point, their race is neither overlooked in a ‘colourblind’ way (they are definitely Black characters rather than just Black actors playing indeterminate characters) or portrayed in a negative sense. Police Academy is not an enlightened franchise by any means; the whole series essentially runs on stereotypes and bad taste and therefore has the capacity to offend pretty much everyone. But although there are almost certainly racial slurs to be found there, alongside (for sure) gross sexism, homophobia etc, the series is so determined to make fun of every possible point of view that it ends up leaving a far less bad smell behind it than many of its peers did; perhaps most of all the previously alluded to Soul Man (1986). *ie they definitely aren’t
Despite its essential good nature though, there is a genuine, if mild kind of subversion to be found in the Police Academy films. With the Dickensian, broadly-drawn characters comes a mildly rebellious agenda (laughing at authority), but it also subverts in a more subtle (and therefore unintentional? who knows) way, the established pattern of how the police were depicted. Yes, they are a gang, and as such they are stupid and corrupt and vicious and inept, just like the police of Easy Rider, Smokey and the Bandit, TheDukes of Hazzard etc. Unlike all of those films and franchises though, Police Academy offers a simple solution in line with its dorky, good natured approach; if you don’t want the police to suck, it implies, what you need to do is to recruit people who are not ‘police material.’ In the 1980s those who were not considered traditional ‘police material’ seemingly included ethnic minorities, women, smartasses, nerds, and at least one dangerous gun-worshipper, albeit one with a sense of right and wrong that was less morally dubious than Dirty Harry’s. So ultimately, like its spiritual ancestors, Saturnalia and the Feast of Fools, Police Academy is more like a safety valve that ensures the survival of the status quo rather than a wrecking ball that ushers in a new society. Indeed, as with Dickens and his poorhouses and brutal mill owners, the message is not – as you might justifiably expect it to be – ‘we need urgent reform’, but instead ‘people should be nicer’. It’s hard to argue with, as far as it goes, but as always seems to be the case*, the police get off lightly in the end.
The Boys in Blue (1982). Christ
*one brutal exception to this rule is roughly the UK equivalent of Police Academy, the risible 1982 Cannon & Ball vehicle The Boys In Blue. After sitting through an impossibly long hour and a half of Tommy and Bobby, the average viewer will want not only to dismantle the police force, but also set fire to the entire western culture that produced it.
Thomas Braithwaite of Ambleside making his will (1607, artist unknown)
The dying man glows with sickness in his mildewy-looking bed, the light seeming to emanate from where he sits, crammed into the airless, box-like room. He signs his will while his friend looks on intently with concern and restrained grief.
The artist who painted Thomas Braithwaite of Ambleside making his will in 1607 may not have been considered important enough as an artist, (still a person of relatively low social status in northern Europe, though this was starting to change with painters like Rubens and his pupil Anthony Van Dyck) to warrant signing the picture or having their name recorded at all, except perhaps in the household accounts – but they were important as a witness, and the painting is itself a kind of legal document, although it’s more than that too. The great enemy of the Elizabethan and Jacobean ages wasn’t death, with which most adults would have been on very familiar terms, but disorder and chaos*; and this, despite its tragic appearance, is a painting devoted to the age’s great virtue; order. Both the dying lord (an inscription records the date of his death (Thomas Braithwaite of gentry stock, died 22 December, 1607, aged 31) and his friend George Preston of Holker are identifiable to those who knew them by their likenesses and to those who didn’t, by their coats-of-arms. Biblical texts tell us that Thomas Braithwaite was a virtuous man, but so does the painting itself; this is a man who, even while he lay dying, took care of his business. His passing is tragic, but, he reassures us, it will cause only grief and not inconvenience.
*see EMW Tillyard, The Elizabethan World Picture, Pelican Books, 1972, p.24
We talk about religious faith now as a kind of choice as much as a belief system, but for all its paranoia about atheism –and all the subsequent romanticism about that era’s new spirit of humanism – the Tudor and Stewart ages had inherited a world view in which the existence, not only of God and Heaven and Hell, but the essential hierarchy of existence, was more or less taken for granted. We may differentiate arbitrarily now between religion and superstition, but for the people in these cramped and airless paintings there was no real contradiction between, say Christianity and astrology, because in accepting without exception the primacy of god the creator, it all works out in the end – everything that has ever existed and everything that will ever exist, already exists. Perhaps human beings aren’t supposed to divine the future, but God has written it and the signs – comets, unseasonal weather, the movement of the stars and the behaviour of animals – are there to be read and interpreted by anyone with the nerve to do so.
John Souch – Sir Thomas Aston at his Wife’s Deathbed (1635)
In an off-kilter, vertigo-inducing room that seems almost to unfurl outwards from the skull at its centre, an illogical space hung with black velvet, a man and his son, looking outwards, but not at us, stand by the deathbed of their wife and mother, while a glamorous young woman meets our gaze from where she sits, apparently on the floor at the foot of the bed.
There’s virtue in this painting too, but mostly this one really is about death. It’s there at the centre, where the lord’s hand sits on a skull, recalling the kind of drama which was then passing out of fashion, just as this kind of painting was. The skull, like the black-draped cradle (with its inscription that reads He who sows in flesh reaps bones), acts as a vanitas motif, focussing the viewer’s attention on the shortness of life, but also recalls the enthusiastically morbid writing of men like John Webster and Thomas Middleton. Sir Thomas and his wife had grown up in an England where plays like Middleton’s Revenger’s Tragedy often featured soliloquies over the remains of loved ones. Sir Thomas Aston is not being consumed by a desire for revenge, but his hand on the skull can’t help recalling Hamlet, or even more so, anti-heroes like Middleton’s Vindice, who opens The Revenger’s Tragedy contemplating the skull of his fiancée;
My study’s ornament, thou shell of death/once the bright face of my betrothed lady/When life and beauty naturally fill’d out/these ragged imperfections,/when two heaven-pointed diamonds were set/ in those unsightly rings – then t’was a face/so far beyond the artificial shine/of any woman’s bought complexion The Revenger’s Tragedy, Act1 Sc 1, in Thomas Middleton, Five Plays ed. Bryan Loughrey & Neil Taylor, Penguin Books, 1988 p.73
Sir Thomas, unlike Vindice, displays the correct behaviour for a grieving man with an orphaned young son – not, the deadpan ‘stiff upper lip’ restraint of later generations of British gentlemen – though he is a dignified figure, but the kind of behaviour noted in books of etiquette like the anonymous Bachelor’s Banquet of 1603, which states that if
in the midst of this their mutual love and solace, it chanceth she dies, whereat he grieves so extremely, that he is almost beside himself with sorrow: he mourns, not only in his apparel for a show, but unfeignedly, in his very heart, and that so much, that he shuns all places of pleasure, and all company, lives solitary, and spends the time in daily complaints and moans, and bitterly bewailing the loss of so good a wife, wherein no man can justly blame him, for it is a loss worthy to be lamented.
The Bachelor’s Banquet in The Laurel Masterpieces of World Literature – Elizbethan Age, ed. Harry T. Moore, Dell Books, 1965, p.324)
It is perhaps this behaviour we should read in Sir Thomas’s sideways glance, not the hauteur of the nobleman but the remoteness of the recently bereaved. His black sash is adorned with a death’s head brooch; he and his young son (also Thomas) are to be considered men of the world; to their left a globe sits on a tapestry decorated with elephants. But all their worldly knowledge and faith is no help here; the two Astons grasp a cross staff bearing the inscription, The seas can be defined, the earth can be measured, grief is immeasurable. Given this display of intense, but restrained grief, the smiling girl – the only person who makes eye contact with us – is a strange figure, despite her beautiful mourning clothes, and it may be that she is the lady in the bed, as she looked in happier times, there to show us, and remind father and son, of what they are missing.
David Des Granges – The Saltonstall Family c.1636-7
On what looks like a shallow stage opening onto a bed in a cupboard, a strangely-scaled set of figures pose stiffly, only the older child meeting our eye with a knowing smirk, although the strangely capsule-like baby seems aware of us too.
As in the Souch painting, the father figure dominates, just as they dominated their households; the household being a microcosm of the state, the state itself a microcosm of the universe.* Mr Saltonstall, despite being at the apex of a pyramid of hierarchy that allowed absolute power, does not look devoid of compassion or warmth – indeed, he has had himself depicted holding the hand of his son, who himself mirrors (in, it has to be said, a less benign-looking way) this gesture of casual mastery, holding his little sister’s wrist, demonstrating just how the links in this chain of family work. And the family is inside the kind of house familiar nowadays to the heritage tourist as a mirror of the world that produced it; mansions like overgrown doll’s houses, big on the outside, but strangely cramped and illogical inside, with peculiar little wood-panelled rooms and an ancient smell of damp.
Dorothea Tanning – A Family Portrait (1954)
The nakedness of the power structure here isn’t subtle; and it isn’t supposed to be, because it wasn’t there to be questioned but accepted. Virtue lies in following god’s system of organisation, any suggestion to the contrary would make it an entirely different kind of painting. And indeed when painting – and painters – achieved a higher social standing in the century that followed, the messages become more subtle, only reappearing in something like this blatant form again in western art in the post-Freudian era, with a painting like Dorothea Tanning’s 1954 A Family Portrait. But Tanning’s painting is a knowing representation of a reality she was aware of but which had the force of tradition alone. Its appearance in the mid-17th century reflects the reality of the age; the truth, if not the only truth.
*EMW Tillyard, The Elizabethan World Picture, p.98-9
Richard Dadd – The Fairy Feller’s Master Stroke (1855-64)
The first impression, looking at these kinds of paintings, is something like looking at fairyland through the distorting lens of Richard Dadd’s insanity centuries later; comical and disturbing, familiar and illogical. These painters of the Elizabethan and Jacobean tradition (their art died out at around the same time as Charles I did in the middle of the seventeeth century) – Souch, Des Granges, William Larkin and their many nameless contemporaries – were at the tail end of a dying tradition that would be replaced by something more spacious, gracious, modern and ‘realistic’; but ‘realistic’ is a loaded word and it’s entirely likely that this older tradition captures their world more accurately. We don’t need a time machine (though it would be nice) – a visit to almost any castle, palace or stately home is enough to confirm that the velvet curtains and classical paraphernalia of a Rubens or Van Dyck portrait does not tell the whole story of their era, even among the tiny demographic who their art served. It is a world that we would probably find dark and claustrophobic; witness the smallness of furniture, the lowness of the doorways and the dark paintings of dead ancestors, and this – regardless of the fact that it is partly due to what would later be seen as incompetence* – is what is preserved in this tradition of painting, as well as in the homes these people left behind.
* it’s a matter of fact that the average artist drawing a superhero comic in the 20th/21st century has a better grasp of mathematical perspective – and the idea of perspective at all – than even the more accomplished Elizabethan or Jacobean portrait painter
William Larkin: a great painter who could have learned something from John Buscema & Stan Lee’s ‘How to Draw Comics the Marvel Way’ (1978)
This is the kind of art that the Renaissance and its aftermath is supposed to have made obsolete – but though the word ‘art’ may owe its origin to its nature as something artificial, it also tells the truth, or a truth, regardless of its creators’ intentions. But if I’m implying that it’s realistic rather than idealistic, what does ‘realistic’ mean? Often when deriding ‘modern art’ (a meaningless term, since the art it usually refers to is often post-dated by art – like Jack Vettriano for instance – that is not considered to be ‘modern’) the assumption is that modern art is kind of aberration, a straying from a realistic norm*. But when looked at as a whole (or as much of a whole as is possible from a particular cultural viewpoint) it becomes quickly apparent that art that is ‘realistic’ in the narrowly photographic sense is a tiny island in the vast ocean of art history – and what is more, relies on ideas – such as the opposition of ‘abstract’ and ‘realistic’, that may have no currency whatsoever outside of the Western tradition.
visions of war: Picasso’s Guernica (1937) and Robert Taylor’s Struggle For Supremacy (2001)
Even within Western cultures, the idea that photographic equates to experiential is debatable; despite the persistence (outside of academia) of the idea that Picasso was primarily an artist who painted noses on the wrong side of heads etc, a painting like his Guernica clearly has more in common with images of war as it was experienced in the 20th century – even vicariously through cinema and TV – than the kind of ‘war art’ that my granddad had on his walls, beautiful paintings in a tradition that lives on through artists like Robert Taylor, visions of war where the fear and panic becomes excitement and drama, an altogether easier thing to be entertained by.
*A classic example of this attitude came from Philip Larkin, who, when writing about modernism in jazz, digressed to cover all of the arts, noting
All that I am saying is that the term ‘modern’ when applied to art, has a more than chronological meaning: it denotes a quality of irresponsibility peculiar to this [ie the 20th] century… the artist has become over-concerned with his material (hence an age of technical experiment) and, in isolation, has busied himself with the two principal themes of modernism, mystification and outrage. Philip Larkin, All What Jazz, Faber & Faber, 1970, p.23
Picasso was trying to capture the feel of his century – but most of the great courtly artists of the sixteenth and seventeenth centuries – the Renaissance masters who became household names – were trying to capture something loftier, to escape the more earthy, earthly aspects of theirs, not least because they were the first generation to attain something like the status that Picasso would later attain; artists as creators and inventors, not craftsmen and recorders. And therefore that feeling of the life of the times shines through more vividly in the work of artists like John Souch and David Des Granges. The 17th century was a time when the world – even the world inhabited by the aristocracy – was far smaller than it is today in one sense, but the wider world seemed correspondingly bigger and more dangerous, but also perhaps richer or deeper, just as these people – often married by 12 or 14, learned – if they were allowed to learn – by 20, old by 40, were both smaller and bigger than we are.
This kind of painting, part portrait, part narrative, was uniquely suited to the lives it recorded, and in one late example its strengths can be contrasted with those of the baroque style that swept it away. In 1613, Nicholas Lanier was a rising star in the English court, composer of a masque for the marriage of the Earl of Somerset. Around this time he was painted by an unknown artist, in the semi-emblematic tradition of artists like John Souch. There are references – the classical statue, the pen and paper with its mysterious inscription (RE/MI/SOL/LA) that highlight that this man is more than just a lutenist, but at the same time he is most definitely that, and the artist has taken care to render realistically Lanier’s muscles as he holds the instrument; an artist yes, but a workman of sorts too. By 1632, Lanier was the Master of the King’s Music and a trusted envoy of King Charles, who even sent him on picture-buying missions. And it is this gentleman that Van Dyck captures; aloof, authoritative, not someone we can picture sweating over a difficult piece of music.
Nicholas Lanier (1613) by an unknown artist (left) and Nicholas Lanier (1632) by Anthony van Dyck (right)
With the art of Van Dyck, the courts of Britain were to discover an ideal of aristocratic indifference which would partly define the project of British imperialism and which is, unfortunately, still with us today. But the truth of Van Dyck’s age, and those which preceded him was stranger, darker and more human. And it’s there still, in those damp-smelling big-small houses, and in the art that died with King Charles.
a cat?a cry for help from the depths of the classroom
There are relatively few times in life when it’s possible to switch off your mind and enter a trance-like state without going out of your way to do so; but sitting in a classroom for a period (or better yet, a double period) of whatever subject it is that engages you least is one of those times. When the conditions are right – a sleepy winter afternoon in an overly warm room maybe, with darkness and heavy rain or snow outside and the classroom lights yellow and warm, the smell of damp coats hung over radiators and a particularly boring teacher – the effect can be very little short of hypnotic. The subject will be a matter of taste, for me the obvious one I detested was Maths, but I think that something like Geography or ‘Modern Studies’ (strangely vague subject name), where I wasn’t concerned so much with not understanding and/or hating it, would be the optimum ‘trance class’.
I think every school jotter i had between the ages of 5 and 18 had this on the back, and it never went un-alteredfragments of the Metallica logo. and ???
There’s nothing like school for making you examine the apparently stable nature of time; if, as logic (and the clock) states, the 60 or so minutes of hearing about ‘scarp-and-vale topography’ really was about the same length of time as our always-too-short lunch hour, or even as was spent running around the rugby pitch, then clearly logic isn’t everything, as far as the perception of human experience is concerned.
Darth Vader, axes, spears…
But it would not be true to say that I did nothing during these long, barren stretches of unleavened non-learning. Mostly, I doodled on my school books. Sometimes this was a conscious act, like the altering of maps with tippex to create fun new supercontinents, or the inevitable (in fact, almost ritualistic, after 7 years of Primary school) amending of the fire safety rules that were printed on the back of every jotter produced by The Fife Regional Council Education Committee. Often these were just nonsensical, but even so, favourite patterns emerged. I had a soft spot for “ire! ire! ire! anger! anger! anger!” (in the interests of transparency I should probably point out that I was almost certainly unaware at the time that ire means anger), and the more abstract “fir! fir fir! Dang! Dang! Dang!” (see?), but some things like ‘Remember Eire hunts – Eire kills’ were fairly universal. But also, there was the whiling (or willing) away of time by just doodling, in margins, on covers, or if the books didn’t have to be handed in at the end of the class, just anywhere; band logos and Eddies* and cartoon characters. Later, towards the end of my high school career, there’s a particularly detailed and baroque drawing of a train going over a bridge (something I wouldn’t have had much patience for drawing in an actual art class) which immediately summons up the vivid memory of a particularly long Geography class, and even which pen – a fine felt tip I liked but couldn’t write neatly with** – that I drew it with.
possibly not fully engaged with learning – but I do remember that this was a Geography lesson
*Eddie = ‘Eddie the head’, Iron Maiden’s beloved zombie mascot, created – and painted best – by Derek Riggs
**i.e. ‘I wrote even less neatly than usual with’
adventures in abstract arta scowling Eddie face, a strange man and some kind of tornadoes
If I could recall the things I was supposed to learn in classes this well I would have done much better at school. But the point of doodling is that it’s whatever it is your hand draws when your brain isn’t engaged; or, as André Breton put it, drawings that are ‘dictated by thought, in the absence of any control exercised by reason, exempt from any aesthetic or moral concern.’*
This is in fact from his definition of what surrealism is; ‘psychic automatism in its pure state’ and later, in The Automatic Message (1933) Breton went further, influenced by his reading of Freud, specifically referencing what would later become known as art brut or ‘outsider art’ – drawings by the mentally ill, visionaries, mediums and children – as ‘surrealist automatism’. Although it might seem to – well, it definitely does – give too much dignity and importance to the time-wasting scrawls of teenagers to consider them anything but ephemeral, the strange faces, swords, cubes, eyes, tornadoes and goats that littered my school books aged 12-14 or so do seem to preserve, not just the kind of pantheon almost every child/teenager has – made up of favourite bands, TV shows, cartoon characters etc – but a kind of landscape of enigmatic symbolism that comes from who-knows-where and perhaps represents nothing more than the imagination crying for help from the heart of a particularly stimulus-free desert. But in the end, that’s still something.
boredom made flesh(y)
*André Breton, Manifesto of Surrealism 1924, published in Manifestoes of Surrealism, Ann Arbor paperbacks, tr. Richard Seaver and Helen R. Lane, 1972, p.26
Aged 20/1586 James 6/By Grace of God King of Scotland
Was it a cold morning in Edinburgh in 1586 when James VI, only twenty years old, very aware of his status as a divinely-appointed monarch, but with already a lifetime’s experience of human nature and earthly politics, sat in front of Adrian Vanson to be painted? Was he nervous? His watchful eyes suggest not, but his position, though finally secure, probably didn’t feel very stable; just three years earlier he had been imprisoned by those ruling in his name, and this year, although he signed a treaty of mutual defence with England against the possibility of a Catholic invasion, his mother who he had succeeded, remained in England, alive and imprisoned. Was Vanson nervous? Or was it just another job? The King wasn’t always noted for his good temper, but the artist, who had come to Scotland from the Netherlands via London (where he had an uncle) already knew James, and had first painted some pictures for the young King in 1581, before his imprisonment and, in happier circumstances, the year before this portrait, had painted a more glamorous and light-hearted portrait of the King to be taken abroad and shown to prospective suitors. But this picture, sombre, stern even, is about power; James 6th by the grace of God King of Scotland. When we look at this painting, at this sulky looking young man, we are making some kind of connection, looking through the eyes, albeit via the hand, of a Dutch man who died around 420 years ago. The painting – even if by the standards by which art is usually judged, it’s ‘not great’ – has a personal value, one human being, recorded by another, as well as a cultural one. It tells us something about fashions, lifestyles, the way a king could be depicted in that country, in that period (for all his divinity he is not an iconic figure), class structures, religion – but what is it “worth”? What is any work of art worth?
James again, when both he and the artist were a long 9 years older
Leaving aside metaphorical, metaphysical or aphoristic answers, or going into a much more long winded but possibly worthwhile conversation about what art is (I’m going to say it’s a deliberate act of creation, but even that is arguable), let’s assume we know what art is. Googling ‘art definition’ initially brings up five presumably definitive and certainly iconic pictures, the Mona Lisa, The Starry Night (both as famous as their creators, pretty much), Les Demoiselles d’Avignon (whose creator – Picasso – is more famous than the painting), The (or rather Leonardo’s) Last Supper and A Sunday Afternoon on the Island of La Grand Jatte, which I think is probably more famous as an image than a title, and the image is more famous than its creator Seurat.
What are these paintings worth? I’m sure facts and figures are available, but this is not – despite the age of some of the paintings, about intrinsic worth; I imagine there is a basic going rate for an early 16th century Italian renaissance portrait on panel (and so forth), but that has little to do at this point with the price of the Mona Lisa. The painting would be just as good (or just as whatever you think it is) if the artist was unknown, but the value has – and always has had – a lot to do with Leonardo da Vinci and the perception of him as more than just someone who painted good portraits.
a (but not “the”) Mona Lisa, an early copy probably by one of Leonardo’s apprentices
Separating the art from the artist is always a difficult and controversial subject, but should really be easier in the visual arts that almost any other field. Yes, artists have their own ‘voice’ or visual language, but that is not the same as reading their actual words, or hearing their actual voice; and yet – because, I guess, of market forces, artists are routinely known and valued above and beyond their works and those works – even their doodles and fragments – are valued accordingly. A scrawled caricature in a margin by Leonardo (or Picasso) can be “worth” many times what a highly finished, technically brilliant oil painting by an unknown artist is. This disconnect happens because although art history is human history, “the art world” as it has existed since at least the 19th century is more like horse racing – take away the money and what you have is a far smaller number of people who are genuinely interested in how fast a horse can run.
Which is fine – but the question of what a painting (for instance) is “worth” has become the way art is engaged with popularly; somehow art, unlike sport, has never earned its own daily segment on the news and really it only appears there when the sums it raises are enormous (Leonardo’s Salvator Mundi), the sums lost are enormous (theft, fires, vandalism), or it’s part of a story that’s interesting in itself (Nazi art hoards, previously undiscovered ‘masterpieces’ etc). But the veneration of artists above art – now at the very peculiar stage at which a painting “after” (that is, not by, and possibly not even from the same era as) a famous ‘old master’ can be worth a far higher sum than a genuine painting by a lesser known ‘old master’ – masks the true value of art, which may be cultural, but is ultimately always personal. Even without any knowledge of the King James or his life, we are able, if we can see – just by being human – to make certain assumptions about the kind of person he was, and what he may have been thinking or feeling on that day in 1586. This kind of empathy is an act of the imagination; if we are mind-reading it is ultimately our own mind we are reading – but no more so than when we meet eyes with a stranger on the street or on a train. And if looking at Vanson’s King James is – because we can find out these facts – a connection with both an immigrant living in what must have in many ways been an unfamiliar country, and with a young man who had recently attained some kind of power, not only over his own life, but over a country, at the cost of his mother, then what of a painting like the Mona Lisa? It is, regardless of how compromised it has become by fame, monetary value and endless theorising, a link with the mind and ideas – and hand – of Leonardo and a kind of communication with the sitter herself. She was probably Lisa Gioconda, she may have already been dead, but although I stand by all of the above, what I seem to have suggested is that a painting is a kind of code to be broken or a museum to be explored and unpacked. These things enrich our understanding of or connection with a painting, but they don’t make it. What makes art so fascinating – but also why it doesn’t have five minutes on the news every night – is because it’s so individual. It’s (VERY) possible to not care in the slightest about the outcome of, say a rugby or football match, but the final score is the final score, regardless of how anyone feels about the quality of the game or the skill of the players. It would not be satisfactory somehow to have a football match where no points were awarded and the outcome of the game depended on how you feel about it. But in art it is completely respectable – and I don’t think wrong – to say, (To paraphrase the great surrealist painter Leonora Carrington); if you really want to know what the Mona Lisa’s smile means, think about how it makes you feel.
Composition in White, Black, Red and Grey (1932) by Marlow Moss
This might seem like reducing art to the level of ‘human interest’, but what else is there? The choice of figurative paintings with a possible narrative element is a matter of taste and makes the human element unavoidable. But if we feel intense emotion when looking at a Mark Rothko painting, a sense of peace and calm from a Mondrian, Marlow Moss or Hans Arp picture, or exhilaration in front of a Peter Lanyon work, the fact remains that ‘we feel’ (or ‘we don’t feel’) is the common denominator. Viewers through the ages who have detected echoes of divine order and harmony in the works of Piero Della Francesa or Fra Angelico have only definitely detected them with any certainty within their own perceptions, which is not to say that they aren’t feeling something the artist himself felt. There’s a philosophical, ‘tree falling in the woods’ point here; is Van Gogh’s ‘Sunflowers’ a work of emotional and artistic intensity after the gallery lights go out? Or is it more like a kind of magic spell or booby trap, triggered only when a spectator is there to observe it?
That said, figurative art, especially portraiture, is – however many layers of information are contained in it – relatively easy to ‘understand’ on a basic level; ie if we can see, we can see what it is. It is the understanding and appreciation that remains entirely individual and subjective. Conceptual art – shockingly still around in much the same forms as it has been since the 60s – is, despite its apparently interpretation-inviting name, less transparent. This means that, unlike something we instantly recognise, it’s – initially at least – only as powerful as its visual impact. And in fact, whereas familiarity invites interpretation in traditional art, it tends to – on a popular level at least – repel it in conceptual art. The controversy surrounding classic media frenzy conceptual pieces like Carl Andre’s pile of bricks, or Tracy Emin’s unmade bed is because everyone knows exactly what a pile of bricks, or a sleeping bag or a bed is, and they don’t feel the need or desire to think further about it and if they do they feel – no doubt wrongly – that they are putting more thought into it than the artist did.
Comedian (2019) by Maurizio CattelanCarl Andre – Equivalent V (1966-69)
That is the ‘philistine’ response and it’s easy to have sympathy with; personally, I don’t mind wondering what a conceptual work means, but if I get no kind of emotional or cerebral response from looking at it in the first place then I’d rather the artist had just written their ideas down. This is me and my deficiency though – if Maurizio Cattelan put his heart and soul into taping that banana to the wall – or even if he just enjoyed doing it – who am I or anyone else to devalue that? And if whoever paid that much money for it is getting some similar experience, or just the satisfaction of being the owner of the most expensive banana in the world – then that’s hard to argue with too.
Portrait of an unknown woman by an unknown artist c.1725
I don’t think it devalues art – quite the opposite – to think of it as a form of communication between individuals, even if as mentioned above, it is really communication with the one person you will ever know with any certainty – yourself. What I seem to be saying (which I may not entirely agree with) is that art is a mirror. Take this beautiful painting from around 1725 by an unknown artist of an unknown lady. To me, this is a real connection with this unknowable person – but again, only as unknowable as any face that passes you in the street never to be seen again – she was a real person, sitting in a room, around 300 years ago, probably wearing something she liked or that told the world how she wanted to be seen, being painted by someone – and by 1725 it could have been a man or a woman – with whom they may have been engaging, impatient, chatty… We can only guess and extrapolate from the picture. That extrapolation will be different every time depending on the viewer and their own knowledge, not just of history, but of people and experience. If 7.6 billion people look at the picture it becomes in essence 7.6 billion pictures, 7.6 billion mirrors.
That is not to say that the picture is ‘better’ than Cattelan’s banana. If I came across the banana taped to a wall anywhere except an art fair would I see it as art? In a way yes, in the sense that it is literally artificial – not the fruit itself, but its location would clearly be a deliberate, human act and not – as a nail in a wall might be – something that could feasibly have a purely utilitarian meaning. It would be puzzling – far more so in fact that in an art fair where the (surely expected by the artist) first reaction of most non-art world people would surely be the eye-rolling ‘so this is ‘art’ is it?’ Whether it would be intriguing, or thought-provoking seems less likely, except insofar as provoking thoughts like ‘who put that banana there and why?’ Which I guess is perfectly valid – and in its own way a genuine connection of the viewer and artists’ minds, though not something that would probably take up much brain space after the initial wondering. But then, many – even most, people (whether or not they would approve of it as art vs the banana) might just as well look at the woman in her fine dress 300 years ago, or the young King James, and pass on without even wondering anything at all.
Between the ages of 14 and 16 or thereabouts, the things I probably loved the most – or at least the most consistently – were horror (books and movies) and heavy metal.
These loves changed (and ended, for a long time) at around the same time as each other in a way that I’m sure is typical of adolescence, but which also seemed to reflect bigger changes in the world. Reading this excellent article that references the end of the 80s horror boom made me think; are these apparent beginnings and endings really mainly internal ones that we only perceive as seismic shifts because of how they relate to us? After all, Stephen King, Clive Barker, James Herbert & co continued to have extremely successful careers after I stopped buying their books, and it’s not like horror movies or heavy metal ground to a halt either. But still; looking back, the turn of the 80s to the 90s still feels like a change of era and of culture in a way that not every decade does (unless you’re a teenager when it happens perhaps?) But why should 1989/90 be more different than say, 85/86? Although time is ‘organised’ in what feels like an arbitrary manner (the time it takes the earth to travel around the sun is something which I don’t think many of us experience instinctively or empirically as we do with night and day), decades do seem to develop their own identifiable ‘personalities’ somehow, or perhaps we simply sort/filter our memories of the period until they do so.
“The 80s” is a thing that means many different things to different people; but in the western world its iconography and soundtrack have been agreed on and packaged in a way that, if it doesn’t necessarily reflect your own experience, it at least feels familiar if you were there. What the 2010s will look like to posterity is hard to say; but the 2020s seem to have established themselves as something different almost from the start; whether they will end up as homogeneous to future generations as the 1920s seem to us now is impossible to say at this point; based on 2020 so far, hopefully not.
I sometimes feel like my adolescence began at around the age of 11 and ended some time around 25, but still, my taste in music, books, films etc went through a major change in the second half of my teens which was surely not coincidental. But even trying to look at it objectively, it really does seem like everything else was changing too. From the point of view of a teenager, the 80s came to a close in a way that few decades since have done; in world terms, the cold war – something that had always been in the background for my generation – came to an end. Though that was undoubtedly a euphoric moment, 80s pop culture – which had helped to define what ‘the west’ meant during the latter period of that war – seemed simultaneously to be running out of steam.
“The 80s” (I actually owned this poster as a kid, which seems extremely bizarre now)
My generation grew up with a background of brainless action movies starring people like Arnold Schwarzenegger and Sylvester Stallone, who suddenly seemed to be laughable and obsolete, teen comedies starring ‘teens’ like Andrew McCarthy and Robert Downey, Jr who were now uneasily in their 20s. We had both old fashioned ‘family entertainment’ like Little & Large and Cannon & Ball which was, on TV at least. in its dying throes; but then so was the ‘alternative comedy’ boom initiated by The Young Ones, as its stars became the new mainstream. The era-defining franchises we had grown up with – Star Wars, Indiana Jones, Ghostbusters, Back to the Future, Police Academy – seemed to be either finished or on their last legs. Comics, were (it seemed) suddenly¹ semi-respectable and re-branded as graphic novels, even if many of the comics themselves remained the same old pulpy nonsense in new, often painted covers. The international success of Katsuhiro Otomo’s Akira in 1988 opened the gates for the manga and anime that would become part of international pop culture from the 90s onwards.
the 80s: book covers as faux movie posters – black/red/metallic; extremely non-psychedelic
Those aforementioned things I loved the most in the late 80s, aged 14-15 – horror fiction and heavy metal music – were changing too. The age of the blockbuster horror novel wasn’t quite over, but its key figures; Stephen King, James Herbert, Clive Barker², Shaun Hutson – all seemed to be losing interest in the straightforward horror-as-horror novel³, diversifying into more fantastical or subtle, atmospheric or ironic kinds of stories. In movies too, the classic 80s Nightmare on Elm Street and Friday the 13th franchises – as definitively 80s as anything else the decade produced – began to flag in terms of both creativity and popularity. Somewhere between these two models of evolution and stagnation were the metal bands I liked best. These seemed to either be going through a particularly dull patch, with personnel issues (Iron Maiden, Anthrax) or morphing into something softer (Metallica) or funkier Suicidal Tendencies). As with the influence of Clive Barker in horror, so bands who were only partly connected with metal (Faith No More, Red Hot Chilli Peppers) began to shape the genre. All of which occurred as I began to be obsessed with music that had nothing to do with metal at all, whether contemporary (Pixies, Ride, Lush, the Stone Roses, Happy Mondays, Jesus Jones – jesus, the Shamen etc) or older (The Smiths, Jesus and Mary Chain, The Doors⁴, the Velvet Underground).
Revolver #1, July 1990: very not 80s
Still; not many people are into the same things at 18 as they were at 14; and it’s tempting to think that my feelings about the end of the decade had more to do with my age than the times themselves; but they were indeed a-changing, and a certain aspect of the new decade is reflected in editor Peter K. Hogan’s ‘Outro’ to the debut issue of the somewhat psychedelically-inclined comic Revolver (published July 1990):
Why Revolver?
Because what goes around comes around, and looking out my window it appears to be 1966 again (which means – with any luck – we should be in for a couple of good years ahead of us). Because maybe – just maybe – comics might now occupy the slot that rock music used to. Because everything is cyclical and nothing lasts forever (goodbye, Maggie). Because the 90s are the 60s upside down (and let’s do it right, this time). Because love is all and love is everything and this is not dying. Any more stupid questions?
This euphoric vision of the 90s was understandable (when Margaret Thatcher finally resigned in 1990 there was a generation of by now young adults who couldn’t remember any other Prime Minister) but it aged quickly. The ambiguity of the statement ‘the 90s are the 60s upside down’ is embodied in that disclaimer (and let’s do it right, this time) and turned out to be prophetic; within a month of the publication of Revolver issue1 the Gulf War had begun. Aspects of that lost version of the 90s lived on in rave culture, just as aspects of the summer of love lived on through the 70s in the work of Hawkwind and Gong, but to posterity the 90s definitely did not end up being the 60s vol.2. In the end, like the 80s, the 90s (like every decade?) is defined, depending on your age and point of view, on a series of apparently incompatible things; rave and grunge, Jurassic Park and Trainspotting, Riot Grrrl and the Spice Girls, New Labour and Saddam Hussein.
That tiny oasis of positivity in 1990 – between the Poll Tax Riots on 31st March and the declaration of the first Gulf War on the 2nd August is, looking back, even shorter than I remember, and some of the things I loved in that strange interregnum between adolescence and adulthood (which lasted much longer than those few months) – perhaps because they seemed grown up then – are in some ways more remote now than childhood itself. So… conclusions? I don’t know, the times change as we change and they change us as we change them; a bit too Revolver, a lot too neat. And just as we are something other than the sum of our parents, there’s some part of us too that seems to be independent of the times we happen to exist in. I’ll leave the last words to me, aged 18, not entirely basking in the spirit of peace and love that seemed to be ushered in by the new decade.
¹ in reality this was the result of a decade of quiet progress led by writers like Alan Moore, Neil Gaiman and Frank Miller
² although 100% part of the 80s horror boom, Barker is perhaps more responsible than any other writer for the end of its pure horror phase
³ Stephen King’s Dark Tower series, though dating from earlier in the 80s, appeared in print with much fanfare in the UK in the late 80s and, along with the more sci-fi inflected The Tommyknockers and the somewhat postmodern The Dark Half seemed to signal a move away from the big, cinematic horror novels like Pet Sematary, Christine, Cujo et al. In fact, looking at his bibliography, there really doesn’t appear to be the big shift around the turn of the 90s that I remember, except that a couple of his new books around that time (Dark Tower III, Needful Things, Gerald’s Game for one reason or another didn’t have half the impact that It had on me. That’s probably the age thing). James Herbert, more clearly, abandoned the explicit gore of his earlier work for the more or less traditional ghost story Haunted (1988) and the semi-comic horror/thriller Creed (1990)– a misleadingly portentous title which always makes me think of that Peanuts cartoon where Snoopy types This is a story about Greed. Joe Greed lived in a small town in Colorado… Clive Barker, who had already diverged into dark fantasy with Weaveworld, veered further away from straightforward horror with The Great & Secret Show while reliably fun goremeister Shaun Hutson published the genuinely dark Nemesis, a book with little of the black humour – and only a fraction of the bodycount – of his earlier work. ⁴ the release of Oliver Stone’s The Doors in 1991 is as 90s as the 50s of La Bamba (1987) and Great Balls of Fire (1989) was 80s. Quite a statement.
Gustave Courbet was born 101 years ago today, but although he remains one of the key figures in nineteenth century art and the roots of modernism, this isn’t about his painting.
During the Franco-Prussian war, Courbet, by then in his 50s and an elder statesman of French art, proposed that the Column erected in the Place Vendôme by Napoleon to commemorate his military victories be pulled down as a symbol of aggressive imperialism, and moved to a location that both neutralised and cast light on its true meaning; he suggested the military hospital, the Hotel des Invalides.
Place Vendôme, 1871
He also suggested that a new monument be made from melted down cannons and dedicated to the people, both French and German, and the peaceful federation of the two nations.
During the brief period of the Paris Commune the next year, the revolutionary government followed half of his advice and issued a decree that the Vendôme column should be demolished – and replaced by a figure representing the Commune itself.
It was duly pulled down, but the Commune was too short-lived for its replacement to be built, and the suggestion that the Place Vendôme column be moved elsewhere was ignored. Instead, when the Commune was overthrown and the government reinstated, Courbet was imprisoned (ironically, he had by then fallen out with the leaders of the Commune too, disagreeing with their more repressive measures) and after his release was charged with the expense of rebuilding the column (he fled France to avoid paying), which remains in the Place Vendôme to this day. Which is a shame. Had Courbet’s original suggestion been followed, the column would have been both a memorial to Napoleon and the might of his armies as it is now, but also to the real meaning of military glory; death, pain and horror.
Courbet and the Communards (not THE Communards; Jimmy Somerville was not present) with the ruins of the Vendôme column, 1871
In the past week, statues have been toppling (notably the statue of slave traders Edward Colston in Bristol and Robert Milligan in London; and it’s good I think; Britain is full of statues and memorials and it’s only right that, rather than seeing them simply as decoration, we should see them as history, and ask who they are and why they are there. And their removal is history too; and I hope that in removing the layers of time and dust and whitewash between us and the past, we can take into account that removing and ignoring parts of history that – for whatever reason – we don’t like, is and always has been part of the problem. A statue that glorifies one man while ignoring the countless, now unfortunately mostly nameless, people he exploited and whose lives he destroyed is an abomination and a symbol of so many things that are wrong with this country; so it should be used to educate and illuminate that sordid corner of history, and to ensure it isn’t forgotten.
Edward Colston’s statue – Ben Birchall/PA Wire/PA Images
I don’t know the best way to do that, but as a matter of course I think that – at the very least – the monuments that litter the country should be looked at, evaluated, explanations added that tell people what history really means. History is the lives of people, not something abstract, and not just those people who pleased the authorities or the populace enough to be celebrated and commemorated – what was the context? Why are we supposed to still care, where does that part of history fit in with where we are now? In a post-modern age it’s not too much to ask that our landscape becomes post-modern too. If statues and monuments of individuals are to mean anything more than personal glory for their subject it doesn’t seem too much to ask to have a basic overview in whatever form (plaque/recording/who knows?) – who is it/what did they do/why are they here* – and the latter two things may only be tenuously linked. In the case of (since he’s in the news) Edward Colston, a few lines can tell a story that I think is worth telling; Edward Colston (1636-1721)/businessman responsible for the slavery of an estimated 84,000 African men, women and children, 19,000 of whom died in transit to the West Indies, many of whom were sadistically branded with his company logo/statue erected as a reward for investing his fortune in British charities, churches and hospitals. The wording would be important and require more thought than I’ve given it here though. I don’t think this would condone anything, but it explains something about history, what the empire was, how it worked and why things are as they are now, in a way that a name and birth/death dates doesn’t.
*immediately you have to admit that this could become absurd; but it needn’t
A statue isn’t a museum, but I don’t see any reason why they shouldn’t do the same job as one; not just preserving, but educating. There’s a parallel argument here too about museums and the repatriation of items stolen from different peoples; and it’s hard to see a good argument against repatriation in an age where the contents of a museum a thousand miles away is as easily accessible to most people as the contents of one a hundred, or fifty miles away. But that’s another discussion.
Importantly, this isn’t – to me at least – an argument for less public art, but for more. Heroines and heroes are not necessarily those people whose fame was great enough to warrant erecting statues of them within living memory. The heroes, as they were then considered, of the Napoleonic wars, or the British Empire, or of World War Two, may not be – and mostly shouldn’t be – our heroes now – but it’s never too late to remember other figures, who exemplify what we retrospectively see as the virtues of their age (deciding who you would memorialise is irresistible; was very glad to see Sylvia Pankhurst memorialised myself). And though some argue (such as Rachel Holmes in this article that I mostly agree with) that there are too many statues in the UK, I don’t think so. The more our history is clear to see and to question, the healthier it is. Hiding it, or limiting public memorials to people we all approve of (impossible) seems the worst kind of self censorship. That said, it gives me some kind of patriotic pride to note that, despite the number of memorials to forgotten military people and monarchs in my own capital city, the best-known statues there are to a writer (albeit one whose role in Scottish history is both illustrious and ambiguous, depending on your political point of view) and and a dog (ideologically pretty okay).
Probably the Emperor Claudius, 1st century AD
But anyway; time and memory and history are complex, fluid things. There’s a life size bronze head, probably of the Emperor Claudius, in the British Museum which, for whatever reason was removed from its statue and thrown into the river Alde nearly 2000 years ago. The most attractive theory is that the statue was destroyed during Boudicca’s rebellion of native British tribes in AD 61 – and while we can never know if this is true, knowing that the statue existed and that it was dismembered tells us more about Imperialism, resistance and human history than if it had simply been melted down and erased from the world.
The year before Courbet’s birth, Shelley, like Courbet a socialist of sorts, published Ozymandias.
I met a traveller from an antique land,
Who said—“Two vast and trunkless legs of stone
Stand in the desert. . . . Near them, on the sand,
Half sunk a shattered visage lies, whose frown,
And wrinkled lip, and sneer of cold command,
Tell that its sculptor well those passions read
Which yet survive, stamped on these lifeless things,
The hand that mocked them, and the heart that fed;
And on the pedestal, these words appear:
My name is Ozymandias, King of Kings;
Look on my Works, ye Mighty, and despair!
Nothing beside remains. Round the decay
Of that colossal Wreck, boundless and bare
The lone and level sands stretch far away.”.
To start with, this was mostly about books, and I think it will end that way too. But it begins with a not terribly controversial statement; hero worship is not good. And the greatest figures in the fight for human rights or human progress of one kind or another – Martin Luther King, Jr, Emmeline Pankhurst, Gandhi – without wishing to in any way diminish their achievements – would not have achieved them alone. Rosa Parks is a genuine heroine, but if she had been the only person who believed it was wrong for African-American people to be forced to give up seats for white people, the practice would still be happening. These individuals are crucial because they are catalysts for and agents of change – but the change itself happens because people – movements of people – demand it.
a bunch of lonesome and very quarrelsome heroes
This is obviously very elementary and news to nobody, but it’s still worth remembering in times like these, when people seem to be drawn to messianic figures, or to elevate people with no such pretensions to quasi-messianic status. One of the problems with messiahs is that when they don’t fulfil the hopes of their followers, their various failures or defeats (of whatever kind) take on a cataclysmic significance far beyond the usual, human kind of setback and re-evaluation. It’s only natural to feel discouraged if your political or spiritual dreams and hopes are shattered, but it’s also important to remember that the views and opinions that you were drawn to and which you agree with belong to your too. They are likely to be shared by millions of people and the fact that they are also apparently not shared by a greater number in no way invalidates them or renders them pointless.
The history of human progress is, mostly, the history of people fighting against entrenched conservative views in order to improve the lives of all people, including, incidentally, the lives of those people they are fighting against. This obviously isn’t the case in ultimately ideological revolutions like those in France or Russia, which quickly abandoned their theoretically egalitarian positions in order to remove undesirable elements altogether, or the Nazi revolution in Germany, which never pretended to be inclusive in the first place. Hopelessness, whether cynical or Kierkegaard-ishly defiant, is a natural response to depressing times, but the biggest successes of human rights movements – from the abolition of slavery to the enfranchisement of women to the end of apartheid in South Africa to the legalisation in various countries of abortion or gay marriage – have often taken place during eras which retrospectively do not seem especially enlightened; if you believe in something, there is hope.
Rome is a place, but this is mostly about people
But if change is largely driven by mass opinion and group pressure – and it demonstrably is – why is it the individual; Rameses II, Julius Caesar, Genghis Khan, Napoleon, Garibaldi, Lenin, Hitler, the Dalai Lama, Queens, Kings, political leaders – that looms so large in the way we see events historically? Anywhere from three to six million people died in the “Napoleonic Wars” – Napoleon wasn’t one of them, his armies didn’t even win them, in the end; but they are, to posterity, his wars. There i more than one answer, and one has to do with blame, but the short answer is I think because as individuals, it is individuals that we identify with. We have a sense of other peoples’ lives, we live among other people (sounds a bit Invasion of the Bodysnatchers), but we only know our own life, and we only see the world through the window of our own perceptions.
Sara Shamma self portrait
The artist Sara Shamma – who, significantly, has undertaken many humanitarian art projects, but has also done much of her most profound work in self-portraiture – said “I think understanding a human being is like understanding the whole of humanity, and the whole universe” and the more I’ve thought about that statement the more true it seems. If we truly understand any human being, it is first, foremost and perhaps only, ourselves. And, unless you are a psychopath, in which case you have my condolences, you will recognise the traits you have – perhaps every trait you have – in other people, people who may seem otherwise almost entirely different from you. When you look at the classifications humankind has made for itself – good/bad, deadly sins, cardinal virtues – these are things we know to exist because, in varying degrees, we feel them in ourselves, and therefore recognise them in others. Even that most valued human tool, objectivity, is a human tool, just as logic, which certainly seems to explain, to our understanding at least, the way the world works, is a human idea and also an ideal. Interestingly but significantly, unlike nature, mathematics or gravity, human behaviour itself routinely defies logic. When we say – to whatever extent – that we understand the universe, what I think we mean is that we understand our own conception of it. It’s easy to talk about the universe being boundless, but not limitless, or limitless, or connected to other universes as part of a multiverse (though not easy to talk about intelligently, for me), but regardless of what is ‘out there’, what we are actually talking about is all ‘in here’, in our own brain; the universe that you talk about and think about is whatever you think it is, however you perceive it. If what you believe dictates the way you live your life it might as well be, to all intents and purposes ‘the truth’. For Stephen Hawking there were black holes in space/time, and whether or not there actually are, for a creationist there really aren’t, until the day when they impinge on our lives in anything other than a theoretical way.
This is not to say that there are no actual solid facts about (for example) the nature of the universe; but nonetheless to even prove – to us personally while alive – that anything at all continues to exist after our own death is impossible. We can of see that existence goes on after other people’s deaths, but then I can say with what I believe to be complete conviction that there is no God and that human beings are just (well I wouldn’t say “just”) a kind of sentient hourglass with the added fun that you never know how much sand it holds to start with – but that doesn’t change the fact that a whole range of Gods have made and continue to make a decisive difference to the lives of other people and therefore to the world. In that way, whether or not I believe in them, they exist.
self-empowerment
But whereas the above might sound like the background for some kind of Ayn Rand-ish radical individualism, I think the opposite is true; because if any of what I have written is correct, the key part is that it applies equally to everyone. The phrase ‘we’re all in the same boat’ is being bandied about a lot lately for pandemic-related reasons, and it’s only vaguely true as regards that particular situation. We aren’t in the same boat, or even necessarily in the same kind of body exactly, but what we as human beings do all share – broadly – is the same kind of brain. We are all individuals, and If we are conscious, we are probably self-conscious. And given that we live our – as far as we can safely tell – single earthly life as an individual human being, the idea that any of us is powerless during that lifetime is nonsense. When asked to name someone who has made a difference to the world, the first person you think of should be yourself. There would be no world as you know it without you in it, and that is not a small thing; by existing, you are changing the world. Whether for better or worse, only you can say.
Having faith in other people (or even just getting along with them) makes both your and their lives better, but the belief that one particular individual outside of yourself may be the solution to the world’s (or the country’s, etc) problems is worse than feeling powerless yourself. Not only because it can reinforce that sense of powerlessness, but because it’s blatantly untrue and (I hate to use this completely devalued word, but never mind) elitist. Also, it reduces every issue, however complex, to a finite, success-or-failure one, which is rarely how the world works. The idea of the lone hero as saviour probably has about as much validity as the idea of the lone villain as the cause of whatever ills need to be cured. Hero-worship is both logical (because we see the world from the viewpoint of “I”) and also an oddly counter-intuitive ideal to have created, since in reality as we know it, the lone individual may be us, but is largely not how we live or how things work. Human beings have structured their societies, whether on the smaller level of family or tribe, to the larger ones like political parties or nations, in terms of groups of people. But I suppose it is the same humanity that makes us aware of and empathetic to the feelings of others that makes us want to reduce ideas to their black and white, bad vs good essentials and then dress those ideas up in human clothes.
childhood favourites
And so, to books! Reading fiction and watching films and TV, it’s amazing how the larger-than-life (but also simpler and therefore ironically smaller-than-life) hero/ine vs villain, protagonist vs antagonist and – most hackneyed of all (a speciality of genre fiction since such a thing existed, and the preserve or religion and mythology before that) – the ‘chosen one’ vs ‘dark lord’ narrative continues to be employed by writers and enjoyed by generations of people (myself included*), long past the age that one becomes aware of the formulaic simplification of it.
*for people of my generation, the mention of a ‘dark lord’ immediately conjures up Star Wars and Darth Vader/The Emperor, though the ‘chosen one’ theme is thankfully underplayed in the original Star Wars trilogy. George Lucas doesn’t get much credit for the prequels, but making the chosen one becomethe dark lord is an interesting twist, even if Lucifer got there first.
Whatever its origins, it seems that people do want these kinds of figures in their lives and will settle for celebrities, athletes, even politicians in lieu of the real thing. Hitler was aware of it and cast himself in the lead heroic role, ironically becoming, to posterity, the antithesis of the character he adopted; Lenin, who by any logical reading of The Communist Manifesto should have been immune to the lure of hero worship, also cast himself in the lead role, as did most of his successors to the present day; and really, to enthusiastically espouse Marxism and then approve a monumental statue of oneself displays, at best, a shocking lack of self-awareness. The Judeo-Christian god with its demand, not only to be acknowledged as the creator of everything, but also to be actually worshipped by his creations, even in his Christian, fallible, supposedly just-like-us human form, is something of a special case, but clearly these are primordial waters to be paddling in.
Still, entertainment-wise, it took a kind of epic humbling to get even to the stage we’re at now. Heroes were once demi-gods; Gilgamesh had many adventures, overcame many enemies, but when trying to conquer death found that he could not even conquer sleep. Fallible yes, but hardly someone to identify with. And Cain killed Abel, David killed Goliath, Hercules succeeded in his twelve tasks but was eventually poisoned by the blood of a hydra, Sun Wukong the Monkey King attained immortality by mistake while drunk, Beowulf was mortally wounded in his last battle against a dragon. Cúchulainn transformed into a monstrous creature and single-handedly defeated the armies of Queen Medb. King Arthur and/or the Fisher King sleep still, to be awoken when the need for them is finally great enough. These are heroes we still recognise today and would accept in the context of a blockbuster movie or doorstop-like fantasy novel, but less so in say, a soap opera or (hopefully) on Question Time. I knew some (but not all) of these stories when I was a child, but all of them would have made sense to me because, despite the differences between the settings and the societies that produced them and that which produced me, they are not really so vastly different from most of my favourite childhood stories.
Partly that’s because some of my favourite childhood stories were those same ancient stories. But even when not reading infantilised retellings of the Greek myths (I loved the Ladybird book Famous Legends Vol. 1 with its versions of Theseus and the Minotaur and Perseus and Andromeda*) it was noticeable that not all heroes were created equal. There still were heroes of the unambiguously superhuman type (in comics most obviously; like um, Superman), but in most of the books I read, the hero who conquers all through his or her (usually his) all-round superiority was rarely the lone, or sometimes not even the main protagonist. I don’t know if it’s a consequence of Christianity (or just of literacy?) but presumably at some point people decided they preferred to identify with a hero rather than to venerate them. Perhaps stories became private rather than public when people began to read for themselves, rather than listening to stories as passed down by bards or whatever? Someone will know.
*I remember being disappointed by the Clash of the Titans film version of Medusa; too monstrous, less human, somehow undermining the horror for little me
not the original set of Narnia books I had; never quite as good without Pauline Baynes’s cover art
The first real stories that I remember (this would initially be hearing rather than reading) are probably The Hobbit, The Lion, The Witch and The Wardrobe and Charlie and the Chocolate Factory – all of which have children or quasi-children as the main characters. Narnia is a special case in that there is a ‘chosen one’ – Aslan the lion – but mostly he isn’t the main focus of the narrative, Far more shadowy, there are books I was read that I never went back to and read by myself, like Pippi Longstocking and my memory of those tends to be a few images rather than an actual story. As a very little kid I know I liked The Very Hungry Caterpillar and its ilk (also, vastly less well known, The Hungry Thing by Jan Slepian and Ann Seidler in which, as I recall, some rice would be nice said a baby sucking ice). Later, I loved Tintin and Asterix and Peanuts and Garfield as well as the usual UK comics; Beano, Dandy, Oor Wullie, The Broons, Victor and Warlord etc.
The first fiction not reliant on pictures that I remember reading for myself (probably around the Beano era) would be the Narnia series (which I already knew), Richmal Crompton’s William books and, then Biggles (already by then an antique from a very different era), some Enid Blyton (I liked the less-famous Five Find-Outers best), Lloyd Alexander’s Chronicles of Prydain, and Willard Price’s Adventure series. Mostly these were all a bit old fashioned in the 80s now that I look at them, but I tended then as now to accumulate second hand books.
Lloyd Alexander’s Chronicles of Prydain; perfect marriage of author and cover art (Brian Fround and Ken Thompson)Biggles Flies Undone! Very old even when I was young, I bought this book from a jumble sale when I was 8 or 9
There was also a small group of classics that I had that must have been condensed and re-written for kids – a little brick-like paperback of Moby-Dick (Christmas present) and old hardbacks of Robinson Crusoe, Treasure Island and Kidnapped with illustrations by Broons/Oor Wullie genius Dudley D. Watkins (bought at ‘bring and buy’ sales at Primary School). Watkins’s versions of Crusoe, Long John Silver etc are still the ones I see in my head if I think of those characters. More up to date, I also had a particular fondness for Robert Westall (The Machine Gunners, The Scarecrows, The Watch House etc) and the somewhat trashy Race Against Time adventure series by JJ Fortune. This was a very 80s concoction in which a young boy from New York called Stephen, is picked up by his (this was the initial appeal) Indiana Jones-like Uncle Richard and, unbeknownst to his parents, hauled off around the world for various implausible adventures. I liked these books so much (especially the first two that I read, The Search for Mad Jack’s Crown – bought via the Chip Book Club which our school took part in – and Duel For The Samurai Sword) that I actually, for the first and last time in my life, joined a fan club. I still have the letter somewhere, warning me as a “RAT adventurer” to be prepared to be whisked away myself. Didn’t happen yet though. And then there were gamebooks (a LOT of them), which have a special place here because they fundamentally shift the focus of the narrative back to the direct hero-conquers-all themes of ancient mythology, while adding the twist that the reader themselves is that hero.
80s Hollywood blockbuster design comes to childrens’ fiction
There were also books I wouldn’t necessarily have chosen but was given at Christmas etc, books by people like Leon Garfield (adventures set in a vividly grotty evocation of 18thand early 19thcentury London), the aforementioned Moby-Dick, a comic strip version of The Mutiny on the Bounty, a Dracula annual. Also authors who I read and loved one book by, but never got around to reading more of; Anne Pilling’s Henry’s Leg, Jan Mark’s Thunder and Lightnings ( there’s a moving article about this beautifully subtle book here), Robert Leeson’s The Third Class Genie. And then there were also things we had to read at school, which mostly didn’t make a huge impression and are just evocative titles to me now – The Boy with the Bronze Axe by Kathleen Fidler and The Kelpie’s Pearls by Molly Hunter, Ian Serralliers’s The Silver Sword, Children on the Oregon Trail by Anna Rutgers van der Loeff and The Diddakoi by Rumer Godden. What did I do as a kid apart from reading?
Anyway; that’s a lot of books. And in the vast majority of them, the conclusion of the plot relies on the main character, or main character and sidekick or team to take some kind of decisive action to solve whatever problem they have. Heroism as the ancient Greeks would have understood it may largely have vanished, but even without superhuman strength or vastly superior cunning (even the fantasy novels mentioned like Lloyd Alexander’s which do still have the chosen one/dark lord idea at their heart, tend to have a fallible, doubt-filled human type of hero rather than a demigod) there is still the idea that the individual character is what matters.
it’s hard to remember a time I didn’t know these stories
And that makes sense – something like the ‘battle of five armies’ towards the end of The Hobbit is dull enough with the inclusion of characters that the reader has come to care about. A battle between armies of nameless ciphers (think the ‘Napoleonic Wars’ sans Napoleon) would be hard to get too involved in (cue image of generals with their model battlefields moving blocks of troops about, with little or no danger to themselves). Which is fair enough – being in a battle might well feel impersonal, but reading about one can’t be, if the reader is to feel any kind of drama. And maybe this is the key point – reading is – albeit at one remove – a one-on-one activity. Stephen King likens it to telepathy between the writer and reader and that is the case – they think it, we read it and it transfers from their minds to ours. And since reading is something that people seem to think children have to be made to do, often against their will, children’s authors in particular are understandably keen to engage the reader by making them identify with one character or another.
I don’t think it’s a coincidence that the most successful writers for children from CS Lewis to Enid Blyton to JK Rowling (to name just notable British ones) have tended to make children the protagonists of their books and surround their main characters with a variety of girls and boys of varying personality types. Children’s books about children are (I find) far easier to re-read as an adult than children’s books about adults are. As an adult, even JJ Fortune’s “Stephen” rings more or less true as a mostly bored tweenager of the 80s, while his Uncle Richard seems both ridiculous and vaguely creepy. “Grown up” heroes like Biggles, very vivid when encountered as a child, seem hopelessly two-dimensional and childish as an adult; what do they DO all day, when not flying planes and shooting at the enemy?
the unasked-for Christmas present that began a few years of obsessive game-playing
I mentioned gamebooks above and they – essentially single-player role playing games, often inspired by Dungeons and Dragons – deserve special mention, partly just because in the 80s, there were so many of them. There were series’ I followed and was a completist about (up to a point) – first and best being Puffin’s Fighting Fantasy (which, when I finally lost interest in them, consisted of around 30 books), there was its spin-off Steve Jackson’s Sorcery (four books), Joe Dever and Gary Chalk’s Lone Wolf (seven or eight books), Grey Star (four books), Grailquest (I think I lost interest around vol 5 or 6), then quite a few series’ that I quite liked but didn’t follow religiously – Way of the Tiger (six books), Golden Dragon (six books), Cretan Chronicles (three books) and series’ I dipped into if I happened to come across them: Choose Your Own Adventure (essentially the first gamebook series, but they mostly weren’t in the swords & sorcery genre and felt like they were aimed at a younger readership), Demonspawn (by JH Brennan, the author of Grailquest, but much, much more difficult), Falcon (time travel) and Sagard the Barbarian (four books; the selling point being that they were by D&D co-creator Gary Gygax. They were a bit clunky compared to the UK books).
Sudden memory; even before encountering my first Fighting Fantasy book, which was Steve Jackson’s Citadel of Chaos, actually the second in the series, I had bought (the Chip club again), Edward Packard’s Exploration Infinity, which was one of the Choose Your Own Adventure series, repackaged for the UK I guess, or maybe a separate book that was later absorbed into the CYOA series? Either way, there’s a particular dreamlike atmosphere that gives me a pang of complicated melancholy nostalgia when I think of the book now.
lots of books; one hero
Putting a real person – the reader – at the centre of the action ironically dispenses with the need for “character” at all, and even in books like the Lone Wolf and, Grailquest series’ where YOU are a specific person (“Lone Wolf” in the former, “Pip” in the latter), there is very little sense of (or point in) character building. You are the hero, this is what you need to do, and that’s all you need to know. In many cases, the protagonists of the heroic fantasy novels I devoured in my early teens – when I was drawn to any fat book with foil lettering and a landscape on the cover (the standard fantasy novel look in the 80s) – were not much more rounded than their lightly sketched gamebook counterparts. These books often achieved their epic length through plot only; the truly complex epic fantasy novel is a rare thing.
Thanks, presumably, to Tolkien, these plots generally revolved around main characters who were rarely ‘heroes’ in the ancient mould (though Conan and his imitators were), but were mainly inexperienced, rural quasi-children, thrust into adventures they initially had no knowledge of (Terry Brooks’s Shannara series being the classic Tolkien-lite example). But even when, as in Stephen Donaldson’s also very Tolkien-influenced Chronicles of Thomas Covenant, the hero was a cynical, unpleasant modern human being, or in Michael Moorcock’s deliberately anti-Tolkienesque Eternal Champion series, where s/he was a series of interlinked beings inhabiting the same role within different dimensions of the multiverse, the ‘chosen one’ vs some kind of implacable ‘dark lord’-ish enemy theme remains pretty constant. But this underlying core or skeleton is only most explicit in self-consciously fantastical fiction; whether or not there’s an actual dark lord or a quest, in most fiction of any kind there’s a ‘chosen one’, even if they have only been chosen by the author as the focus of the story she or he wants to tell.
Holden Caulfield and Sylvia Plath’s Esther Greenwood have this in common with Bilbo Baggins, Conan the Barbarian and William Brown. But really, what’s the alternative to books about people anyway? Even novels in which people (or surrogate people like Richard Adams’s rabbits or William Horwood’s moles) are not the main focus (or are half of the focus, like Alan Moore’s peculiar Voice of the Fire, where Northampton is essentially the ‘hero’) rely on us engaging with the writer as a writer, a human voice that becomes a kind of stand-in for a character.
classic 80s fantasy cover design
But books are not life; one of the things that unites the most undemanding pulp novelette and the greatest works of literature is that they are to some extent – like human beings – discrete, enclosed worlds; they have their beginning, middle and end. And yet, however much all of our experience relies on our perception of these key moments, that’s not necessarily how the world feels. Even complicated books are simple in that they reveal – just by seeing their length before we read them – the sense of design that is hidden from us or absent in our own lives. Even something seemingly random or illogical (the giant helmet that falls from nowhere, crushing Conrad to death in Horace Walpole’s proto-gothic novel The Castle of Otranto (1764) for example) is deliberate; recognisably something dreamlike, from the human imagination, rather than truly random as the world can be.
What we call history (“things that have happened”) usually can’t quite manage the neatness of even the most bizarre or surreal fiction. There have been genuine, almost superhuman hero/antihero/demigod figures, but how often – even when we can see their entirety – do their lives have the satisfying shape of a story? Granted, Caesar, stabbed twenty three times by his peers in the Senate chamber, has the cause-and-effect narrative of myth; but it’s an ambiguous story where the hero is the villain, depending on your point of view. Whatever one’s point of view in TheLord of the Rings or Harry Potter, to have sympathy with someone referred to (or calling themselves) a ‘dark lord’ is to consciously choose to be on the side of ‘bad’, in a way that defending a republic as a republic, or an empire as an empire isn’t.
Take Genghis Khan – ‘he’ conquered (the temptation is to also write ‘conquered’, but where do you stop with that?) – obviously not alone, but as sole leader – as much of the world as anyone has. And then, he remained successful, had issues with his succession and died in his mid 60s, in uncertain, rather than dramatic or tragic circumstances. The heroes of the Greek myths often have surprisingly downbeat endings (which I didn’t know about from the children’s versions I read) but they are usually significant in some way, and stem from the behaviour of the hero himself. Napoleon, old at 51, dying of stomach cancer or poisoning, a broken man, is not exactly a classic punishment from the Gods for hubris, or an end that anyone would have seen coming, let alone would have written for him. As ‘chosen ones’ go, Jesus is a pretty definitive example, and whether accepted as history or as fiction, he has an ending which, appropriately for god-made-man, manages to fit with both the stuff of myth (rises from the dead and ascends to heaven) but is also mundane in a way we can easily recognise; he wasn’t defeated by the Antichrist or by some supreme force of supernatural evil, but essentially killed by a committee, on the orders of someone acting against their own better judgement. More than anything else in the New Testament, that has the ring of truth to it. A significant detail too for those who want to stress the factual basis of the gospels is that the name of the murderer himself* unlike the nemeses of the ancient heroes, wasn’t even recorded.
* I guess either the guy nailing him to the cross, or the soldier spearing him in the side (much later named as Longinus, presumably for narrative purposes)
And if Jesus’s nemesis was disappointingly mundane, when on occasion, the universe does throw up something approximating a “dark lord” it doesn’t counter them with ‘chosen ones’ to defeat them either, as one might hope or expect. Living still in the shadow of WW2, Hitler’s messy and furtive end, committing suicide when beleaguered and already beaten, somehow isn’t good enough and there are a variety of rival theories about what ‘really’ happened, all of which more pleasingly fit with the kind of fiction we all grow up with. Mussolini was strung up by an angry faceless mob and his corpse was defiled. Hirohito, meanwhile, survived defeat as his troops were not supposed to do, and presided over Japan’s post-war boom to become one of the world’s longest reigning monarchs. The moral of the story is there is rarely a moral to the story. For proof of that, did the ‘heroes’ fare much better? The victors of Yalta lived on to die of a haemorrhage just months later on the eve of the unveiling of the UN (FDR), to be voted out of office, dying twenty years later a divisive figure with an ambiguous legacy (Churchill) and to become himself one of the great villains of the century with a reputation rivalling Hitler’s (Stalin).
Entertainment programs us to view history as the adventures of a series of important ‘main characters’ and how they shaped the world. It’s perhaps as good a ‘way in’ as any – like Frodo taking the ring to Mordor when no human can, or Biggles (almost) single-handedly defeating the Luftwaffe, it makes a kind of sense to us. But the distorted version of history it gives us is something to consider; think of your life and that of (name any current world leader or influential figure; apologies if you are one). If the people of the future are reading about that person, what will that tell them about your life? And what is ‘history’ telling you about really? Things that happened, yes, but prioritised by who, and for what purpose? This is an argument for reading more history, and not less I think. Other people may be the protagonists in books, but in our own personal history we have to take that role.
Artists (and historians too, in a different way) share their humanity with us, and there are great artists – you’ll have your own ideas, but William Shakespeare, Sue Townsend, Albrecht Dürer, Mickalene Thomas, Steven Spielberg and James Baldwin seems like a random but fair enough selection – who somehow have the capacity or empathy to give us insights into human beings other than (and very different from) themselves, but somehow created entirely from their own minds and their own perceptions of the world. But just like them, however aware we are of everyone else and of existence in all its variety, we can only be ourselves, and, however many boxes we seem to fit into, we can only experience the world through our own single consciousness. If there’s a chosen one, it’s you. If there’s a dark lady or a dark lord, it’s also you.
Prelude: Getting older while time stands still
Apparently today, when I’m writing this, is five years to the day since the first (and even then, belated) UK lockdown of the Covid-19 pandemic. A strange thing to have experienced then, it almost feels like a dream now. There was much discussion, online and on TV during that unusually warm and pleasant spring and summer, about how life and the world would be changed by it. Some of that discussion was oddly hopeful, even among the shocking daily mortality figures and news reports about medical facilities in the UK running out of bodybags. I remember those reports about the canals of Venice having fish in them for the first time in decades; I remember the wildlife around here (being in a quiet rural area meant that a daily walk was permitted without being aggressively policed) becoming unusually bold and visible. And though of course business was suffering and various goods becoming ridiculously expensive or hard to find, the world didn’t come to an end.
The thinking was that, since we had proved that the roads needn’t always be choked with cars or the skies busy with air traffic, and that nature bounced back far more quickly than could have been expected, perhaps the key to ecological recovery was within our grasp. But that’s not how things worked out; the second that it was possible to do so, the roads filled up, the airports were busy again and yet somehow it worked out so that all the negativity associated with capitalism went back to normal but prices didn’t.
And now I think everyone who lived through that time is discovering why we grow up learning plenty about the First World War but not about the Flu pandemic that killed more people in its aftermath. Not I think because it was too horrible to talk about or too difficult to put into words, but because when it’s done you just get on with other stuff and before you know it those events have an undramatic sense of unreality hanging over them; too tedious to talk about. I had the feeling during that time that the only way it would have been taken genuinely seriously (lockdown was serious, but people mostly whined about it rather than cowering in their homes) would be if it had been a disease as dramatic and visible and fast-moving as the Black Death. Maybe if people were rotting before our eyes, the dead lying in the streets, there wouldn’t have been all the debate and denial and conspiracy theories; I’m glad we didn’t find out.
Oh well; the pandemic was an experience and, aside from the death and horror, I have to admit I quite enjoyed the strangeness, the empty streets, the quietness, the masks. Not so much queueing outside of shops in single file with 6-foot gaps. But anyway; this was written during that first (by the sound of it, quite relaxed) lockdown, five years ago but feels like it could have been twenty….
“Ane doolie sessoun” covid-19 and the art of isolation
At some point in the late fifteenth century, the poet Robert Henryson (who lived in Dunfermline, not too far from where I’m writing this now), began his Testament of Cresseid with one of my favourite openings of any poem:
Ane doolie sessoun to ane cairfull dyte
Suld correspond and be equivalent.
Robert Henryson – The Testament of Cresseid and Other Poems, my edition Penguin Books, 1988, p. 19
I don’t think I knew, word for word, what Henryson was saying when I first read those lines, but I did get the meaning: essentially that miserable/sad/grim times call for miserable/sad/grim poetry. I guess ‘doolie’ would be ‘doleful’ or ‘dolorous’ a few hundred years later; not sure what it would be now. ‘Cairfull’ sounds far more familiar, but in this case means literally ‘full-of-care’ as in the more woeful sense of caring about things than the casual one we would usually use. The words, with their mixture of strangeness and familiarity (people in Scotland have not talked like that for many centuries, but I think that being attuned to the accents and patterns of speech here probably still makes it easier to understand), stayed with me.
The poet goes on to talk about the weather; apparently it was an unseasonable Lent in Fife that year, when “schouris of hail can fra the north discend/that scantlie fra the cauld I micht defend.” Despite impending climate disaster, Fife weather hasn’t changed beyond all recognition it seems; It was only two weeks ago – though it seems far longer now – that I was caught in a hailstorm myself.
my own photograph from April 2006
The season is still doolie however; not because of the weather, but because of the pandemic sweeping the world, one unlike any that Henryson would have known, but which probably wouldn’t have surprised him too much. One of the key elements he brought to his version of the Troilus and Cressida story in The Testament of Cresseid is its heroine being struck down by leprosy and joining a leper colony. The cover of my copy of his poems (above somewhere) has a drawing from a medieval manuscript, of a figure which would have been familiar to most readers at the time; a leper with a bell begging for alms.
Maurice Utrillo
In fact, with dependable cosmic irony (or if you are less fatalistic, normal seasonal progress), the weather, since #stayhome has been trending online and quarantine officially recommended, has been beautiful here. The streets are fairly, but not yet eerily, quiet. So this particular dyte (the old word that Henryson used referred to his poem, but I think stems ultimately from the Latin dictum that can apply to any piece of writing) may not seem especially gloomy (and may in fact be quite sloppy), but it is certainly careful in the sense that Henryson intended. It’s quite easy – and I think reasonable – to be optimistic about the state of the world in April 2020, but not I hope possible for anybody with any sense of empathy to not be concerned about it.
There are some silver linings to the current situation (major caveat: so far). As well as – inevitably – bringing out the worst in some people, a crisis also brings out the best in many more. And a whole range of major and minor plus points, from a measure of environmental recovery to time to catch up with reading, have emerged. For me, one of the nicest things to come out of the crisis so far is – thanks to social media – the way that arts institutions, while physically almost empty, have begun to engage online with a wider range of people than those who are likely to, or physically able to visit the galleries themselves.
Algernon Newton – The Outskirts of Kensington
It has been said that Edward Hopper is the artist who has captured this particular moment best, and it’s true that his vision of loneliness in the metropolis particularly mirrors our own age of social media and reality TV, in that it is voyeuristic (not a criticism, visual art is by definition voyeuristic). Online, we are not looking at ourselves, or at an absence of people, we are looking at other people whose isolation mirrors our own. If there’s something about this particular pandemic that sets it apart from the Spanish flu of 1918-19 or the great plague of 1665 or the Black Death of 1348-9, or any of the devastating outbreaks of disease that sweep the earth from time to time, it’s that online we are (a ridiculous generalisation perhaps, but if you’re reading this chances are you have internet access at least) sharing the experience of isolation; surely in itself a relatively new phenomenon, at least on this kind of a scale. When Daniel Defoe wrote in his fictional memoir of the 1665 plague (and it’s worth remembering that, although he was only five when the plague swept London, he would have had the testimony of many who had survived as adults as well as whatever shadowy memories he himself had of the period) –
Passing through a Token-house Yard, in Lothbury, of a sudden a casement violently opened just over my head, and a woman gave three violent screeches, and then cried “Oh! Death, Death, Death!“in a most inimitable tone, and which struck me with horror and a chilness in my very blood. There was nobody to be seen in the whole street, neither did any other window open; for people had no curiosity now in any case; nor could any body help one another
Daniel Defoe, A Journal of the Plague Year, 1722, my copy published by Paul Elek Ltd, 1958, p. 79-80
– he was depicting a situation which many people could no doubt relate to; when they read it, long after the fact. What we have now is a sense of shared helplessness in real time; this has never existed, quite in this way before. Assuming some kind of return to normality, we (not entirely sure who I mean exactly by ‘we’) will know each other better than we ever have; which is something to have mixed feelings about no doubt.
Edward Hopper capturing the 2020 zeitgeist with 11 am (1926)
The current appeal of Edward Hopper’s paintings of lonely figures is humanistic and easy to explain. His art, with its depiction of strangers quietly sitting in anonymous places, people who paradoxically we can never know and never know much about, but who we can easily relate to, is profoundly empathetic. It belongs to a long tradition of quiet loneliness or at least alone-ness that stretches back, in Western, art to the seventeenth century and the art of Vermeer (it’s easy to forget, as the children of it, but the idea of art reflecting the individual for reasons other than wealth and status is an essentially Protestant one*) through artists like Arthur Devis (though I’m not sure he intended the quiet melancholy of his paintings though) and Vilhelm Hammershoi (he did). In fact, Hammershoi’s beautiful turn-of-the (19th-to-20th)-century paintings are if anything even more relevant to stay-at-home culture than Hopper’s diner, bar and hotel-dwelling urbanites. With Hopper, we are often watching – spying on – his characters from the outside, as if through a pair of binoculars, with Hammershoi we are shut in with them, like ghosts haunting their silent rooms.
Really, the only ‘lonely’ figures in pre-Protestant European art are Christ himself (think of the utter solitary misery of the crucified Jesus in Grunewald’s Isenheim altarpiece) and of course Judas, or those who that, like him have separated themselves from Christianity. There is a terrifying solitary quality in some depictions of saints during martyrdom, but for their contemporary audience it was essential to bear in mind that they were not spiritually alone (note: this may be a completely false assertion)
Vilhelm Hammershoi – A room at home with the artist’s wife (1902)voyeuristic Hopper: Night Windows (1928)
Hopper’s most discussed and shared works now are those where we seem to catch, as we do from a train window, a momentary glimpse of a life that is utterly separate from our own. It’s a feeling I strongly associate with childhood and (very) specifically, with travelling through Edinburgh in the winter and seeing glimpses of people at windows and the high ceilings in the big Georgian townhouses in the New Town when their Christmas decorations were up. Who were all these people? What were their lives like? Why was this such a melancholy experience? Who knows.
But there are other kinds of Edward Hopper paintings too – including some of my favourites, like Early Sunday Morning (see below) – where the only human presence is the artist, or the viewer, where Hopper could claim (though I have no idea if he would have) like Christopher Isherwood, I am a camera with its shutter open, quite passive, recording not thinking.* But recording, for a human being, is thinking. And the picture of a place-without-people is rarely as simple as it seems; even in the case of an actual photograph. Someone had to be there to photograph it, and had their human reasons for doing so. The tradition of landscape painting exemplifies this; landscapes may be mythical, romantic, epic, realistic, but they have been recorded or edited or invented for a variety of complex human reasons. The landscape painting of earlier eras was often self-consciously beautiful, or psychologically charged (Caspar David Friedrich is the classic example; landscape as a personal, spiritual vision; in some ways his work, with its isolated or dwarfed human figures, is kind of like a romantic-era Hopper), but the fact that the urban landscape is itself an artificial, human-constructed environment gives it a different, poignant (if you are me) dimension.
*Christopher Isherwood, Goodbye To Berlin, in The Berlin Novels, Minerva 1992, p. 243.
Edward Hopper – Early Sunday Morning (1930)
The appeal of the empty urban landscape in art is perhaps hard to explain to those who don’t see it, but I think it’s worth examining. There is a utopian tradition beginning with (or at least exemplified by) the ‘ideal cities’ produced in Italy in the late 15th century that is in a strange way misanthropic (or at least anthro-indifferent) in that the tranquil geometric perfection of the imaginary cities can only be made less harmonious by the introduction of human figures. But it’s also important to note that these cityscapes actually pre-date landscape painting for its own sake in western art by a few centuries. I don’t think it’s much of an exaggeration to say that in the medieval and renaissance period, the urban landscape had a far greater claim to represent paradise than the natural one. The garden of Eden was a garden after all, not a wilderness, and even the word paradise denotes a walled enclosure in its original Persian meaning. We might think now of paradise existing beyond the realms of human habitation, but in ages where the landscape was mainly something perilous to be passed through as quickly as possible on your way to safety, the controlled human landscape had a lot to be said for it.
Ideal City c.1480s, previously attributed to Piero della Francesca
Like the Renaissance ‘ideal city’, the beautiful post-cubist-realist paintings of Charles Demuth have a sense of perfection, where the severe but harmonious geometry of his industrial buildings seems to preclude more organic shapes altogether.
Charles Demuth – My Egypt (1927)
But if Demuth shows an ideal world where human beings seem to have designed themselves out of their own environment, the ideal cities of the Renaissance, with their impossibly perfect perspectives are something more primal and dreamlike; prototypes in fact for the examinations of the inner landscape of the subconscious as practised by proto-surrealist Giorgio de Chirico and his actual-surrealist successors.
De Chirico’s eerie ‘metaphysical’ cityscapes are essentially those ideal Renaissance cities by twilight, and artists like Paul Delvaux used the extreme, telescoped perspectives of the early Renaissance to create their own prescient sense of urban displacement. Why the kind of linear perspective that sucks the eye into the distance should so often be, or feel like, the geometry of dreams is mysterious – one plausible possibility is that it’s the point of view that first forms our perception of the world, the low child’s eye view that renders distances longer and verticals taller; we may be the hero (or at least main protagonist) in our dreams, but that definitely doesn’t mean we dominate them.
Paul Delvaux – Isolation (1955)
The use of isolated human figures, as in Delvaux and Hopper’s work, gives us a ‘way in’ to a picture, something human to either to relate or respond to (although Delvaux – like Magritte in Not To Be Reproduced (1937) – emphasises the loneliness and again the ultimate unknowable nature of human beings in Isolation by showing the figure only from behind), but the cityscape that is devoid of life, or which reduces the figures to ciphers, has a very different appeal.
Rene Magritte – Not To Be Reproduced (1937)
Whereas the unpopulated landscape may suggest a prelapsarian, primordial or mythical past, or an entirely alien realm altogether, empty streets are just that; empty. These are utilitarian environments designed specifically for human beings and their patterns reflect our needs. A meadow or hillside or mountain with no visible sign of human life may be ‘unspoiled’; towns and cities, by this definition, come ‘pre-spoiled’, and the absence of people raises questions that a natural landscape usually doesn’t; Where are the people? What has happened?
That said, nothing about Hopper’s Early Sunday Morning, Algernon Newton’s paintings of Kensington (or Takanori Oguiss’s Paris, or indeed the beautiful photographs of the city in Masataka Nakano’s Tokyo Nobody (2000)) really suggests anything ominous or post-apocalyptic – but even so, the absence of life is the most noticeable thing about them. Whether intended or not, this gives a picture a psychological depth beyond that of a simple topographical study. In still life paintings from the Renaissance onwards, the use of objects with a purpose, for example musical instruments, was always more than just something pretty to paint. Whether the instrument in question was there to express the fleetingness of time (music fades away quickly), discord (a lute with a broken string etc) it was never just an object. And so in the urban landscape, objects with a specific purpose (roads, paths, buildings) apparently not fulfilling that purpose, creates a response as complex as – though very different from – the feeling of looking at those lonely figures in Hopper and Hammershoi’s paintings. Not so different in fact, from the feeling of leaving your home in the spring of 2020 and walking down the deserted street outside.
Takanori Oguiss
These paintings can have a slightly uncanny quality reminiscent of the eerie opening scenes (the best parts) of movies like The Omega Man (1971) and 28 Days Later (2002) or John Carpenter’s classic Escape From New York (1981) where, emptied of people, any sign of life in a city becomes, not a sign of hope, but threatening and full of sinister power. Things will hopefully never reach that point in the current crisis, but as it is, avoiding people in the street is for now the new norm; for the first time I can remember, my natural reserve feels like a plus.
Algernon Newton – In Kensington (1922-3)
Those 15th century ‘ideal cities’ were part of the flowering of the Renaissance, and, as with every other aspect of it, they were the product of people looking backwards as much as forwards. The actual, non-ideal cities that were lived in by the artists who painted those pictures were largely organic, messy, medieval conglomerations, regularly visited by outbreaks of disease. The ideal city’s emptiness is not only harmonious and logical, it’s clean. And like the classical sculptures, bleached white by time and weather which were to prove so influential on that generation of artists, the aspiration is towards a kind of sterile perfection which never really existed until long after the culture that created the buildings and the art, had disappeared to leave just a ghostly husk of its former self.
Algernon Newton – Spring Morning Camden Hill, 1940
The deserted city or townscape more or less disappears from art from the 15th century until the later years of the industrial revolution, when urban life itself became the subject for modern art. And it makes sense; the reversal in European culture which saw city life become perilous and the countryside as a means of escape was a slow one, and the solution (never more than a partial one) was in building programmes, urban renewal and harmonious town planning; Empire building and colonial expansion fuelled the growth of urbanisation and were fuelled by it; to imagine an empty city at the height of Empire was to imagine extinction. If there was any remaining collective memory of empty streets in the late 19th century, it was probably an echo of the kind of scenario that Defoe had written about*; less graced by the muses of harmony than haunted by the dead.
*or of natural disasters like drowned villages, or man made catastrophes like the Highland Clearances.
But by the late 19th century, in Europe, plague was less a current concern than it was a subject for gothic horror, the memory of a memory, and industrialisation had – for those with a measure of financial security – rendered the city (now with drains and public transport) and the country (now sans dangerous animals and medieval lawlessness) on something of an equal footing. For the generation of the impressionists, both city and country could be celebrated, and both (as has been true ever since) could mean escape. But that impressionist cliché, the ‘bustling metropolis’, defined by Baudelaire’s “fleeting, ephemeral experience of life in an urban metropolis” – the hub of modernity, the engine of culture and progress, becomes something else when the streets are empty; it can never just be a collection of buildings.
Maurice Utrillo
Not surprisingly perhaps, it seems that to some degree, the art of the deserted street is a kind of outsider art; Maurice Utrillo was an alcoholic with mental health issues, and although literally based at the centre of the Parisian art scene in Montmartre – because he was born there to an artist mother – he was nevertheless a marginal figure, and his paintings of his home town are heavy with melancholy and isolation.
Similarly, although far less gloomy, the Montmartre paintings of Maria Slavona, a foreigner – a German Impressionist painter living in Paris, are depictions of an urban landscape that, while not hostile, is enclosed and other and (to me) brings to mind the closing lines of Philip Larkin’s Here: “Facing the sun, untalkative, out of reach.” Whether that mood is inherent in the paintings, or only in the mind of the person looking at them, is not something I can answer.
Maria Slavona – Houses in Montmartre (1898)
The German artists of a later generation found a similar sense of alienation at home. The neue sachlichkeit (‘new objectivity’) movement of the Weimar Republic may have been a rejection of the extremes of Expressionism and romanticism, but in its embracing of modernity it was a specifically urban movement too. The teeming street scenes of George Grosz and Otto Dix reflected the often-chaotic street life of Germany’s big cities in the social and economic upheaval following that followed World War One, much as Alfred Döblin’s Berlin Alexanderplatz (1929) was to do in literature, but there were other views of the city too. It was an era of political unrest, but if one thing united the political left and right it was the understanding that they were living in an essentially transitional period; that change would, and must come.
Hans Grundig was the epitome of the kind of artist hated by the Nazi party; politically a communist, he used his art to oppose the creeping rise of fascism but also to capture working class life in the city (in his case Dresden). But in Thunderstorm (Cold Night), 1928, it is the environment itself that condemns the society of the declining republic: the streets are empty and ghostly pale and the buildings, run down and near-derelict, offer little shelter and no comfort. The people whose fate looked uncertain, are nowhere to be seen, but meanwhile, a storm approached.
Hans Grundig Thunderstorn (Cold NIght), 1928
Carl Theodor Protzen – Lonely Street (1932)
Carl Theodor Protzen was, by contrast, an establishment figure; a member of the Association of Fine Artists and the German Society for Christian Art, he was to become a pillar of the Nazi art community. Urban landscapes were his speciality and his depictions of Nazi building projects were to make his name, but just prior to the NSDAP’s rise to power in 1933, he was painting pictures like Lonely Street (1932) that show those same urban landscapes, but without the excitement of progress. Less bleak and doom-laden than Grundig’s city, this is nevertheless an environment which does not embrace or protect humankind; the title reflects the child’s exclusion from the harshly geometric scene in which he finds himself and, although there is no sense of exaggeration, the perspective, as in surrealism, pushes the end of the road ever further into the distance.
This perspective is seen too, in Volker Böhringer’s the Road to Waiblingen, painted in the year that the Nazis came to power. Böhringer, an anti-fascist painter, was later to become a surrealist, and the ominous (blood-stained?) road, stormy clouds and sinister trees suggest that this is (with apologies to Waiblingen) not a road that he saw leading anywhere very pleasant.
Volker Böhringer – the Road to Waiblingen (1933)
Ever since I was a child, I’ve always loved to visualise (usually at night) a real place, say a nearby hilltop or field, as it is at that moment, with nobody except animals and birds there to see or experience it. It’s a strange kind of excitement that depends on not being able to experience the thing you’re excited about. Psychology probably has a term for it, but at a time when people have never been more inescapable (not that one necessarily wants to escape them) there is something appealing about the complex landscapes we have created for our needs, but without the most complex element of all – ourselves – in them.
Whether we enjoy the empty streets or not (and hopefully we don’t have to get too used to them), we should probably take the time to have a good look at what is all around us; it’s a rare chance to see our world without us getting in the way.
Surrealist social distancing: Rue de la sante (1925) by Yves Tanguy
The correct response to the title here is of course it depends who you are and what you did. But anyway; in a February when the big news story was the alarming spread of coronavirus/COVID-19, which history will tell us is either – (a) a pandemic like none seen since the 1918 flu outbreak which killed between 20 and 50 million people (quite a big ‘between’. that) or (b) an unfortunate but quite normal kind of illness which is causing inconvenience and a certain amount of tragedy but is mainly a media frenzy like SARS or Bird Flu, and will blow over soon – it seems a bit like fiddling while Rome burns to talk about music and books etc. But as everyone knows, Nero didn’t really fiddle while Rome burned, and anyway, the big and relatively thoughtful thing I was writing during the Christmas holidays is no further forward and I mainly spent February writing things for other places than my own website, so there it is.
I just finished reading the newest edition* of Jon Savage’s brilliant England’s Dreaming which is as good as any music-related book I’ve ever read and made me realise how many parallels there are between now and the political situation in mid-70s Britain. Up to a point, that is. It would be hard, even I think for a conservative person, to see the victory of Johnson’s Tories as a return to some kind of sensible order in the way that deluded right wingers saw Thatcher’s victory – which did, it has to be said, render somewhat pointless the extreme right wing groups like the National Front & British Movement that had been growing in strength and influence throughout the decade. As with Johnson/the ERG and their wooing of the UKIP/nazi fanbase though, the reassurance that comes from seeing extremist groups losing popularity is soured (to put it mildly) by having people in charge who appeal to that demographic.
*the latest revised edition is from 2005, and is the one to get – the excellent introduction, which addresses the ‘Englishness’ of punk within the wider UK setting, is itself quite dated, though more relevant than ever, and this version also contains a brief summary of that most surprising part of the whole Sex Pistols story – the band’s 1996 reunion.
Reading about punk – especially remembering the very tail end of it in the early 80s (i.e. seeing the stereotypical 80s fashion punks and skinheads and reading THE EXPLOITED/OI!/PUNKS NOT DEAD etc spray painted all over the place) it’s hard to imagine the force the movement had in ’76-7. In my own era, Acid House/rave culture/etc has had an even bigger impact on music and arguably a comparable one culturally, but although it annoyed grownups and upset politicians it was never as deliberately confrontational or as alien and ugly as punk. Its figureheads, insofar as it had any, could certainly be ‘outrageous’ in a way, but Shaun Ryder and Bez swearing on TV was worlds away from the omnipresence of the Sex Pistols in the UK media of the 70s; not least because the Sex Pistols and punk had already happened. Pop stars being obnoxious in the 90s was not a phenomenon – and the Pistols, despite everything, were a recognisable thing – a pop group or rock band.
The public and the tabloids knew about the existence of acid house, and might be alarmed by the ‘acid’ aspect in particular – but as far as signing record contracts, being on TV or playing concerts went, there wasn’t much to report on. An interesting thing about the acid house/rave phenomenon was that, although a musical movement, the music and its makers barely featured in the moral panics that ensued, it was all about the audience. Whether this made it more frightening to the older generation, I don’t know. In the 60s, the Woodstock kids might have been seen as outrageous dirty, drug taking hippies, but maybe the fact that they were being ‘incited’ by Jimi Hendrix, The Who, Country Joe etc in a field (much like the teenagers in the 50s were under the influence of Bill Haley/Elvis etc and punk kids in the streets were being led astray by Rotten & co on TV) gave a clear them/us or leader/followers divide and made it easier to condemn/contain/control them? This is an interesting thing that I should think about more – except that I’m almost certain that there will be a book out there by someone who has thought about it more and knows a lot more than I do about the 90s (the most I can say is ‘I was there’, I wasn’t mostly very interested in acid house etc at the time).
Anyway; certainly the punks were heirs to the hippies (not that they would have welcomed the comparison) in that the visibility of the punk audience (who, whatever their claims of individuality, were clearly – especially by 1977 – dressing in emulation of other punks, of whom Johnny Rotten was the most visible example) marked them out as ‘other’. And made them a target of the authorities, as well as a flag for disaffected kids to rally to. The subtitle of England’s Dreaming – “The Sex Pistols and Punk Rock” is important. The Sex Pistols may not have the strongest claim to have invented punk, but in a sense that isn’t as true for other foundational bands, they were punk; their career trajectory; form band, play shows, cause outrage, record demos, cause outrage, sign contracts, appear on TV, cause outrage, get dumped by label, cause outrage get banned from venues, release singles, cause outrage, release album, cause outrage, split up, have a member die – within the space of around two years, is a microcosm of UK punk. The British punk scene was born with them and it essentially died with Sid Vicious; everything thereafter is either post-punk, second-wave punk or pastiche. Whether embodying a movement is an achievement as such is hard to say and in a way doesn’t matter, but what Savage documents is the way in which a youth movement – one with many and varied influences and antecedents – absorbed and expressed the anxieties of its time and in turn embodied and shaped them.
Away from that book, I’ll keep up with my pick of the most interesting things to be sent my way in February.
Out in April is a reissue of a noise-rock classic from 1995:
Caspar Brötzmann Massaker
Home Southern Lord Recordings Sounding something like The Birthday Party playing noisy free jazz, the Massaker are a brutal guitar-bass-drums (with minimalist vocals) trio; heavy on feedback, tense dynamics and churning distortion, but sometimes almost groovy and (very) occasionally kind of pretty. Home was their fifth album and it’s pretty similar to the only other one of their albums that I know, The Tribe, from 1987. Squally, angular and dark but with insistent percussion, it’s a great palate-cleanser for your ears after too much pop music.
I could say the same about this, very different but equally eccentric record:
Zhenya Strigalev (saxophone), Jamie Murray (drums) and Tim Lefebvre (bass) have made a frankly insane-sounding but weirdly addictive record that at different times reminds me of the John Zorn/Bil Laswell/Mick Harris jazz/grind band Painkiller, Ornette Coleman and King Tubby. But it also has the odd moment of funk, breakbeat and drum-n-bass. Nevertheless it’s amazingly coherent and although at times I thought Murray, Strigalev or Lefebvre was what made it so great, subtracting any one element would make it all collapse. Recording something at once as familiar and peculiar as any song here (‘Guilty Look3‘ is a great example) is a special skill. Disrespectful borrows from everywhere and yet somehow sounds like nothing else – and really that’s just what jazz is all about.
Perchta
Ufång
Prophecy Productions
This Austrian black metal project has a very specific local (Tyrolean) focus, but judging by its Facebook page is the brainchild of Italian ex-pat Fabio D’Amore of symphonic power metal band Serenity; which makes sense – for all its atmospheric/folkish elements (there are some very nice jangly clean parts), this is a theatrical, musicianly album which feels epic and polished rather than dark and brutal. The band’s name refers to a pagan goddess, and throughout the album an odd, witchy narrator pops up declaiming or whispering, who I assume is the woman in the artwork, who the promotional material refers to as “the front woman [who] will sermonize, face-painted in historical black garb with embroidered belt and cast-iron broom …”
Not really my cup of tea overall, which is a shame because I really like the idea of the Tyrolean folklore etc, but it’s extremely well done and has some very good tunes and with the usual excellent Prophecy treatment it will no doubt find its audience.
For as long as I can remember, Christmas has always been commercialised, and there have always been people who complain about it, but I’m not one of them. That commercialisation might have noticeably accelerated, even in the five years since I wrote the article below, but so has the commercialisation of everything else. Whereas probably my favourite traditional British celebration, Bonfire Night, has withered away thanks mainly to the fact that its most marketable feature – fireworks – have not unreasonably become harder to buy and sell, Halloween, which is infinitely marketable, has grown exponentially. But although it’s a far bigger deal – what advertisers have started to refer to as “Halloween Season” – it feels oddly diminished in spirit. That said, I doubt if children now would prefer it to go back to being just a fun night for kids to carve turnips (which now seems hilarious even though I still think turnips are more entertaining and creepy as lanterns) wear crappy home-made costumes and “entertain” strangers to get sweets or (if you were very lucky) money. Far more aggravating is the purely commercial event ‘Black Friday,’ which has no business existing outside of the USA where it’s tied to Thanksgiving, but inevitably arrived here and this year, for the first time (that I’ve noticed) was referred to on TV adverts as “Black Friday season.” That doesn’t even make sense. Ah well. On to Christmas, and the year that I had broken my leg in November.
It’s mid-December as I write this and the farmyards, the stately homes and recently converted stable blocks and steadings of rural Fife are aglow with rustic Christmas ‘Fayres’. Leaving aside the fact that I’m clattering around the cobblestones on crutches, what could be more festive and redolent of Christmas past than the twinkling lights, the smell of trees, the mulled apple juice and stalls selling home baking, handmade decorations, crafts of all varieties? Sitting on a low straw bale – you forget how hard straw is – in the corner of a huge barn, next to an old cable reel (now a makeshift table), it’s easy to forget that this is – to me at least – all relatively new. It feels both very archaic and very now.
Ancient barn walls, lumpy and whitewashed and high above, a new corrugated iron roof, halogen strip lights glowing far more brightly than the dreich pewtery daylight that gleams in the small inset windows overhead. In the barn, a kind of fantasy village square has been erected in four aisles; stalls with red and white striped awnings and, in the main thoroughfare, a huge Christmas tree, wide, dark and feathery in its own little low-fenced enclosure, pulling the huge space in towards itself with outstretched maypole-like tendrils of tinsel, braided with fairy lights, a benign still centre around which a river of human traffic flows noisily.
Woodcrafts, papercrafts, ceramics, chocolate, textiles – this is a handmade version of Christmas that feels traditional and olde worlde (cinnamon and orange slices floating in the apple juice; carol singers rising above the sound of the generators) while also being extremely fashionable and 21st century. It’s a secular Christmas, albeit one that is characterised by both pagan and Christian accoutrements. This Christmas is genuinely welcoming to all, which is I suppose the ‘real meaning’ if there has to be one; really, sharing Christmas with everyone else is the least the little baby Jesus can do for hijacking a pagan winter festival.
Technically there is commerce here as everywhere else, but the tupperware tubs of coins and the brown paper bags and recycled, hand-stamped business cards hardly feel ‘commercialised’ in the He-Man and Transformers sense of the childhood Christmases that I remember (or indeed in the sense of the relentless “Christmas adverts” which unbelievably but cleverly, seem to be a talking point in themselves now. With typical irony, the Christmas industry (I think it can safely be called that) favours the same pre-industrial image of the festival that these self-consciously rustic fayres try to embody). It’s all very twee and, to reinforce that tweeness, I’ll say it’s kind of an upcycled (eye roll) memory of Christmas that almost certainly doesn’t bear much resemblance to anything that the stallholders and organisers at the fayres remember, let alone the executives from John Lewis and Tesco and Lidl. In fact, if anything, it feels like an attempt to create Christmas as it has been depicted in cards and calendars since time immemorial.*
But, as willed and artificial as all this seems, it will one day be the (or at least a) genuine memory of childhood Christmases for all of these kids who are roaming around looking at their phones in the draughty barns and muddy farmyards, looking bored (teens), excited (toddlers) or vaguely bemused and drinking their hot chocolate – and that’s kind of nice.