I’d like to open with a kind of history. This history takes many forms and surfaces in many different places with the names of the actors sometimes replaced. Occasionally, the role of the nation-state is assumed by religion and at other times it is the gods of classical antiquity who take the lead. Regardless of which iteration of this history you have heard, its narrative will be familiar to you for it is a narrative of loss.
Once upon a time, people lived in tribes. These tribes were small social entities made up of a number of different family groups that pooled their resources. Members of tribes lived together, worked together and died together and this permanent state of communion with others made their lives meaningful. Of course, human nature being what it is, tribes could not peacefully co-exist and the tribes soon began conquering each other until their dominion extended over millions of people and thousands of miles of territory. Because these abstract tribal groupings were a lot harder to manage than a couple of families that had been living and working together for generations, tribal elders began reinventing themselves as governments who began to rule over abstract political entities known as kingdoms and principalities then as nations and states. Of course, nation states were never anything more than a way of referring to the territory under the control of one particular government but they stuck around for long enough that people began to forget their tribal loyalties and began to see their nationality as a fundamental fact about themselves, a fact no different to their sex, their gender, their sexuality or their race, a fact that took the form of a noun.
These nouns soon became so important to people that they started to identify with the interests of their local state to the point where they would agree to fight wars in order to further the influence of that state. People even went so far as to start seeing people with different nouns as inherently inferior to them. These nouns may well have been invented to replace older ones that lost their meaning over centuries of tribal conflict, but they also allowed people to enter a state of permanent communion that imbued their lives with meaning. Suddenly, a man in a toga could step foot on an alien shore and know that he stood there as one of the few, the cives romani, those destined to rule the world. The same can be said of the citizens of the Reich, the subjects of the British Empire and the people of the United States of America. All of these states are nothing but collections of nouns but they are nouns big enough to blot out the stars and lend human lives the appearance of meaning. As Ernest Gellner puts it in Nations and Nationalism (1983):
Far from revelling in the defiant individual will, nationalists delight in feelings of submission or incorporation in a continuous entity greater, more persistent and more legitimate than the isolated self.
However, times change and so do nouns and some nouns are now much smaller than they once were. There was a time when ‘Christian’ or ‘American’ were large enough to fill the world but now they seem far too small to fill the void of meaning in our lives. So people began casting around for other nouns to replace the nouns of their parents but this task has been made almost impossible by the fact that our culture now trains us to value individualism, and individualism ensures that just as the old nouns begin to shrink, the new selves expand outwards towards the stars.
Regardless of whether or not you believe this history to be true, the reality is that today’s humans look at themselves and conclude that they are too unique, too complex and too special to ever be summed up with such simple phrases as ‘British’ or ‘Jewish’. People need new nouns… nouns that will both complete them and complete the world.
The problem is that no matter how many nouns we collect and staple to ourselves, none of them ever manage to completely encompass the wonder that is us, and so we collect more nouns with more and more precise meanings:
Why simply be British when you can be English? Why simply be English when you can be from the South? Why simply be from the South when you can be from London? Why be simply from London when you can be from north of the river? Why be simply from north of the river when you can be from Hoxton? Why simple be from Hoxton when you can point out that you were from Hoxton before it got cool? But then, there’s so much more to us than where we are from… It never stops. There are never enough nouns.
The more nouns we collect, the less certain we become of who we are. Nouns were once vast things made of iron and brick, now they are small and fragile things that slide between our fingers like rancid oysters. We are lost, we are alone and we are completely incapable of articulating who and what we are. Clearly, we have come a long way from the man in the toga. This process of definitional inflation is brilliantly demonstrated in Charlie Kaufman’s Synecdoche, New York (2008). The film revolves around the character of Caden who attempts to write and stage an autobiographical play of such brutal and uncompromising honesty and realism that it completely encapsulates the very essence of his being. Decades later and the play’s set has now expanded to the size of a city and the play itself contains not just all the characters in Caden’s life but multiple meta-versions of them arising from his attempt to capture the essence not just of his life but of the process of writing a play about his life and then the essence of writing a play about writing a play. No matter how many actors Caden recruits and how much time, energy and money he pours into the project, the essence of his being remains impossible to encompass.
One solution to this problem would be to simply stop trying to pin ourselves down but to do this would be to acknowledge the inherent meaninglessness of existence and that is simply too much to bear. So, like the artists of yore, we have abandoned the language of nouns is favour of the more ambiguous and evocative language of symbols.
When early Christian theologians began to apply philosophical techniques to Christian text, they rapidly ran into difficulties. The product of Jerusalem did not gel with the product of Athens. While the problem of applying conceptual analysis to religious text might be taken as a sign that the religious texts made little sense to begin with, these early Christian thinkers took the disconnect between theology and philosophy to be an expression of some intimate facet of the divine essence. God, they claimed, was profoundly ineffable and no human concepts or nouns could ever hope to encompass his infinite inscrutability. Because of this intellectual impasse, the theologians developed a set of techniques known as apophatic – or negative – theology. Negative theology functions by accepting that God is indefinable and then attempting to describe him in terms of that which he is not. As the influential apophatic theologian Pseudo-Dionysus the Aeropagite put it in his work on Mystical Theology:
For the higher we soar in contemplation the more limited become our expressions of that which is purely intelligible; even as now, when plunging into the Darkness that is above the intellect, we pass not merely into brevity of speech, but even into absolute silence of thoughts and of words.
Obviously, there is something quite amusing about the fact that while humanity once struggled to define God, they now struggle to define themselves instead. There is a delicious quaintness about our species’ ever-expanding self-esteem but there are also portentous speeches to be made about the dangers of placing man on the same pedestal as his Creator. O Scientist! O Hubris! Oh bullshit… the reason why apophatic theologians opted to talk about what-God-is-not rather than what-God-is is because it is a lot easier to pick something out when it is placed before a uniform backdrop. This is why people choose to openly proclaim their rebellion rather than simply doing things a bit differently.
In the years following World War II, American military might was second to none. Yes, the Russians could probably have fielded a larger army but the Russians did not have nuclear weapons. America’s vast military might was built on the same industrial foundation as the vast standard of living enjoyed by many middle-class Americans. All across the country, thousands of factories turned out the kinds of mass-produced consumer goods that made America seem like the land of tomorrow. However, because the provision of these consumer goods relied upon economies of scale, American consumer capitalism did not allow for much variation from the norm in terms of lifestyle choice. The truth is that if you lived in the suburbs with a husband, two kids, a dog and a huge car then your dreams were being satisfied on a truly industrial scale. However, if your dreams failed to fit within the templates of American industry then you were shit out of luck.
Having grown up in a culture where substantial generational improvements in the standard of living were normal, many young Americans found themselves unimpressed by the lack of control they had over their lives. They felt entitled to control their destiny and to choose their lifestyle and when American capitalism failed to pander to their unique lifestyle choices, they opted to rebel, to grow their hair long, to smoke marijuana, to write poetry, to date outside their ethnicity and to live in apartments in the city rather than houses in the suburbs. At the time, these deviations from the norm seemed intolerable because the entire fabric of Western civilization was built upon the assumption that everyone wanted the same thing and that governments provided these things at reasonable costs then people would pay their taxes and keep their noses clean.
Right from the start, the decision to rebel was all about purchasing decisions. Because members of the counter-culture did not need to wear suits to work, they opted instead to dress in unfashionable but much cheaper second-hand clothes or to make their own outfits. Rather than drive cars into work and buy houses in the suburbs, members of the counter-culture decided to live and travel collectively. While many of these iconic purchasing decisions originated in the practicalities of living on very little money and with very little choice, many of these choices came as a result of a deeper desire for self-definition. After all, if you are not normal… then what are you? Abnormal. And how do abnormal people dress? Well… if you don’t know, you turn on your TV or pick up a magazine and you look at the abnormal people there and you model yourself on them, but in order to model yourself on those people, you need to be able to buy those kinds of clothes. Thus the desire to reject the lifestyle demanded by American industry became a set of needs that could be satisfied by American industry. Today, businesses pander to so many ‘alternative lifestyles’ that the baseline against which the original counter-culture tried to define itself no longer exists outside of TV’s Mad Men. Today, we all live in our own little tribes and we broadcast our membership of those tribes through purchasing decisions that really do not amount to rebellion at all. As Thomas Frank put it in his immortal essay “Why Johnny Can’t Dissent” (1995):
What we understand as “dissent” does not subvert, does not challenge, does not even question the cultural faiths of western business.
Just as our hunger for self-definition drives us towards more and more specific lists of nouns, our attempt to define ourselves in physical terms through the adoptions of certain styles and the purchasing of certain products pushes us to more and more specific exercises in consumerism that businesses are only too happy to respond to.
When one thinks of consumerism, one generally thinks of expensive cars and designer clothes. Indeed, designer goods are expensive and come only in small sizes precisely because they would cease to be desirable if everyone could both afford and fit into them. Exclusivity is part of the allure because to access exclusive goods and services means that, by definition, you are part of a minority and in an age of individualism, everyone wants to be in a minority. While this type of consumerism is particularly obvious in the field of clothing where people willingly pay over-the-odds for remarkably mundane objects such as bras and trousers, consumerism is actually present in every aspect of our lives. For example, open up a fitness magazine and you will find that, while these magazines frequently do give practical work-out and dieting tips, what they really want to do is sell you stuff. The truth is that all you need in order to be a runner is a pair of feet and two working legs but look at a running magazine and you’ll wind up thinking that you also need new trainers, new socks, new shorts and a new Wifi-enabled pedometer that automatically uploads your running data to a website that ranks you by country, by town and by street and allows you to download an app that’ll honk obnoxiously every time you walk past someone who does not perform as well as you. Even wannabe writers have the option of purchasing an array of writing guides and distraction-free writing environments when all they ever actually need is a pen, a piece of paper, an idea and some time.
While the above paragraph may come across as sneering, the truth is that I do not live in the woods and eat bark. I am a human living in a capitalist society and so I have internalised the language of consumerism in much the same way as I internalised English and French while growing up. Indeed, when I want to know the way to the tube station, I express myself in English. When I want to know the way to the metro station, I express myself in French. When I want people to be aware of my fondness for the films of John Carpenter, I express myself using a Pork Chop Express t-shirt. Like everyone else, I feel the hollowness at the heart of existence and I find it much easier to fill that hole with stuff than I do to come to terms with my own ultimate insignificance.
This is not an essay denouncing the evils of capitalism and consumerism. I am not Jack’s smirking revenge and I will not be putting a gun in my mouth in an attempt to kill my charismatic alter ego. This is an essay that attempts to place consumerism in a particular psychological context that can help us to understand certain odd facts about the world, including why people are so eager to jump on social media bandwagons.
Humans have always existed as little islands of subjectivity imprisoned in their skulls and all we ever experience is our selves and our reactions to the world. We never see what other people see, we never feel what other people feel and we never know what other people know. The skull that protects our fragile pink goo from hungry birds also serves as a barrier between us and other subjectivities.
Because we live and die without ever directly experiencing the subjectivity of others, we spend a lot of time trying to imagine how other people feel. In fact, it would not be possible to interact with other people without generating some kind of model of human psychology. This model, referred to by psychologists and philosophers as Folk Psychology throws together everything we intuitively ‘know’ about human behaviour and uses that knowledge to make inferences about what people are thinking and feeling based upon the physical manifestations of those feelings and thoughts. For example, while we can never know how someone feels, we can notice when someone bursts into tears as a result of something we did. Then, using our folk psychological models, we can make a series of inferences that allow us to guess about what just happened:
A) We see that someone started crying after they learned about a particular event and so we can infer that there is a causal connection between that event taking place and this person breaking into tears.
B) We know from our own experience that we only ever break into tears when we feel really bad and so we can infer that this person only started crying because they felt really bad.
C) We can infer from (A) and (B) that this person is feeling really bad because they learned about a particular event.
Unfortunately, given that much of our interaction with other humans takes place through the lens of Folk Psychology, it is easy to become obsessed with folk-psychological data to the point where we forget that the data is only really there as the basis for inferences. This is why people who want to become a particular noun frequently wind-up obsessing over the means of signalling that noun’s dominion over them. This is why people who have worked all their lives to escape the working-class tend to avoid anything that might suggest that they are working-class. This is why people who want to become writers devote themselves to finding the best possible pen, chair or writing environment. Clearly, if people could instantly access our souls and know that we were over-educated film fans then we would not feel the need to wear glasses with chunky black frames and clever T-shirts that refer to John Carpenter movies. Our alienation from other subjectivities has turned us into a race of inverted Sherlock Holmes. Indeed, while Sherlock looked at the ink stains on someone’s fingers, the hunch of their spine and the fraying of their right sleeve and inferred they were clerks, we reason that if we buy the right kind of pen, the right kind of glasses, the right kind of writing guides and the right kind of distraction-free writing environment, we will magically become authors. This is why it is a lot easier to buy books (and post pictures of them on the internet) than it is to actually read them and think about them.
Far from being a problem or a spiritual sickness as some would have us believe, consumerism is a solution to a problem and that problem is the need to inform the people around us of who and what we are: we could just sit them down and tell them or we could become their friends and spend twenty years of our lives allowing them to get to know us but these are fast moving times and we need to express our individuality as quickly and as effectively as possible and that means communicating using the non-verbal symbolic cues of consumer capitalism.
Consumerism worked brilliantly as a form of non-verbal communication as long as people conducted their social interaction in a physical space that allowed them to display their possessions. However, when people started conducting large amounts of their social interaction online, consumerism began to break down as a form of self-expression. Indeed, though Facebook may well be full of people posing in front of their cars, the truth is that the trappings of consumerism really do not translate all that well into the virtual world. This withering of traditional consumerist language is one of the driving forces behind the ever-accelerating move away from physical media and towards live streaming, electronic formats and cloud computing. Indeed, at a very simplistic level, expensive leather-bound books have little value as status objects when nobody ever gets to see them. Similarly, the market for evening wear declined when people stopped dressing up to go out for dinner just as the price of antique dining tables collapsed when people realised that they no longer had dinner parties that required twenty matching dining room chairs.
Back in the early days of the Internet, cultural anthropologists made a big deal about the potential of cyberspace for self-expression. The idea was that, by allowing people to be socially present in a cultural space where they were not physically present, cyberspace allowed people to try-out various identities that had very little to do with their physical beings. In other words, cyberspace allowed men to be women, straight people to be gay and humans to be anthropomorphic wolves with large sets of genitalia. As Gary Larson so wonderfully put it “On the Internet, nobody knows you’re a dog”. However, while noms-de-guerre, fictional personae and distinctive visual avatars do indeed exist in various corners of the Internet, the current trend is very much towards owning your online identity and acknowledging its connection with a real meat-space individual. For example, many bloggers now write under their real name and even those who do not try to maintain a coherent online identity so that that they stand by their words and that their words can be traced back to them. Even people who change their name on Facebook in order to ward off snooping parents, teachers and employers ensure that their real friends know who they are. Far from being an area of playful experimentation, online identity is a very serious matter and this is why people were so happy to hand over vast amounts of information about themselves to Facebook and Google.
While this essay is written very much with social media in mind, I’d like to consider a few other forms of online interaction as a demonstration of certain principles governing online self-expression.
One can express one’s individuality by joining a forum devoted to a niche interest and engaging with that particular community. While this process is most evident in the case of forums that advertise their domain of interest, the same is also true of platforms such as Twitter where important figures in certain communities regularly acquire followings far in excess of the numbers allowing for meaningful interaction in either direction. One imagines that the ‘mentions’ column of Neil Gaiman’s Twitter feed is an incomprehensible blur of obsequious praise and re-tweets and yet people continue to follow him and attempt to interact with him. Why? because they are the types of people that are interested in Neil Gaiman and the type of people that are interested in Neil Gaiman constitute a community with a good deal of overlap with communities interested in comics, films and/or fantastic literature.
Of course, while simple membership of a niche community will allow you to differentiate yourself from the bulk of humanity it does not guarantee individuality, particularly when the niche community is quite large. One means of establishing individuality whilst a member of a particular cultural space is by engaging in a form of de-materialised materialism in which one signals one’s particular fondness for some works over others. This tendency is evident in the lists of people’s likes on Facebook as well as in the popularity in online gaming circles of things called Gamertags that broadcast not only a person’s preferred games, but also their level of proficiency with those games.
Blogs are another form of de-materialised consumerism in that they frequently express (at great length) not only a person’s individuality in terms of the media they consume, but also how they felt about various forms of media and the nuanced types of works that exist within that particular medium. For example, as I say on the ‘About’ page on my blog:
Ruthless Culture is the blog of Jonathan McCalmont, a freelance critic living in the UK. Aside from writing about films, books, comics and games for a number of magazines and websites he also serves as a festival scout for film distribution companies. When not hiking he maintains this blog which, though mostly given over to film-writing is also a venue for endless complaints about the wretchedness of the human condition.
Even if you never meet me, this paragraph allows you to distinguish me from other people. The look of the blog is also part of that distinguishing process as is the choice of what I choose to write about. Based on what is currently on the front page of my blog, I am the kind of guy who writes about films, contemporary culture and (of course) myself. Read a little further and you’ll learn that I’m the guy who hates the work of Connie Willis but loves the work of Michael Bay. Each of these judgements issued in the form of reviews and articles allow me to express my feelings about certain works, but they also allow me to express myself in a way that carves me out from everyone who isn’t me. I could invite you to my house and show you my DVD and book collections but my blog allows me to be an individual without ever having to interact with other people in meat-space. In fact, my blog is a far more effective form of self-expression than conspicuous consumption could ever hope to be.
It is interesting to note that most blog posts illicit little or no reaction at all. People spend hours at a time expressing their opinions about works of art and these remarks seldom prompt any form of discussion. They are posted and then promptly lost to the waves in a constantly churning digital sea. The pathetic futility of such an undertaking is difficult to understand until one realises that maintaining a blog really does not require any form of audience support or participation. If people respond and want to discuss a particular opinion then that is great and if that opinion earns its creator a much wider audience and a level of online celebrity then that is even better but most bloggers work in almost complete obscurity because the point of the exercise is not to become famous or respected, it is to express oneself and one can express oneself quite happily even when nobody cares about that expression.
See my blog.
See my nouns.
See me.
To describe blogging and social media as self-indulgent may well be both high-handed and dismissive but it is also true as social media, blogging and online social interaction are all about indulging the very human need to express one’s individuality. In fact, most human activities boil down to more or less circuitous exercises in pointless self-expression. I’d even go so far as to suggest that seeing political decisions as a form of self-expression is far more fruitful than traditional models of internal relations that assume both rationality and strategic forethought. For example, what were the Afghan and Iraq wars if not political actes gratuit designed to signal America’s power and willingness to fight in the aftermath of a humiliating failure to defend American citizens? Similarly, if you examine the history of nuclear proliferation you will find that the major driving force behind the development of nuclear weapons by countries such as France, Britain and Russia was the need to not be left behind and to retain a place at the ‘top table’ of international affairs. Mutually Assured Destruction came later. Much later.
An interesting explanation of this need for perpetual self-expression can be found in the works of the French philosopher Jean-Paul Sartre. Sartre argued that human beings are fundamentally empty. Unlike inanimate objects such as tables and chairs, a simple noun cannot encapsulate our essence, a single word or sentence cannot hope to contain everything that we are and everything that we will be in the way that ‘chair’ encompasses the complete essence of a chair. We cannot live safe under the protection of a single noun because we lack the type of fundamental essence that every inanimate object possesses at its creation. As Sartre himself put it, we are beings for themselves trying to pass for beings in themselves, we are nothings trying to pass ourselves off as somethings. However, rather than recognising this hollowness at the centre of our being, we try to squeeze ourselves into an array of culturally sanctioned and industrially pre-rendered identities. Sitting in a Parisian café, Sartre recoiled in horror at our tendency to display what he thought of as ‘inauthenticity’:
All his behaviour seems to us a game. He applies himself to chaining his movements as if they were mechanisms, the one regulating the other; his gestures and even his voice seem to be mechanisms; he gives himself the quickness and pitiless rapidity of things. He is playing, he is amusing himself. But what is he playing? We need not watch long before we can explain it: he is playing at being a waiter in a café.
The alternative to acting out these pre-rendered identities is to be authentic but existentialist philosophers such as Heidegger and Sartre tended to be rather sketchy when it came to defining precisely what it is that they mean by that term. As with the apophatic theologians, this tendency towards hand-waving could be chalked up to intellectual dishonesty but it could just as easily be chalked up to being a symptom of the wider problem of engulfing the self in language. Indeed, in his book In Search of Authenticity (1995), Jacob Golomb points out that authenticity is a concept that exists at the very limits of human language and so it is much easier to express authenticity in terms of its opposition to inauthenticity because we all know when someone is being inauthentic. When we act out pre-rendered roles (like the waiter that Sartre so despised) we are being inauthentic and acting in bad faith, we are surrendering our freedom to external influences and allowing those influences to determine who and what we are.
Like most existentialists, Sartre took a very negative view of inauthenticity. In his essay Existentialism is a Humanism (1946), Sartre likened inauthenticity to a form of cowardice whereby people glimpse the nothingness at the heart of the human self, realise the freedom this grants them and flee towards the safety of socially-sanctioned pre-rendered identities such as that of a waiter, a writer or a runner. One of the many unfortunate side-effects of the under-representation of women in the history of philosophical thought is a tendency towards what can only be called ‘macho posturing’.
Provide a man with a yardstick by which he can measure himself and that yardstick will soon be broken upon the back of someone that that man deems to be inferior to him. Authenticity might well be a complex and nebulously defined concept but that has not stopped male readers of philosophy from using it as a symbol of personal integrity. To this day, when men read Nietzsche’s Thus Spoke Zarathustra (1885), they do not identify (as Nietzsche suggests they should) with the rope that stretches from beast to superman, they identify with the superman and look down their noses at the pitiful mortals who exist within the safety of boundaries created by bourgeois society. Clearly, Jamie Lee Curtis’s character in A Fish Named Wanda (1988) was quite correct when she pointed out that apes can indeed read philosophy, they just tend not to understand it.
Inauthenticity is not a weakness, a sickness or an example of cowardice. It is humanity’s natural state… it is what we do. When confronted by the meaninglessness of existence and the gaping void that exists within every one of us, we try our best to fill the void by any means necessary. Even Necrophilia.
Every now and again, my Twitter and RSS feeds erupt with news of the death of some more-or-less beloved celebrity. The closer I am to a particular community, the more hysterical the pronouncements of grief become. Particularly gruelling was the weekend following the death of the polemical journalist Christopher Hitchens. Some claimed that ‘Hitch’ had changed their lives, others claimed that he was one of the finest ever writers of the English language. Frankly, I didn’t get it. Despite being an atheist I thought that God is Not Great (2007) was facile rabble-rousing devoid of any fresh insight and my interest in foreign affairs and international relations in no way allowed me to distinguish Hitchens’ views on the War on Terror from those of any other demented, ill-informed right-winger with racist tendencies. I do not think that Hitchens was much of a writer or much of a thinker and yet I completely understand why people felt the need to praise him in the way that they did.
In an articlethat does little to conceal its author’s frustrations with social media, Glenn Greenwald argues that the praise heaped upon Hitchens is a result of the same failure to differentiate natural decorum following someone’s death from the highly politicised process of cultural beatification. After going over Hitchens’ failures in political judgement in some length, Greenwald concludes with an anguished howl:
Nobody should have to silently watch someone with this history be converted into some sort of universally beloved literary saint. To enshrine him as worthy of unalloyed admiration is to insist that these actions were either themselves commendable or, at worst, insignificant. Nobody who writes about politics for decades will be entirely free of serious error, but how serious the error is, whether it reflects on their character, and whether they came to regret it, are all vital parts of honestly describing and assessing their work. To demand its exclusion is an act of dishonesty.
Nor should anyone be deterred by the manipulative, somewhat tyrannical use of sympathy: designed to render any post-death criticisms gauche and forbidden. Those hailing Hitchens’ greatness are engaged in a very public, affirmative, politically consequential effort to depict him as someone worthy of homage. That’s fine: Hitchens, like most people, did have admirable traits, impressive accomplishments, genuine talents and a periodic willingness to expose himself to danger to report on issues about which he was writing. But demanding in the name of politeness or civility that none of that be balanced or refuted by other facts is to demand a monopoly on how a consequential figure is remembered, to demand a license to propagandize
In truth, I suspect that Greenwald is just as guilty of conflation as the people he criticises. Indeed, while Reagan’s on-air canonisation was indeed quite aggressively policed in the same manner as I expect Margaret Thatcher’s to be, I have yet to encounter anyone who says that we should not be critical of Hitchens because he is dead. Greenwald is conflating the canonisation of Reagan (which was policed for political reasons) and the canonisation of Hitchens (which was initially universal but in no way policed). Needless to say, I completely agree with Greenwald that sympathy for the dead, their family and the people emotionally attached to that figure should in no way prevent us from judging their legacy with either dispassionate objectivity or full-throated political agency. However, I do not see this process of grief inflation as in any way recent or negative.
Back in 2004, the British think tank Civitas published “Conspicuous Compassion”, a report into the unprecedented public outpouring of grief that followed the death of Princess Diana. In his somewhat ill-tempered introduction, the report’s author Patrick West claims that:
Such displays are sheer opportunism. They do not reflect, as some contend, that Britain has thankfully cast off its collective ‘stiff upper lip’. They are the symptoms of a cynical nation. To judge by the ‘outpourings of grief’ over Diana in August 1997, one would have thought her memory would have remained firmly imprinted on the public’s consciousness. Yet, on the fifth anniversary of her death in August 2002, there were no crowds, tears or teddies. Diana had served her purpose. The public had moved on. These recreational grievers were now emoting about Jill Dando, Linda McCartney or the Soham girls.
West argues that, far from expressing genuine empathy, these public displays of concern are selfish forms of self-expression. By wearing the coloured bracelets of charitable organisations, we are not showing our concern for these causes so much as we are broadcasting the fact that we are the kinds of people who care about AIDS, famine, land-mines, homeless people and war veterans. Similarly, public outpourings of grief are all about displaying our capacity for empathy.
The performative and self-expressive nature of public displays of grief and outrage is particularly evident in social media where people aggressively voice their feelings in a way that they would never do in real life. People wail and gnash their teeth over the death of celebrities despite the fact that said celebrities were frequently quite elderly and/or ill. A salient example of this type of thing took place at a science fiction-related talk I attended a little while ago. During the conversation, someone announced in passing that the veteran science fiction author Philip Jose Farmer had recently died. One man in the front row reacted to this news by literally shouting “Oh No!” and then muttering about how sad it was. At the time of his death, Philip Jose Farmer was 91 years of age. Was the man on the front row genuinely upset about the death of a decidedly ancient science fiction author he had never met? Possibly. Did he want everyone in the room to know that he was the kind of person who cared deeply about Philip Jose Farmer? Absolutely.
The performative aspect of online displays of emotion is also evident in those moments when someone manages to get on the wrong side of a hot-button issue. What usually happens is that people will encounter this differing (and possibly offensive) opinion and begin linking it for other people to go and look at. Once the traffic to this opinion reaches a certain point, people begin reacting to it by angrily sniping at both the opinion and its author regardless of whether they know the author or the context in which they made their remarks. Forwarding these kinds of links and expressing your disgust of the person who made the remarks serve an important social function as they allow you to broadcast your ideological and emotional compatibility with a certain online community (‘I’m with you guys’) but it also allows you to emphasise the distance in cultural space between you and the bullied individual (‘I’m closer to you guys than that guy is’) as well as derive kudos from the power of your attack upon the individual (‘I deserve to be one of you guys because I really got him good’).
Another reason why people get so very publicly upset about the death of certain celebrities is that humanity has a long tradition of using the dead as a means of identifying the living. Indeed, one of the most basic ways in which humans express themselves is through the public exclamation of their ancestry. To this day, people from Iceland bear patronymic (and occasionally matronimic) surnames. When Magnus Magnusson became a TV celebrity in the UK, his name not only identified him but also exclaimed the fact that he was the son of a man who was also named Magnus. Similarly, when I go out into the world under the name ‘Jonathan McCalmont’, I do so knowing full well that this name locates me as part of a wider family, a family whose nature and history could help to define my identity if I moved in circles where my family was well known. What these public displays of grief demonstrate is that ancestors need not necessarily be linked to us by blood. Robert Pogue Harrison reflects upon the phenomenon of ancestor adoption in his book The Dominion of the Dead (2003). According to Harrison:
To some of our ancestors we are condemned by blood, race, and cultural history. We can hardly avoid their claims on us, or make as if such claims didn’t exist, for we are always – whoever we are – thrown into our biocultural ancestry. While it does not dispense us from the claims of our tribe, authenticity opens up the freedom of ancestral adoption, as it were, in and through which we turn ourselves into the elective heirs of non-consanguine predecessors. An adopted ancestor is more than a hero whom one aspires to emulate, more than a forerunner whom one has read, studied, admired, or been influenced by. He or she is someone who fosters the becoming of who one already is, someone whom one “makes one’s own” through elective affiliation and unconditional allegiance.
The example Harrison gives is of Baudelaire’s decision to adopt Edgar Allan Poe by means of offering up prayers to him every night alongside Baudelaire’s father and governess. Of this strange act of adoption Harrison concludes:
Personal choice, spiritual kinship, and the fate of circumstance all conspire to bring about unpredictable lines of descent. In effect, one never knows who one’s descendants will turn out to be once one is dead, for to be dead means to give oneself up to the possibility of ancestral adoption. Could anyone have imagined, before it happened, that the elective affiliation of a poet like Baudelaire would turn Poe into the father of French modernism and give him an afterlife in France?
When people go online and express rage and sadness at the death of celebrities and the state of certain political causes, they are not merely being selfish or monstrously pious as Patrick West suggests, they are expressing their individuality by adopting certain media figures. Indeed, when people expressed sadness about the death of Christopher Hitchens, they were not simply being sentimental or even broadcasting the fact that they are the kinds of people who are upset by the death of controversial literary figures, they were expressing their deep kinship with the ideas and values associated with that figure.
In his experimental hybridisation of novel and short-story collection The Atrocity Exhibition (1969), J.G. Ballard explored the way in which the process of becoming a celebrity transforms real-life people into immaterial collections of symbols that bear very little resemblance to their physical source material. The infamous short story “Why I Want to Fuck Ronald Reagan” features elements of a bizarre scientific study in which images of Ronald Reagan are juxtaposed with images of death and machinery for the purposes of measuring sexual stimulation:
Multiple-track cine-films were constructed of ‘Reagan’ in intercourse during (a) campaign speeches, (b) rear-end auto-collisions with one- and three-year-old model chances, (c) with rear-exhaust assemblies, (d) with Vietnamese child-atrocity victims.
What inspired Ballard to write this piece was the fact that, while Reagan’s public persona was that of a charmingly old-fashioned but ultimately harmless old duffer, he was in fact the mouth-piece for some of the 20th Century’s most viciously right-wing policies, policies that made poor people poorer and nearly bankrupted the country as part of a demented game of financial chicken with the Soviet Union. The disconnection between Reagan’s public persona and the reality of his political legacy proves that ‘Reagan’ is a media construct that is quite clearly distinct from Ronald Reagan the actor and politician.
When people such as Glenn Greenwald complain about people’s failure to adopt a nuanced attitude towards famous people with decidedly ambiguous legacies, they are missing the point at a very basic level. Christopher Hitchens was a jobbing writer who championed atheism at a time when it was barely tolerated in American public life. A creature of emotional immediacy rather than sustained thought, he frequently changed his political spots right up until he made the terrible error of championing a war that has long-since been shown to be a colossal waste of life, resources and political capital. Christopher Hitchens’ decision to champion the War on Terror and argue for the destruction of “Islamofascism” did him no credit and yet, it in no way tarnishes the value of ‘Christopher Hitchens’ as a totemic figure of intellectual contrarianism, robust secularism and literary panache. People may have thought they were celebrating the life of Christopher Hitchens, but in truth what they were doing was adopting ‘Christopher Hitchens’ as a spiritual ancestor, as a collection of disembodied nouns that lend meaning to life and depth to personal identity.
Having taken you from the rise of civilisation to the cutting edge of human cultural flourishing (and blogging), I’d like to end this piece by returning us to the place where it all started; the long-shadowed world of early humanity.
One of the most fascinating books I have read in recent years is Nicholas Roddick’s book on prehistoric fiction entitled The Fire in The Stone (2009). While much of the book is given over to quite dry analyses of the tropes that comprise prehistoric fiction and more detailed analyses of works by Jean M. Auel and Jules Verne, the book’s discussion of William Goldman’s The Inheritors (1955) contains an idea that cuts to the very bone of what it means to be human at the start of the 21st Century.
The Inheritors opens on a group of primitive individuals living lives of absolute simplicity; they eat, they love, they bury their dead and all of these things are done in complete harmony with the natural world. We sympathise with these people because they seem to embody everything that humanity wishes it could be. Sadly, the lives of these primitive people are first disrupted and then brutally ended when a different group of technologically advanced people venture onto their territory. Intelligent, individualistic, cruel and brutal, these New People embody everything that humanity is because they are in fact early humans while the first group of people are revealed at the end of the book to be Neanderthals.
Goldman’s depiction of early humans as a race of violent and paranoid savages obviously draws upon the same vision of human nature explored in his most famous work Lord of the Flies (1954). However, while Lord of the Flies remained quite content to show us what happens when humanity divests itself of its hypocritical veneer of civilisation, The Inheritors engages in a far more subtle and complex play of sympathies. Reading the book, we are trapped between a desire to sympathise with the Neanderthals and the tendency to recognise ourselves in the New People’s decision to attack the Neanderthals out of fear of their Otherness. On the one hand, we do not want to sympathise with the Humans because they are the ‘Bad Guys’ but, on the other hand, we cannot completely sympathise with the Neanderthals because their moral perfection is ultimately quite inhuman. To read The Inheritors is to be trapped between acknowledging what we are and grieving for that which we can never be.
As Roddick explains, The Inheritors was written in the aftermath of the Second World War, a war that made humanity’s true nature absolutely clear. By playing around with our sympathies, Golding is suggesting that while we must acknowledge our kinship with the Nazis and the genetic legacy of the savagely intelligent New People, we can also empathise with the victims of humanity and work towards becoming that which we are not. The goal of prehistoric fiction, according to Roddick, is to ‘Hominize’ that which is factually not human. What he means by this is that prehistoric fiction allows us to speculate about human origins in such a way as to reclaim traits possessed by people previously deemed inhuman for reasons that were entirely cultural and entirely of their time:
By telling ourselves stories about people who never existed and events that never happened we can generate empathy that is capable of embracing both sexes, all peoples, even all living things, and express via the fiction the truths of a higher order that are obscured by local or contingent circumstances.
Roddick’s Hominization is actually a variation on the idea of ancestor adoption described by Robert Harrison. The facts of human nature are evident from our collective history of guilt and bloodshed and yet, while we can never completely escape that burden, we can lighten it by choosing to adopt non-genetic ancestors whose totemic power makes them ideal vehicles for broadcasting what we want other people to know about us.
By having us sympathise with Neanderthals, Golding was not making some point about the true genetic origins of contemporary humanity. Rather, he was saying that these Neanderthals embodied so many of the things we wished were true about ourselves that they were, in some sense, our spiritual (if not genetic) ancestors. By adopting the Neanderthals and thereby Hominizing them, Golding was saying that the values of the Neanderthals do in fact live on in those Humans who did not side with the Nazis and who do not go about murdering people.
Much like the process of Hominization, the process of ancestor adoption allows people to broadcast basic facts about themselves without having to resort to detailed lists of nouns and adjectives. When Baudelaire claimed Edgar Allan Poe as an ancestor, he was saying that there was something about Edgar Allan Poe that captured what it meant to be a French modernist writer. When people publically proclaim their love of dead celebrities, they are doing precisely the same thing. They are saying that there is something about Christopher Hitchens or Michael Jackson or Oliver Postgate or Ray Bradbury that captures what it means to be them.
Like Vikings bearing their family legacy in their names, people broadcast facts about themselves through a process of adoption. However, unlike the Vikings, these adopted ancestors need not be the real people behind the mediatised façade. Of course Christopher Hitchens was no saint, but then neither were the Neanderthals and the actual writings of Edgar Allan Poe had very little to do with the precepts of French modernism. However, like the figure of ‘Ronald Reagan’ in Ballard’s short story, each of these figures possess a totemic power that people can use to broadcast both their individuality and their essence as they see it at a given time.
In some cases, this process of adoption can take on a spiritual dimension as people who describe themselves as Other- or Otakukin adopt animals, fantastical creatures and fictional characters as totemic markers for their individuality. These people do not see themselves as like the characters in Tolkien’s Lord of the Rings of the popular video game Final Fantasy VII, they see those fictional entities as part of their spiritual nature. Some legacies are memetic and not genetic. I would even go so far as to argue that this process of adoption lies at the very heart of Christianity as millions of people look to the historically dubious story of a man on a cross to express some fundamental fact about them and how they see the world. It simply does not matter that Otherkin are not actually werewolves, or that Ronald Reagan did not cause people to spontaneously ejaculate, or that Jesus did not get resurrected from the dead or that Christopher Hitchens’ views on atheism were actually quite asinine. The point is that all of these figures are symbols and symbols are how we choose to embody ourselves in social space.
Last year, I attended my father’s 65th birthday party. My parents got divorced when I was quite young and while I have a good deal of affection for my father, our relationship has never been all that close. In fact, when I attended my father’s birthday party it was the first time I had set foot in his home as an adult. As I wandered around the house, I was assailed by the trappings of a heritage which, though mine by blood, means very little to me. At the top of the stairs stood a large oil painting of my great-grandfather. The man is holding a severed fox’s head in his hands. My father surrounds himself with family history because he feels that it communicates some very basic facts about who and what he is. My father lives as a McCalmont and he looks to the history of the McCalmonts as a means of making his life meaningful, they give him a sense of place.
As I stood before the image of a man proudly holding aloft a severed head, it occurred to me that family heritage is but one of many techniques used by humans to express their individual identities. This technique is really no different to people expressing themselves through their purchasing decisions, their writing, their adoption of dead celebrities or their exaggerated outrage at the moral panic of the day. Humans are and have always been hollow and this is why they spend so much time and energy externalising what they hope is really inside. It is one thing to think of oneself as a writer, it is quite another to be acknowledged as a writer by others because that which is inside our heads is fluid, abstract and barely real whereas that which is out in the world is concrete, fixed and true.
We are fragile creatures and inauthenticity is our only true birth-right. This, I suspect has always been true but we notice it more today because we have both the education and the spare time in which to realise it. As hollow beings in a hollow world, we have grown adept at telling stories that paper over the gaps and lend life the touch and smell of meaningfulness. The opening story of this essay in which a fictional humanity lost its way and began to cast about for new nouns is one such story as it is a good deal easier to cope with the idea that we are hermeneutic orphans than it is to deal with the idea that all of our meanings are and always have been fiction. Because we are lost and have no idea of how to move forward, we find comfort in the origin myths that we create for ourselves; myths of identity, myths of belonging, myths of spiritual and psychological kinship. With every fibre of our beings, we hurl our identities into the world like messages in bottles, hopeful that someday and somehow that metaphorical bottle will reach another island of subjectivity and that the inhabitant of that island will recognise you for who you would like to be. It is only in the eyes of others that we acquire some semblance of form.
I just wanted to add that in terms of your opening on nationalism and identity, an excellent source would be Benedict Anderson’s Imagined Communities, a rather insightful and amazing work that articulates how and why a particular 20th century version of nationalism has come about.
Secondly, another book to look at would be Eric Hobsbawm’s The Invention of Tradition, which is more political than sociology but is still a fascinating and important work.
LikeLike
Hi Matthew :-) I’ve had both of those books on my want list for ages but have been shying away from getting them until I had cleared some of the initial ground. I expect I shall be reading both of them quite soon. Thanks for the tip though!
LikeLike
Excellent piece. Ties in with a lot of my thoughts on authenticity-as-false – admittedly much of which is influenced by a speech in the Warren Ellis comic book Doktor Sleepless!
Have you read the sociologist Adam Possamai on the subject of \’hyper-real religion\’? A lot of parallels there, especially in his consideration of such beliefs as Jediism & Otherkin. A good intro to his ideas in this interview:
http://www.theofantastique.com/2007/10/31/adam-possamai-jediism-matrixism-and-hyper-real-spiritualities/
LikeLike
[…] Why You Want to Fuck Christopher Hitchens – Celebrity … […]
LikeLike
Jonathan,
You covered a lot of ground here, so much ground, in fact, that I’m not sure that the beginning and the end meet in the middle. This is not necessarily a shortcoming, it just leaves a lot of room for discussion.
If the “narrative of loss” at the beginning has any truth to it, wouldn’t that mean that a sort of prehistoric authenticity is our birthright? If early humans used to make meaning by being more connected to one another (not to mention being connected with their habitats), perhaps modern humans have the potential to do the same.
Maybe Sartre was wrong about chairs. I have a fair bit in common with the old, wooden, adjective-laden chair I am sitting in right now. At the very least, the chair and I are similar enough states of matter that we do not immediately pass through one another the way billions of neutrinos are allegedly passing through us both at this very instant. The chair is made out of wood, and the wood came from a tree, and I have quite a lot in common with a tree. I have even more in common with the craftsman who constructed the chair, and with the other people who have sat in this chair. I can assume that this chair has a more-or-less detailed and particular history, just as I do. It would be shortsighted and presumptuous of me to claim that the mere word “chair” somehow encapsulates its “essence”. It might be convenient for me to think of it as nothing but a “chair” inasmuch as I want for something to sit on, but that is only a tiny fraction of the possible information contained in this real, instantiated, embodied object. So I don’t think that Sartre’s distinction holds up when considered in that light.
Now I suppose he was getting at the idea that the chair has a well-defined purpose (indeed is defined by its purpose) while the human does not. I don’t know what the chair considers its purpose to be, but I can consider it to have the purpose of being sat upon, and thereby imbue it with purpose and “essence”. Likewise another person who does not know me well might consider me, from his perspective, to be little different from an object, to be used to achieve some end. Were I employed in the lower rungs of a large and hierarchical business organization, my boss’s boss’s boss would almost be forced to think of me as such. My essence, to that executive, would be similar in nature to that of my chair. I would be encapsulated by my job title, my salary, and perhaps some rudimentary performance metrics. From this remote perspective, I would be quite purpose-full, hardly empty and hollow at all. Only from the inside looking in can I see the hollowness (if any).
This leads to the main issue, that of self-awareness and its connection to inauthenticity. I’ll grant that the chair is not aware, and is thoroughly incapable of being inauthentic. My cats are aware, but they fail the mirror test, so I presume they are not self-aware in the same way I am. They can manage only the crudest sort of inauthenticity, displaying “affection” in proportion to the length of time since their last feeding. They can put on a performance, but it’s merely one that reminds me that they are present and that we are on good terms, not one calculated to inform me that “I am this cat and not the other.”
I, on the other hand, have a relatively advanced capacity for inauthenticity. I can put on a performance calculated to persuade someone else that I am someone other than who I am. This requires a degree of self-awareness, because I have to perceive myself in order to infer how the other person will perceive me. Unlike the cats, who learned their food-requesting performance through simple operant conditioning, I have a theory of mind, and use folk-psychology to predict how others will receive my performances. I create an identity that is performative, and when I have no one else for whom to perform, I create an identity by performing for myself. One does not climb the mountain “because it’s there”, one climbs the mountain to demonstrate to oneself “what’s in here”. Research in personality psychology suggests that much of the conscious information people have about their own traits comes not from introspection, but from observing the self in action and making inferences about the actor. If we perform inauthentically, we’re deceiving ourselves, and Sartre didn’t like that at all.
I gather that this is why you say that inauthenticity is our birthright as humans. Self-awareness, one of our more distinctly human characteristics, is necessary for performative identity-creation, and the identity-creation might be necessarily inauthentic if we’re really ultimately hollow, just trying to paper over the inherent meaningless of our lives. It’s terribly difficult if not impossible to create authentic identities, we’re always trying to create identities (as our individualistic culture demands) and this leads to lots of inauthenticity.
However, you have already acknowledged that this isn’t the way it has to be. That it is this way is a artifact of our historical and cultural context. It’s not built into the fabric of the universe. Why can’t we have self-awareness without obsessing about individual identity? Why can’t we acknowledge that we are not fully distinct and discrete individuals, but deeply interdependent phenomena? Why can’t our awareness that we exist as persons be overlaid with an awareness that we are inseparable from the world? It’s not always easy (ok, it’s really hard; if I REALLY understood this, I might be a Bodhisattva) but I don’t think its entirely unattainable.
If the source of the inauthenticity is the striving to demarcate ourselves, couldn’t we acheive greater authenticity through greater fluidity? Why do we have to say “this is me and that is not me, and I need your acknowledgement of that fact”?
We’ve got this idea that if you we don’t define ourselves, someone else will define us instead. That’s the “weakness” Sartre’s talking about. It’s a false dichotomy. Why does there have to be so much definition? Doesn’t it take strength to refuse to be defined? Wouldn’t it be more authentic to admit that we don’t know where the boundaries are and just start living?
LikeLike
Cat — (apologies for belated response, I thought I had actually replied) Many thanks for the link… excellent stuff. I’m currently playing around with some further refinements to my thinking on these issues and I may well draw on some of the stuff in that interview.
Geoff — I think that there’s a tension at the heart of the existential tradition and that tension is between, on the one hand, the desire to define ourselves and ‘be ourselves’ and, on the other hand, the realisation that definition is a fundamentally social act. The reason why existentialism is such a psychologically tricky creed is that the means of self-definition are very well understood and yet none of these definitions stick unless you open the door to the possibility of other people defining you in the same way as they define an object. In effect, existentialism traps you between the horror of a meaningless existence and the horror of objectification (which existentialists consider to be a form of non-being).
I think you’re correct that a willingness to accept the slipperiness of definition is a way out of this impasse. I think this is why so many people turn to escapism as a form of spiritual activity… by identifying with the characters in Final Fantasy 7 you are discovering truths about yourself and yet, because you can never be those characters, those truths are gossamer thin and easily revised. What’s more, I think that people are more postmodern about meaning than they let on… they wear different hats in different situations and assume different identities in different social contexts. People ‘are’ students, British and spiritually kin to wolves and yet all of these labels wax and wane depending upon context and mood.
LikeLike
Fine comments, however exhausting. Agree with your assessment of Hitchens, too; however I note that most every adjective is merely an expression of opinion, and Hitchens bored me to tears with his. Try not to make the same mistakes. Am not sure that “inauthenticity” really best describes our essential condition: we think, therefor we are, and perhaps what we think is what we are. How can one be more authentic? Facing the essential meaninglessness of life is our real challenge, as you touch here; but then again, “living well” (i.e. as comfortable as possible under the circumstances, while trying to avoid knowingly burdening others who are merely trying to do the same) seems, indeed, “the best revenge”.
LikeLike