Late Modernity and Teleology

Philosophy, Politics

Erentsen Erentsenov

“If we’ve learned anything from psychoanalysis is that we humans are very creative in sabotaging our own attempts at happiness… The worst thing is for us to get what we officially desire.”

– Slavoj Zizek, 2019

In terms of the right-left dichotomy, neoliberalism is a right-wing economical system with left-wing social tendencies. Teleology is a purpose, end, and a higher order or a meaning to life, going by the Aristotelian definition. This meaning is assumed to be desirable, if not achievable, or at least worth looking for.

Some are worried about the resurgence of ‘the left’, that it may pose a threat to civilizational order, or bring chaos to our society.

I will try to counter this notion by presenting three points: humanism, capitalism and liberal left movement and its relationship with teleology. This rather informal essay will show that the modern liberal left is a puppet of capital – it lacks its own ideas and coherency. Moreover, this geist stems from utilitarian ethics: materialism and positivism are infantile ideas because the basis for humanism and liberalism is absent. This further shows that the real threat to “freedom” is a hegemony of capital and its tendency towards authority.

Ontological clarification and methodology of identity

With my basic understanding of Deleuze and Hegel I can conclude that modern day capitalism has resulted in deterritorialization and a creation of new meanings. According to Deleuze, the definition of a category is influenced not only by its static meaning, but by movement in and of itself. For example, the role of the church, its tradition, and its symbolism have changed throughout the course of the 20th century, partially due to the movements around said institution: thus, the very definition of the category (in this case, the church) dependent on its motion – basically, “becoming” is an essential part of “being.”

The idea that identity is a stable concept that precedes the existence of the phenomena is a counter to Deleuze’s de-territorialisation. Of course, his own counter, in saying that the identity is originally created through the difference in objects (this difference is the identity), is much cleverer than that. However, the identities change, move and mould themselves depending on geist, historical context, current social conditions, and so on.

Moreover, as a condition or geist changes, the definition of the objects change as well. In the famous Dewey v Lippmann debate, Lippmann stated that the founding fathers assumed the role and expertise of their public in creating an active liberal democracy. However, the expertise of the common man was assumed by the standards of their local community. If they had a competent understanding of their local area, they were considered typical of the common man. Currently, the modern ‘common man’ is required to have a much wider understanding of affairs that extend well beyond the confines of his own nation. The understanding of ‘common man’ has changed, and therefore, the label of “liberal democracy” has changed as well, even if its fundamental characteristics remain. More importantly, Dewey had an idea that a liberal democracy’s reaction to world affairs should be an emotional one, which was the original development of the liberal democratic character. Therefore, a single identity exists in motion, it is inseparable from its motion and history, and more importantly, the condition of the object in context influences its identity.

This classification of identity is important as it reflects on the current nature of global capitalism and its identity and effect on the global cultural sphere. This was how Deleuze originally argued that capitalism took over from the preceding pre-modern western condition, and it de-territorialised meanings of masculinity, power, state, Christianity and many other social systems and customary attitudes in the western world. However, as capitalism became a more global phenomenon, it began re-territorialising these concepts: witness the growing authoritarianism of private companies and the imposition of capitalism on culture as merely two of many features of this growing cultural hegemon. Consider also the imposition of the idea of productivity as an end goal, and the widespread conception that normal human behaviour is behaviour that makes one a suitable employee – i.e. behaviours and attitudes that restrict freedom in favour of capital. This loss of freedom is natural as global capitalism matures and assumes cultural hegemony.

What kind of meaning could neoliberal capitalism create? In terms of teleology of the human life, there is nothing. Capitalism is in itself a materialist category (much the same as Marxism or any other materialist ideology). It rejects things outside of the empirical, the physical. Therefore, the meaning of this materialist category cannot in itself be something “higher than human.” The vain accumulation of resources or increasing material life conditions may suffice to an extent, but it is difficult to argue that this is the ‘meaning of life’, and more importantly, it entails lazy thinking.

To counter my assertion, one might claim that capitalism is only an economic tool, a system of resource allocation. However, the dichotomy of material vs ideal creates and imposes its reality on the mind. Assuming it functions as a Nietzschean ‘master morality’, people lack the agency to withstand and overcome subjective materialist capitalist impositions on culture.

Moreover, the current zeitgeist is purely modern. The most pervasive and apparent ideology of the modern day is capitalism. Specifically neoliberal capitalism. The idea of a good human matches the idea of a good worker.

Furthermore, the ideas of equality and inclusivity are capitalist as well, and neoliberalism is obviously the product of free-trade-capitalist-globalists.

Humanism and Teleology

Taking the works of postmodernists such as Zizek and Dugin into consideration, we can assume that idealism is essential to human existence. One of its manifestations exists through storytelling, framing events for oneself and others, thus creating subjective reality. Subjective reality is much more reliable (an idea stemming from Nietzsche). Therefore, one cannot discard idealism and the power of narrative from the human experience.

Clearly, neoliberalism as a whole has not even made an attempt to create any kind of meaning. By its own definition it can’t – it lacks any notion of idealism, of the notion that telling stories has any benefit.

Some thinkers say that modernity is an ateleological condition, where people’s only purpose is to destroy any higher purpose. But this notion of ateleology needs further examination. Modernism’s humanist ideas and reaches (literature, scientific advancement) are examples of the human-oriented end-goal of understanding being and stoking individualism.

One may counter that secular democracy in itself creates meaning in achieving a state of perfect secular democracy. One can also propose that meaning and teleology for humans is synonymous with progress. However, a real progressive idea only exists when coherently explained. The most coherent and furthest-reaching ideas of progress were conceptualised in the modernist era which tackled teleology (and materialism as its compass) with cold-blooded rationality, rejecting it and embracing the excellence of humanity instead (classical liberalism of Locke and Marxism).

The whole idea of a secular democracy is that people have the capacity to develop their own values, and they are encouraged to search for personal meaning.

However, techno-capital is infringing on this notion by creating and encouraging authoritative practices and by assuming hegemony over social discourse. Techno-capital is a force that stops de-territorialising identities and starts to create its own, and the resulting authority and hegemony are a part of that process (from schizophrenia back to authority). But moreover, Marcuse has shown that modernity always possessed authoritarian characteristics, and not only was the population during the height of the movement unable to create their own personal values, but it summarily asserted dominance and propagated its values in people’s minds. Late modernity is eating its own tail and it is openly aggressive towards the perceived freedoms of the west.

I find it very easy to disprove the humanist’s idea of a human being. At modernism’s birth, it retained the idea of a “human” from the pre-existing humanist tradition. The human is an ever-developing, rational being, constantly striving to find the truth, and the height of creation.

In a period where psychoanalysis and psychology were undeveloped, this conception of humanity may have sounded reasonable. But even then, this assertion was a positivist one. The underlying assumption was that this is what a ‘human’ should and can be, if possible. And this ideal ‘human’ should be liberated from the shackles of material hardship.

However, psychologists then discovered the myriad of ways we lack self-control, and Zizek and Deleuze found that humans are complicated and intricate machines of desire, and that we require mythology and storytelling to function.  Levi-Strauss believed that we could only assess things through binary oppositions, and Roland Barthes posited that mythology in media and everyday life were more prevalent in the 20th century than they had ever been before. It is also important to remember Kierkegaard’s assessment of media as a new church – occupying the same mythological spot in a person’s mind. To summarise, postmodern assessments of human nature under our current circumstances concluded largely that we are still the same mystical, tribal, mythically-inspired, irrational, imperfect, and very interesting species. Brushing all this aside to label humans as generally rational, even if we are rid of material hardship, is an incorrect assessment of human nature.

Therefore, the idea of the human posited by humanism and liberalism does not really exist. This breaks the fundaments of humanism and liberalism. More importantly, it leaves the liberalism’s ideas as forever positivist, forever aspirational. This is not a terrible thing – most idealist concepts are unreachable or at least not fully materialistic. Active faith is involved in fleshing out concepts such as ‘being’, love, freedom, etc. However, the fundament on which these concepts are built should be coherent and correct and that is where modernism lost its ideological basis.

There are many ways in which this fundamental ideological core affects our cultural life. For example, the idea of equality in the modernist era also assumed that the discourse between two rational people would be able to seize the truth. However, under the postmodern condition, one has to spend a lot more time studying and engaging with the object to achieve expertise in any given field. This shows that the most effective empirical or ‘better-assessed’ truth is only accessible to a qualified subject. There is no longer a level field that could unify us, no basis of equal human universality looking for a rational truth through discourse. This is a clear example of how the fundamental misunderstanding of a human condition is affecting the cultural field: miscommunication is a fundamental feature of human experience.

To summarize, the humanist and liberal’s idea of the human does not exist, liberal democracies of the present day lack an overriding coherent meaning despite the fact that people function through stories and myth and an explanation of the world on a subjective level. On the objective level the narrative fails because the ideal human in humanist tradition doesn’t exist. More importantly, this absence of higher meaning (or teleology) is most prominent in late stage capitalism.

Capitalism and its lack of teleology

The destruction of old hierarchical structures in cultural life created a window of freedom in western civilization. However, as capitalism gained more power in the world, it created its own hierarchical structures and began influencing cultural life. I will take the notion that global capitalism became stronger throughout the 20th and 21st centuries as a given: it has won.

Moreover, global capital’s hegemony squeezes and assumes authority over the spheres of cultural life in which it wants to assert itself. The most obvious examples show the ease with which traditionalism is sidelined by capitalism: Gillette ads, Starbucks support for LGBTQI, and a number of other global companies’ incorporation of socially leftist movements. The workers and consumers’ lives and their social opinions are of little importance to these companies – as long as they are consumers of their product and hardworking employees, people’s personal lives have no influence on the company. ‘If there are more consumers and workers to participate in capital accumulation, then why should society be organised along traditional family lines (father works, mother homesteads, children are educated), when one could have two members of the home actively participating in the economy?’

Consider also fourth wave feminism, with its ideas of responsibility and independence, which greatly resemble the positive attributes of masculinity. These ideas perfectly fit the idea of a good employee and are widely accepted and promoted by private institutions.

Another hot topic of the day is free speech – private institutions are asserting speech restrictions on their employees. Ironic that the value that was brought through modernity is now being shut down by another modernist creation. The existence of these contradictions is not terrible in itself, but when human nature is taken into account, these contradictions are becoming unavoidable and inevitable.

These examples are not new – they have existed since Bernays’ time, as shown in Century of Self by Adam Curtis.

Capitalism is becoming naturally and visibly more authoritative, to the point where it now restricts freedom of expression. The workplace is an authoritative environment where you are told what to do, what to wear, how to act to be successful, how to engage with your emotions, how to present yourself, etc.

The natural extension of capital’s control to everyday life will result in further losses of freedom of expression. According to Deleuze, capitalism has created the window of schizophrenia by dismantling old structures. But it will also create its own new hierarchical structures.

“The Liberal Left”

The postmodern condition births hedonist drones because we are supposed to find a fulfilling life and create meaning ourselves, and this is a difficult thing to do. Modernism, and its child capitalism, lack inherent meaning, leaving us to fill in the gap ourselves. Rather, the meaning that was created by modernist societies is one that we are unhappy with because it is unfit for our condition.

The overriding ethical system of the modern day is utilitarianism. Utilitarianism and materialism in the postmodern condition have created the ‘left’ that seeks material fulfillment. It is best exercised in capitalism, the system best at creating material goods. We should examine the ‘left’ as a generalised populace in 21st century: it has definitely been more socially accepting than most movements before its time, but that is because of its historical condition, in which ethical values such as pleasure and happiness are embraced.

Combine vulgar utilitarianism with the left’s infantility (its natural strive for change) and with positivism, and the result is the average modern leftist’s movement. These people are trying to defend an idea of humanity that does not exist while simultaneously exercising utilitarian ethics stemming from capitalism. The left continues to push for material improvement to see their idea of a human, which does not exist, flourish.

The main thrust of the left is the spread of social liberalism, headed by somewhat vague ideas about universal equality and diversity. It asks for the representation of quotas, as capital owners and in the workplace itself. Therefore, it is comfortable with capitalism.

The modern day left movement is the last gasp of a dying animal. They’re a leashed dog – all bark, no bite. Their moralising is hollow. 

This condition is widely recognized by most of their intellectuals: they lack a coherent left-wing theory describing and providing a framework for the modern condition and lack a prescriptive methodology, which is especially apparent when they are compared to their 19th century counterparts.

Their hollow reaction to the modern condition, and subsequent moralising is made manifest through hysteria: that’s why the neoliberal leftist supporters are often incoherent. Take as the Ur-example the political engagements of students on American college campuses, and as further example the reaction of Hollywood stars to the global affairs.


Late modernity has assumed an unprecedented level of control over social life through re-territorialisation, which has manifested itself through capitalism, the most prominent social hegemon. Private institutions, under the current system, will continue asserting inclusivity, equality and diversity in our cultural sphere.

But the nature of this leftist movement is false and incoherent, based on disproven ideas of what it means to be a human being. And the modern leftist movement lacks substance and is subverted by private institutions.

Is it the chicken or the egg that comes first – there a genuine desire for change in the left wing which has been exploited by the authoritative practices of capitalism, or is the current state of affairs the result of the left’s impressive control over capitalism?

People worried about the left bringing ‘chaos’ and a threat to civilisation ought to examine the ideas of the left a little more closely. I hope I have made it clear that the left lacks the coherency to drive large-scale movements, and is currently just subject to the power of capital.


The methodology and causation in this essay imply that there is a movement and dismantling of pre-modernist categories that modernism destroyed. But as late modernity creeps in, it establishes its own authority and hierarchy.

This is also a normativist essay, there are no reaches and value judgements: I am not saying what is right, I am simply describing the changes in western culture. Moreover, I used a very assertive tone, as it is a very raw representation of my current thoughts.

This analysis is based on half-baked ontology, and there is no examination of aesthetics and their changes. Moreover, most of this is just my limited interpretation of Zizek, Dugin, Deleuze, Nietzsche and Hegel. There are many assumptions such as the hegemony of capitalism, liberalism’s Kantian ethics losing to capitalist utilitarianism, and there is no examination of personnel in the control of capital. Moreover, this work assumed that capital and government are much the same. My main assumption is that teleology is real and important and that the idealism suffices in targeting it.

Moreover, there few examples – both because this is an expression of subjective truth, and more obviously because this point has been made throughout postmodern philosophy before.

Making Peace on Australian Campuses

Australia, Brisbane, Politics

Drew Pavlou

In recent weeks, menacing Chinese ultra-nationalist rallies have convulsed Australian university campuses and city streets. Peaceful pro-democracy Hong Kong demonstrators have been bullied, assaulted, abused, and subjected to doxing and death threats online. Against this backdrop of tumult and unrest, in part encouraged and spurred on by Chinese diplomatic officials in Australia, it is easy to lose faith in the ability of peaceful debate and discussion, those hallmarks of liberal democracy, to help us find a way out of this mess. My personal experience as an Australian student involved in the pro-Hong Kong democracy movement at UQ tells me that we must not be so quick to despair.

The other night, an improbable meeting of minds occurred beneath our university’s Lennon Wall that reinforced to me the enduring power of free deliberation and dialogue. Hubert introduced himself to me as a twenty year old International Relations student from Southern China. He already knew who I was, having seen the posts circulating about me on Chinese social media. He was taking a courageous risk in reaching out across the divide. We soon got to chatting, speaking for hours as the brilliant purple sky above Saint Lucia burnt to night.

Over the course of our friendly exchange, we discussed a wide range of topics frankly and openly. Hubert described his family’s experience of the horrors of the Cultural Revolution while speaking with great pride of China’s stunning economic miracle. I shared his sense of awe at China having lifted some eight hundred million people out of poverty in a single generation, an achievement of world-historic magnitude. We bonded over a shared respect for Chinese literature, culture and civilisation. Throughout our conversation, I was struck by both his erudition and his kindly and thoughtful demeanour.

We also talked about the pro-democracy rally at UQ I helped organise on July 24th. During that rally, I was assaulted by a co-ordinated group of masked pro-CCP heavies. If I am honest, I was surprised he was present among those Chinese nationalist demonstrators that sought to disrupt Hong Kong students peacefully expressing their concerns on campus. Hubert did not participate in the violence and I know he did not condone it. Still, what led him to support the nationalist rally?

He explained his concerns and I tried to understand and respond to them. As the discussion broadened, he helped me see that Chinese students share the same anxieties and fears as Australian and Hong Kong students. Where we feared persecution for our political beliefs and views, they did too. Alone in a foreign country, Chinese students could rationally fear Australian protests critical of the CCP would contribute to the creation of a McCarthy-like atmosphere of paranoia and mutual distrust. Who could blame them, given Australia’s long history of anti-Chinese racism?

Profound contrasts in the political culture of our nations served to stoke misunderstandings that inflamed passions on both sides. Hubert explained to me how China’s nationalist education system encouraged citizens to conflate the CCP-led state with the very nation itself, so that Chinese students at UQ would interpret criticism of the state as criticism of the Chinese people. This definition of nationhood is obviously radically different to how we conceive of the relationship between a state and its people in the West. Where I intuitively draw a distinction between criticism of Prime Minister Scott Morrison and criticism of my identity as an Australian, Chinese students at UQ interpreted our opposition to the policies of the Chinese state in Xinjiang, Tibet and Hong Kong as opposition to the Chinese nation and people.

I fundamentally disagree with this vision of political life as it seems to underpin a blood-and-soil authoritarianism that brooks no criticism of human rights abuses. I think this is an idea we must rationally challenge and break down. Were it not for my productive discussion with Hubert, I would not have understood the need to reformulate future protest messaging to clearly respond to this distinction. He helped me see how vital it is that we show Chinese students our opposition to the current policies of the Chinese government do not entail an objection to them or their presence in this country.

Ultimately, Hubert and I still came away from our dialogue with fundamental disagreements. But leaving aside those differences, it was a productive, educational experience. And that is the power of free debate and discussion as hallmarks of effective liberal democracy. Through peaceful dialogue, we overcame differences, clarified misunderstandings, and tried to bridge the divide between ourselves. That night at UQ, two twenty-year-old kids from vastly different worlds and cultures got together to try to understand each other a little better, and I think they came away slightly better people. That is the beauty of discussion and peaceful attempts at mutual understanding, and it can underpin a new peace on Australian campuses and city streets.

On Anti-Intellectualism And The Cult Of Self-Proclaimed Millionaire Jack Bloomfield


Ben Wilson

Jack Bloomfield is seventeen years old and a self-reported millionaire. If we trust the account delivered by himself and his parents in a television interview, he has earned all of the money himself and has had no assistance from his parents. In an opinion piece published by Rupert Murdoch’s NewsCorp, he wrote:

‘Is there a profession called “arts” outside of being an artist? Not that I’ve ever heard of.’

His derision of the liberal arts is cliché and uninteresting in and of itself. Equally uninteresting to me are his businesses and the existence of his e-Commerce courses for which he charges thousands of dollars to be instructed by him through a series of online modules. What is more interesting to me is that this wealthy seventeen-year old has been the subject of numerous articles as well as interviews and photo ops with politicians and public figures including Prime Minister Malcolm Turnbull and LNP Leader Deb Frecklington. If the comments sections discussing and reporting on Mr Bloomfield are any indication, reaction to his success consists of peoples’ admiration, dismissal, derision or a combination of the three. What I find interesting is the question of why Jack Bloomfield is newsworthy in the first place.

A scan of headlines suggests one answer immediately – his age. Most people aspire to be wealthy and here is someone who has apparently achieved that goal while still in high school. But I think our society’s obsession with someone like Mr Bloomfield is tied more to how he has made his money, and that he has made money at all, than anything else. His dismissal of tertiary education that is not obviously tied to a profession, and some peoples’ admiration for that dismissal, is similarly telling.

College With A Capital C

Mr Bloomfield made me think of the film It’s A Wonderful Life, made in 1946. For those of you unfamiliar with it, it tells the story of a man named George Bailey who, while trying to kill himself on Christmas Eve in the face of personal hardships and a crisis of self-worth, is saved by an angel who shows him his value by showing how much worse the world would be had he never been born. The film is saccharine sweet and utterly lovely. But the specific scene I thought of was when George gives up going to college to help the family business, using his college tuition on his younger brother instead. We are never told in the film exactly what George intended to study. We are told only that George was going to go to college and that by giving up that opportunity he was making a great personal sacrifice of his potential future prosperity. This is in spite of the fact that, notwithstanding his lack of tertiary education, he is managing a building and loans bank which appears to do a decent trade supported by the goodwill of the town. The bank is so well-loved and relatively prosperous that we are told it survives the Great Depression in a time when most small-town banks folded. The movie does not explain how college would have made George Bailey wealthier. It is knowledge assumed of the audience. The word college is spoken with a capital C – a magical word implying social mobility and return on investment.

The Death Of Old Narratives

I believe firmly in the value of tertiary education, perhaps especially the liberal arts, for their own sake. People much smarter than I have pointed to a growing disdain for parts of history and civics as being a possible reason for the declining support of democratic institutions and liberal ideals (whatever you think those things mean). It is difficult to see the benefit of such systems without a context for why they exist and the reasons for their forms. Where I think I differ from some who decry the corporatisation of our universities is in the idea that there has been a period at all in the past century where the majority of people did not attend university primarily for the sake of personal prosperity. Even if historically universities and the academics who resided within viewed their role primarily as the advancing of intellectual inquiry, I think that the majority of their students viewed the expansion of their minds as an ancillary benefit. People attended university when they could because they were told that a degree was how a person becomes better off. And to be fair, on average that is still true. People with university educations tend to make more money over their lifetimes than those who do not. But the degree is no longer a guarantee of future employment and students find themselves seeking endless internships and work experience opportunities to flesh out their resumes for the job market. There are people leaving universities with tens of thousands of dollars of debt who earnestly believed that their degree alone would guarantee them future prosperity. Some of these graduates are finding that is not the case. The narrative that existed of university as a guaranteed pathway to future comfort is dying.

And it is not only university graduates. Young people are being told they may be the first generation to be worse off than the generations which preceded them. The welfare state is struggling to adequately support the elderly whose life savings are proving insufficient to support them in their ever-lengthening age and who are considered less desirable in a modern workforce. Those in the middle are being squeezed to pay the costs of the latter while facing the justified outrage of the former. Narratives which I think we were all raised to believe were true of our society (in Australia and western-liberal democracies generally) are beginning to fail.

The New Narrative Of Entrepreneurship

My theory is that people want an understanding of how they can succeed. That is why people pay money to self-styled entrepreneurs who not only provide an answer but offer themselves as a living, breathing example of that answer’s truth. That is why people pay hundreds of dollars for a man named Gary to tell them that they need to ‘eat shit’ and suffer the hard times if they eventually want to make it. That is why people will pay thousands of dollars for an eCommerce course taught by a seventeen year old. The old truths that a university degree, or a trade, will let you buy a home and support a family seem to the eyes of many to have expired. And these entrepreneurs understand their market. They have identified the backlash against so-called intelligentsia and elites. The liberal arts are the subject of particular ire because there is truth to the assertion that for many graduates it can be difficult to find well-paying work in the field they studied. People want a path to prosperity. Whether or not the path offered is easy is less relevant than that it seems real. Our society is increasingly unequal and even our culture’s idea of what it means to be prosperous has shifted for many from a home and a family to the lifestyle of the wealthiest people we read about and see online.

What I think is terrifying to some people when confronted with a lack of clear paths to wealth and social mobility is that there are only two possible answers for that lack. Either they have not discovered the path yet, or the path does not exist. Humanity is always going to want to be successful but we also crave the familiar. Forging a new system, a new path, is intimidating. So instead we seek out those who seem to have found the path. That is why I think people are as interested in Mr Bloomfield as they are. Not merely because of his age, but because he gives the impression of having found a path to wealth that, while perhaps being difficult, is nevertheless achievable. Increasingly, that does not seem to be true for some university graduates. That’s why, consciously or not, he writes a line deriding the liberal arts in our universities. He’s not trying to speak to the people who went to university and succeeded. He’s trying to rally and profit from those who did not.

Nicholas Comino

I would like to begin this piece by stating unequivocally that I am a capitalist who supports free markets, trade and enterprise. But never before have I been more struck with despair than I was when reading seventeen year old entrepreneur Jack Bloomfield’s opinion piece published by While it seems that Mr Bloomfield is successful in business, many of his statements reflect a trend in social opinion that I believe is becoming more prevalent than ever. The view that higher learning and tertiary education is outdated, a waste of time and purely about the end result of getting a job.

‘Is there a profession called “arts” outside of being an artist? Not that I’ve ever heard of.’

It’s a shocking line of thought, one that is gaining ever more purchase. But with the current state of our universities, who could blame people for feeling this way? These days, universities are spending more money than ever on external advertising and administration. At the cost of genuine academic inquiry and scholarship, armies of administrative staff have been deployed to our universities to focus on their “student experience” ratings. Thanks to the uncapping of places by the Rudd/Gillard/Rudd Governments, universities have become corporatised machines, determined to get butts into seats without any actual care for the creation of academic community or the production of anything of social value.

The public does not care for our institutions. Bloomfield summaries their attitude perfectly, however coarsely. He notes that contemporary Australian university educations produce: “A mountain of debt and a piece of paper that carries absolutely no weight in the working world”. This is the perspective those that run university administrations across the country want us to hold.

Why fund the humanities, social sciences, and the language departments of the world when we can pour endless money into marketing our business schools to overseas fee-paying students? This is the remorseless, cold logic of those that run Australian academia. Scholarship isn’t the only field being gutted. All the things that make our universities matter: student activism and social engagement are increasingly hollowed out in order to transform students from active learners into consumers. Across Australia, we see University Student Experience departments trying to crush student unions, student run clubs and student run societies in order to replace them with a sanitised university administration PR-approved mono-culture that costs obscene amounts of money to maintain while engaging precisely no one.

Take the University of Queensland for example, where the failed “UQ Mates” initiative promised a social media platform that would advertise university sanctioned and approved events. While seemingly innocent in nature, students actively engaged in the social life of their university saw this as a malign attempt to control all aspects of student life and kill whatever student culture, communities or activities deemed hostile to the corporatized administration’s interests. Against this alienating backdrop, students suffer an epidemic of loneliness and isolation that threatens the mental health of young people across the country.

How do our university administrations respond to this criticism? We hardly hear a peep from anyone outside our student unions and student run publications. You would think the self-proclaimed freedom fighters over at the Institute of Public Affairs would be up in arms about such a monstrous move to bureaucracy and the strangling of academic and student freedom. But no, instead we are subjected to the absurd beatings of the culture war drums as they desperately seek to import American political controversies to Australia. Think the whole: ”Campus free speech is under attack! Conservative students are being overrun by social justice warriors!” line of attack that should have been left in the year 2015 back with the irrelevant Milo Yiannopolouses of the world.

If you are like me, upset by the comments made by public figures like Mr Bloomfield, don’t direct your anger at him. Direct your anger to the university administrators and lawmakers that have created the world that allowed him to take this perspective. If we are to make universities socially relevant and seen as places that are worth attending, we must dismantle greedy administrations and start holding them to account.

Maddy Taylor

‘Is there a profession called “arts” outside of being an artist? Not that I’ve ever heard of.’

Having opened five eCommerce businesses by the tender age of seventeen, this is Jack Bloomfield’s expert commentary on the state of tertiary education in this country.

First, let me explain the concept of drop shipping, the model through which Jack supposedly made his millions.

Drop shipping, in essence, consists of buying products from one website and selling them on your own website at a marked-up price. Sounds pretty simple, right?

Yes, it largely is.

This model, as Jack has proven, can be picked up with a bit of time and dabbling and a few YouTube tutorials. Or, if you’ve got an extra $3,500 to spare for a one-hour phone call with Bloomfield, you can pick his brains as an expert entrepreneur, joining the hive-mind of e-businessmen selling the millionaire dream. According to this cult, those not motivated by money are both losers and failures, part of the ”mediocre” hogwash that comprise the slave-like masses.

I don’t cast doubt on the work Jack has put into developing his image. Credit where it is due, he has mastered Brand Management 101 by deleting comments criticising him on his Instagram page and blocking critics from commenting on his Facebook page.

All in all, however, it is disingenuous and pig-headed for him to make inflammatory statements about the worth, or lack thereof, he perceives in higher education. It is also disingenuous of Bloomfield to claim he was not in a privileged position to begin with; Bloomfield attends an extremely expensive elite Brisbane private school and his father is a CEO. Due to his position at birth, Jack enjoys tremendous social and economic capital from which to draw on. The vast majority of teenagers don’t possess such advantages by virtue of family connections, and thus cannot become ”self-made” millionaires. Not everyone can afford the minimum $10,000 investment required to take part in Bloomfield’s 12-month mentoring program where he deigns to impart his divine knowledge to us mere mortals.

It is sadly ironic that a year of Bloomfield’s mentoring program costs thousands of dollars more than a year of university education at a high-ranking Australian tertiary instititution. In light of his claims that Arts degrees are worth nothing more than the pieces of paper they are printed on, I would like to see the results Bloomfield’s vapid mentoring program have produced. I suspect Bloomfield could not demonstrate said results, for his program is fundamentally a meaningless exercise in rent-seeking, exploiting the ignorant and  gullible alike.

We must start to think critically as a society about ‘self-made millionaires’ and other ‘entrepreneurs,’ giving motivational seminars, selling online courses for thousands of dollars and telling us there is no value in the pursuit of intellect and knowledge. How did a seventeen-year-old who has not yet achieved a high school education develop a full online university course in eCommerce all by himself? (Hint: he probably didn’t). What qualifies him to deliver an absurdly expensive mentoring program as some kind of Tony Robbins-lite business guru? (Hint: he probably isn’t qualified). Why is a one hour phone call with Bloomfield aparently worth $3,500? (Hint: It isn’t worth anything).

Where did we go so wrong as a society? Drop-shipping is essentially a parasitic economic activity for it produces nothing of productive economic value. It is for all intents and purposes simply an exercise in rent-seeking. Why then is Bloomfield being splashed across the Murdoch papers and celebrated on Australian network television as some kind of latter day Adam Smith, champion of capitalism and free enterprise? What exactly is respectable or admirable about reselling the cheap products of sweat shop labor online?

Hint again: there’s nothing worthy of respect here. Bloomfield and the hive-mind of influencers, entrepreneurs and snake-oil salesmen want you to believe their illusion so as to keep selling you their online education courses and ‘motivational’ seminars, where people like him and Gary Vaynerchuk will tell you you’re failing if you haven’t broken out of your comfortable 9-5 to embrace your entrepreneurial spirit and resell products manufactured through forced labour, like Jack.

This is a dangerous narrative that we must begin to resist, both individually and collectively. It is unfortunately easy to get sucked up by the charisma of huckster businessmen giving their well-scripted sales pitches about the worthlessness of life outside their hair-brained programs. It will be tempting for impressionable teenagers and high school graduates, unsure of their futures, to buy into this anti-intellectual narrative and abandon the pursuit of tertiary education altogether.

This is not to say that there are no alternative avenues to a university degree. I am a fierce advocate for vocational training and trade apprenticeships. But following in the footsteps of a wealthy, privileged 17-year-old with the advantage of existing business connections through his affluent parents is sorely misguided and a dangerous path to embark upon.

I admire the entrepreneurial spirit of those who genuinely build themselves from nothing by spotting gaps in the market and innovating. Drop shipping is entirely alien to authentic entrepreneurial activity. It is simply exploiting cheap overseas labor to pedal more cheap, worthless low-quality products to consumers as the world burns and stares down ecological catastrophe. Bloomfield, despite his insistence on his status as an expert, would not know entrepreneural spirit if it knocked him in the face. Bloomfield prices his advice at around seven times the hourly rate of your average Queen’s Counsel barrister and around thirty five times higher than the average professional hourly consultancy fee. What a joke.

Bloomfield has seemingly constructed a magnificant artiface of lies and deceit. While breakfast talk show hosts fawn over him, it is worth noting the fact that Jack’s ABN was only registered in December last year. This is despite the fact that he claims to have made millions in business ventures stretching back five years. Open-source domain information also shows his domain is registered under his father’s name, and his parents’ tennis equipment business is listed as the organisation. It is curious none of his feted apps or businesses, including Next Gift, Best Bargain Club and Blue Health, are available on the App Store or searchable on Google.

It seems as though Jack Bloomfield is a con-artist and a snake oil salesman, and the apparent widespread support for him and his work is a sad indictment on our ability to think about and look critically at self-made millionaires and entrepreneurs. I deplore Jack’s message about university education, and encourage all who seek to improve themselves through the pursuit of knowledge to do so. After all, gaining a deeper insight into our world and what makes it turn might allow us to see through the lies of those making money off forced labor and ripping off aspiring businesspeople for overpriced ‘consultancy.’

Do not buy into the influencer narrative. Do not buy into Jack Bloomfield.

Against Hollywood Cinema: An Anti-Capitalist Rebuttal of Riordan’s ‘In Defence of Hollywood Cinema’

Culture, Film, Society

Tom Harrison

In this article I do not aim to offend Liam Riordan, nor any other person. Rather, I write this piece to express my feelings towards the debate around the ‘quality of Hollywood cinema’, as I feel many have been bogged down in the question ‘is Hollywood cinema good?’ rather than the far more important question ‘why does Hollywood cinema exist, and what does it do?’.

Liam Riordan’s article In Defence of Hollywood is not ill-conceived, as I too feel shame and despair when confronted by the plethora of cultural critics who construct monoliths – such as ‘the category of Hollywood cinema’ – merely to tear them down with rhetoric. Such ‘intellectuals’, often disciples of Jordan Peterson or of a thoughtless ‘popular’ feminism, invariably decry ‘mainstream media’ simply as bad. On this point I agree with Riordan, these uncomplicated critics have nothing to offer in terms of nuanced critique.

But such arguments, loosely described within Riordan’s article, must not be engaged with uncritically. At the risk of betraying myself as a reader of Foucault – a charge I may indeed be guilty of – the very notion of a Hollywood cannot remain unchallenged if it is to be honestly and critically discussed. Rather unfortunately, Riordan met his opponent on their own terms: he defended ‘Hollywood’. He defended, as I shall argue, the indefensible. He defended a system of production that is designed to subjugate the worker and prolong work itself, all in order to expand and enforce capitalism.

The ‘Hollywood’ that Riordan engages with does not exist. There is no unified force, no table of executives, no board of directors that creates films. But to say there is no ‘Hollywood’ is not to say there is no ‘culture industry’. Rather, the term Hollywood does not adequately or accurately describe the late capitalist emergence of manufactured culture; manufactured by and for those under capitalism in order to remedy the cultural chaos caused by, among other factors: the death of God, the dissolution of any precapitalistic restraints, and social and technological differentiation and specialisation. Such a system, as described by Theodor Adorno and Max Horkheimer, differentiates itself from the liberal notion of ‘Hollywood’, as the ‘culture industry’ is not a collection of production studios, but rather, production of culture itself. It is not conscious, nor reducible to the individual. ‘Hollywood’ is controlled by men, the ‘culture industry’ is controlled by the leviathan of late Capitalism: reducing all in its attempt to expand production.

The product of this system is the production of films for the purpose of ‘mass deception’ and uniform indoctrination, according to Adorno. Riordan describes a similar system himself:

“Hollywood is an industry, just like any other. It works on supply and demand: the smaller “sub-­studios” like Focus Features cater to an audience that wants smaller, more emotional, perhaps more specific experiences, where the bigger studios cater to a wider audience, as well as to those who want to see something that necessitates a huge budget.”

All tastes are catered for within the ‘culture industry’, but not because of supply and demand. To quote Adorno, “Something is provided for all so none may escape” (Adorno and Horkheimer, 2016, p.123). Riordan is correct in claiming that the totality of Hollywood is due to its capitalist structure, but wrong in assuming there is ‘demand’. The ‘culture industry’ is not driven by demand, there is no demand for the specific entertainment offered by the ‘culture industry’. No worker needs Toy Story, John Wick, or Love, Actually like they need food, medicine, shelter, etc. The worker seeks amusement, a distraction from the hell of late capitalism, and the culture industry provides it. “Amusement under late capitalism is the prolongation of work” (Adorno and Horkheimer, 2016, p.137). The worker seeks distraction from his existence and turns to amusement rather than confronting his radical freedom and thus ability to change. He seeks a tranquiliser and the culture industry provides.

In doing so, the ‘culture industry’ no longer pretends to make art: “the people at the top are no longer concerned with concealing their monopoly … They call themselves industries; and when their directors’ incomes are published, any doubt of the social utility of their product is removed” (Adorno and Horkheimer, 2016, p.121). The ‘culture industry’ is merely an extension of the mechanics of late capitalism, which perpetuates its workforce and sustains it, as painkillers sustain a crippled man.

This claim may appear to be a wild Marxist conspiracy, written by a wild Marxist. Such an accusation is deeply offensive to me, as I am not a wrenched Marxist but an Anti-capitalist.

I couldn’t possibly hope to achieve a total description and deconstruction of the ‘culture industry’ within this piece, so instead I shall turn the reader’s attention to Chapter 5 of Dialectic of Enlightenment, The Culture Industry: Enlightenment as Mass Deception.

Riordan’s piece came to the wrong conclusions, due to its being a response to the most pathetic and thoughtless cultural criticism imaginable. To say ‘the culture industry is incapable of producing art due to the profit motive eclipsing all intentions of meaningful expression and artistic creation’ is a fairly compelling argument (expanded on by Adorno) but to reduce this to the level of ‘Hollywood = Bad!’, well, on that I do sympathise with Riordan. It is a distraction from real analysis and inquiry.

This piece did not fully address whether Hollywood can produce good films or ‘art’, simply because such nebulous questions are almost impossible to answer and doing so would be tedious. Any definition of art is likely inadequate, as is any category of ‘good’. Rather, I expressed my feelings towards the ‘culture industry’ as an answer to the unspoken question, “what does ‘Hollywood’ do?”, hopefully correcting the otherwise pointless course of the dialogues surrounding ‘art and Hollywood’. The quality of the cinema the ‘culture industry’ produces should be as irrelevant to the consumer as it is to the producer, for ‘quality’ is merely a method of differentiation within a totalising system, which attempts to momentarily unify the schizophrenia of late capitalist signs within a product.

Scott Morrison Hopes The Crocodile Will Eat Him (And Us) Last


Drew Pavlou

In the lead up to the Second World War, the elite British foreign policy establishment hoped to contain the rise of fascism with a policy of appeasement. History records how that worked out for us all. Nazi Germany‘s furious assault on humanity and civilisation tore Europe’s moral universe apart, leaving tens of millions dead. It was a disaster made all the more morally catastrophic by its foreseeability. Winston Churchill famously mocked the strategy of those early capitulationists: ‘’Each one hopes that if he feeds the crocodile enough, the crocodile will eat him last.’’ In the end, the crocodile still gets its feed.

Distressingly, Prime Minister Scott Morrison seems to be employing a similar strategy of appeasement in the face of an increasingly aggressive, genocidal and tyrannical China. He has refused to be drawn into expressing anything approximating support for the people of Hong Kong as democratic protests there enter a fifth month. Recent footage coming out of the city shows armed riot police bashing protesting school children, slamming their skulls into the pavement. Against this imagery, Scott has called on the protesters to be ”peaceful’’ and urged a ”de-escalation of the situation.’’ Such fighting words!

As Chinese state media threaten military intervention and tanks mass at the border in Shenzen, he has rejected Richard Di Natale’s call that the government offer permanent shelter to the 18,839 Hong Kong residents of Australia, calling such a measure ”premature.” It is lucky that Hong-Kong Australians have nothing to fear. They can be comforted by the fact Morrison will issue a bland statement expressing ‘’concerns’’ when the People’s Liberation Army inevitably rolls into the city to slaughter their families and loved ones.

Scott’s response to the mounting humanitarian catastrophe in Xinjiang is perhaps even more damning. All the conditions for a coming genocide exist in Xinjiang. The Chinese Communist Party operates a vast Orwellian system of surveillance in the province, where at least one million Uyghur Muslims have been interred in concentration camps based on their ethnicity. There, they are forced into slave labour and subjected to physical and psychological torture. Recent reports suggest the Chinese state has begun imposing forced abortions and sterilizations on the Uyghur people in order to prevent future births within the group. This is a clear, concerted effort to eradicate an entire people from existence. These ”re-education” camps operated by the Chinese state could be transformed into death camps at any time. We could be staring down the barrel of a new Holocaust.

Asked about the ongoing persecution of the Uyghur Muslim people in an interview with Neil Mitchell, Scott would say: ”It’s not for us to go around and tell every country how they run their show … we don’t run China.” It isn’t surprising that Morrison would display such callous disregard for the human rights of vulnerable ethnic minorities given his track record presiding over refugee torture camps on Manus and Nauru. But such moral turpitude in the face of genocide surely represents a new low and the utter abrogation of Australia’s moral leadership in the world.

Ultimately, the outlines of Scott’s strategy (or lack thereof) confronting an increasingly authoritarian and bloodthirsty China are becoming clear. He will allow Hong Kong to be crushed and he will stay silent as blood flows through its streets. He will allow the Uyghur Muslim people of Xinjiang to be exterminated in a chilling genocide the likes of which the world has not seen in decades. He will keep his mouth shut through this all and kowtow to Beijing to ensure trade flows don’t let up at a time when we face recession (can’t risk the beautiful surplus!).

He hopes the crocodile will be satiated by such bloodletting. He hopes it won’t come for him, and us, next. He is deeply mistaken. Peace was not secured by the Munich Betrayal on the eve of the Second World War. It won’t be secured now by betraying the people of Hong Kong and Xinjiang in their most desperate hour. We can only hope Morrison wakes up before it is too late.

The Spectacular Success of Seleucus, the Great Opportunist

Culture, History

Spiridon Raikos

After the death of Alexander the Great in 323 BCE, his empire, which covered land all the way from what is now Albania to the edge of the Indian subcontinent, was split among his generals. These generals, known as the Diadochi, were to wage wars over Alexander’s territories that would characterise the next fifty years, and mark the beginning of what was known as the ‘Hellenistic period’ – the time when Greek influence over the Mediterranean and West/Central Asian world was at its peak. One of those generals in particular, and his rise to power, is worth our attention, if only because it demonstrates the enormous role of opportunism throughout history.

Seleucus I Nicator arguably had the most illustrious career of the Diadochi. In 315 BCE, initially having lost his satrapy and military to Antigonus Monophthalmus (a rival king and Diadochi), he would through the years of war come to regain his seat in Babylon and then rise to be one of the most powerful rulers in the ancient world. At its peak, the Seleucid Empire spanned territory from the western coast of Asia Minor to the Indus River, and was a testament to the monumental victories Seleucus had achieved. Given his unexpected ascent to power and prominence in the history of the Wars of the Diadochoi, the degree of Seleucus’ agency and fortune is a topic of great debate. Yet, his career clearly indicates a tendency toward opportunism.

Seleucus was an opportunist, not a long-term schemer, drawing upon key milestones in his rise to prominence such as his role in the murder of the Diadochi Perdiccas and subsequent promotion to satrap of Babylonia at Triparadeisus, his conflict with Antigonus (another Diadochi), and his flight from Babylon. Seleucus had seized opportunities out of both fortunes and misfortunes, and never had any sort of long-term plan.

The impressive rise of Seleucus I Nicator is made ever more so when considering his rather less prominent occupation during the campaign of Alexander in Asia. He is first mentioned in antiquity regarding his actions in the Indian campaign of Alexander’s conquests, as commander of the Hyspaspists, an elite infantry unit. Unlike the other successors, Seleucus was not one of Alexander’s primary generals. His rise to prominence occurred only after Alexander’s death, and over the course of years in the emerging power struggle among the successors. His first major position was as Perdiccas’ chiliarch, second-in-command to the appointed regent – up until his role in Perdiccas’ murder on an ill-fated Egyptian campaign. He was appointed satrap of Babylon as a result by the treaty of Triparadeisus, having made gains from his assassination of Perdiccas. In 316 BCE however, he lost his position due to a dispute with Antigonus, forcing Seleucus to flee for his life to Egypt under the protection of Ptolemy, another one of Alexander’s successors. While Seleucus’ power was limited during this period, his influence could be felt in his use as a propaganda tool by the other successors opposed to Antigonus. Seleucus, over the course of the struggle against Antigonus would regain his seat in Babylon and from there ascend to be one of the most powerful, if not the most powerful kings of Alexander’s successor kingdoms – his new ‘Seleucid Empire’ stretching at its height from the western coast of Asia Minor to the north-west of India. This spectacular rise from complete nobody to indomitable leader begs the question – was Seleucus’ ascension part of a long running scheme, or did he simply grab any opportunity he could reach, and get lucky?

The key events of Seleucus’ adult life involve the murder of his general Perdiccas and the resulting treaty of Triparadeisus, which gave him his position as satrap of Babylonia. It would be easy to claim his prominent role in Perdiccas’ murder proves that Seleucus had a constant plan from the beginning: but the general’s death was the result of a mutiny by dissatisfied troops, and Seleucus had joined on a pre-existing conspiracy against the regent’s life along with two other officers, Peithon and Antigenes. The apparent motivation for the assassination of Perdiccas (according to extant, if incomplete, historical evidence) was the regent’s ill-fated campaign in Egypt against Ptolemy, who had challenged Perdiccas’ authority, which subsequently lead to war. Ptolemy’s dissent against Perdiccas’ regency was not exactly a unique position, however, as the latter sought absolute control; his claims as guardian of Alexander’s interests conflicted with the fact that Alexander’s widow, Roxana, was pregnant and could potentially give birth to a male heir,

In historian David Braund’s analysis of these events, Perdiccas also rejected the claims of another possible heir, Herakles, son of Barsine and said to be Alexander’s son (though illegitimate) so Perdiccas’ denial of this claim is reasonable given the importance of Alexander’s blood. Ptolemy also suggested the empire be ruled by the successor generals as a joint council, which would undermine Perdiccas’ authority as regent. Perdiccas had thusly made enemies, and in 320 BCE he would face his downfall when the campaign against Ptolemy resulted in disaster. According to Diodorus Siculus, more than two thousand of his men, including some prominent commanders, were killed, prompting dissatisfaction that lead to mutiny. The resulting mutiny led to his assassination, the details of which are not fully known; Peithons’ involvement is recorded by Diodorus Siculus, with the implication that Antigenes led the coup, and Seleucus is mentioned in passing.

Seeing that Seleucus is given less attention in this event and is not the leader, it is hard to imagine this being somehow part of a grand power play. Perdiccas’ actions both in regards to the other Diadochoi and his failure in leading his troops to victory were unpredictable, and thus it would be safer to assume Seleucus was acting on opportunity. The following regent, Antipater, nearly met the same fate as Perdiccas at Triparadeisus, and would have succumbed had Seleucus not had joined with Antigonus to quell the mutiny. It would be strange to think that Seleucus’ plot involved an alliance with someone so unsuccessful, whose views actively contradicted those of the other Diadochoi (especially Ptolemy). It also makes little sense to believe that Seleucus was merely continuing his role in service of Alexander’s military – with Perdiccas at first, then acting as the representative of Alexander’s will until the appearance of a suitable heir. Seleucus’ role in the assassination seems more driven by dissatisfaction with Perdiccas’ leadership than by adherence to some grand plan for ascendancy.

The other key event precipitating Seleucus’ rise was his loss of the satrapy of Babylon in 315 BCE, leading to his reconquest of the same satrapy and the subsequent establishment of the Seleucid Dynasty. Seleucus’ first period as satrap began from 321 BCE, but due to conflict with Antigonus, he was forced to abandon his posting in 315 BCE. The roots of Seleucus’ conflict with Antigonus are found in Antigonus’ own rise to power, particularly in Antigonus’ symbolic acts, and his possession of the royal army (a significant military force) enabling him to take on rival satrap Eumenes, with Seleucus’ aid. Contemporary historians believe the principle source of conflict between the two satraps was the power vacuum left in Alexander’s wake – the account most supported by ancient evidence. According to Appian, conflict between Antigonus and Seleucus began when Seleucus insulted one of the latter’s officers in his presence, without consulting Antigonus first, a great public offense. Antigonus responded by demanding to see Seleucus’ accounts. Appian makes no mention of exactly how Seleucus responded here, but describes Seleucus fleeing Babylon, knowing that he could not openly fight Antigonus.

Diodorus Siculus, our other primary source, describes Seleucus’ response in more detail: Seleucus chided Antigonus, stating that the satrap had no authority to investigate Seleucus’ administration, and his position was his by right, a reward from Macedonia for his loyalty to Alexander. Seleucus subsequently travelled to Egypt and allied himself with Ptolemy. His new role as a propaganda tool, while it lacked actual agency, led to a coalition of the remaining Diadochoi against Antigonus, consisting of Ptolemy, Cassander, and Lysimachus.

Assuming Seleucus had planned his ascension, it would be difficult to comprehend his rise as a careful plot, by him or any other actor in this time, when Ptolemy had allegedly only given him ‘eight hundred foot-soldiers and about two hundred horses’, according to Diodorus, though Appian cites only a thousand foot soldiers and three hundred cavalry. Details of his extant military forces greatly enhance Seleucus’ characterization as an opportunist. No matter which primary source is consulted, Seleucus lacked the requisite forces for a recapture of Babylon. It may also be the case that Ptolemy rewarded him only in passing, clearly keeping larger forces for his own aims and giving a little to spare once Seleucus was no longer of any use to him.

Standing up against Antigonus in 315 BCE on account of personal pride and issues with Antigonus’ authority and being forced to run to Ptolemy, only to be given practically nothing in the way of military might as a token reward after years assisting Ptolemy is not the best plan one could conceive of if one were making long-term plans to become the next king of Alexander’s empire.

To say the least.

With this puny military might, however, Seleucus did manage to retake Babylon from Antigonid control in 311 BCE. Diodorus explains that Seleucus hoped for local Babylonian support, given their warm reception to him in his time as satrap: yet, as a result of his initially small force, Seleucus was required to raise their morale first. He allegedly accomplished this by appealing to Alexander’s success as the results of experience and skill: both attributes also associated with Seleucus due to his extensive military experience. Seleucus as an opportunist here appears to be the best way to explain his actions – he appears to understand his strength in military experience is the most expedient tool, and it was more expedient to convince the locals of that fact that than carefully planning and raising an army somewhere else.

Seleucus saw victory after victory (other than a single defeat against Indian king Chandragupta Maurya), and his empire wasestablished in 312 BCE, not to fall until the Roman conquest in 63 BCE. This impressive rise to power is a testament to his skill as a commander, and his ability to take advantage of misfortunes which occurred to him, and perhaps of his personal fortune as well – but it is not evidence of his skill at long-term planning. His role in Perdiccas’ assassination was not as leader – he merely latched onto something that existed long before him, and came out on top. He was given Babylon, but had never planned for it, and so his undiplomatic refusal to acquiesce to Antigonus’ demands to investigate his administration escalated to the point of outright hostility. His service under Ptolemy gave him nearly nothing in return, but Seleucus was able to make use of his few resources anyway. His is an empire that began with the grasping of opportunity, with chance, like so many others.

Britpop: The Story of British Politics and Society in the 1990s

Music, Society

Otis Platt

Much has been written of the British music phenomenon Britpop over the past 20 years. I argue that while Britpop started off as a reaction against the US grunge scene in way to promote a ‘British’ national identity based around working class grit, it became co-opted by capitalist marketing and the neoliberal agenda of New Labour. Using Benedict Anderson’s theory of “imagined communities”, along with E.P. Thompson’s concept of fluid social class, I will discuss how the “Britpop” phenomenon of mid-1990s Britain represented nostalgic British nationalism, I will look at three of the most well-known acts of the era: Oasis, Blur & Pulp during a time when their ideas were being called into question as problematic. After that I will detail how this popular alternative music subgenre was co-opted by capitalist marketing and nascent political forces (represented by Labour leader Tony Blair’s New Labour). In order to show this historical development and its socio-political contours I will begin by outlining the origins of Britpop in the early 1990s as a reaction against the influence on US Grunge.

Emphasis will be placed on how Britpop differed from Grunge and attempted to define itself in opposition to Grunge. I will chart the general history of Britpop to its decline in 1997 and the movement’s legacy to British music. The structural analysis will begin by outlining the representations of British nationalism inherent in the lyrics and music of Britpop bands such as Oasis, Blur and Pulp. This will include an analysis of the potentially problematic aspects of whether the nationalism of Britpop was simply nostalgic during a time when the reality of British identity was more complex. My final section will be on the attempted co-option of the Britpop genre by economic and political interests in the mid-to-late 1990s. This will show how the counter-cultural aspects of Britpop became just another aspect of neoliberal commodification and part of the new political establishment.

The History of Britpop

Britpop emerged as a musical phenomenon in Britain in the early 1990s. Historians point to the genre emerging after Blur embarked on a lengthy American tour in May 1992, which precipitated in them writing and recording an album about the Americanisation of Britain that November. The lead singer described British culture as being “under siege”, and that “We [Britons] should be proud of being British.” This change for Blur was the product of two factors. Firstly, Albarn listened repeatedly to English alternative records by the Specials, Madness, XTC, The Jam, and The Fall with his girlfriend Justine Frischmann, culminating with the band writing new songs on tour that told stories of English life. Secondly, much like in the USA, when the band returned home the UK music press was filled with American guitar bands like Pearl Jam, Alice in Chains and the Red Hot Chilli Peppers. This only made Blur more determined to release Modern Life is Rubbish in a way that was stridently anti-American rock and pro-British pop. The band insisted that the resulting record was about British nostalgia over American importation. Albarn, according to Mike Smith, wanted to emphasise Britain’s rich Victorian-era musical tradition combined with the music of the 1960s and “come up with an articulate response to what had come from America.” Modern Life is Rubbish was released in May 1993 as Blur’s statement. The sixties influences of The Kinks was the ‘rubbish’ from the past that the band used to create its music. Thus, Britpop was born as a reaction to American grunge with a distinctly British twist.

In 1994 Blur released their second Britpop album, Parklife, with many of its 14 songs reflecting “Albarn’s claims to a bittersweet take on the UK’s human patchwork.” Parklife reached number one on the album charts. Nineteen Ninety-Four would also see Oasis release their first album, Definitely Maybe, a record with a profoundly Beatles influence. This group, led by Noel Gallagher, saw themselves as “proletarian sons of the soil, come to avenge music’s dependency on intellect and artifice.” This “proletarian swagger” of the Gallagher brothers would be something Damon Albarn and Blur would attempt to emulate, if unconvincingly. By the end of 1994, Noel urged British bands to stick together, in order to “go and break America.” However, 1995 saw the battle of the singles between Oasis’ “Roll With It” and Blur’s “Country House” (both released on 14 August) in the bid to get to number one on the charts. Musicians and the press often portrayed this battle as the middle-class southerners (Blur) versus the working-class northerners (Oasis). While Blur won this battle (reaching number one), Oasis won the war, due to the other songs off their second album (What’s The Story) Morning Glory? achieving sustained sales in the US. The year 1995 also saw the release Pulp’s acclaimed album Different Class, the lead single of which (“Common People”) was a strident response to the “voyeurism on the part of the middle classes” of “a certain romanticism of working-class culture” in the words of singer Jarvis Cocker.

The year 1997 is considered the year Britpop declined in prominence. Many point to Blur’s self-titled album as a point of departure, where the English influences were traded in for American lo-fi influences (particularly Pavement) following a further stylistic change in 1996 spurred on by Albarn and Blur’s guitarist Graham Coxon. Ironically, the second single off Blur, “Song 2”, would become their biggest and most endearing hit. Additionally, Oasis’ third album Be Here Now has been identified as the moment the Britpop movement ended.

Britpop and British Nationalism

The Britpop movement had a strong emphasis on British nationalism. Ida Hølgo argued that “one of the reasons why British music and Britpop came into vogue was the change of focus from America and its rather depressive and self-pitying grunge, to music of a more positive character and lyrics about topics and concerns that were uniquely British and that British people could relate to.” Sociologist Andy Bennett argued: “music can work in a variety of ways to inform particular notions of nation and national identity.” Against industrial decline, high youth unemployment and an “increasing anxiety concerning the fate of national identity”, Britpop assisted with “the ‘magical recovery’ of British national identity.” In the 1990s there was much debate “about the kind of Britain that will emerge in the twenty-first century, and the problems upon putting forward any coherent notion of “Britishness”.” Britpop’s re-exploration of the themes and imagery contained in the songs of The Beatles, The Kinks and The Small Faces was postulated to be seen “as a nostalgic return of the past – harking back to a Britain that has been lost.”

Certainly, several of the songs of key Britpop acts fitted this trope. Blur were described in 1994 by journalist Cliff Jones as defining “a New Englishness”; “an attitude based not on a nostalgic Carry On Mr Kipling Britain, but a Britain you will recognise – the one you live in”. However, Bennett makes the point that the group’s video-clip to their 1994 song “Parklife” revisited “some of the themes and ideas concerning British life explored in the lyrics and music of 60s British bands.” The video explored the regional identity symbolised by Phil Daniels’ cockney accent, along with the virtues of simple working-class life (terraced streets and trips to the ice-cream man) that were so prevalent in the 1960s. However by the 1990s this was all done with a heavy dose of irony, with Bennett suggesting that the video harked back to “a ‘golden age’ of British life.” These concepts of British identity were also explored by Pulp.

Pulp’s video-clip to their song “Common People” was also replete with this new British nationalism of the 1990s. The Kitchen Sink social realism of the video evoked nostalgia to the 1950s and 1960s in a romanticist tone, reviving a class identity that was appearing to disappear by the 1990s. This was also a time when regional and national identities were more strongly felt, and the “Common People” video-clip went to great lengths to recapture this as much as the Blur video-clip did. It is argued that these representations form part of what Anderson calls an ‘imagined community’ of nationhood, as they are constructed in a way that reinforces an archetypal version of national identity that reinforces something which is imagined as “a deep, horizontal comradeship.”

At the time these representations of British identity were criticised as inauthentic. It was suggested that Britpop was glossing over the recent British social history of the racially turbulent 1970s and 1980s. It was seen as promoting a traditionalist view of British cultural identity through “flag-waving.” Jones argues that “[as] Britpop became the music that was seen as the preferred representation of British cultural identity, its ethos came to reflect and reinforce a nostalgic and chauvinist cultural turn which privileged whiteness and to a lesser extent maleness.” Retrospectives of the 1990s focusing on Britpop often ignore the success of socially, racially and sexually diverse and politically charged genres such as jungle and bhangra (electronic rather than guitar-based) that “engaged with and reflected contemporary concerns and anxieties rather than attempting to mould a self consciously nostalgic national identity that excluded large swathes of the country.”

However, this author takes the view along the lines of E.P. Thompson in that we should not judge the actions of the key players of Britpop, as their “aspirations were valid in terms of their own experience.” Additionally, people writing at the time believed that Britpop was not necessarily the Britain one recognised as the one they lived in, rather being one resource that young people chose to associate themselves with thanks to the cultural fragmentation that existed at the time. It was part of a pluralism of identity politics that meant one could take solace in “magical recovery” of traditionalism or embrace a new national identity that was more encompassing of multiculturalism. Both Britpop and South Asian Dance Music included all Britons regardless of race. Britpop was simply one version of national identity that coexisted “unproblematically alongside a range of other possible versions of national identity facilitated by an increasing musical and stylistic diversity within British youth culture.” Therefore, the criticism that Britpop was too white is unfounded.

Britpop’s Decline and Co-option by New Labour

The 1990s in Britain was a time of great consumerism. Britpop became one of “the most significant creative industries of the decade and became the shibboleth of cultural consumerism in Britain.” This became enhanced during the “Cool Britannia” period (1996-1998), where Tony Blair’s Labour Party used Britpop and its commercial consumption to great advantage by winning the 1997 General Election. In the lead up to the Election, Blair was portrayed as hip and cool, in contrast to the Conservatives. He even Noel Gallagher to Downing Street in an attempt to make the government likeable in popular (or populist) sense.

Once in government, Blair used Britpop as a means to an end with his socio-economic policies. Scholars have pointed out that Britpop “became increasingly acquiescent with, and deferential towards, the specific expressions of neoliberal triumphalism of the Blair years.” Navarro argues that New Labour successfully marketed Britpop through the politicisation of consumerism and the marketisation of popular culture that generated a cultural movement between 1996 and 1998 (“Cool Britannia”) by connecting politics with music, art and fashion. In endorsing “Cool Britannia”, Blair and his government “fostered a new conception of a young, dynamic and multicultural Britain by bringing pop culture to the foreground and activating a market-driven economy.” Once Blair won office, Noel Gallagher was appointed to his Creative Industries Task Force. This shows the close link between politics and the music industry in this period, emphasising Blair’s “favouring of a dynamic economy stimulated by culture industry.” Thus Britpop began to serve the neoliberal economics of Blair’s New Labour.


Britpop represented the cultural zeitgeist of 1990s Britain. While Britpop began as a reaction against Grunge, affirming a sense of British nationalism and working-class identity in the process, however, this movement became co-opted by political interests. Its use in politics by the Labour Party ensured Tony Blair’s ascension to Number 10 and emboldened a neoliberal consensus across both sides of British politics.

While it came to reflect the neoliberalisation of British politics, I argue that Britpop did not represent an exclusionary and nostalgic British national identity – it did quite the opposite. It was one cog in a pluralistic conception of national identity, one section of the imagined community of 1990s Britain. Britpop’s co-option by commerce coincided with its use in New Labour’s election strategy, along with the movement’s musical decline. By the end of “Cool Britannia” the party was over.

The Fetish: A History, A Future, and Why You Can’t Survive Without One

History, Philosophy

Peter Calos

On the first of October, in 2013, a man revealed anonymously to the internet that he had a deeply-held and obsessive fetish for the state of Ohio.

His story was as follows:

He lived in Ohio, and was intensely interested in hunting, in the geography and history of the state; enough to read an obscure book by a longhunter (a hunter who typically embarks on elongated excursions, some lasting for as long as six months). This book concerned the fault lines around the Ohio Valley River, and the author was a local, who he quickly managed to contact and befriend. This author happened to lead a hobbyist group that would often embark on hunting trips near the valley. Our protagonist was encouraged to come on one of the trips in question, where he was led into an obscure forest near the Ohio River. Once there, the members of the hiking party encouraged him to take several unidentifiable pills, and he experienced audio-visual hallucinations while the leader of the troop spoke about the history of the land and performed sexual acts in front of him.

Emerging from the experience, the man found he had acquired a…shall we say, a ‘certain taste’ for the fault-lines and specific geological features of Ohio.

This story, posted on 4chan slightly under six years ago, is almost certainly not true.

Yet it raises the question of how our interactions with and ideas of the fetish have changed over time. What is the origin of this idea, of attraction to a physical object? The etymology? Its conception over time? How should we consider the ‘fetish’ in modern society?

Tracing terms

The term ‘fetish’ first rears its head in 16th and 17th century travelogues, written by European traders journeying as far as their supplies and superstitions allowed into West Africa. Common parlance has it that the term ‘fetish’ refers to a small wooden idol, worshiped as a god by African tribesmen; Portuguese traders from the 15th century originally distinguished between a feitico (an object worn on the body and used in rituals) versus an idolo (a medium of worship). Collation of the terms was a European generalisation.

As the concept became more widespread, the colonialists’ impression of the fetish developed into an object that obstructed the natural path of commerce, it being a valueless piece typically composed of wood, stone, or bone, which captivated its owner despite its complete lack of monetary value.

Introductions of the fetish to Western intellectuals were fraught with travellers’ preconceptions of fetishes as primitive African misunderstandings of the universe. Their belief that devotion to a physical object could change the natural state of the world, bring prosperity to the unfortunate, or curse a particularly offensive person was a ridiculous idea to a culture that had already long accepted monotheism and the idea of an incorporeal god.

And this was also a culture that had only relatively recently recovered from the cultural shock of Protestantism, which had raised the following questions:

  1. What to do with physical representations of divinity
  2. How to accept that the Catholic power structure was both necessary to uphold social values, and also irredeemably corrupted by centuries of selling relics and indulgences.

Discrimination from a Christian world followed fetishists (If I may be allowed to use the term in an anachronistic context) in West Africa and in the Haitian colonies, on account of the association with witchcraft, sorcerous acts, and the deception of others through perceived tribal fakery. These tensions were compounded by the efforts of the West Africans to resist their oppressors with highly effective poisons. The use of fetishes had become yet another aspect of the ‘barbarian’ image the West had of her colonies.

In 1757, a French philosopher (Charles de Brosse) coined the term ‘Fetichisme’ to describe “the religious delusion that blocks recognition of rational self-interest and social order.” This description signposts two aspects of the European mindset: it is both an evolution of the Portuguese traders’ conception of the fetish as a useless trinket clogging up the market with inflated, non-extant value, and it is also an evolution of the idea that the fetish is an obsession, something that cannot be disregarded on a whim or bought or sold in the first place.

Fetishism abstracted: 19th century Europe

The backdrop of the late 19th century: European imperialism, contrasting the civility of a Europe which was being torn apart by politics and economic depression (the Austro-Hungarian Empire) with the brutal reality of colonial oppression. What Belgium had done in the Congo, and Germany in Southwest Africa, was widespread knowledge, to say nothing of the position of African-Americans in the United States.

Fetishism as a concept had become abstracted from its roots in totem worship, and had veered off wildly into two directions:

Commodity: Karl makes his Mark(s)

Commodity fetishism finds its roots in Karl Marx’s writings. In creating his labour theory of value, Marx conspicuously neglects a quite obvious source thereof: black slaves. In Marx, the slave is not a commodity, or a productive entity, or even a person: he is a ‘pedestal’, an object lacking agency, whose work is absorbed into Marx’s equations as part of the socially autonomous white labour force. In Marx’s work, the African slave is a demonstrative example for the plight of the European wage-worker, not an object in himself.

Yet the great irony of Marx’s attitude towards the fetishists is that his labour theory of value was itself a framework to be applied to the world in the absence of real evidence, something to be ideally taken in faith and acted upon by a unified proletariat. In other words, his work was written for the sake of creating a framework to judge the world by and act accordingly, which is the same principle as the African priests with their fetishes. Marx, being Jewish and downwardly mobile in class terms, was in a decidedly poor position within his own framework, and so must have considered the subjection of African culture to Europe to be in his personal interest (adding the Africans to the ‘ladder’ of European class would have brought him, relatively, one rung higher).

Commodity fetishism: when producers and consumers perceive one another in value terms, as mere creators or purchasers of value, rather than people. Economic relations abstract the reality of a given situation, hide the cruelty of the capitalist towards the worker through market-oriented language. In other words, in Marx, the ‘fetish’ is an obscurant; an obsessive, religious framework that conceals the truth of the world.

Sexual Fetishism:

The first person to coin the term sexual fetishism was the French psychologist Alfred Binet (1857-1911), who was also the progenitor of the first IQ test. Binet established the belief, popular among contemporary psychoanalysts, that a fetish was established as a result of an associative process, a lasting after-effect of a sexually-charged first impression. Following in his wake was the Austrian Richard von Krafft-Ebing (1840-1902), whose exhaustive book of sexual pathology, the Psychopathia Sexualis, challenged many pre-existing ideas on the formation and classification of perversions. The book was considered an essential resource for 19th century psychologists.

Krafft-Ebing retained the idea of perversions as functional sexual deviations which arose during puberty and declined after 40. He also wrote about individual fetishes in a distinctly gendered manner, referring to sadism and lustful murder as excessively manly, while masochism was excessively effeminate. Masturbation, in Krafft-Ebing’s view, was a key component in causing a fetish to appear.

He differed from his progenitors, however, in asserting that fetishes were mainly brought into being by hereditary tendencies, ‘taintedness’ in the family line which led to imbalances between inhibition and sexual instinct. This instinct was aggravated by stimulation, but not caused by it. Like Binet, Krafft-Ebing believed that a specific fetish was caused by the association of an object with inborn sexuality.

His most important deviation from the contemporary consensus, however, was the new conception of a fetish as not the result of degeneracy, a weak anatomy, or weak will, but as an intrinsic part of a person’s psychological composition, inseparable from that person. This more liberal perspective allowed him to then separate actions from psychological states – ‘perversions’ from ‘perversities’. “In order to differentiate between disease (perversion) and vice (perversity) one must investigate the whole personality of the individual and the original motive leading to the perverse act. Therein will be found the key to the diagnosis.” Perversion was now separated in the popular consciousness from immorality and crime, and thoroughly individualised.

One cannot avoid Freud.

In his first three essays on the subject, Freud simply summarised and regurgitated the views of his predecessors. Then came disparity: according to Freud, the sexual norm (attraction to a mature member of the opposite sex) was a perversion in itself, in the sense that the disposition towards perversion was common enough to overlap with sexual norms, and thus formed a part of sexual normalcy. In simpler terms, perversions existed, and were defined in much the same terms as Krafft-Ebing and his adherents, but they were universal. Childhood sexual proclivity was perversity to Freud because it always had the potential to veer off into any fetish as a consequence of a formative sexual experience.

In 1927, after having delved deeper into his psychological studies, Freud returned to the concept of fetishism, and redefined it as a result of traumatic childhood experience. Such a radical idea was a point of contention between Freud and other psychologists and contemporary sexologists, but this and other differences largely rose from a difference in objectives: Freud was sceptical about the possibility of ‘curing’ the perverted, while the main body of European psychoanalysts considered themselves medical workers. This was what distinguished psychoanalysts of the 20th century from sexologists: a focus on treatment versus research.

Fetish as Universal Phenomenon

‘Sexology’ became an accepted and well-defined intellectual discipline around the turn of the 20th century. This discipline was politicised in the sense that it dealt with power relations and the representation of deviants – the founder of the first sexological journal, Magnus Hirschfeld, defined sexology as a ‘progressive science’. His findings supported this definition: sexual deviation was not pathological or dangerous to society on a wider scale. A second founder of the discipline, the American Henry Havelock Ellis, claimed that sexology should serve a primary role in the politics of sex reform, and tried to garner sympathy in particular for sexual inversion.

Historical and anthropological contexts were added to the study of sexology, to divorce it from the exclusive domain of psychoanalysis and to potentially gain a deeper understanding of sexual proclivities. The question for Iwan Bloch, a major figure in the field and the creator of the term ‘sexology’, was not the origin and treatment of fetishes, but the reason they had been repressed throughout most of human history, and why they continued to be repressed. Activism from these circles mainly focused on the legal reformation of anti-homosexual laws, even from those sexologists who believed homosexuality to be a mental disorder.

Henry Ellis argued that the phenomena central to perverted desire was closely related to socially accepted sexual norms, implying that fetishists were in fact closer to the sphere of ‘normality’ than people had previously believed. According to him, sexual desire existed on a bell curve, with the majority of society close to the mean and a relative handful of individuals located at the extreme ends. Yet all previous standards upheld from Krafft-Ebing’s time were not lost: Ellis considered exhibitionism a perversion of the courtship instinct, and considered excessive self-stimulation to have harmful side-effects.

Ellis found a receptive audience in the 20th century Americans; in part because his writing was less obsessed with theory than the typical German tract, in part because it was filled with examples of deviation. These details, released publicly in popular science books, were seen as a form of social amelioration for those who had previously been pathologised. Many American sexology books became bestsellers.

Some statistical arguments and biological arguments were used to reduce the stigma of the fetish: because much of the population had a fetish of some kind or another, according to sexological research, practices that were ostensibly deviations were in fact secretive norms. Many of the fetishes were also practiced by animals, meaning there existed a biological, or a natural basis for a fetish. The idea was at the time radical, and by no means an accepted perspective – but an extant one.

Well, what do you think it means?

Try the following experiment: run the gamut of 20th century anthropological attitudes on the fetish, as I have just outlined them, through the gauntlet of 1960s and 70s progressivism. The result is our contemporary conception of the fetish as something slightly scandalous, but mostly harmless, and usually privately admissible.

(One will forgive the lack of relative detail in this section; my assumption is that the layman is familiar with the liberalising tendency of the second half of the 20th century on various fields of the social sciences, and the reader is also aware of the liberating effect its socio-cultural movements had on the public consideration and expression of sexual deviance. There is little I could add to that understanding within the scope of this work. To return to our topic, the modern-day fetish…)

According to the ever-reliable Oxford dictionary, a fetish is now “a form of sexual desire in which gratification is linked to an abnormal degree to a particular object, item of clothing, part of the body, etc…an excessive and irrational devotion or commitment to a particular thing.” The original meaning of the word as an object of African worship is now apologetically retained as an outdated secondary meaning.

Since it has become to a large degree secularised, in the sense that it no longer refers strictly to either the sexual meaning or the original religious meaning, the term is somewhat less charged in modern Western society.

With that in mind, and given that we appear to have run out of history to analyse, please allow me to delve into the realm of speculative philosophy, to create a prospective definition for the term ‘fetish.’


The term may be increasingly divorced from its overtly psychological meaning, and come to refer to something like the following:

“A specialisation undertaken for its own sake, a private interest (not reliant on anyone else sharing it for it to interest you) that serves as a framework through which you interpret the world. Or a ‘motivating framework’. Not something that exists in isolation, and not something that completely dominates the mind, but exists in tandem with a whole host of other specialities and interests. We might tentatively call a ‘worldview’ a ‘collection of fetishes.’”

This covers both the original divine spectrum of the word and the modern secular use.

Under this definition, the fetish can be considered a ‘god’ in the sense that it provides an underlying meaning and reason to act in the world, and as an artistic/creative endeavour.

  1. Fetish in divine terms

There is no reason to discount the original definition, since a fetish and a god are the same. There is a tie here between the idea of ‘god as fetish’ and ‘fetish as god’. Both contain the key to meaning: a solid bedrock, an unquestionable foundation through which to interpret the world. Unquestionable in the traditional sense due to superstition, certainly, but now also unquestionable from a secular viewpoint as a result of the time put into it, the hours of understanding gained from a lifetime of experience, and from the inability of even the most ardent postmodernist to discount that experience.

A fetish contains its own minuscule yet gargantuan world: within the area of a single art-form or of a profession is a universe-full of specialist terms, of ideas, a personal history relative to the history of everything outside of it, which may actually belie or contradict another fetish’s history in tone, if not in content. A fetish defines things.

An example. Ask a 21st century atheist what God is and he’ll likely reply with some variation on ‘a psychologically driven superstition’. None of the terms of religion or the practices thereof have any meaning for him. But a 12th century Frenchman believes that God is the underlying reason for everything, and the existence of his God – his fetish – is what allows him to define everything, understand abstract events. He can look at an assortment of religious tools, symbols, icons, and understand how they all fit into the overarching tapestry of Christian faith – what each piece means, what it’s used for and why. In other words, for the man with the belief, the man with the fetish, somewhat arbitrary practices have their own meaning and map onto the world in a very specific, specialist way.

To use a secular example, a chess player understands the reason each piece moves the way it does – because it’s an abstraction of a certain type of warrior on a battlefield.

The understanding of shared world history differs between a fetishist and a non-fetishist. The aforementioned 12th-century Frenchman considers the crucifixion of Jesus to be the supreme moment of salvation, the event around which the world turns, while the atheist would view it as merely another Jewish rebel being given a typical Roman punishment. To use a less dramatic example, consider ‘history’ as viewed by an art historian, versus someone whose chief focus is political history. A peaceful exchange of techniques, driven by outside factors which are not in themselves important – versus the history of those same factors.

Whoever lacks a fetish lacks meaning: all aspects of life assume an equally important, agnostic, characterless character, and the observer becomes a post-modern believer in nothing, immobilised, with only a casual interest in everything, unable to decide anything is more important than anything else. This is the fate of the fetish-less.

2. Fetish in artistic terms – what makes good art?

The fetish in regards to art is a combination, between the excessive focus outlined in my modern definition (specialisation for its own sake) and the Marxist idea of commodity fetish, or the uniquely sell-able product. A fetish is what distinguishes one piece of art—one product—from another. What people typically consider ‘good’ art is usually a piece with a strong, unique focus, where a specific theme is explored in unusual and interesting depth, as opposed to a poor work, where a theme is ignored, treated in a shallow manner, or used as window dressing for the sake of some irrelevant aspect of the work. In other words, good art is good because it caters to a specific fetish.


Whether my prediction for the future of the idea of ‘fetish’ is deluded or prophetic, out-of-order or the order of the day; whether the fetish will become an entirely secular, non-sexual concept again or whether the European psychologists have associated it with the carnal instinct beyond all recovery is not something I can tell you.

Take, as a final example, the excessive, fervent devotion of a frenzied, sexually frustrated acolyte for a religious icon, which promises to cure him of his shameful impurities: an icon which he is intent on purchasing. Here is the archetypal frustrated, embarrassed 15th century Christian, convinced by the Catholic church that he can indulge in his forbidden fantasies if he purchases the requisite volume of indulgences. In this man, every aspect of the term ‘fetish’ is combined into a focal point. It is at once an obsession, a commodity, a locus for his desires, and a thing that grounds his worldview and allows him to define his world relative to it. So, perhaps, the term will never lose its potency, as long as we have that example to draw upon.

Within the history of the fetish we encounter a history of cultural exchange, unwilling and purposeful, of rebellion, discrimination, of mental deficiency, of degeneracy, of religions lost to time. The fetish as object and the fetish as symbol of the mind have been collated in our modern understanding as a slavish fixation, something to obsess over and fascinate us forever, no matter the future.

The Ageism Epidemic

Australia, Philosophy, Politics, Society

Lachlan Green

Disclaimer: Names have been changed at the request of those involved for the sake of privacy and dignity.

Jodie is an incredibly talented artist. Her rural landscapes dot the walls of her villa and her bedroom. The intricate details of the country settings are made even more incredible when it is discovered that all the locations are painted from Jodie’s own memory. Her modesty means she adamantly refuses to take any compliments about the works, preferring instead to criticise the tiny imperfections that only an artist can see.

Jodie is also in her late-70s and living in a residential aged care facility. In fact, her talent for painting wasn’t uncovered until she had her first art therapy session in the facility, in which she showed off a talent she hadn’t explored at all through the years. A major part of Jodie’s life has been her involvement in her local rugby league club, where she was heavily involved in team management and business operations for many decades. She maintains an undying love for her footy team, and frequent visits to the club show that she’s still a familiar face and her significant contributions are often celebrated.

On one standard Tuesday, Jodie had to catch a taxi into Brisbane city for a regular hospital appointment. When the taxi arrived, Jodie saw that it was a larger, van style, Maxi Taxi. Due to age-related complications, Jodie has limited mobility. On this particular day, the taxi driver refused to call Jodie a smaller taxi and refused to assist Jodie into the back of the taxi. After some time, Jodie hauled herself into the back of the cab and they were off.

On arrival to the hospital, the taxi driver remarked that she was slower than most people and asked her to get out of his vehicle. Jodie sat for a moment and asked once again for assistance, this time, clearly upset with the time all of this was taking – the driver finally agreed. Briskly opening the side door, grabbing Jodie roughly on both arms, and applying enough pressure to make Jodie feel like she was being pulled out onto the road. Jodie showed me the grab marks on her bruised arm that persisted. Jodie made one comment to the driver before leaving, “When you’re my age, I hope that no-one treats you the way you have treated me.”

While a more extreme example, this is just one way that ageism (age-related discrimination) manifests in our society. Ageism is surprisingly rife in western society and although many people would claim it does not affect their daily thinking, it pervades culture in interesting and unrecognised ways. Simple, “harmless” generalisations about older generations and assumptions about older peoples’ abilities are basic ways in which discrimination manifests and paves the way for more sinister forms of ageism. For instance, the taxi driver’s intolerance of Jodie’s impaired mobility could most likely be attributed to a sub-conscious belief that, as a younger person, he was superior to her. Current ideas regarding ageing implicitly perpetuated in the West centre around people becoming less capable and more worthless as they grow older. This is demonstrated by the countless stories of people unable to find work once they get to the later stage of the “middle-aged” bracket.

Of course, the most extreme manifestation of ageism is elder abuse, one of the most globally prevalent forms of abuse. Elder abuse merits an article to itself, but it is safe to say that stories of physical, sexual, financial and psychological abuse and neglect are not hard to find. Evidently, the mindset of many perpetrators of elder abuse is based in ageist stereotypes and judgements of the older victims.

As with other forms of discrimination, ageism can be questioned philosophically. The big question is, what part of human reasoning causes ageism? Many people have provided a range of answers, all of which probably hold some degree of truth. Some would say that the common perception that older people are a ‘drain’ on society and on public resources, causes people to treat older people with disregard. Yet, this perception itself is inherently ageist. Many older people do contribute economically and the generalisation that all older people do not serve any purpose in society is central to the problem of ageism (and is nothing more than a gross stereotype). In fact, in many instances where older people are not contributing, it is because of younger people in the workforce refusing to work with them.

Another interesting, and perhaps more philosophical answer would be that older people are treated poorly because of a human fear of growing old. While I am not in any place to deny or confirm that answer, I do not see fear as a valid reason to treat someone as less than oneself.

The story of Jodie is not unique, it’s not even especially remarkable. The taxi driver was reported and is pending disciplinary action. Jodie claims that she’s okay but she has stated, “I don’t feel like I’m comfortable going out on my own anymore.” Yet, it’s the words that follow that floor me, and in the many instances of ageism that I’ve been informed of over the last 4 years of working in, or close to, the aged care industry, these words always seem to follow. “Don’t worry, it’s just how it is.”

Works Cited

UQ Campus Heated as Anti-CCP and Chinese-Counter Protesters Clash

Australia, Brisbane, Politics

Nilsson Jones

UQ’s Market Day at the St Lucia campus was hijacked by mid afternoon as tensions boiled over between Chinese and Hong Kong protesters. 

Market Day was an opportune time for anti-Chinese Communist Party (CCP) and pro-Hong Kong protesters to join together and get their message out to students. 

Drew Pavlou, an organiser for the anti-CCP protests promoted the event on Facebook in the days leading up to Market Day.

Mr Pavlou has been a vocal student on the persecution of Uighur Muslims in mainland China, as well as the controversial relationship between UQ and the Chinese Government – regarding international students bankrolling the university and the presence of the Confucius Institute on campus. 

On Wednesday morning, Chinese students contacted Mr Pavlou and encouraged him to cancel the event, as it would cause unnecessary and unwanted damage to the broader Chinese-Australia relationship and implicate students who wish not to become involved. Some sent Mr Pavlou death threats.

The event went ahead at midday, beginning as a peaceful sit-in with students holding signs that criticised outgoing UQ Vice-Chancellor, Peter Hoj for the Chinese Government’s supposed influence at UQ. 

Pro-CCP students then gathered directly across from the protesters at the entrance to the great court near Merlos, as a form of counter-protest. 

This is when the day got interesting. 

The counter-protesters arrived with speakers loudly playing the Chinese national anthem on repeat as students joined in on the singing and chants that were endless throughout the afternoon’s ‘festivities’. 

Both groups screamed and chanted at each other until tensions boiled over when Chinese students began tearing up pro-Hong Kong signs and pushing the protesters away. 

The violence then escalated when Mr Pavlou, amongst others, was assaulted, being repeatedly punched then pushed into a group of students sitting down. 

Another student was hit over the head with an energy drink by a pro-CCP student, it was then that a UQ security officer stepped in and was bitten by the Chinese student. 

It is important to note that there were UQ security officers present throughout these events, however, they remained largely uninvolved to avoid the appearance of stifling debate and free speech on campus. 

It was after signs were stolen and students were punched that police arrived on campus and remained there until the groups calmed down and later moved on. 

There was a brief forty minute period where things looked to have fizzled out, the groups had dispersed from the Great Court. A group of students led by Mr Pavlou occupied the Confucius Institute on campus and live streamed from the location. 

Following the Confucius Institute protest, approximately fifty Hong Kong students sat down near the Grassy Knoll. 

Domestic students who were in support of the Hong Kongers, as well as those in opposition to the CCP, huddled behind the students in a showing of support. 

Over the next thirty minutes, pro-Chinese students had re-grouped opposite the Grassy Knoll, with an increasing presence that grew to over 350 as tensions began to rise once more. 

The chanting and singing returned as both sides grew increasingly agitated by one another until the second round of violence ensued. 

Police and media presence was far more apparent at the revived protest as key figures from both sides were interviewed and questioned about the day’s events. 

This violence and motivation to protest did not exist in a vacuum, there were several peaceful Hong Kong student protests at the end of last semester showing solidarity with family and friends back home. 

Similarly, tensions and debate have been present on UQ platforms such as Stalkerspace following the events and protests in Hong Kong in previous months. 

Wednesday’s events will most likely not be a one-off, with a larger protest already proposed on campus for next Wednesday.