Loading...

Follow The Electric Agora – A modern symposium for the .. on Feedspot

Continue with Google
Continue with Facebook
or

Valid

by Daniel A. Kaufman

_____

My discussion with Milton Lawson, comic writer and cultural critic, on the Stranger Things phenomenon, Generation X, and popular culture.  Originally aired on MeaningofLife.TV, July 17, 2019.

"Stranger Things" and Generation X | Daniel Kaufman & Milton Lawson [Sophia] - YouTube

 

  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

by Miroslav Imbrišević

___

Justice is an important human good. I will review one ancient and one modern principle of justice and discuss their relevance for the political sphere, paying particular attention to the Trump presidency.

We have inherited two maxims from Roman law that are commonly known as representing principles of justice: nemo iudex in sua causa (let no one be a judge in their own cause) and audi alteram partem (listen to the other side). They are also called “principles of natural justice” and have for a long time been part of the common law. Apart from the legal realm, these principles can be applied in the private sphere, as well as in political life.

The role of these principles is to put formal (i.e. procedural) constraints on the exercise of justice.  With respect to the first, in the case of the law, a judge should not rule on a case that involves his wife or any other relatives, friends, or enemies. For an example outside of the law, we might say that the chair of a board should not decide his or her own remuneration. By barring people from judging their own cases, we avoid dangerous conflicts of interest.

This principle recently came to the fore, when President Trump claimed the constitutional power to pardon himself, in the case of impeachment and conviction by the Senate. There may be ambiguity in how the relevant passage in the US constitution should be interpreted, but I would submit that the framers of the constitution had nemo iudex in sua causa in mind. Regardless, this particular principle of justice helps us to decide the question before we address any issues of constitutional interpretation: as a basic ethical matter, Trump should not be the judge of his own case.

The military junta in Argentina attempted something similar in 1982. Five weeks before elections were to return the country to civilian rule, the generals passed a law which would give the military and the police immunity from prosecution for their deeds during the “dirty war” era. Thousands of people disappeared and many were tortured or even killed. The generals justified their atrocities by claiming that they had “fought for the dignity of man.” It took a long time, but the Argentine Supreme Court finally struck down the immunity law in 2005.

Only once did a sitting President – Gerald Ford – pardon another President – Richard Nixon – and we can construe Ford as having acted on behalf of the citizenry. A President who pardons himself for impeachable offences while in office usurps the right to judge a wrong-doer, which normally rests with the people or their representatives. Sitting Presidents are special in that their malfeasances are tried by the Senate, presided over by the Chief Justice of the United States. Trump’s belief that he can pardon himself reveals that he is convinced that others cannot judge him, an attitude we associate with absolute rulers, not U.S. Presidents.

I will now turn to another principle of justice, advanced by the philosopher John Rawls and according to which “positions of authority and responsibility must be accessible to all,” which, among other things, would seem to forbid nepotism. In politics, this principle was most famously invoked in the wake of John F. Kennedy’s appointment of his brother, Robert, to the post of Attorney General, in 1961. Though it turned out that the young and inexperienced lawyer was a capable official, six years later, in 1967, Congress passed an anti-nepotism law – also known as the “Bobby Kennedy Law.” In the American imagination anyone can become President of the United States, so surely, anyone should have the opportunity to take up other government positions, regardless of parentage. The sole consideration should be merit.

We find similar ideas in ancient China. The Chinese imperial examination for selecting government officials goes back more than 2000 years. Having competitive exams would ensure that government appointments would be based on merit rather than on social background and connections. In the 19th century, several European states as well as the U.S. saw the benefits of the Chinese system and adapted it for their own countries. Not only would this prevent the concentration of power in the hands of a few families, it would ensure that the best candidates would be appointed to government positions.

The U.S. anti-nepotism law explicitly includes the President. However, under the current administration, Deputy Assistant Attorney General Daniel Koffsky overturned longstanding legal advice against the hiring of relatives, insisting that “In choosing his personal staff, the President enjoys an unusual degree of freedom, which Congress found suitable to the demands of his office.” It would seem, then, that it is now acceptable for the President to give jobs to his daughter and her husband.

It doesn’t matter that the positions are unpaid, because Trump’s relatives occupy positions that were not open to others, who might have been more qualified. Being close to power also often results in a pay-off later, something we observe when former politicians gain plush jobs in business or benefit in other ways from their former positions. But even if the family members were well qualified for the job, the slightest hint of favoritism should be avoided or else citizens my lose trust in their government, which is crucial to the functioning of a democracy.

This principle can be extended to apply to the buying of offices. Prior to the Reformation, once could purchase and sell church offices (known as “simony”) and use them to enrich yourself.  In the 16th century, judicial offices, known as “venal offices,” were for sale in France. Today, in the U.S., large donations to a political party can buy you an ambassadorship, and in the UK such generosity to a party may lead to a Lordship.

President Obama appointed Colleen Bell, a soap-opera producer (‘The Bold and the Beautiful’) to be the ambassador to Hungary. She had made large donations to his election campaign. Her bachelor’s degree may have been in economics and political science, but her performance during her confirmation hearings was dismal. When questioned by John McCain, she couldn’t name a single strategic interest the U.S. has in Hungary.

It is common for 30% of ambassadorships in the U.S. to be political appointments. The rest are filled by career diplomats. During the Obama administration, the number of amateur ambassadors rose to 37%, and in the first two years of the Trump administration, it went up to 42%. In other democratic countries, this would be unthinkable, because regional knowledge and political sensitivity are minimum requirements for these posts.

Although Britain traditionally only sends career diplomats abroad, there have been exceptions. In 1977, Peter Jay, a journalist and broadcaster, was appointed ambassador to the U.S. His unique qualification, as one commentator observed, was being the husband of the Prime Minister’s daughter.

Also under the umbrella of this principle is the idea that you should not profit from your office. This is why it is customary for politicians either to sell their business interests or hand over the running of them to others (preferably not family members or close associates), prior to taking office. President Trump has chosen to relinquish his position as CEO of the Trump Organization to his sons, and because of these family ties, he retains a close connection to his former businesses. Consider the following: on 1st January 2017, shortly before he took office, the initiation fee for his luxury resort in Florida, Mar-A-Lago, doubled, going from $100,000 to $200,000. One could say that this was a shrewd business move, but when you take political office you are supposed to serve the people rather than your own business interests.

The President is a frequent visitor to his Mar-A-Lago and likes to use it for official visits by foreign dignitaries. This opens up the opportunity to buy political or commercial influence by becoming a member. His private clubs also serve as recruiting grounds, and a number of longstanding members of the President’s various clubs have been elevated to ambassadorships.

One of President Trump’s election promises was that he would “drain the swamp.” Judging from his actions, it appears that he is hydrating the swamp rather than draining it.

Miroslav Imbrisevic is a philosopher, formerly of Heythrop College/University of London. His background is in legal and political philosophy, but recently he has started working in philosophy of sport as well.

  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

By Daniel A. Kaufman

___

Bernard Williams is one of the top 5 greatest post WWII philosophers.  Here is a new website devoted to collecting everything available by and about him.

https://sites.google.com/site/bernardwilliamsphilosopher/home

Rare and very high quality live footage of Def Leppard, from 1983, when they were still a hard-rocking NWOBHM band. (Concert split into two parts)

DEF LEPPARD - Live In Germany: Part 2 (Rockpop In Concert, 18.12.1983) OFFICIAL - YouTube

A classic paper by Hilary Putnam and one of several important efforts within analytic philosophy to show that linguistic meaning should not be understood in terms of mental representation.

https://conservancy.umn.edu/handle/11299/185225

Pat Buchanan may be horrible, but he is politically savvy.  A keen take on how Trump has played the Democrats again.  Will my party ever learn?

https://www.theamericanconservative.com/buchanan/trump-fuels-a-tribal-war-in-nancys-house/

Eddie Murphy was a comedy genius.  One of his greatest SNL sketches.

James Brown's Celebrity Hot Tub Party - SNL - YouTube

Israeli-Style Hummus Masabacha (Hummus in which the chick peas are left whole, rather than blended with the tehina) and Israeli Tehina

https://www.seriouseats.com/recipes/2017/11/hummus-msabbaha-masabacha-musabbaha-chickpea-recipe.html

https://www.seriouseats.com/recipes/2016/03/israeli-style-tahini-sauce-recipe.html

  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

by Heather Brunskell-Evans

___

Thirty years ago, ‘the transgender child’ would not have made sense to the general public, nor would it have made sense to young people. Today, children and adolescents declare themselves transgender, the NHS refers some children for ‘gender-affirming’ therapy, and laws and policy are invented which uphold young people’s ‘choice’ to transition and to authorize the stages at which medical intervention is permissible and desirable.

The current narrative of the transgender child has numerous, attendant strands: although previously unrecognized, children born in ‘the wrong body’ are alleged to have always existed; parents are ‘brave’ when they accept their daughter is ‘really’ a boy (and vice versa); active cultural support of children’s gender self-identification helps revolutionize hide-bound, sexist and outdated ideas about gender; and medical intervention is a sign of a tolerant, liberal and humane society. What is the provenance of such a narrative? On what scientific medical/ psychological/philosophical bases are these composite ‘truths’ founded?

Liberal Wisdom 

A small event serves to illustrate the ubiquity of the current liberal wisdom that the transgender child is real and that transitioning children is progressive. A young woman in her early twenties does odd jobs for me in the garden. She is fascinated by the fact that I am an academic who analyzes sex, gender and sexuality, but at the same time I am skeptical about current knowledges,  politics and ethics of transitioning children.

One day, during moments when I took her cups of tea and chatted, she questioned me about my critical views. She informed me that in ancient times shamans revered sex-indeterminate individuals as nearer to gods than other mortals. In the present day, men dressing as women is an expression of their ‘feminine side’, thus demonstrating that feminine men have existed throughout time.  I assured her that I can understand a spiritual perspective that honors gender indeterminacy. It seems to me that freezing masculinity and femininity is restrictive of the range of human expressions open to men and women.  I quipped, truthfully, that as a heterosexual woman I always find men who are uncomfortable with masculinity far more attractive at every level of human connection than their less self-reflexive peers.

My friend was perplexed by what she judged to be my contradictory perspective – my acknowledgement that masculinity is restrictive for men and yet my avowed critical analysis of the theory and practice of transitioning children. Each of my replies was unsatisfactory to her, but rather than deterring her, they provoked further questions. Surely, she probed, to be opposed to transitioning children is tantamount to resurrecting the patriarchy? The moment her assessment was out of her mouth, I knew I had fallen into the (by now) familiar rabbit hole when casual discussion about transgenderism occurs. A dichotomy is erected, blocking any other view or thoughtful exploration: progressive people are supportive of transitioning children; critics are bigots wedded to traditional gender roles. Experience has taught me that once ethics are framed in this way – a direction of travel usually arrived at with great alacrity and in this instance about four minutes – there is no gainsaying my interlocutor’s belief in her own alleged moral and political high-ground. In contrast, I become immediately aligned with transphobia and likened to the kind of person who, ‘back in the day’, would have opposed lesbian and gay rights.

What I would have liked to have replied on this occasion is that whilst I share my friend’s aspiration for freedom from gender (indeed I have worked to this end all my private and professional life), we have a moral obligation, particularly with regard to children and adolescents, to open transgender doctrine to critical scrutiny. To truly defend progressivism, we would need to examine the following: Are the sex-indeterminate men of ancient times the same kinds of person as 21st century men who identify as women? What relation does transgender doctrine propose exists between biological sex and gender? What kinds of persons do lobby groups, such as Mermaids and Gendered Intelligence, assert transgender children are? What are the long-term consequences for children of social and/or medical transition? Do the doctors and the lobbyists fulfill their shared declared aim of releasing children from gender oppression, and if not, why not?  What are the consequences for all children of the narrative that it is possible to be ‘born in the wrong body’?

In order to examine the component parts that make up the narrative of the transgender child as a real, ahistorical, naturally occurring figure I use the genealogical method of the philosopher Michel Foucault who traces histories of the present power/knowledge/ ethics relations out of which sex and gender identities emerge.

The Making of The Transgender Adult

The identity of the transgender child cannot be fully understood outside of the history of the making of the transgender adult, since the transgender child is its off-shoot.

Transgender adults repeatedly claim that their gender was not aligned with their ‘assigned’ sex at birth. The concept of assigned sex suggests that when babies are born, an evaluative judgement is made, one that can mis-recognize the sex of the child. However, the phrase ‘assigned’ is only relevant to intersex people, about 0.05% of the population, whose genitalia at birth are ambiguous (or approximately 1.7% if the percentage includes later discovery for example of intersex chromosome composition, gonadal structure, hormone levels, and/or the structure of the internal genital duct systems).  The fact that a tiny percentage of people are born intersex – the category of person referred to by my gardening friend – does not negate the fact that with very little exception babies occupy a sex-category, male or female, based on objective, observed reality. Since male and female are discernible biological categories Robert Jensen, professor of journalism, asks: “What does it mean for someone unambiguously female to claim as an adult she is in fact male (or vice versa)?” I ask: What is the political and social context out of which such a claim, bizarre to the ears of the general public 30 years ago, apparently now makes sense?

In the 1970’s and 1980’s a paradigm shift in thought about sex and gender occurred, primarily brought about by feminist activists, theorists and philosophers,  which drove a wedge between biological sex (the division between male and female based on reproductive capacity) and social gender (‘masculinity’ and ‘femininity’). At the same time,  Foucault’s idea that homosexuality has been historically pathologized by medicine as a means of heteronormative social control was used by the lesbian and gay movement to critique the cultural designation of heterosexual men and women as psychologically healthy in contrast to homosexuals designated deviant.  The ‘pathological homosexual’ was found not to be an objective naturally occurring type of person, but a socially constructed identity reified as if natural.

A small transgender movement had also sprung up alongside the lesbian and gay movement in the 1970’s and 1980’s, and although connected, nevertheless retained different aims, aspirations and politics. Janice Raymond, professor of women’s studies and medical ethics, describes the possibilities for transgender self-identification during this period. Although the number of individuals identifying as transgender was minimal, it began to grow in line with the development of new medical technologies – hormone treatment, breast implants and the construction of artificial cavities for vaginas – that attempted to simulate the opposite anatomical sex.  During the 1980’s and 1990’s, queer theory was developed which built upon the lesbian and gay examination of gender identities and the pathologisation of same-sex attraction. The sociologist Sheila Jeffreys points out that queer politics very quickly became less about the original gay and lesbian movement’s analysis of sex and gender and structures of oppression, and more about the rights to play with and transgress gender norms.

By the end of the 1990’s, partly because of the potential for networking created by the internet, the transgender movement became firmly established. It began to make the following claims, overturning the epistemological insights and political possibilities of the sex/gender distinction: gender is not socially constructed but inherent; the biological division of human beings into two-sexed categories is socially constructed;  medical ‘sex transition’ is a human right; transgender people are marginalized and oppressed by the same heterosexism that had discriminated against lesbian and gay people; transgenderism is transgressive and thus axiomatically progressive.

The philosopher Terri Murray argues the current transgender movement gives the appearance of progressivism but is not a natural sequel to feminist and gay liberation.  Rather, in reifying gender it gives credence to the very gender myths that lesbian and gay activists originally spurned. LGBTQ+ is divisive of the once-powerful countercultural movement, reinforcing the myth that men and women are “different species of human being, not just reproductively, but mentally – with different desires, different needs, different aptitudes, and different minds.”

The Making of the Transgender Child 

During the same period that the transgender movement was gathering political traction, the transgender child was beginning to make its debut. Since the 1990’s, organizations such as Gender Identity Research and Education Society (GIRES) and Mermaids spearheaded demands for early medical intervention on the grounds it would spare gender nonconforming or gender defiant children the future trauma of reaching adulthood in ‘the wrong body’. These organizations were joined in 2008 by Gendered Intelligence, a lobby group that queers childhood.

In a newly published book Inventing Transgender Children and Young People edited by myself and my colleague Michele Moore, Professor of Disability Studies, I use the method of genealogy to trace the complex interrelationship between these lobby groups and the UK’s national health service clinic for children, the Gender Identity Development Service (GIDS) at the Tavistock. At the beginning of 2019, the GIDS Multi-disciplinary Team responded to public concern about, amongst other issues, the clinic’s administration of puberty blockers and cross-sex hormones to children and young people by explaining the basis for its decision making. Senior members of the GIDS team tell us: transgender identity can be a born property, and transgender identities have existed throughout history; transgender identities have been suppressed historically; it is an example of today’s more progressive society that these identities can now be expressed, and their suffering alleviated.

The GIDS team demonstrates a shocking lack of the history of ideas that informs its own affirmative practices. Its alleged multi-factorial approach is in effect driven by a single theoretical construct: it has at its core the issue of ‘identity’, defined by transgender theory and lobbying.  The idea that transgenderism is an internal, pre-social phenomenon that has existed throughout history is not an evidenced fact but a proposition. The clinic has no credible scientific basis for the theory it applies in a radical and experimental way to children, referring some physically healthy and phenotypically normal children and young people for dangerous, off-label drug treatment, with life-long deleterious consequences, including sterility.

The past thirty years have been witness to the invention of two identities for the transgender child: the first is that of the unfortunate victim ‘born in the wrong body’, i.e. whose gender self-identification requires medical diagnosis and hormone treatment (GIRES and Mermaids); the second is that of the revolutionary adolescent who bravely sensitizes the older generation, including trained clinicians, to the subtleties, complexities and politics of gender (Gendered Intelligence). These seemingly contrasting identities are still evolving and taking shape, but are increasingly synthesized into the one figure that we know today, ‘the transgender child’, who is invested by the GIDS with the capacity to consent to hormone therapy and for whom any dissent on the part of a clinician would be classified as conversion or reparative therapy.

Elsewhere, I demonstrate the relationship between the law and medicine in discursively producing the figure of the transgender child.  Considerable social, political and legal changes have occurred in response to transgender lobbying, and there is increasing acquiescence by governments to demands for transgender rights. Not only does the UK legal system now enshrine the legal fiction that transgenderism exists and that adolescents can be competent to consent to life-changing medical intervention, by 2020 all children will be taught in schools that transgender identity is inherent and that they and their brothers, sisters and friends may have been born in the wrong body. Shelley Charlesworth, a former BBC journalist, analyses the materials that convey these messages in programs already taught within some primary schools.

In conclusion, the consequences of the ‘transgender child’ is not only felt by the children and young people who access the clinic’s services but by the nation’s children from primary school upwards. As a society, parents need to be alert to this phenomenon, and refute the attribution that any critical reflection aligns them with bigotry and homophobia.

The Confused Ethics of Transitioning Children 

The transgender child is not a naturally occurring, pre-discursive figure but a newly constructed category of person forged out of the following: psychoanalysis, psychology and queer theory; lobby groups and transactivism;  and misdirected liberal values. These combined relations of knowledge, power, and ethics construct the composite picture of the transgender child. This identity is no more objective and no less political than the ‘pathological homosexual’ that conventional liberal wisdom is now happy to consign to the dustbin of history.

I suggest that ‘the transgender child’ should be equally rejected and consigned.  By subsuming the multifactorial sociological, psychological and familial context within which a child identifies as transgender under an overall model of affirmation (a model that it is allegedly transphobic or anti-trans for sociologists, philosophers and psychologists to question), the gendered intelligence offered by transgender doctrine to children, parents, doctors and society not only endorses the very gendered norms of masculinity and femininity it purports to revolutionize but exposes children to lasting physical harm. Medical procedures are carried out based on a child’s subjective feeling for which there is no scientific test and where clinical diagnosis is based on the child’s self-report. In contrast to the idea that transitioning children is progressive and humane, I conclude it is politically reactionary and an egregious abrogation of adult responsibility to fulfill their duty of care, played out on and through the bodies of children.

  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

by Daniel A. Kaufman

____

My discussion with Jane Clare Jones on sex, gender, feminism, progressivism, and liberalism. Originally aired on MeaningofLife.TV, Monday, July 8, 2019.

Gender Noncomformity and Feminism | Daniel Kaufman & Jane Clare Jones [Sophia] - YouTube
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

by E. John Winner

___

Born to die

Some German theorist came up with the idea that primitive peoples approach death as always a mysterious, even incomprehensible, visitation to the tribal community, requiring avoidance and always met with fear.  He must have been thinking of the faculty at his university.

Actually, the archaeological and anthropological records are quite clear.  Primitive peoples know what death is, prepare for it, and deal with it with ritualized burial routines.  It is simply another event in a universe filled with events, and eventually, as the individual well knows, every individual of the community will die.  It is the community’s survival that gives meaning to the individual’s life; performance of one’s duties for the good of the community prepares a person for a death worthy of the community.

In our culture, however, some are so convinced that death will never touch them that one suspects the fear of death is the primary motivation behind their choice of what is essentially a  hedonistic lifestyle.  One quickly discovers  their fierce opposition to any mention of the possibility of anyone’s demise: “I don’t want to think about that!”  “It’s morbid!” One wonders why we don’t simply outlaw any mention of death, and to some extent we do: it’s why so many Hollywood movies have happy endings, even when the logic of the narrative points inexorably towards the hero’s death.  People don’t die, they “pass away,” “go to a better place,” and “are no longer with us.”  At the very worst they “breathe their last breath.”

Of course in such narratives, one’s enemies are allowed to perish in the most gruesome ways imaginable: shot,  hung, stabbed, beaten to a pulp, strangled,  immolated, crucified, blown to smithereens, beheaded, disemboweled, impaled, drawn-and-quartered , whacked, gassed, nuked, wiped from the face of the earth, and killed, killed, killed.   Grammatically, such narratives cannot speak of an enemy without asserting he or she as (at least in potential) the direct object of the verb ‘kill’.  It is here that we see the denial of the selfhood of other human beings most graphically displayed; when thinking along these lines, one cannot imagine oneself dying, but it is easy to imagine the death of the enemy who is perceived as merely and only a thing to be destroyed.

This trenchant opposition to the possibility of one’s own death structures the worldview of many, like the operating system of a computer always running in the background, regardless of what other programs are in use.  It determines how we meet every moment of our existence, how we raise children, and how we prepare to leave the world to those children, once they are grown.  The terrible truth, of course, is that we never really prepare the world for  the next generation, since we deny the possibility of our own death.  We tacitly assume that we will outlive their children who, after all, are only brought into the world as a source of entertainment.  No wonder we so cheerfully send their children off to war: even the death of the child is but the performance of a theatrical tragedy that we learn to live with; even, perhaps, secretly enjoy, as the catharsis of our own fear of death.

Such people live each moment not as if it were the last (although they frequently assert as much), but as if there would never be a last moment; as if this were the only one that ever existed and the only one that ever could exist and as if it could not end. This is the “eternal now” that we often hear described as the time for our greatest enjoyment in life, something frequently claimed as being derived from “Oriental” philosophies such as Buddhism.  But, a Buddhist doesn’t deny the passage of time or its culmination in death. Rather, he learns to live with it.  The “eternal now” is the wet dream of one who is in love not with life, but with the fear of death.  A reflective, reasoning person learns to live life on its own terms; the hedonic recidivist refuses to learn to live.  One can imagine someone replying, “We don’t need to learn how to live, we just do it,”  but this can only be true of animals, incapable of rational self-awareness.  Human beings must always must learn how to live. Our infancy is too prolonged, our self-sufficiency outside the womb too inadequate, to assure our survival.  If we become individuals, it is thanks to the community that raises us and to which we consequently are thus obligated. We are not and cannot be, by nature, individualists, except within a social domain.  The supposedly “free,” isolated individual is doomed to an early death.

The claim to self-centeredness is thus denial of one’s own childhood and of the process of maturation.  Maturation is a temporal process, and its inevitable stopping point is death.  No wonder some deny it.  Their existence would be perceived as wholly atemporal, if it weren’t for the inevitable frustrations and crises of survival.  Life thus reduces to a meaningless sequence of immeasurable enjoyments, randomly sectioned by annoying events of purposeless frustration and disappointment.  Too many believe that such a life – with as few frustrating events as possible – constitutes “happiness.”  Yet, paradoxically, this is not really living, it is the mere passage of time.  Every period of unbothered enjoyment is but an intermission between crises.  Those not attending to their own history may not remember a spouse dying or even deny that spouses ever die (one psychological benefit of belief in an afterlife).  But they can no longer cherish the remembered pleasures of the espousal.  These pleasures were expected as their due and consequently were as meaningless as the pleasures of defecation (which comparison I think a Warhol acolyte once actually asserted).   Although they yearn for atemporality, the inevitable crises of life assure that their existence is the mere counting of empty time.  And yet, despite this, the final moment of the last count remains denied.  But for some, better the meaningless count of “eternal nows,” than the complex, and tentative, living of time as the making of meaning, which we always find among those who acknowledge and accept their own mortality.

To finde Deeth, turne up this croked wey. (Chaucer)

Death is a certainty.  To this we all agree.  But what do we mean by that?

Death happens to all living entities, but what is our thinking in this?  How do we see it?  We stick a needle in the back of an ant.  It struggles in vain to escape or, somehow, to counterattack.  Its legs and mandibles writhe erratically.  Something in us is moved to pity.  For now, it would seem that the ant has an imminent awareness of its own demise. Perhaps it also feels something we would recognize as pain, were we able to communicate with it properly.  If a one-cell creature responds in like manner to a threat, we cannot tell by way of microscopes. Neither could we see a tree shudder in pain as the ax-blade carves into it, if in fact it does.  So, we can think of the ant as having imminent awareness of its own life only because it responds to a threat in a manner similar to the way we know we would, were we to be impaled.  And that, of course, is why we feel something akin to pity when we watch the ant die, pierced by our needle.

Yet this still remains only an assumption on our parts.  We don’t really know what the ant feels.  We don’t have the slightest idea of its experience of this threat and inevitable demise.  After all, we do know that the ant cannot know that its demise is inevitable.  We can ascribe pain to it, and we can characterize its behavior as responding to a threat, but this threat could only be “perceived” – if one could even call it that – in the most vague and general way. In order to “recognize” a threat as potentially fatal, an intelligence has to be able to make a reflexive connection between three significations:  ‘living’; ‘intervening threat’; and ‘not living’. Even if the ant could connect the first two significations, it could never make any connection with the third.  What could “not living” possibly “mean” to an ant?  And how would this happen?

It is not enough to say here that “not living” is the signification of an abstract idea.  In fact, it is no idea at all, but rather, a negation.  Within its temporal and spatial domain, living does not happen; there is no possible intelligence there capable of having any ideas, and there are no knowable entities to have any ideas about.  While its own occurrence is possible (indeed, inevitable), once it occurs, possibility comes to an end.  Its potential is not realized; it cancels out all potential.  To die is to enter the realm of the impossible.

We have a very difficult time thinking about this, our inescapable future as suddenly becoming impossible. What could it possibly mean?  “A hundred years from now, whatever else happens, I will be impossible by then,” a sentence the grammar of which appears silly, even absurd. Yet, it happens to be as true a sentence as any making claim on a future reality, for any person speaking it.  The individual human being dies and thus becomes impossible.  No wonder so many believe in an afterlife!  If my future is impossible, what status has my past? how could I ever have been born?

Accidents happen, of course.  It always strikes me as comical, the enormous amount of ideological babble spewed across eons and continents, from innumerable people,  asserting that human life and intelligence could not occur by accident.  Why not?   Those saying such things seem to think that the assertion itself answers the question, but that’s not even begging the question – it’s simply ignoring it.

Our deaths are a function of our animal nature and consequently, they must be of a kind with the deaths of other animals. We can narrow this classification a little, by remembering that we are a certain species of primate, all of whom are mammals.   Thus, like it or not, we will die as all primates die.   Our blood pressure rate soars or drops; our heart beats rapidly as though escaping the confines of our chest (or it ceases all together);  or breathing becomes an irregular, choking gasp; sometimes our mouths fill with blood; sometimes their mucous membranes dry out and crack.  We know that it will become difficult to focus our vision at the last, because the dying person’s gaze becomes fixed and non-responsive to light.   We will either feel so much pain that life becomes intolerable, or we will feel nothing at all. And then, “the rest is silence.”

I do not for a moment accept the veracity of those reports of so-called “near-death experiences,” concerning those who have  “died” – in a medically technical sense – and yet have survived to talk about it, usually claiming to see some light at the end of a tunnel, or hearing some voice of a long-dead loved one, etc.  The problem with these reports is actually grammatical: there is no such thing as a “near-death experience,” there is life and then there is death, the undiscovered country, from which no traveler returns.  No mortal entity, once dead, can ever live again, and thus, whatever can be reported by the living to the living must have been lived; must have occurred as an experience of life.  The “near-death experience” is thus revealed as the experience of one particular mode or moment of living.   “Oh, but that is just semantics!” (I.e., it is just a grammatical remark.)   But grammar determines the knowable.  If it cannot be said, it cannot be known, in any communicable manner.  “Near-death” is a term of art for referring to some moment which is obviously and undeniably not-death.  “Well, but I am trying to communicate the incommunicable.”  Trying to square a circle will meet with as much success.

Are there any experiences beyond the range of the communicable, beyond the range of language?   Actually, no.  Every human experience can be communicated to another human, by reference to analogues embedded in shared experience.  We should remember that this is one of the important missions for the poet and the storyteller; that they help to generate the language with which we communicate our experiences to others, even and perhaps especially, when the experience has never been communicated before.

(Well, but is there anything at all we can know about that is beyond the range of the communicable?  Yes, indeed; too much, perhaps.  Unfortunately, there’s no way to refer to any of it except to say:  “Whereof one cannot speak thereof one must be silent.”  Or something like.  So, whatever could be “known” of whatever follows death itself, must remain moot to us, and we mute to it.)

Every human death is the same.   Male or female, old or young, decrepit or fit, by way of accident or intent or mere process of aging. Death comes to each and every one of us. Many people find this intolerable. How many monarchs have forced their subjects to build for them magnificent mausoleums, like the pyramids of ancient Egypt; enduring evidence of human hubris and stupidity?  Open any of these and discover the same decayed remains one finds in the grave of a peasant.  “But we are not to be identified with our bodies,” so begins the argument that the living experience of each individual being, so the experience of death must be different for each of us. But this simply evades the real issue: it is not the final moment of individual, living consciousness that defines the “experience of death,” it is the first moment afterward, which, I’ve already noted, no one can experience and then report to the living.

In Being and Time (1927), Martin Heidegger attempted to make a distinction between “authentic death,” i.e., death as experienced by an individual of courage and intellect, perhaps the very culmination of life, the event granting it real meaning, and “inauthentic death,” just the run-of-the-mill, household variety of death happening while no one pays attention to it.   Reading back through Heidegger to the ancient Greeks, we understand that an “authentic death” would be that confronted by a civilized person, while an “inauthentic death” would represent the curse of cowardice.  But while I respect Heidegger’s evidently civilized desire for a civilized confrontation with one’s own mortality, I must reject the distinction as attempted poesis of myth.  Death’s arrival is inevitable, but always surprising.  Preparing for it, as the culmination of one’s individual living, is simply impossible.  The most we can do is try to effect what others might say of us after our demise.  But that preparation (which is really the preparation Heidegger is discussing, although he is unaware of it) can never be finished, because death itself always cuts it short.  That can only mean that such preparation is irrelevant to death.  And, indeed, such is the case; what could it possibly matter to one what others say of one, after one has become impossible?  And although Heidegger doesn’t want the “authentic” individual to confront death as does the otherwise nameless “one” of generic subjectivity, I’m afraid this is unavoidable.  Any individual human dies in just the same manner as dies any “one” (the “Man” or “They” of Heidegger’s text).

Whatever it is (which we living can never know), death is the same for all human beings.   Someone who has attained any awareness of this accepts it and devotes his or her life to doing what can be done for others, and finding personal satisfactions in community with them.  There’s no real point in doing anything on one’s own just for oneself, which is doomed in the first instance.  Even an effort to contribute to a future – which is always closed off to the individual – must be undertaken as an effort to contribute to the future presence of others.

The end comes at last

We close the door on death (temporarily), but perhaps open the door to a new way of looking at life.  I suppose one could get all evolutionary here and remark upon the necessity of continuing the species, but this is a weak argument for those who care little whether their species continues or not. But simply living with others generates obligations, which can be taught, enforced by social behaviors (or by law when necessary), and which demand of us contributions to the good of others. We owe it to them.  We were nurtured by their forbearers, and we nurture them in turn.

The denial of death that leads to the pursuit of momentary pleasures, especially those that cause others harm (as well as oneself, eventually), neither generates an “eternal now,” nor sustains itself within it. This “now” repeats itself incessantly, becoming a predictable, and rather boring, pattern of seeking pleasure, enjoying it, loss, dissatisfaction, forgetting, seeking all over again.

To accept our own mortality is neither morbid nor despairing.  It means that we have a moment to recognize that what we each do in life will always have consequences, but it also means that the future, in which such consequences are realized, is not ours individually.  It belongs to our progeny and our community.  Those who have children owe it to them, having burdened them with the possibilities, risks, and responsibilities of life, to provide for their future and not simply their passing fancies.  And those without children share precisely this same debt to the children of others.  We enjoy the community’s benefits, and we share its responsibilities.   There can be no greater responsibility to the community, than to work for the continuance of that community.

The end of life is death, which means that while we know where we are going, we won’t know anything once we’ve gotten there.  The great challenge, then, is to forge our learning into a purpose for the journey.  Our lives cannot but be shared, and it would seem the broadest and deepest purpose that we can build for ourselves involves always reaching out to others, not to get something from them, but to contribute to their own efforts to create purpose.  We do not learn for ourselves, but for others; what we know is always contribution to the shared experience of community.  The search for meaning finds its goal within itself, as the continuing search for the meaning we share with others.

  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

by Daniel A. Kaufman

My dialogue with Robert Gresis on the “morality everywhere” problem, and my discussion with Nathan Eckstrand on his essay suggesting humanities professors should go on strike.

Morality's Limit | Daniel Kaufman & Robert Gressis [Sophia] - YouTube
Should the Humanities Go On Strike? | Daniel Kaufman & Nathan Eckstrand [Sophia] - YouTube
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

by David Ottlinger

I have always been a person out of step with his own, native culture. Most of the time I think the received wisdom on most subjects — be they political, social or artistic — ranges from misguided to catastrophically wrong. I open the newspaper to find it makes as much sense read backwards or forwards and that I have virtually no comprehension of the people with whom I share a country. Such opposition is perfectly comfortable to me and causes no distress.

But every now and then some conflict seems to me special. I find myself so unable to comprehend why my fellows act and believe as they do that I begin in some measure to doubt my own sanity. I find myself returning to the issue again and again. In moments of exhaustion, I resolve to leave it and live with the mystery. But I always break this promise to myself. Something tells me to be reasonable, that I’ve not yet thought of everything, that there must be something which I am missing, something obvious which would explain how everyone around me has come to such an opposite conclusion. So I resume the struggle, running over old arguments and inventing new ones, trying to understand just how we got so far apart.

This is how it feels in today’s America to hate Game of Thrones. When the show first aired, I remember remarking to a friend that I had reached a point in my life at which I was no longer interested in questing, and I would have been happy to leave it there. But over time I found the series increasingly difficult to ignore. People were not just watching the show; people were taking it seriously. It provided a vocabulary for political arguments. [1] It was ransacked for metaphors that showed up in the most rarefied atmospheres of respectable opinion journals and op-ed pages. Every looming threat became a White Walker. Every deferred conflict a “war to come.” Winter was always coming. Even while writing this piece, David French, Ben Domenech and Sohrab Ahmari could not manage to argue over the future of conservatism without a few choice references to the series. [2] Domenech argued that cultural liberalism was like the dreaded White Walkers, an unstoppable force which cannot be politely argued with but must be forcefully opposed. French countered that “the Valyrian steel that stops the cultural white walker is pluralism buttressed by classical liberalism…” These are grown men.

When I finally broke down and watched the show, I thought there must have been some mistake. The show everyone was talking about must be coming on right after the one I had watched: the one with the ponderous dialogue, dreadful plotting and bad cinematography. This could not be the series all the country’s best – or at least best paid – intellectuals were scribbling about. This could not be the most popular series in America.

Could it?

The sense of perplexity has never left me. But I want to make a promise to the reader. I am not vain enough to believe that my tastes and distastes are interesting to other people, and I would not waste anyone’s time with a mere rant. So I will not merely describe my dislike of Game of Thrones. In today’s world, there are far too many critics who do far too much of that already. Instead I will offer something which may be of some actual use, a definite value judgment supported by arguments. To paraphrase Kant, I do not expect everyman’s assent, but I do demand it. Or in other words, you may have thought Game of Thrones was a good show, but you are wrong.

And where, really, do I begin? There are a thousand things large and small. Take, for instance, the White Walkers. Has there ever been a cinematic monster less satisfying, as profoundly un-intimidating as the ponderous, plodding, fragile White Walkers? When Domenech argued that cultural liberalism was like the White Walkers, he must have that liberalism is over-hyped and easily defeated. We are told these White Walkers are devastatingly powerful — unless they are poked with certain stones or tapped with certain swords in which case they simply shatter like cheap china. You have to really marvel at the writers’ decision, on the first meeting between a White Walker and one of our characters, to have a Walker not only completely destroyed, but inadvertently destroyed by the least physically impressive character in the show. It was as if they were actively trying to remove all tension from their looming threat. Then, as though to solidify this point, in the first real fight between a character and a Walker, Jon Snow once again breaks a Walker into ice chips, totally by accident. (For good measure the same fight also established that all White Walkers move like they have advanced rheumatoid arthritis.) After that White Walkers inspired no feeling in me but boredom. I just can’t manage any dread at a creature that will trudge slowly towards you, only to fly pieces as soon as it pricks its finger. There is a possibility that if you suddenly found yourself ten yards away from a rampaging White Walker, and you stood very still for several consecutive minutes, it might be actually be able to do you harm, but it would probably slip and hurt itself in the process. In the case of an invasion, a small division of well-disciplined cub-scouts, armed with feather dusters, should have been able to drive them all out of Westeros.

White Walkers

But complaints such as these, of which there are many, are not essential to understanding the series’ failure and harping too much on them would be unfair. The real problems of Game of Thrones concern its very structure and influence not just an isolated element of the story, like the White Walkers, but its whole design. Taken together they doomed the show’s chances of achieving any artistic success.

Some of these more pervasive problems concern the show’s style and genre. In particular I always detected a certain tension between the story of Game of Thrones and the conventions of the fantasy genre. Watching the show, I could never shake a certain vague sense of disbelief. The costumes were richly designed and the set design, apart from one notorious coffee cup, was meticulous, but I could never shake a sense that none of it was real. I felt that familiar and unsatisfying feeling of watching actors playing characters rather than watching characters.

Over time I was able to articulate the source of this unease by comparing Westeros to an equally fantastic place that I had accepted as real both on the page and on the screen, namely Tolkien’s Middle Earth. Tolkien’s world, whether in his novels or in Peter Jackson’s highly successful adaptations, felt more convincing, paradoxically, because it was more completely alien. Men like Tolkien and C. S. Lewis were highly trained classicists and medievalists, and they brought a sense of deep historical time to their work. All the societies of Middle Earth are thoroughly pre-modern. Frodo and the other hobbits naturally defer the great lords and ladies that they meet in their travels as their superiors. They accept their station and harbor no ambitions of reaching a higher one. They are totally at home in an aristocratic world, and define themselves, comfortably, in aristocratic terms.

The denizens of Westeros are in this respect completely different. The problem is that all of our characters are really, just below the surface, modern Americans. I would stop short of saying that Game of Thrones was really and fundamentally an American show. That would invalidate the contributions of the largely British cast who were overwhelmingly excellent and generally the best part of the show. For that matter it would invalidate the contributions of the European landscapes, which are so much a part of the show as to become characters in their own right. But the source material is obviously deeply American. And the source material inevitably calls a lot of the shots.

The Americanness of our characters manifests itself in their values, their preoccupations and the way they understand society. Middle Earthers are, as authentic pre-moderns, largely pre-occupied with honor, hierarchical status, reputation and the fate of their clan or nation. The inhabitants of Westeros are concerned with wealth, power, sex, consumption, getting ahead in their careers and other largely individual pursuits. In case anyone failed to notice this, Littlefinger, himself a character who perspicuously embodies these values, bluntly drives the point home in a long speech at the end of the episode “The Climb.” Society, for Littlefinger, is a ladder. “Only the ladder is real,” he intones, making Hortatio Alger blush, “The climb is all there is.” He claims that the worth or admirability of a person consists totally in how far they can climb and dismisses all those who do not attempt it as suckers and dupes to a false morality. Like Edmund, nature, red in tooth and claw, is his goddess and he will not stand in the plague of custom. Most of the show validates this Algerian worldview. Westeros is superficially an aristocracy but more deeply a ruthless meritocracy. Its inhabitants compete for individual worldly success and status. This makes the fantasy element feel painted on and shatters the illusion of Westeros’s exotic otherness far better than any misplaced latte ever could.

Littlefinger

Then there is the matter of Game of Thrones’ controversial use of explicit sex and violence. In principle I have no objection to such things and many outstanding shows have involved a great deal of both. (Vide The Sopranos.) But critics are right to point to the show’s sex and violence as weaknesses. The problem is not that they exist at all but how they always seem to be trapped between different sensibilities.

The uses of sex and violence in film are many, but we can make a broad distinction between two general kinds. On one side are depictions of sex and violence that are meant to shock and disconcert an audience. Consider the violence in, say, Saving Private Ryan. Graphically depicting the horrors of war, the violence of this film is not meant to amuse or entertain. Rather the violent scenes are meant to make us feel, as the soldiers did, that we would rather be anywhere else. In this category also belong such memorable moments as Cornwall’s blinding of Gloucester, “out, vile jelly!”, and the riot at the end of Peter Brooks’ Marat/Sade. On the other side of the distinction belong the kinds of violence with which we are familiar from action movies and many westerns and which are a large part of the appeal of all the Marvel franchises. Such depictions of violence are meant to be voluptuously enjoyed. Watching Daredevil or Captain America wail on a group of faceless bad guys is just good, visceral fun.

The two sensibilities can be blended. It is part of the genius of Tarantino, for instance, to invite us to enjoy graphic violence and then confront morally us with the fact that we do enjoy it. But what we have in Game of Thrones is sex and violence that attempt to live on both sides of the line in ways that are far less coherent and foil the audience’s ability to relate emotionally to the events depicted. Never is this ambivalence clearer than when the show focuses on Littlefinger’s prostitutes. Are these exploited women, whose lives reflect the harsh realities of the sex trade? Or are they sexy, fashionable people whose promiscuity and lives of easy luxury we should envy? The show can never quite decide. One might wish to say that they can be both. No one, after all, is entirely a victim or entirely responsible, and no life consists only of harshness or pleasure. But the problem does not dissolve so easily, because the show adopts these different attitudes toward the same aspects of these women’s lives. Sometimes prostitution is depicted as demeaning and an affront to the dignity of those who work in it. At other times it is depicted as a good deal for those that can get it, and one that the women enjoy and possibly even find liberating. This makes it impossible for the audience to adopt a coherent attitude toward these characters. We are asked to toggle back and forth between empathy and jealousy until, ultimately, we can’t really feel anything at all.

Ultimately, a similar dissonance arises between the audience and Westeros as a whole. Is Westeros a bleak world of horrors and bitter injustices or a colorful world full of adventure and opulent pleasures that we look forward to visiting on Sunday nights? Again, the show runners seem loathe to make a decision. Game of Thrones could have been a pulpy adventure story full of nudity and violence, one that maintained a glib tone and had generally low moral stakes. Or it could have been a somber morality play with a serious tone and high moral stakes. But its attempt to be both makes it neither.

One moment particularly stands out to me and makes clear what the show might have been. I found in general that the violence in the show was unsatisfying for the reasons I have been describing. Unsure of the effect it wanted to make it neither shocked not enthralled. But when Jamie Lannister had his sword hand unceremoniously hacked off, I sat straight up. It was off-handedly brutal (no pun intended), gratuitous and completely plausible. You can easily imagine Jamie’s surly captors engaging in such casual violence. Everything changed for that character in the space of a moment, and his shock was our shock.

Jamie Lannister losing a hand

Then a most remarkable thing happened. The show almost immediately went to credits which were accompanied by a ribald ballad that had been heard earlier in the episode only now, performed in an anachronistically modern rock arrangement, complete with electric guitars. I felt as if I had been poked in the eye. For once the show had actually managed to elicit a clear emotional response and it immediately set about foiling it. This comical and off-puttingly anachronistic song immediately cleared away the impact left by the previous scene. This was more than ordinary artistic failure – it was sabotage; cowardice. The show runners knew they had created something shocking and disturbing and instead of trusting the audience to sit with that emotion, they immediately provided relief from it. Such decisions make the show easier viewing, but they rob it of any lasting impact as well.

But the most serious issues with Game of Thrones concern its structure. I was amused when I realized that if there was one television series to which Game of Thrones could be compared, it would probably have to the old science fiction serial Babylon 5. On the surface they may not seem to have much in common. Game of Thrones is the cool kids table of the lunch cafeteria of television, filled with a lot of sexy people with perfect hair obsessing over their problems and taking themselves very seriously. Babylon 5 is more at home with the nerds in the corner. It was self-consciously intellectual and idealistic, in some ways sentimental and bashful in the way it dealt with sexuality, all of which is in sharp contrast with Game of Thrones. Furthermore all the budgets for all five seasons of Babylon 5 probably couldn’t cover what it takes to animate one CGI flying dragon.

But beneath the surface the two shows have a startling amount in common. Both tell a long story, stretching across multiple seasons which concerns a looming threat and the different political factions which will have to unify to meet that threat. Game of Thrones focuses on different kingdoms which will have to unify to face the threat of the White Walkers and possibly Daenerys Targaryen. The story of Babylon 5 concerns different species with their own empires on different worlds who must unify to defeat the Shadows, an encroaching race of powerful aliens. In both shows the various factions are more concerned with old scores and long-standing rivalries than with their own, collective long-term good. Accordingly the two series tell strikingly similar stories in strikingly similar ways despite the strong differences in style and presentation. This makes comparison between the two illuminating and useful.

One aspect of Game of Thrones which is highlighted by the comparison is its inability to play by its own rules. Babylon 5 is masterful in efficiently setting up the chessboard. The first episode establishes all the major players. The warring factions are the humans, the Psi-Corps (who exist within the human government), the Narn, the Minabari and the Centauri. The Shadows provide the looming threat, and the Vorlons are a powerful force for good and counter-balance to the Shadows who have no analogue in Game of Thrones. What stands out is that across five seasons these remain, with a few exceptions, the major players. Each has an arc that is appropriate to each civilization and what happens at the end of the story is never inconsistent with what we were told at the beginning.

If only Game of Thrones had been so careful. What strikes me about main plot of Game of Thrones is how often major players are being introduced and eliminated, and how often they experience violent and implausible shifts in fortune. Before season 3, you have to listen very hard to notice the existence of the Tyrells. In season three they are suddenly absolutely central, the Lannisters but more so. Meanwhile we are told constantly that the defining quality of the Lannisters is that they are incredibly wealthy, until, very suddenly, they aren’t anymore. The Dothraki and the Starks simply collapse. But most remarkable is the sense one gets that one can’t step ten feet in Westeros without tripping over some new army. Someone has always carelessly left a major fighting force lying around. Sometimes it’s in Dorn, sometimes in Meereen, sometimes north of the wall. Never has a land been so thick with troops.

  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

by Daniel A. Kaufman

If someone who looks like a man and has XY chromosomes tells me he feels female – I cannot tell her she is ‘wrong’. Would you?

–Prof. Alice Roberts, University of Birmingham (1)

____

The “thought” behind the idea of gender self-identification is about as confused as any in contemporary public discourse, which explains why the conversation on the topic is so fraught. Those in the vanguard fighting on behalf of the rights of people who fall under the “trans” umbrella are convinced that the concept of gender self-identification is absolutely essential to their success, which means not only that they are incapable of recognizing its (ultimately disqualifying) problems, but that they are inclined to double, triple, and quadruple down when confronted with them, lending a desperate, shrill aura to any discussion of the issue and inducing aggressive protests, histrionic public displays, no-platformings, attacks on peoples’ livelihoods, and even outright violence. (2)

Elsewhere, I addressed the issue of gender self-ID from the perspective of the “self” portion of the concept and suggested that social identities, of which gender is one, are not self-made, but publicly negotiated. (3)  Along this vector, the problem with gender identity activism is that it misunderstands what kind of identity a gender identity is, construing it as personal, when it is, in fact, social. My interest here, however, is in a somewhat different problem, having to do with the “identify” portion of gender self-ID.  Sexes are not things one can “identify with,” and identifying with genders is tantamount to embracing sexist stereotypes, something that any genuinely feminist – and more generally, liberal – philosophy and politics must oppose.

The use of ‘identify’ in the context of sex and gender is odd.  A judge can order that a plaintiff not be identified, meaning that the person’s identity should be kept a secret. One can identify oneself with a political movement, by which one means that one should be associated with it.  One can identify with the plight of a people, meaning that one has sympathy for – or even empathizes with – them.  A doctor can identify the cause of a cough, meaning that he has found the bacterial or viral or other thing that is responsible for it.  A suspect can be identified by the police, meaning that they have determined who he is.

But what could it mean to “identify as a man/woman”?  From what I can discern from gender self-ID theorists and activists, it could mean one of two things, both of which strike me as untenable.

The first is reflected in the Alice Roberts quote, above: for me to identify as a man/woman is to feel like I am one.  Now, I am a man, but if you asked me what it feels like to be one, I couldn’t answer; while I am a man, there is no sense in which I feel like one.  Being a man is a matter of belonging to a certain sex-category, specifically, the male one, but there is nothing that it feels like to be male. Certainly there are things that only males can feel: in my middle age, I know what it feels like to have an enlarged prostate – a distinctive sort of discomfort that only males experience.  But to feel something that only males can feel is not the same as feeling male, and certainly, it is not something that makes you male.  After all, there  are any number of males that don’t feel it, because their prostate glands are not enlarged or because, perhaps, their prostates have been surgically removed as part of a cancer treatment. (4)

So “feeling” male or female is not going to help us make sense of “identifying as a man or woman,” because there is no such thing: one is male or female, but one doesn’t feel male or female, just as one is a mammal, but one doesn’t “feel like a mammal.”

The second pursues the line of gender: to feel like a man/woman is to feel masculine or feminine; manly or womanly.  And certainly, there is something that feels like.  I might feel manly after an especially tough workout or while moshing at a Slayer concert or upon realizing that women are checking me out (I haven’t felt the latter, since middle-aged decrepitude set in), but such feelings of manliness are little more than the products of sexist expectations. It is because males are expected to be macho and muscular and aggressive and “players” that the experiences associated with these things are deemed “manly” experiences. And because the opposite sorts of expectations are held of females, males who are not into working out or thrash metal or strutting in front of women are deemed unmanly and even effeminate or womanly.

It would be regressive, then, to take this tack in trying to make sense of “identifying as” a man/woman and even worse to suggest that meeting these sexist expectations makes a person one or the other. For decades, feminists and other forward-thinking people have been fighting against precisely these sorts of expectations and rejecting the idea that such notions of manliness or womanliness should determine what one is or what one should do.  In a previous essay, I referenced Marlo Thomas’s seminal Free to Be You and Me, which my parents gave me as a young child and the entirety of which is devoted to opposing these sexist conceptions of manhood and womanhood and making the case that beyond our sexual identities as males and females, which are determined by nature, the rest should be up to the individual person and the course he or she chooses to pursue in life. Particularly effective in this regard is the wonderful opening skit, in which Thomas and Mel Brooks play infants, trying to figure out which one is the boy and which one is the girl.

Boy Meets Girl - YouTube

The notion of “identifying as” a man/woman, then, is either incoherent or retrograde. It is the farthest thing from being liberatory or progressive, and I find it hard to understand why anyone interested in advancing the cause of trans people would want to have anything to do with it, let alone plant their flag in it. As I have argued many times, everything required to make a complete and compelling case for trans civil rights is already contained within the liberal tradition. And beyond the advantage of being grounded in a stronger, more rigorous, more universal set of principles, to pursue such a course would avoid the sexist logic and tropes that have done so much to put trans activism in conflict with its feminist and gay/lesbian counterparts.

Notes

(1) Alice Roberts, Professor of Public Engagement in Science at University of Birmingham and President of Humanists UK.

https://twitter.com/theAliceRoberts/status/1141261931556286464

https://www.birmingham.ac.uk/schools/biosciences/staff/profile.aspx?ReferenceId=122726&Name=professor-alice-roberts

(2) https://www.bbc.com/news/world-us-canada-47301007

https://www.thetimes.co.uk/article/trans-goldsmiths-lecturer-natacha-kennedy-behind-smear-campaign-against-academics-f2zqbl222

https://www.standard.co.uk/news/crime/transgender-activist-tara-wolf-fined-150-for-assaulting-exclusionary-radical-feminist-in-hyde-park-a3813856.html

https://www.thetimes.co.uk/article/julie-bindel-the-man-in-a-skirt-called-me-a-nazi-then-attacked-8dfwk8jft

(3) https://theelectricagora.com/2017/05/25/self-made/

(4) Our own E.J. Winner also was quite critical about this notion of “feeling like” a man/woman, although on somewhat different grounds.

https://theelectricagora.com/2017/12/07/sex-gender-politics/

  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

by Kevin Currie-Knight

__

Does postmodernism spell the death of reason? If you have been caught up in some recent online discussions of it, you’d think so. Postmodernism, its critics (often of the so-called Intellectual Dark Web) say, poses an existential threat to reason, a bedrock value of “the West.” An article on how French postmodernist intellectuals “ruined the West,” explains that thanks to postmodernism, “the need to argue a case persuasively using reasoned argument is now often replaced with references to identity and pure rage.” Jordan Peterson, talking on the Joe Rogan Podcast, suggests that postmodernism is, among other things, a complete assault on “everything that’s been established by the Enlightenment,” namely “rationality, empiricism, [and] science.”

Wow! Any philosophy that uses reason to argue against reason must be not only awful and dangerous, but self-contradictory. I want to argue that it isn’t necessarily so. My goal isn’t to convince people that postmodernism can’t be taken in irrationalist or even dangerous directions, but that it need not be, and probably wasn’t originally intended to be. If anything, I do not see postmodernism, as Peterson does, as a “complete assault” on “the” Enlightenment (there were several Enlightenments, not just one). I see it as a potentially valuable corrective to some of the Enlightenment’s excesses.

Here is an admittedly corny, but possibly helpful, example to show how I conceive of postmodernism’s relationship to reason. Imagine an infomercial for a tool a company wants to sell. For sake of simplicity, just imagine that the tool is a hammer. Like all infomercials, this one is pitched to put the product – the hammer – in the best possible light. And like all infomercials, that means not only showing what the product can do, but maybe exaggerating a bit about what it can do. We’ll imagine that this particular infomercial gives a long list of things this nifty hammer can help you do: “It helps pound nails…. like, really well; it helps extract nails too.” Fine so far. “But there’s more! It is an excellent mallet for cracking crabs and lobster, is a great back-scratcher, it can help you saw wood (just smash the wood really hard until it breaks), and has tons of other everyday uses!”

Now, certainly this infomercial is overly generous toward the hammer and its uses. The first two uses – hammering in and extracting nails – are surely on point, but the others are probably exaggerations. So, imagine that a truth-in-advertising campaign comes along geared toward potential consumers: “Our independent tests indicate that the hammer is really good for some of the things listed, but buyers should beware that hammers are not so good at the other things. The hammer itself is a good tool, but only when confined to its proper uses.”

What does this have to do with postmodernism and its relationship to reason? Well, imagine that the hammer is reason, the exaggerated infomercial is what happened to reason under (the excesses of) the Enlightenment, and the truth-in-advertising campaign is postmodernism. The way I see it, the problem is not that the Enlightenment advocated for things like reason and science. Those things were and are good things. The problem is that some of the more enthusiastic Enlightenment figures, mainly within the French and German Enlightenments made some really extravagant claims for what reason and science could do. Postmodernism is not trying to kick reason to the curb any more than the truth-in-advertising campaign wants us not to buy hammers. Rather, like the truth-in-advertising campaign, postmodernism is just trying to check some possible excesses and exuberance about what reason and science can do.

To see this, let’s look at some of the claims that certain Enlightenment figures made about what reason could do. In his study of the Enlightenment and his reaction to it, Isaiah Berlin depicted the message sent by some of the Enlightenment’s more enthusiastic champions, the French Encyclopedists:

A wider thesis underlay this: namely, that to all true questions there must be one true answer and one only, all the other answers being false, for otherwise the questions cannot be genuine questions. There must exist a path which leads clear thinkers to the correct answers to these questions, as much in the moral, social and political worlds as in that of the natural sciences, whether it is the same method or not; and once all the correct answers to the deepest moral, social and political questions that occupy (or should occupy) mankind are put together, the result will represent the final solution to all the problems of existence.

This is certainly not a view that all Enlightenment figures held, and doesn’t come close to describing Hume, Smith, and a host of others we readily recognize as participants in the Enlightenment. But certainly,  the view Berlin depicts has left a big cultural mark. However big you think that cultural mark is – and there is room for debate there – it is that mark that I see postmodernists as intending to call into question. They’re not aiming at hammers, but against being led by overeager advertisers to expect more of hammers than they can actually provide.

The big message in postmodern thinking is that there are many ways to understand and interpret the world, and when those ways battle for supremacy, there isn’t a neutral way to adjudicate between ways. Anyone who attempts to adjudicate between ways of seeing the world is doing so against the backdrop of some non-neutral way of interpreting the world. If you are arguing that God exists and is omnipresent and I am arguing that God is a fiction, anyone who hears our debate and wants to decide which side is right will be doing so with some non-neutral framework for making the decision. It may be that she is an atheist and, as such, puts more burden of proof on you than I do (or vice versa if she is a theist). It may even be that she has no existing view on whether God exists, but even then, she is not appraising neutrally. She probably has some idea of how to appraise arguments. Should I give weight to personal testimony, or should I only consider evidence that can be independently verified? How do I make sense of what ‘God’ means in this debate? How much weight to I give to argument appealing to logic, or arguments from authority? What criteria makes an argument convincing? Here’s neopragmastist-cum-postmodernist philosopher Richard Rorty’s way of explaining:

Philosophy, as a discipline, makes itself ridiculous when it steps forward at such junctures and says that it will find neutral ground on which to adjudicate the issue. It is not as if the philosophers had succeeded in finding some neutral ground on which to stand. It would be better for philosophers to admit there is no one way to break such standoffs, no single place to which it is appropriate to step back. There are, instead, as many ways of breaking the standoff as there are topics of conversation.

A more concrete way of describing this situation is made by the Taoist philosopher Zhuangzi, who envisions a disagreement between interlocutors:

Whom should we have straighten out the matter? Someone who agrees with you? But since he already agrees with you, how can he straighten it out? Someone who agrees with me? But since, she already agrees with me, how can he straighten it out? Someone who disagrees with both of us? But if he already disagrees with both of us, how can he straighten it out? Someone who agrees with both of us? But since he already agrees with both of us, how can he straighten it out? So neither you nor I nor any third party can ever know how it is.

Devotees of the Enlightenment might retort with: “But of course, there is a neutral way. Just look at the facts and adduce from there/Just go where reason takes you/just look at the situation objectively.” (And not coincidentally, we all think that we are the ones doing this and our interlocutors are not.) Yet, facts must be interpreted (Is this fact decisive to refute the claim?), reason must proceed via some method (Are appeals to authority acceptable?), and the interlocutors almost certainly all believe that they, not their opponents, are looking at the world objectively. (No debate was ever solved by a third party coming in and saying: “Hey, I got an idea; let’s just all look at the world objectively. The correct answer will stare us in the face!”)

Yet, none of this means that we must abandon reason. At best, it means that we must take an inventory of what is and isn’t realistic to expect from reason. Humans have to live and act in the world, and we all want to act intelligently (yes, even postmodernists). Insofar as reason is a great tool for thinking, we should use it! And even if some disagreements might be un-resolvable – for the reason that there is no fully neutral way to adjudicate disputes – that doesn’t mean that reason can’t or shouldn’t be used for argument. I think my belief is a better one than yours, and I want to persuade you that adopting it would make you better off. Even if I can no longer say that my view just represents The Way Things Really Are and complain about how if only you’d just listen to objective reason, persuading you will still mean providing you with reasons, and if I really want to persuade you, I should provide you with strong reasons.

One can be a postmodernist and recognize all of these things. But, a postmodernist might say, reason is probably not good for the things Berlin writes about: leading all correct reasoners to the same once-and-for-all answers, and creating consensus around what those answers are. Nor is reason good at seeing the world objectively. Surely, it can be used to detect some of our own biases, but since reason is as much a part of us as our biases are, it can’t detect those biases it can’t be aware of.

I’ll close with a quote that philosopher Stephen Hicks wrongly attributes to Foucault in his book Postmodernism Explained. Even though the author was really philosopher Todd May, May is describing what I think is an accurate read of Foucault: “it is meaningless to speak in the name of — or against — Reason, Truth, or Knowledge.” Well, that sounds ominous, doesn’t it? Okay, we can hand it to Foucault (or May) that it is meaningless to speak against reason, but see, he wants to end all talk in defense of it, too! In context, however, Foucault (or May) is saying that “There is no Reason; there are rationalities.” it is not that we should banish all attempts to give and listen to reason. It is that when we do, we are always working within one of many possible sets of rules that we are reasoning within (such rules as what types of arguments are acceptable and not acceptable, will and won’t be deemed convincing).

Whether you agree with the postmodernists on this is beside my point. My point is that this postmodernist vision would only undermine reason if you believe that reason is either the type of thing Berlin describes or nothing at all. Either the hammer works as a back scratcher and a saw, or it is nothing. By my interpretation, postmodernism isn’t against reason any more than our fictitious truth-in-advertising campaign is against hammers. Postmodernists are simply trying to give us a better understanding of what we should and shouldn’t realistically expect reason to do. Moreover, I wonder if tempering our expectations in this way might actually help us appreciate reason more: a hammer will likely be better appreciated if one doesn’t buy it expecting it to be a good saw.

Kevin Currie-Knight is a Teaching Associate Professor in East Carolina University’s College of Education. His teaching and research focuses on the philosophy and history of US education. His more popular writings –  on a range of issues from self-directed education to philosophy – have appeared in venues like Tipping Points Magazine and the Foundation for Economic Education.

Read for later

Articles marked as Favorite are saved for later viewing.
close
  • Show original
  • .
  • Share
  • .
  • Favorite
  • .
  • Email
  • .
  • Add Tags 

Separate tags by commas
To access this feature, please upgrade your account.
Start your free month
Free Preview