Brian McHale • What Is the End of the World Good For?

About the blog: The Instrumental Narratives blog aims to popularize the insights and methods of narrative scholarship and features analyses of instrumental storytelling by high-profile narrative scholars. The analyzed cases deal with uses or abuses of the narrative form, storytelling practices or narrative sense-making in many areas of life: politics, journalism, business, identity work, artistic or literary sphere, activism, and forms of social participation. The blog texts evaluate possible societal risks or benefits of contemporary storytelling, for example through cases from the author’s own national, linguistic, or cultural sphere.

As my contribution to this guest blog, I was intending to write about the use and abuse of end-of-the-world science fiction when the end of the world caught up with me.

I had planned to denounce science fiction for its generally frivolous and unreflective approach to the end of the world – its lack of seriousness about apocalypse. Apocalyptic and post-apocalyptic scenarios, as I hardly need to tell you, are ubiquitous in contemporary science-fiction narratives across all media platforms.  But such scenarios mostly function as reset buttons, wiping clean the world’s slate, giving survivors an opportunity to rebuild civilization from scratch. Or the apocalypse can serve as the pretext for realizing erotic fantasies (the last-man-and-last-woman motif), or as a kind of playground or theme-park, a sound-stage on which adventures can be played out amid the ruins, as in the Mad Max movie franchise or in much current Young Adult dystopian fiction and film.  Think how much fun we could have if none of these other people were around!

* * *

Apocalypse has always been science fiction’s stock-in-trade, though its modalities have changed over time. Alien invasion has never lost its dark grip on the collective apocalyptic imagination since the time of H.G. Wells’s original War of the Worlds (1898). Nuclear holocaust, once a dominant modality, has since the end of the Cold War been edged out by other end-of-the-world scenarios: cosmic disasters (asteroids, worlds in collision), ecological collapse, and of course plague outbreaks, not to mention the zombie apocalypse which is so ubiquitous in contemporary popular culture (and which is often affiliated with the outbreak narrative). The result is always drastically to reduce the human population to a struggling remnant, as in Philip K. Dick’s Dr. Bloodmoney (1965), or to a last couple, as in W.E.B. Dubois’s “The Comet” (1920), or even to a single individual, the last man (rarely woman), as in Mary Shelley’s seminal novel by that name (1826). The last man motif has enjoyed a long run in popular culture, beginning with Richard Matheson’s novel I Am Legend (1954) and its three movie adaptations (1964, 1971, 2007, under various titles), which is often cited as the model for zombie-apocalypse stories (though technically Matheson’s walking dead are vampires, not zombies).

Susan Sontag, writing in 1965, saw that the apocalyptic science-fiction movies of the ‘50s served as an opportunity for “participat[ing] in the fantasy of living through one’s own death and more, the death of cities, the destruction of humanity” (“The Imagination of Disaster”). It is certainly the case that the mass death that is a precondition for the last man (or small-band-of-survivors) motif is rarely taken seriously. Death on a planetary scale is reduced to a premise for action-adventure entertainment, or at best for reflection on – as the subtitle of Dick’s post-apocalyptic novel Dr. Bloodmoney puts it – How We Got Along After the Bomb, that is, how survivors took advantage of the slate wiped clean by apocalypse to build the world anew. There are exceptions: in Samuel R. Delany’s Stars in My Pocket Like Grains of Sand (1984),the last man to survive an alien onslaught on his home world becomes a charismatic messiah-figure throughout the galaxy, the object of obscure hopes for ultimate survival; while in Octavia E. Butler’s Lilith’s Brood trilogy (1987-89), Margaret Atwood’s MaddAddam trilogy (2003-13), and Neal Stephenson’s Seveneves (2015),the near-total extinction of the human race by (respectively) nuclear war, engineered plague, and cosmic accident, and the plight of the handful of survivors (such as the “seven Eves” of Stephenson’s title), is accorded something like the weight and sobriety it deserves.

In general, however, the current popularity and predictability of post-apocalyptic scenarios heightens the odds against serious treatment of the motif in genre fiction or film. Serious engagement with mass death generally seems reserved for “crossover” literary fiction, such as Colson Whitehead’s literary novel of the zombie apocalypse, Zone One (2011), Russell Hoban’s linguistically inventive Riddley Walker (1980), or Cormac McCarthy’s merciless The Road (2006) and its equally unrelenting film adaptation (2009). In the end, perhaps it requires the methods of the literary avant-garde to genuinely come to terms with universal death – the methods, let’s say, of Samuel Beckett’s The Lost Ones / Le dépeupleur (1971), or of Maggie Gee’s less celebrated postmodern metafictions, Dying, in Other Words (1981) and The Burning Book (1983). Gee at least has the honesty to admit that the only truthful perspective on the end of the world is that of . . . no-one at all:

This is a city, though who is there who can tell. For miles there is nothing left standing: light falls upon miles and miles of litter and ice and ice and litter and chaos …. No speech, and no stories. The last great story was death: someone failed to tell it, or else no-one wanted to hear. (Dying, in Other Words)

That was what I was going to write about for this blog: the science fiction genre’s abject failure, apart from some rare and liminal examples, to engage seriously with the greatest of its great themes, the end of the world. But then in the spring of 2020 along came covid-19, and I was compelled to re-evaluate my position. When the pandemic lockdown reached my part of the world in mid-March, and all the libraries closed, I was forced to resort for my science-fiction reading to books that had stood unread on my home bookshelves for a long time – in some cases a very long time – supplemented by occasional online purchases of newly-published novels. Only a few of these narratives were strictly speaking contagion-themed, but most of them, new and old, were apocalyptic in one way or another, just by virtue of being science fictions; it comes with the territory.

* * *

Thus it happened that during the pandemic spring and summer I read Margaret Atwood’s Maddaddam trilogy of Oryx and Crake (2003), The Year of the Flood (2009), and Maddaddam (2013), and Paolo Bacigalupi’s The Windup Girl (2009), in all of which epidemics or pandemics figure. I also read N.K. Jemisin’s Broken Earth trilogy (2015-17) and (digging deeper into my personal backlist) Anna Kavan’s Ice (1967), which count as end-of-the-world novels but not as plague novels.  Among the brand-new books I read, there was no contagion in Neal Stephenson’s Fall or, Dodge in Hell (2019), and neither were there any in William Gibson’s Agency (2020). At least there were none in either of Agency’s time-frames, near-future and far-future, though evidently there must have been plenty of world-ending pandemic disease raging in the offstage interval between those two moments.

Stephenson’s Fall illustrates why science fiction seemed so perversely relevant in the pandemic. Enormously long (880 pages), Fall or, Dodge in Hell is a novel with many moving parts – too many.  One of those parts, early on, involves a visit by some bright, young over-privileged East Coast college students of the near future to a region of the American Midwest that they derisively call Ameristan.  In Ameristan, the zone outside the urban centers, consensual reality has disintegrated.  People there are fed datastreams that cater to their own preconceptions about the world, confirming and amplifying them.  Targeted by predatory algorithms, trapped in feedback loops, victims of self-fulfilling prophecies, they each live within their own tailor-made pocket-realities.  Meanwhile, the elites, including these young travelers, can afford to pay for the services of online curators who edit out disinformation and “fake news” from their datastreams and keep them grounded in real reality.  It is a world of reality-pluralism run amok.  Reading Fall in the first half of 2020, when our world seemed to be disintegrating into warring tribes, each outfitted with its own weaponized epistemology and ontology, was like reading the least fanciful, the most mimetic of realist fictions – more faithful to the way we live now than any contemporary realist novel in the bookstore.  And yet it says on the copyright page that Fall is a work of science fiction.

No doubt it sounds perverse, but reading apocalyptic science-fiction like this one during the pandemic turned out to be not an alarming experience, serving only to exacerbate one’s anxiety, but a strangely comforting one. What works like Fall, Agency, the Maddaddam trilogy, the Broken Earth trilogy and all the rest offered was not so much an image of what was happening to us – none of them is very “faithful” to reality in that sense – or a plausible forecast of what might happen, or even a reassuring scenario of how we might weather the worst of it and come out the other side as survivors, and might even be able to enjoy ourselves in the adventure-playground of our future’s ruins.  Rather, these science fictions were valuable because they conducted thought experiments; they staged alternatives, plausible or otherwise, and thinking about alternatives just then, even dire ones, was comforting, liberating and useful – a good use of one’s imagination.  The value of science fiction to readers under lockdown was precisely its capacity for prompting us to think of, think about and think through alternatives.  This helps explain, perhaps, why science fiction seemed to some of us so much more relevant and compelling than contemporary realism – seemed, indeed, to take the place of contemporary realism – in the year of covid-19.

Brian McHale, Arts and Humanities Distinguished Professor of English at the Ohio State University, is the author of four books about postmodern literature and culture, including Postmodernist Fiction (1987) and The Cambridge Introduction to Postmodernism (2015), as well as the co-editor of five other volumes.  Co-founder of Ohio State’s Project Narrative, he is a past president of the International Society for the Study of Narrative (ISSN), and was a founding member and president of the Association for the Study of the Arts of the Present (ASAP).  He was editor-in-chief of Poetics Today from July 2015 through June 2019.

Avril Tynan • Epidemic as Metaphor: Rethinking the Dementia “Epidemic” in a Time of Pandemic

About the blog: The Instrumental Narratives blog aims to popularize the insights and methods of narrative scholarship and features analyses of instrumental storytelling by high-profile narrative scholars. The analyzed cases deal with uses or abuses of the narrative form, storytelling practices or narrative sense-making in many areas of life: politics, journalism, business, identity work, artistic or literary sphere, activism, and forms of social participation. The blog texts evaluate possible societal risks or benefits of contemporary storytelling, for example through cases from the author’s own national, linguistic, or cultural sphere.


In “Dementia as a Cultural Metaphor” (2013), Hannah Zeilig explores the “diachronic phenomenon” of dementia. She argues that public understanding of dementia is shaped by the cultural metaphors we use to represent it while, at the same time, dementia itself has become a metaphor for wider social and cultural ills. Among the many cultural narratives of “crisis, war, uncontrollable natural disaster, and death” in which dementia is embedded, Zeilig notes that “an anxiety about epidemics” is often linked with dementia, implying that it is infectious and can be ‘caught’”. (258–261.) Such metaphoric devices enable the construction of imaginative and symbolic “explanatory models” for dementia that effectively communicate problems of scale and deindividuation (see Kleinman 1988, 48–49). “Metaphoric thinking” need not, as Susan Sontag in Illness as Metaphor has suggested, be unhealthy or dishonest (1990, 3), but any rhetorical device must be cognizant of the social, cultural and historical contexts in which it is embedded. In the current age of the global coronavirus crisis, the use of epidemic as metaphor needs critical re-evaluation.

Certainly, dementia-related diseases are a rising global challenge – one which will only be intensified by increasing life expectancies in developing countries – but such diseases have hardly crept up on us in the way the coronavirus pandemic suddenly overwhelmed whole communities, and nor can it be caught or spread from person to person in the manner of the current epidemic. The metaphoric use of epidemic in the context of non-infectious, non-contagious and relatively predictable diseases such as dementia embeds the illness within a cultural frame of immense proportions. Evoking fears of the unassailable scale and spread of disease  helps to stir up emotional and – hopefully, eventually – material responses to an imminent “collective calamity” (Sontag 1990, 58). Yet, such focus on the rising prevalence and reach of disease detracts public attention away from the individual and intensifies alienating discourses of dementia patients as faceless “others” lacking agency and locked within a universalizing pathology. As Rebecca Bitenc, in Reconsidering Dementia Narratives (2020), has suggested, such “alarmist” metaphoric imagery “dehumanises people with dementia by turning them into an indistinguishable mass that will ‘swallow’ the resources of more able-bodied and able-minded sectors of society” (12–13). Examining the “epidemic” metaphor in a time of global pandemic will allow us to rethink the entangled contexts of communication – both of narrative and of infectious diseases – and to re-centre the individual as the subject of disease.

* * *

According to the 2020 edition of the Oxford Concise Medical Dictionary, the noun epidemic refers to “a sudden outbreak of infectious disease that spreads rapidly through the population, affecting a large proportion of people” (Law and Martin 2020). In mediatized political discourses, the term is now commonly used to refer to the sudden and widespread occurrence of pernicious problems – for example, gun violence in the US, loneliness, male suicide, and dementia  – that, while not infectious by nature, “spread” among populations and lead to public health and social crises. However, the current age of global pandemic, characterized by virulent person-to-person inflection, demands that we reflect critically upon the appropriateness of such liberal uses of epidemic as a metaphor.

Illness metaphors are interpreted in light of the ever-changing socio-historical and individual contexts in which they are enmeshed, and contemporary associations with the coronavirus epidemic will infiltrate and structure future understandings. The implications of a dementia “epidemic” therefore raise significant ethical and epistemological problems. Future attempts to understand and treat dementia-related diseases may be hampered if our language continues to intimate stigmatizing notions of contagion, economic depression and mental and physical isolation inferred from the current global context. The immediacy of the current pandemic – in which the ongoing effects of the novel coronavirus influence directly or indirectly almost every aspect of our day-to-day lives – offers a chance to re-evaluate the implications of contemporary rhetoric on the representation and understanding of non-contagious but widespread diseases. Challenging the accuracy and applicability of the epidemic metaphor encourages us to think beyond its totalizing anonymity and to consider instead the distinct individuals whose lives constitute its workings.

In Illness as Metaphor, Sontag suggests that pre-modern epidemic diseases such as cholera, bubonic plague or typhus were imbued with cultural meanings of social disorder, immorality and political decadence that displaced judgement on the disease to a judgement on the community (1990, 58). In contrast, modern diseases like cancer and AIDS reflected judgements upon poor individual moral and physical actions. To speak of a dementia epidemic, therefore, is to impose an unfavourable narrative upon a situation that, if anything, indicates global improvements in healthcare, economic stability and scientific advancement. If a societal malady can be discerned in some way, it belongs more appropriately not to those affected by dementia, but to those responsible for their care whose dialogues are replete with “concepts of loss, dependence, and passivity” (Zimmerman 2017, 73) that echo the narratives of impending doom found in medical science. In the epidemic metaphor, individual experience is a necessary but not sufficient condition of collective malady, such that subjective accounts must be subsumed under anonymizing indications of immense size and scale. In this way, and as Sontag writes, “diseases understood to be simply epidemic have become less useful as metaphors” because they incite a collective amnesia (1990, 71).¹ As we are seeing today, political and media coverage of Covid-19 has focused primarily on the rates of infection, death, hospitalisation, recovery, unemployment and economic loss that transform individual instances of sickness and loss into statistics and figures to be compared on local and international stages. Epidemics, be they literal or metaphorical, are not about individuals.

* * *

This displacement of the individual as the experiencing subject in both literal and metaphoric applications of epidemiology raises concerns over the ways we will later remember the victims of Covid-19 (see Tynan 2020).² It also exposes pressing cultural and social issues surrounding the ways we approach, represent and treat people with dementia. As countries around the world begin to emerge from an assortment of measures designed to limit the spread and consequences of the novel coronavirus while simultaneously remaining vigilant to the threat of a “second wave” of infections, we are faced with an opportunity to reassess how we see the individual victims of epidemics, and to ensure their visibility both during and after the disease. In the UK, dementia and Alzheimer’s disease are the most common pre-existing conditions for Covid-related deaths, accounting for approximately 20% of deaths in the period from March to May 2020 and rising to over 40% in care homes.³ People with dementia are therefore not only more likely to succumb to Covid-19, but are indirectly the victims of two epidemics, a fact that doubles their chances of erasure: individuals are lost first to the dementia “epidemic” and then to the coronavirus epidemic. This is not, of course, to suggest that the two diseases are of immediate public concern in the same way – coronavirus poses a far more alarming concern in this moment – but it is to suggest that the extraordinary surge in the usage of the term epidemic ought to come with some warning to remember the humanity and the individuality of collective losses even as we rush to count and compare at international levels.

Gilles Legardinier’s Nous étions les hommes [We Were Men] (2011), a thriller set in modern-day Edinburgh, demonstrates such human vulnerability and loss in a time of pandemic. In this novel, Alzheimer’s disease is elevated to the status of a literal epidemic as it starts to strike in more extreme and prevalent ways that lead to violent massacres across the globe, becoming “le plus grand fléau auquel l’humanité ait été confrontée” [the greatest plague that humanity has faced] (422–23). Although the strain of Alzheimer’s disease in Legardinier’s novel is a radical form of what we know today – “une sorte d’Alzheimer foudroyant” [a sort of sudden and severe Alzheimer’s] (116) in which an individual suddenly reaches a “basculement” [tipping point] and definitively “perd toute ses facultés cognitives” [loses all their cognitive faculties] (45) – it is characterized by familiar pathologies. The subject’s loss of communication, loss of memory, disorientation and aggression are exacerbated by the intensity and simultaneity of the onset of the disease that drains individuals of their humanity, reducing them to empty shells of their former selves in the case of older patients, and to vicious animals in younger victims. This radical dehumanization underscores how dementia patients’ loss of narrative memory is commonly perceived as a loss of self.4 Although the body remains intact and unchanged, the degeneration of the mind provokes a failure to recognise the person in any meaningful way: subjects are not dead, but they are no longer considered human. Yet, Legardinier’s novel ultimately aims to challenge this notion by emphasizing a common and contingent humanity that unites us even as it destroys us: “quelque chose de purement humain” [something unconditionally human] (221).

* * *

If we are to learn anything from this extraordinary time of pandemic, we must begin by recognizing that at the centre of any epidemic – be it literal or metaphoric – are not numbers and statistics but unique subjects with individual experiences. Epidemics, as we are learning every day, rapidly overwhelm entire communities and risk obscuring subjective responses under collective hysteria, but this universal unanimity should compel us to rebuild our eminently human connections by recognising that we as individuals are connected to others as individuals, and not as comparable statistics. While the dementia “epidemic” appeals to an urgent need for action on a tremendous scale, it bears no comparison to the coronavirus pandemic. Continued use of such rhetoric risks detracting from the current situation and threatens future efforts to treat and care for dementia patients. Instead, recognizing that any epidemic is constituted by multiple individuals will help us to understand the significance and human consequence of widespread disease.


A version of this post has previously been published on The Polyphony: Conversations Across the Medical Humanities. I would like to thank Fiona Johnstone and Katrina Longhurst for their suggestions and contributions.



[1] Sontag cites as evidence the “near-total historical amnesia about the influenza pandemic of 1918–19, in which more people died than in the four years of World War I” (1990, 71).

[2] In the UK, The Guardian invites the public to share their stories and memories of those who have died from Covid-19, ensuring that numbers are converted back to narratives while in the USA The New York Times has collated an interactive memorial, encouraging the reader to visualise the individual and his or her story amongst a faceless crowd.

[3] “Deaths involving Covid-19 England and Wales: Deaths Occurring in April 2020,” Office for National Statistics, 15 May 2020.

[4] Advocates of narrative identity, such as Paul Ricoeur, Jerome Bruner and Paul John Eakin, argue that a healthy sense of self depends upon an ability to construct a meaningful life story. However, the view that narrative is synonymous with personhood is frequently challenged in disability and illness studies for its reliance upon a normative – healthy and able – matrix of lived experience. See James Overboe, “Ableist Limits on Self-Narration: The Concept of Post-personhood,” in Unfitting Stories: Narrative Approaches to Disease, Disability, and Trauma, ed. Valerie Raoul et al. (Waterloo, Ontario: Wilfrid Laurier UP, 2007), 275–84.



Bitenc, Rebecca. Reconsidering Dementia Narratives: Empathy, Identity and Care. London and New York: Routledge, 2020.

Kleinman, Arthur. The Illness Narratives: Suffering, Healing, and the Human Condition. New York: Basic, 1988.

Law, Jonathan and Elizabeth Martin, eds. Concise Medical Dictionary, 10th edition. Oxford: Oxford University Press, 2020.

Legardinier, Gilles. Nous étions les hommes. Paris: Fleuve Noir, 2011.

Sontag, Susan. Illness as Metaphor and AIDS and its Metaphors. New York: Doubleday, 1990.

Tynan, Avril. “Death in Isolation: The Covid-19 Dead Are More Than Numbers,” Medical Humanities Blog, 23 April 2020.

Zeilig, Hannah. “Dementia As a Cultural Metaphor,” The Gerontologist 54, no. 2 (2013): 258–67.

Zimmermann, Martina. “Alzheimer’s Disease Metaphors as Mirror and Lens to the Stigma of Dementia,” Literature and Medicine 35, no. 1 (2017): 71–97 (73).



Avril Tynan is a postdoctoral researcher in comparative literature at the Turku Institute for Advanced Studies, University of Turku, in Finland. She is a member of the SELMA Centre for the Study of Storytelling, Experientiality and Memory, and a visiting researcher at the Centre for Narrative, Memory and Histories at the University of Brighton. Her work examines the role of narrative and ethics in the representation of ageing, illness and death and her current research explores affect and empathy in Francophone novels of dementia. Her recent works include “‘Que peut la fiction?’ Storying the Unexperienced Experience in Jorge Semprun’s Fiction,” Modern Language Review 115:1 (2020): 46-62; “Play and Possibility: Olivia Rosenthal’s We’re not Here to Disappear and the Limits of Understanding Alzheimer’s Disease,” Narrative Works 9:1 (2020); “Winding Down, Living On: The Future in Old Age,” Storyworlds (2020); and “Mind the Gap: From Empathy to Erasure in Narrative Fiction,” Journal of Literary and Cultural Disability Studies, (2020).

Twitter: @avitynan

Lasse Gammelgaard • Mental Illness Costumes: Divisive Discourse and Untold Stories of Stigma

About the blog: The Instrumental Narratives blog aims to popularize the insights and methods of narrative scholarship and features analyses of instrumental storytelling by high-profile narrative scholars. The analyzed cases deal with uses or abuses of the narrative form, storytelling practices or narrative sense-making in many areas of life: politics, journalism, business, identity work, artistic or literary sphere, activism, and forms of social participation. The blog texts evaluate possible societal risks or benefits of contemporary storytelling, for example through cases from the author’s own national, linguistic, or cultural sphere.



This blog post is about damaging, stigmatizing and stereotypical notions of mental illness in contemporary cultural discourse, but it is also a personal reflection on the risks and rewards of immersing oneself into these public debates as an academic. I have made ad-hoc translations of all text originally in Danish.


Opinion Piece

When I wrote the Danish high school textbook Madness in Literature with Thomas Søgaard Boström (a Danish artist, author and teacher of writing classes at The Outsider, a mental health organization based in Copenhagen), we wanted to include an image from Amazon of a costume called “Adult Skizo Costume” for a section of the book on stigma and taboo. However, we were told that the business had withdrawn the costume.

I was subsequently very surprised to learn that these costumes were sold on basically every online costume shop in Denmark. The signature reference to mental illness is the straitjacket. We tried to get permission to use an image from one of the Danish web shops, but none of them replied. I thought these costumes were highly problematic, so I wrote an opinion piece for Weekendavisen (a weekly newspaper in Denmark) right before Halloween in 2019, where one would imagine that they would sell the most.

The costumes would have names like “Loco Straitjacket,” “Mr. Crazy,” and “Bloody Straitjacket Costume,” but some of the prose descriptions of the products would be far more severe. The “Crazy Babe Costume,” stated: “Dress up as an insane person in a straitjacket with this sexy costume.” “Female Mental Patient Costume,” had: “This is the costume, if you want to trick your friends into believing that a mental patient has escaped the asylum! The straitjacket, which can be zipped on the back, is a costume with a realistic expression, but you can also use it if you just need to rest your hands.”

It goes without saying that the costumes had very little to do with realistic representation. In fact, they resemble characters in horror movies more accurately than people with mental illness diagnoses (who do not look a particular way). Madness is a well-established topos in fiction, in which mad characters function well with other resources to create suspense and thrill. Even though such works unavoidably and misleadingly imbue in the viewers notions of psychiatric patients as dangerous and violent, I was not, of course, out to criticize movies like Alfred Hitchcock’s “Psycho” or John Carpenter’s “Halloween.” I did, however, want to criticize the drift from dressing up as fictional characters in horror movies to purportedly depicting people suffering from mental illness. There is a world of difference between dressing up as, say, Hannibal Lecter from “The Silence of the Lambs,” on the one hand, and as a person with a schizophrenia or bipolar disorder diagnosis on the other hand. Dressing up as Hannibal Lecter is, in my opinion, comparable to dressing up as the monster in “Frankenstein” or “Dracula.” But in that case, the reference to a fictive character should be made as explicit as possible, and the web shops should refrain from inviting people to dress up as psychiatric patients in general. In the opinion piece, I addressed all of this and highlighted the discrepancy between the actual costumes and the branding with names, product descriptions etc.


Media Reaction

I do not normally engage in public debates in this way, and no journalist ever called or e-mailed me to get a follow-up interview on my research on poets’ use of aposiopesis in nature poetry about experiences of the sublime – or any other research topic for that matter. One reason why I was hesitant to write the opinion piece is that the debates that follow typically are exaggerated and undiscriminating. The media need people to fit into clear-cut roles: the injured party, the villains, etc. Therefore, a main goal of mine was to avoid writing an angry piece to avoid a divisive discussion.

The week after the piece was published, I was contacted by numerous media outlets, which resulted in a follow-up interview in a newspaper (Berlingske), two radio interviews (a local division, P4, and a national division P1 of DR, Danish Radio, which is the Danish equivalent of BBC), and an interview in a feature on the topic in the evening news at TV2.

The media work in such a way that they need to discuss a clear problem, and the positions in a debate are often presented in a crude way. They look for tensions, contradistinctions, inconsistencies etc. Interviewees, including scientists, are cast in accordance with that logic. Hence, you could almost plot the participants in the ensuing debate into Greimas’s actantial model (Greimas 1983). The Subject would be: anyone living with a mental illness who feel the stigmatization of society. The Object they desire: recognition and destigmatization. The Helper: scholar writing an opinion piece. The Opponent: Costume web shops and people buying the costumes. The Sender: the aforementioned scholar. The Receiver: people living with a mental illness.

After the actants have tacitly and seamlessly been assigned their structural roles, they are then challenged from that position. For instance, I was repeatedly asked, if I had a psychiatric diagnosis – the subtext being: who are you to speak about this on behalf of others? Did anyone ask for your help? Can they not speak up for themselves? Et cetera.

Part of this divisive rhetoric was prepared by the editor at Weekendavisen, who changed the title of my piece from something less colorful to “Offensive Costumes” without checking in with me first (so much for my attempt to avoid an aggressive tone). If I learned anything from the subsequent debate it is this: live interviews are by far preferable to formats that are edited. Journalists I spoke to seemed surprised, when I tried to explain that I was not trying to lecture or blame anyone, I had nothing against costume shops, nor did I think people who might wear the costumes were bad people.

It was important for me to break the actantial script, because issues of identity are always much more complicated. The past few years, there have been numerous debates in Denmark about “offensiveness.” Who has the right to decide, when something is offensive? What is new is that the voice of those “taking offense” is getting more airtime and is taken more seriously. Experiences that perhaps were not told previously are now being heard. This often clashes with people who have no ill intentions and who refuse to be thought of as “the offender” (nor do I necessarily think that they should). The latter group who view themselves as broad-minded complain about “the readiness to take offense” (krænkelsesparathed). Costumes, in particular, have been a hot potato, so I was not surprised at all that I was asked to comment on and compare dressing up as a psychiatric patient to other possible costumes.

The types of questions I was asked are telling of the divisive media rhetoric: What are we even allowed to do? Is there not always a minority group that might be offended by any given costume? Is wearing a sombrero or dressing up as a samurai not an atrocious appropriation of the cultural heritage of Mexicans or Japanese people? Where do you draw the line? It is strange that I was asked to be the judge of such borderline cases. Although I have not walked a mile in their shoes, I feel fairly confident in saying that I do not think that the werewolves or the zombies would be offended by our way of depicting them when dressing up in costumes. But other than that, all I could say was that my opinion piece was about mental illness costumes, and that I thought there was a categorical difference between dressing up as a person suffering from an illness and potentially problematic instances of cultural appropriation – no one would dress up as, say, a cancer patient.

TV2 contacted many costume shops to get a reaction. Only a shop called Faraos Cigarer (Pharaoh’s Cigars), located in Odense, agreed to an interview. The sales clerk’s reply was that we should not be too rigid, and that we should be able to have fun. We should avoid “Swedish conditions.” This is a knee-jerk reaction in Denmark: When something is thought of as moralizing, preachy or excessively politically correct, it is seen as a stance that would be more at home on the turf of our Scandinavian neighbors. We Danes, on the other hand, take pride in our liberal-mindedness. We do not easily take offense. There are many cases from the past few years where the alleged “readiness to take offense” clashes with actions of people who had no intentions of offending anyone.

TV2 wrote a news story on their website that summarized the feature in the evening news, in which they had interviewed the sales clerk, me, and the president of “Sind” (a mental health organization in Denmark). They included a pop-up survey in the news article that laid bare the divide in the population’s opinion regarding the tenability of mental illness costumes. The question asked was: “Would you dress up as ‘a mentally ill patient’?” The 1,489 users who engaged with the survey, answered as follows:

“Yes, I don’t see a problem with that” – 46 %

“Maybe, but it could be problematic” – 8 %

“No, I wouldn’t dream of it” – 39 %

“Don’t know” – 7 %

The 46 % who clicked “Yes, I don’t see a problem with that” probably believe that people get outraged too easily, and they most likely do not harbor any bad intentions or ill will towards people with a mental illness. They just do not see the problem. And that leads me to the most interesting questions posed to me in media: How big is the problem? Has it caused problems for any actual individuals out there? What do you fear will happen if people wear the costumes? Like me, the president of Sind had not known the costumes even existed. All I could say was that I did not know the extent of the problem, but that I thought it was problematic in principle regardless of whether it actually harmed anyone in a direct way. I am sure there will be people who live with diagnoses, and their next of kin, who do not mind the costumes. I am also sure that they could be worn at specific parties without offending or harming anyone. It might also be the case, though, that if someone is hurt by the costumes, they would not speak up. This is how stigmatization works. The stigmatized will feel ashamed, and those stories might not be told.


The Aftermath

In the wake of the media reaction, I was contacted by several people. One person, let’s call her “Ann,” shared exactly such a story, and she has given me permission to recount it anonymously:

The incident took place 15 years ago. “Ann’s” husband who was 51 years old at the time had just been admitted to Riiskov Psychiatric Hospital in Aarhus. He was diagnosed with frontotemporal dementia, an illness that can change one’s entire personality (you can become impulsive, inappropriate, emotionally indifferent, just like it affects one’s language). The diagnosis also meant that he would not be able to live in their shared home anymore. One of her sons who was 13 years old at that time was, of course, greatly affected by this. Right after the time of the diagnosis, he was supposed to go to a weekend camp with his table tennis club. He hesitated to go, but ended up going. He could always call, if he wanted to go home. When his mother came to pick him up at the end of the camp, he fell apart crying. During the night, the adults had arranged a “night run activity.” It was a kind of role-playing game, where the teenagers were told that a dangerous criminal had escaped from the nearby psychiatric hospital (the same one where his father was presently admitted). He said he did not want to participate, but was pressurized to be a part of it. He told his mother that they had even arranged for a police officer in a police car to be involved (“Ann” did not know if it was a camouflaged car or if someone had known an officer who had agreed to play the role). In the end, they had to deliver the dangerous man to the police officer.

Given the circumstances, the boy could not bring himself to tell them that his father was hospitalized, and if anything, he learned that having a father with a mental illness was a bad thing that you should try to hide. This example is clearly more elaborate than simply wearing a costume to a party, but both actions contribute to perpetuating the same prejudice: That being dangerous, rambunctious, and violent are core features in people struggling with mental health issues. Whether intended so or not, it contributes to what the medical historian, Roy Porter, has termed cognitive apartheid, where you think in them-us binaries (Porter 2002, 63). We, the sane (the majority), and them, the insane (whom we should be afraid of). It would be better to think of mental health in terms of a continuum with lots of grey areas on it. Perhaps Sind had not heard complaints about the costumes because most people are not offended by them, but it might also be because speaking comes with a risk of stigmatization.

What to make of the 46 % who do not see a problem with dressing up as a psychiatric patient? Sure, it might not feel like there is a problem, because it is not a problem for me. My guess is, though, that few of them would maintain that position, if they had had an experience remotely akin to “Ann” and her son’s.



Greimas, Algirdas Julien. 1983 [French original 1966]. Structural Semantics: An Attempt at a Method. Transl. Daniele McDowell, Ronald Schleifer, and Alan Velie. Lincoln: University of Nebraska Press.

Porter, Roy. 2002. Madness: A Brief History. New York: Oxford University Press.


Lasse Gammelgaard 10x15.jpg

Lasse R. Gammelgaard (1983), PhD, Associate Professor at Aarhus University. Presently working on a project on “Forms of Mental Illness Representation,” funded by the Independent Research Fund Denmark. Co-director of “Health, Media and Narrative” and member of “Narrative Research Lab” and “Center for Fictionality Studies” at Aarhus University, Denmark. Co-author of Galskab i litteraturen (Madness and Literature) and editor of Madness and Literature: What Fiction Can Do for the Understanding of Mental Illness (forthcoming in the University of Exeter Press’s book series “Language, Discourse and Mental Health”).






Hanna Meretoja • Stop Narrating the Pandemic as a Story of War

About the blog: The Instrumental Narratives blog aims to popularize the insights and methods of narrative scholarship and features analyses of instrumental storytelling by high-profile narrative scholars. The analyzed cases deal with uses or abuses of the narrative form, storytelling practices or narrative sense-making in many areas of life: politics, journalism, business, identity work, artistic or literary sphere, activism, and forms of social participation. The blog texts evaluate possible societal risks or benefits of contemporary storytelling, for example through cases from the author’s own national, linguistic, or cultural sphere.

Narrativisation is a central mode of communication and sense-making. As the coronavirus pandemic is unfolding and changing the world before our eyes, it makes a crucial difference which cultural narratives mediate our understanding of the crisis. A deeply problematic story of war has come to dominate the public imagination.

President Donald Trump has branded himself as a “wartime president” and calls the pandemic “the worst attack” ever on the United States. “We must act like any wartime government” declared Prime Minister Boris Johnson, while President Emmanuel Macron asserted multiple times in his recent televised speech that “We are at war.” Health organisations and the media have also adopted military vocabulary. Doctors and nurses are fighting on the “frontline” with an “army of volunteers” to help them, and we are asked to come together in a joint “war effort.”

It is easy to understand why the narrative of battle is attractive. It attributes agency to us at a time when we feel helpless, with few weapons to fight a virus with no cure and no vaccination. Instead of positioning us as passive victims, the narrative of war turns us into courageous soldiers in a fight against a common enemy. For political leaders, the rhetoric of war is a convenient way of conveying the gravity of the situation and justifying emergency legislation and the suspension of certain fundamental rights.

But we are not soldiers, and this is not a war. Using war metaphors to ascribe agency to patients, healthcare workers and the public as a whole is profoundly problematic.

First, to talk about patients “battling for their lives” risks implying that those who survive fought so hard that they made it, whilst those who fail to survive lost their battle because their fighting spirit wasn’t strong enough.

The same problem pertains to using the language of war in depicting cancer patients as “fighters.” As I went through grueling breast cancer treatments last year, I was struck by how often I was praised for fighting so hard. I had to deal, not only with the shocking prospect of never seeing my children grow up, but also with ‘normative optimism’ – the pressure to have the kind of fighting spirit that fits the culturally-preferred narrative of battling cancer.

But there is no research to suggest that a strong fighting spirit would help us to survive either cancer or the coronavirus. In fact, research indicates the opposite: military metaphors harm cancer patients. Those who recover from cancer or Covid-19 are fortunate but should not be praised for winning a successful battle, anymore than those who die should be blamed for not fighting hard enough. No one wants to die of these illnesses. Survival depends on access to effective care and treatments – subject to structural inequalities – as well as on biological mechanisms such as the immune system of the patient, rather than on psychological traits like courage or optimism.

The language of battle may lead us to support such assumptions even when we don’t explicitly think this way. For example, when Boris Johnson was treated for Covid-19 in intensive care, President Trump declared that he’d be fine because he is so “special” and such a “strong person:” “Strong. Resolute. Doesn’t quit. Doesn’t give up.” Are those who “lose the battle” weak people? Do they die because they give up?

As potential patients, we are urged to prepare for the fight by keeping ourselves fit and alert. This creates an illusion of control, as if catastrophe only affects people who fail to be strong and alert soldiers in the war against the “invisible enemy.” Most of those who become critically-ill with the coronavirus have underlying health conditions, we are told, often linked to less than optimal life-styles.

But the truth is that life is fragile and no one is invulnerable. Anyone can fall ill. I had no known risk factors and yet got cancer at a young age, out of the blue. It made me realise how much my life was governed by that control illusion. I thought that if I kept myself superfit, ate a healthy diet, had children young, breastfed them for ages and did all the other “right things,” then nothing could go this wrong. I never smoked and was never overweight, yet the cells started to divide in my breast uncontrollably. I simply had bad luck, and only time will tell whether the cancer returns. Similarly, in the current pandemic, we also have to learn to live with fundamental uncertainty and lack of control.

Second, healthcare professionals are crucial agents in the effort to stop the pandemic, but they are not soldiers. Doctors practice agency in making vital decisions about treatment and care as they try to keep patients alive. Researchers around the world are key agents in the joint endeavor to develop tests, drugs and vaccines. But what healthcare professionals practice is care, not war. Universal access to healthcare is essential to the prospects for peace.

The narrative of war is used as a legitimizing discourse. Wars inevitably have casualties. Wars require sacrifice. The narrative of war heroes is used to justify putting health workers at risk. It distracts us from structural inequalities, including the high exposure of low-paid women to the virus. Health workers are offered military flypasts and medals, even though they would rather get a proper salary and Personal Protective Equipment.

Third, the pandemic affects everyone, but just because working together to stop the spread of the virus has to be a collective effort doesn’t make it a war. The war narrative is linked to romanticized, nostalgic and false conceptions of conflict. The analogy is misplaced for numerous factual reasons, ranging from the impact of war and pestilence on the economy and the movement of goods and people to crucial differences between the sensory experience of armed conflict and the pandemic.

Resorting to the narrative of war means missing the opportunity to confront the complexity and specificity of the current crisis. It blinds us to the uniqueness, not only of how the pandemic feels but also of the social and economic challenges it engenders.

Not only is the analogy of war misplaced on factual grounds; it also misses the possibility to cultivate an imagination that builds on narratives of peace, solidarity and social justice – and to foster a more acute understanding of how we are all fundamentally dependent on one another as inhabitants of a shared planet.

This is an opportunity to embrace our shared vulnerability and destructibility. We tend to idealize agency when it is linked to autonomy, control and independence. But agency is also about the ability to respond to others and to their touch, thoughts, needs and affection; the ability to share experiences, anxieties and hopes and to be attached to, and care about, beings beyond ourselves.

Instead of seeing the pandemic in terms of destructive and divisive narratives like the “survival of the fittest” or nations competing in the war against the virus, shouldn’t we see it as a lesson on the fragility of life? The Queen asks us “to take pride” in the British response to the crisis, but isn’t this a time when humility takes us further? If we turn away from the narrative of war, we can envision how a new global awareness of mutual dependency could give rise to a stronger sense of solidarity, which may help us build a more socially- and environmentally-just world for future generations.

Instead of narrating the pandemic as a story of war, we could narrate it as an open-ended story of a point in history at which humankind faces the opportunity to choose between different routes to different futures. We stand at a historical crossroads in which political decisions will save or cost millions of lives. While many leaders are resorting to the rhetoric of war, others are emphasizing the power of people in a democracy to work for a better and more peaceful future. In this moment we can practice our narrative agency by cultivating our sense of the possible, our sense of how things could be.

The future of humankind depends on the path we decide to take, and that path largely depends on how we narrate the pandemic and the lessons to be drawn from it as we move forward. Let’s make sure these narratives hold open the possibility we now have to leave behind an unsustainable way of life and to imagine a world based on solidarity and care.


Originally published in openDemocracy/Transformation:

Hanna Meretoja (photo by Maria Grönroos)Hanna Meretoja is Professor of Comparative Literature and Director of SELMA: Centre for the Study of Storytelling, Experientiality and Memory at the University of Turku (Finland) and in 2019-2020 a visiting professor at Exeter College (University of Oxford) and Oxford Centre for Life-Writing. Her research is mainly in the fields of narrative theory, narrative ethics, and cultural memory studies. Her monographs include The Ethics of Storytelling: Narrative Hermeneutics, History, and the Possible (2018, Oxford University Press) and The Narrative Turn in Fiction and Theory: The Crisis and Return of Storytelling from Robbe-Grillet to Tournier (2014, Palgrave Macmillan) and she has co-edited, with Colin Davis, The Routledge Companion to Literature and Trauma (2020, Routledge) and Storytelling and Ethics: Literature, Visual Arts and the Power of Narrative (2018, Routledge).


Robert Appelbaum • Pleasant and Useful? A Tale from the Middle Ages

About the blog: The Instrumental Narratives blog aims to popularize the insights and methods of narrative scholarship and features analyses of instrumental storytelling by high-profile narrative scholars. The analyzed cases deal with uses or abuses of the narrative form, storytelling practices or narrative sense-making in many areas of life: politics, journalism, business, identity work, artistic or literary sphere, activism, and forms of social participation. The blog texts evaluate possible societal risks or benefits of contemporary storytelling, for example through cases from the author’s own national, linguistic, or cultural sphere.

In the Middle Ages and the Renaissance, it was taken for granted that the purpose of telling a story was to provide a moral example. That meant showing something that readers ought to emulate, or else something that they ought to avoid. There was no idea that fiction, in other words, was anything other than instrumental. That fiction ought to entertain was also taken for granted, though. The ancient authority Horace said that art ought to be dulce et utile, pleasant and useful. So it would be a surprise to most storytellers that there is anything to worry about if fiction was meant to be useful, and the very idea of “instrumentality,” as in Adorno and Horkheimer’s famous injunction against “instrumental reason,” would have seemed odd.

But that didn’t mean that instrumentality in fiction was straightforward. The example of the perennially popular Aesop’s Fables apart, early European fiction writers knew that though a story was one thing, the interpretation of a story was another. Moreover, although it was pretty important to know, along with the grasshopper, that it was best to live like an ant, and always prepare for the future, the stories that attracted the most interest could be morally ambiguous. In fact, it was common for storytellers to say that a story meant one thing when it actually meant another, or else to put out a story, salacious though it may be, and pretend that it expressed a strait-laced moral truth which it couldn’t possibly be expressing.

* * *

Here is an example from a medieval French fabliau (a comic short story) by one Hues Piaucele, collected (and translated) in what is called the British Library Manuscript.[1] It is about a loving couple named John and Yfame, who had recently fallen on hard times. Knowing this, three separate monks from the local monastery offered substantial amounts of money to Yfame if she would have sex with him. She indignantly told John about the monks and John came up with a plan for revenge, as also a plan for getting a hold of the monks’ money. He told Yfame to accept their offers, and have each of them come to their home at different times one evening the next week, saying that her husband would be gone from the house that night.

In came the first monk – the narrator loves adding the monk was old and fat. He put his money on the table and went after Yfame in the middle of the main room of the house, bringing the two of them down to floor. John, who had been hiding in the loft, came down and whacked the monk on the head with a club, instantly killing him. John took the corpse outside, dumping it beside a tree, where he thought he would later bury him, and collected the cash. In came the second monk, putting down his money and trying to take Yfame on her bed. Again, John whacked and killed him, and dragged the corpse out of the horse for burial. In came the third monk, and the same thing.

The narrator doesn’t spare his readers or listeners the gory details. On the first occasion, for example, we are told that John

who threw himself upon them,
Very hard with the club:
He hit his head so hard
That the blood and brains flowed out.
The man fell dead speechless.

But now it was getting late and John had the hard work ahead of him of burying the three men. And he was exhausted, though delighted at all the money he has suddenly accumulated from the monks’ down payments on sex they would never enjoy. So he asked his wife to go fetch her nephew Estormi, a simpleminded young man much addicted to gambling. She found him in a tavern, losing at cards, and offered to pay his debts if he came right away and helped her in a private affair.

When Estormi arrived at John and Yfame’s home, John told him that the body of a demonic monk had come back from the dead, and his body needed to be buried as soon as possible to prevent him from coming back to life altogether and cause who knows what kind of harm. So Estormi buried him. But when he came back to the house, John showed him the second monk, pretending that it was the same monk, again arisen from death. “There he is again.” Go bury him, John said, lest he return another time. Estormi did what he was told. But then, coming back to the house, he was shown the third monk, the same monk again, it is alleged, arisen one more time. With much effort, worn out by the previous two burials, Estormi went to work again, and found himself successful at ridding the world a third time of the ghostly priest. But as he was finishing the job, he saw a fourth priest walking past him down the street. “Look,” Estormi said, “this priest is getting away from me! By God’s ass, he’s going back! What is this, Sir Priest?”

Estormi took his shovel and attacked the fourth priest. As he reports the incident later on,

“And I gave him one with the pick
So hard that I made
His brains flow out on the street.
Then I took him, and I went back
Down there by the back door.
And I threw him down;
I stuck him into a mud pit.”

When the husband heard this he was flabbergasted. And he said aloud, though in a low voice,

“In faith, now things are going worse,
Because this man [the fourth  monk] hadn’t done anything wrong here.
Someone is paying the penalty
Who has not deserved death for it.
Very unjustly did the priest
Whom Estormi killed lose his life.
The devil has a great talent
For tricking and trapping people.”

And that is just about the end of the story. It has been modelled in typical folktale fashion, following what is often called “the rule of three,” and it has besides exploited what might also be called “the supplement of the fourth.” The rule of three portends an economical balance: readers will probably be most aware of “Goldilocks and the Three Bears,” where the third alternative portends a golden mean which solves a narrative problem. The device is common in all sorts of tales. There are three little pigs. In the classic Greek myth, The Judgement of Paris, the title character has to choose between three different goddesses as to which one is the loveliest. When a magic being is freed from a bottle or other impediment, his rescuer is entitled to three wishes. But there are cases when a folktale cannot end on the resolution of the rule of three, and so brings in a fourth phenomenon, which upsets the balance altogether. I call it the supplement of the fourth.

And in any case, after the fourth monk is killed, and John bewails the injustice of it, this supplement of the fourth overturning the story’s balance, there is still more to come, because after John finishes speaking, the narrator comes in to tell us the moral of the story:

Through these priests,
I would like to teach you
That it is folly to covet
Or woo somebody else’s wife.
The reason for this is very clear.
Do you think that because of any poverty
A decent woman would forget her duties?
No! She would rather let her throat
Be cut with a sharp razor
Than ever to do for money
A thing that would bring shame to her lord.

Note how different are the responses of John and the narrator, John ending with alarm at the evil circulating in the world, the narrator ending with a caution against trying to seduce another man’s wife by paying her money. Meanwhile four monks lie dead.

* * *

For John and the narrator alike the story is an example of something, a higher moral truth which can come either in the form of a maxim (John) or in the form of a caution (the narrator). It is precisely because of its exemplarity that the story is understood to be “instrumental,” which is to say “useful” as well as pleasant. Watch out for the devil! Don’t try to bribe a married woman! But the author of the story, who uncharacteristically identifies himself at the end of the tale, seems disposed rather to point out that morals of this kind are arbitrary and beside the point. For it should be clear to any discerning reader that the story does not prove that the devil is everywhere, or that all wives would resist a bribe for sex, or that bribing a woman for sex is bound to get the perpetrator in trouble. In other words, whatever John or the narrator may think, the story does not mean what they say it means. Both John and the narrator seem to be committing the fallacy of the excluded middle, where a particular becomes a universal without a middle term to tie the universal to the particular. To put it another way, both John and the narrator assume an either/or state of affairs, although the story itself puts forward any number of alternatives. For example, instead of blaming the devil, John might have blamed himself for entrusting serious work to a fool. Instead of cautioning men to stay away from other men’s wives, the narrator might have said that the would-be adulterer should proceed by steps and make sure of a woman’s affection before he tries to seduce her with gifts.

The question may then arise as to the author’s intention. Why did he tell this story? One obvious answer is that he told it in order to mock exemplary fables. Stories like this, he implies, seem to have a moral but really don’t. There is nothing instrumental about them. Or he may be anticipating the Renaissance tradition which begins with Boccaccio’s Decameron, where time and again it is shown that different people may interpret the same story differently, given their own prejudices, needs and struggles. Or again, he may be anticipating the naturalistic tradition of storytelling, which becomes especially prominent in the nineteenth century, according to which the purpose of fiction is to illuminate the (ugly) truth of how the world really works. The story of John and Yfame casts light on a world where poverty can lead people to extremes, where religious people may ride roughshod over their vows to austerity, continence and charity, where violence and vengeance may seem to satisfy our inner needs coupled with our sense of justice … and so forth.

I was myself raised in this last tradition. I was taught, in high school, that Pride and Prejudice was about the vexing rise of materialism in early nineteenth-century Britain, that Macbeth was about the dangers of ambition, and that “Bartleby the Scrivener” was about the power of irrationality in the face of modern-day bureaucratic capitalism. Or at least, that is what I remember having been taught. But what if the stories are not about these things? What if they have no “use,” no propositional and ethical uptake? Or what if, as in the case of the fabliau, the instrumentality of fiction indefinitely recedes into the abyss of excluded middles?

I don’t have a ready answer to the question, but I hope I have at least pointed out that the problem of the instrumentality of fiction is nothing new; it was already a vital concern among storytellers in the thirteenth century.



[1] The French Fabliau B.N. MS. 837, ed. and trans. Raymond Eichmann and John Duval, Two Volumes, Vol. I (Abingdon: Routledge, 2018). Kindle Edition, position 951-1472.




Robert Appelbaum, Professor Emeritus of English Literature, Uppsala University, and Senior Professor in Arts and Communications at Malmö University, is the author, most recently, of The Aesthetics of Violence: Art, Fiction, Drama and Film (2017). He is currently working on a book entitled The Renaissance Discovery of Violence, from Boccaccio to Shakespeare.