Narrative and Number

Return to Index Cards →

Staging an Exit (Pursued by a Bear)’, Aubrey Whitlock (2018)

‘Exit, pursued by a bear’

Arguably the most famous stage direction, the pursuit of the bear in Shakespeare’s The Winter’s Tale is expedient in doing away with a villain without needing to implicate the hero. The comic quality seems to endow Antigonus with idiocy; dispatched a hapless man. In thinking here on the relationship between narrative and numbers (or computation), the bear serves as a reminder of the sheer complicity (and frequently surreal nature) of language. Aubrey Whitlock (2018), for example, describes how a group of actors can workshop just this one line for serious effect; whether exploring it as naturalistic, comic, abstract, and even regards its timing (Shakespeare declares the exit, yet ‘provides no firm indication of an entrance’). The stage direction comes midway, when the play shifts from the tragedy of Sicilia to the comedy of Bohemia. To modern sensibilities the bear can seem somewhat random, so typically promoting comic effect. Yet, in the Elizabethan period bears were a common sight. The barbarism of bear-baiting inevitably leads to a more tragic reading. Hence, as Whitlock notes: ‘Participants come away from the workshop with a deeper understanding of this moment’s context in The Winter’s Tale, but will barely have scratched the surface of how many playable options exist in performance’. 

Given the numerous ‘playable options’, what might AI natural language processing make of this line? Using the custom adventure mode in AI Dungeon, which asks for an initial scene-setting prompt (e.g. who you are and some opening sentences of where you start from), the following opening line prompts quite different outcomes on several attempts: 

You are Antigonus, in Sicilia. You are making a swift exit pursued by a bear…
(1) Your goal is to get off the island, preferably with your head intact. You start running, and the bear starts running too, but then he stops. He stares at you.
(2) ‘What are you fleeing from?’ / ‘Oedipus.’ / ‘Who?’ / ‘Oedipus, King of Thebes.’ / ‘That's an odd name.’
(3) You are in a small house, made of wood. You are hiding behind a table, next to a woman. You are Antigonus, in Sicilia.
(4) You have a short sword, a javelin, and an old-fashioned heavy clay pot filled with a handful of olives. You hear the bear growling and smell its hot breath.

The first attempt does a good job to ‘run’ with the scenario and even to inject a sense of suspense (‘He stares at you’). The second attempt clearly taps into the Shakespearean genre, but throws a small curve ball (’That’s an odd name’). The third attempt suggests a rustic scene (‘made of wood’). It offers some minor suspense and introduces a new character, ‘a woman’ (although the repeat of your own name and where you come from jars a little with the flow of a narrative. Why do we need reminding of this? Perhaps our character is a little insecure!). The final attempt suggests the role playing game genre, and offers some nice description (not only a clay pot, but one filled with a ‘handful’ of olives, and we ‘sense’ the bear ever closer; we can hear and smell it). 

It is worth noting these examples will seemingly never appear again, or at least not these exact formulations. Every attempt brings up yet another version, a seemingly infinite parade of potential narratives.  (I try again as I write this, just to check myself, and its throws up: ‘It was a bear! I saw its haunches!’, ‘It was a bear, I say.’; and then again a further attempt, just in case: You are hiding in a little hamlet, and the bear, a huge beast, is hot on your trail. A peasant boy points out a little cave to you, just big enough for you to take cover.). Like the piano, with its finite number of keys, but which can yield seemingly endless melodies, AI Dungeon would appear to offer a limitless number of narratives, each seeming to make sense in themselves, and with a fair variety of trajectories. Here, then, is an auto-generated system that delivers on (and only adds to!) Roland Barthes’ suggestion that the ‘narratives of the world are numberless’ – this being the opening line of Barthes’ well-known essay, ‘Introduction to the Structural Analysis of Narratives’.   

In an interview conducted just at the time Barthes was drafting this essay on narrative (and only due to be published for the first time in English in Theory, Culture & Society later this year), we can hear more of what felt ‘necessary’ at the time to undertake the work. For example, Barthes refers explicitly to the need to ‘focus on the anthropological level’ of narrative (references to Propp and Lévi-Strauss are given in the published essay, but in interview there is perhaps a clearer sense of the debt owed). Barthes says at the start of the talk:

I want to emphasise that such an analysis [of narrative] is necessary. That is because narratives are a sort of human material, a class of thing which humans produce. At first glance, it’s just impossible to impose any sort of order on them [absolutely uncontrollable]. There are millions and millions of narratives, developed over an indefinite period of time, the origins of which are unknown. … Narrative is everywhere’. 

Roland Barthes (draft trans. Jon Templeman).

Here Barthes refers to ‘millions and millions’ of narratives, which in fact is a finite number. Yet, elsewhere in his talk (and in his essay) he refers to the ‘infinite number of narratives’. Yet, perhaps Barthes’ equivocation is right. The narratives of the world are perhaps not numberless, although human cognition is such that it is hard to ‘compute’ it otherwise. To the human faculty narratives (and melodies) might as well be infinite. The maths is too big to contain. In Words and Rules, Steven Pinker provides us with some of the numbers. Building on the work of Chomsky, Pinker is interested in the combinatorial and recursive nature of language, and regards combinatorial rules he writes: 

They don’t just have a single slot, like a fill-in-the-blank exam question; every position in the sentence offers a choice of words from a lengthy menu. Say everyday English has four determiners (a, any, one, and the) and ten thousand nouns. Then the rule for a noun phrase allows four choices for the determiner, followed by ten thousand choices for the head noun, yielding 4 x 10,000 = 40,000 ways to utter a noun phrase. The rule for a sentence allows these forty thousand subjects to be followed by any of four thousand verbs, providing 40,000 x 4,000 = 160,000,000 ways to utter the first three words of a sentence. Then there are four choices for the determiner of the object (640 million four-word beginnings) followed by ten thousand choices for the head noun of the object, or 640,000,000 × 10,000 = 6,400,000,000,000 (6.4 trillion) five-word sentences. Suppose it takes five seconds to produce one of these sentences. To crank them all out, from The abandonment abased the abbey and The abandonment abased the abbot, through The abandonment abased the zoologist, all the way to The zoologist zoned the zoo, would take a million years.

Steven Pinker, Words and Rules, 2015 [1999]

Pinker evokes Jorge Luis Borges’s story ‘The Library of Babel’ (as ‘[p]erhaps the most vivid description of the staggering power of a combinatorial system’). As the story goes, ‘somewhere in the library is a book that contains the true history of the future (including the story of your death), a book of prophecy that vindicates the acts of every man in the universe, and a book containing the clarification of the mysteries of humanity’. Of course, even after the human species is made extinct, the library (and its combinatorial possibilities) remains. Yet, technically, Pinker explains, ‘Borges needn’t have described the library as “infinite.” At eighty characters a line, forty lines a page, and 410 pages a book, the number of books is around 101,800,000, or 1 followed by 1.8 million zeroes. That is, to be sure, a very large number—there are only 1070 particles in the visible universe—but it is a finite number. (Pinker, 2015).

While seemingly infinite, it is the (albeit massively) finite nature of language that draws us back to an understanding of the statistical turn in AI development. While at a human level we can have no sense of the magnitude of words, sentences and narratives, for high performance computing (and more potentially with quantum computing) the sums are within range. An important caveat, however, is the finding of diminishing returns for the probabilistic modelling of natural language. Researchers at DeepMind, for example, have  investigated the effects of scale across six language models, ranging from 44 million to 280 billion parameters:

‘…while there were significant gains from scale in some areas, others, such as logical and mathematical reasoning, saw much less benefit. The company now says that scale alone isn’t how it intends to reach its goal of creating a realistic language model that can understand complex logical statements, and has released a model called Retrieval-Enhanced Transformer (RETRO) that researches information rather than memorising it. […] RETRO has 7 billion parameters, 25 times fewer than GPT-3, but can access an external database of around 2 trillion pieces of information. DeepMind claims that the smaller model takes less time, energy and computing power to train but can still rival the performance of GPT-3.

Matthew Sparks, 2001

More to follow… [Exit, pursued by a Teams meeting]

References

Aubrey Whitlock (2018) ‘Staging an Exit (Pursued by a Bear)’, American Shakespeare Center Blog [available online

Steven Pinker (2015 [1999]) Words and Rules: The Ingredients of Language. Basic Books.

Matthew Sparks (2001) ‘DeepMind experiment shows AI must grow smarter, not just bigger’, New Scientist [available online]