Our Tangled Web

Narrative and the future of artificial intelligence

Originally published by H+ Magazine (http://bit.ly/WV0mD2)

Narrative in the modern world

Human culture is dominated by the concept of narrative. Everywhere from the boardroom to the playground, from casual conversation to complex political discourse the idea of the story is key. How we process and build these narratives is vital to our understanding of intelligence and human civilization and yet, it seems, we are increasingly at a loss to understand this web of stories that surrounds us.

The last twenty years have seen an ever greater shift toward a knowledge based society and along with it a corresponding increase in the number of man hours spent focused on, manipulating and refining the basic building blocks of narrative.

From the business executive who needs to create a compelling narrative to the blogger reviewing the latest blockbuster; at the core of many jobs today is creating a good story. The vast majority of skillsets in the modern Western world now rely on communication skills and, by implication, the need to tell a story. As Wikipedia defines it ‘a narrative (or story) is any account that presents a connected series of events.’ – In essence the ability to discuss and interpret ideas and turn these into a coherent whole.

Figure 1: ISBN output for Fiction Titles 2002 -2010

Figure one above shows the increase in output of fiction titles over recent years. These figures show a doubling in the space of eight years. This is impressive enough on its own but when taken alongside the growth in blogs and other online media over the same period there has been a truly exponential increase in narrative output.

Narrative as a guide to intelligence

A few hundred years ago a good proportion of mankind relied on manual dexterity and strength to generate income. Now the ability to turn out a well-structured sentence is of much more economic value than the ability to manipulate a loom or blow glass. Why is this growth in narrative important? Is this merely a shift in market demand  or does it make a difference to our lives?? What importance should we attach to this skill to create and interpret a story?

One key reason that narrative is important is that it is a primary method by which we, as humans, gauge the intelligence of others and formulate intellectual responses to complex hypothetical situations.  A leading artificial intelligence theorist, Roger Schanck, discusses the fact that human society has long considered two aspects of narrative as key indicators of intelligence in his book Tell Me a Story: Narrative and Intelligence. He states: “We assess the intelligence of others on the basis of the stories they tell and on the basis of their receptivity to our own stories.” (Schank, 1999) [1]

It is not only human intelligence that we gauge based on storytelling. Going back to the earliest attempts to measure machine intelligence, such as the Turing test, the idea of narrative and discussion is essential. In his seminal paper ‘Computing Machinery and Intelligence’ (Turing, 1950) [2], Turing discusses not a quantitative measurement of intelligence but an imitation game.

The interrogation he proposes is designed to see whether a machine can fool a human into ascribing it attributes that they would otherwise attribute to a person. Turing makes no mention of measuring intelligence, no empirical certainty, only a dialogue between human and machine. The human judge attempts to decide if the respondent is human or machine based on the sum of its answers. The narrative the machine tells through its answers, combined with the judge’s understanding of that narrative, decides whether the machine is good enough at imitation. If it is only then the human can say that they believe the respondent is intelligent.

“Narrative is not a single entity, or a single tightly related set of concepts… Narrative can mean many things.”

(Mateas M, 2003)

As Turing states, ‘It might be urged that when playing the “imitation game” the best strategy for the machine may possibly be something other than imitation of the behaviour of a man. This may be, but I think it is unlikely that there is any great effect of this kind. In any case there is no intention to investigate here the theory of the game, and it will be assumed that the best strategy is to try to provide answers that would naturally be given by a man.’

This imitation game, of course, is no different from the process (or processes) we use to judge the intelligence of other humans around us. (K. Dautenhahn, 1998) [3] We are constantly engaged in dialogue and constantly struggle to create a consistent narrative that gives us clues as to the intelligence of others around us. However, although there is evidently a link between narrative and our assessment of intelligence we currently have no insight into what form of causal connection exists?

Underneath the narrative

Most people would agree that storytelling is an important part of how we, as humans, interpret the wold around us but how exactly does the brain process narrative? If we read and understand a text are we making an assessment of the intelligence of the author? Or of something else? Are there fundamental differences in the way that different people approach the task of understanding a narrative?

Debate rages in academia over how exactly we define ‘narrative’ and while there is some evidence for ‘narrative pathways’ in the brain much is still unknown. Systems that can model the human capacity for narrative are clearly a step in this direction and recent work developing semantic models for artificial intelligence have produced some evidence. However,  semantic models are currently nowhere near the complexity of  human behaviour. (Mateas M, 2003). [4]

Given the importance of semantic context for AI what can we say about the way the human brain physically processes stories? When presented with a new narrative do we all essentially approach it in the same way? Is the process of understanding fixed – in the same way that vision is largely fixed by the mechanical processes of the eye? Or is narrative merely a word that covers a disparate set of responses, a process that evolves and changes, such that narrative understanding depends only on context and can never be said to be intrinsic.

When we talk about vision there is a certain physical basis that defines what sight is; while people may have hallucinations or other aberrations in their vision, there is a neuro-physical basis that we generally agree underliess what we ‘see’.  The mechanical processes that control the cornea, the retina and the optic nerve are fairly well understood and provide a basis or standard framework when talking about sight. Are there analogous processes in the brain when we understand narrative?

To date a great deal of work in AI has been devoted to determining standardization but the complexity of human strategies in relation to narrative suggest there is little hope of a narrative’ pathway’. If we accept that narrative, in its broadest sense, is ‘an account that connects a series of events’ then the scope is hard to constrain.  Narrative effectively becomes the sum of our understanding of the world and, in this conception, the only limit to narrative is the constraints of our own awareness.

The role of technology in shaping narrative

While there may be no single neuro-physical basis for our understanding of narrative there is plenty of evidence to suggest that environmental factors affect our understanding. One environmental factor that has affected our sense of narrative more than any other in recent history is technology.

Most people would agree that there are clear differences in the way that humans from different cultures, geographical regions or educations react to stories. A short parable about the prophet Muhammad will tend to have more resonance with a person raised as a Muslim than someone who is not. A piece of feminist literature is statistically more likely to be read by a female than a male.

Until recently (that is no more than a few thousand years ago) human understanding of narrative was reliant solely on body language and aural interpretation. It is only with the introduction of the first technology that allowed a scribe to chisel a narrative into stone that the course of narrative culture has changed to one that is closely entwined with technology.

From the earliest uses technology has provided a crutch for our (human) understanding levels and consequently our expectations of intelligence. Four hundred years ago knowledge could be equated with intelligence fairly simply. In 1600, the fact that someone had gained specific knowledge on a topic (say the operation of a steam engine) implied they had invested significant time in understanding the problem space. Now any problem space can be accessed very easily via Google such that knowledge on its own is generally insufficient to claim intelligence. A five year old can easily find and regurgitate a compelling explanation of Einstein’s relativity from the internet but we would be right to hesitate in calling the child intelligent on this evidence alone until we had probed a little deeper.

We have all seen circumstantial evidence that technology affects our attention span and we understand stories in much compressed forms.  [5] Nowadays we tend to furnish our understanding by searching on Google rather than spending time puzzling over meanings. A side effect of this reduced attention span is the result that we cross reference our stimuli many times more frequently than previous generations.

If we are reading an article, and a thought occurs to us, it is unlikely we will wait long before jumping to a new tab and searching for information on that point.  If we are reading a novel length work it is  highly unlikely that we will get through it in one sitting without referencing other material that will influence our ‘reading’ of the text. One can say that this interplay is simply a diversion that comes about because of the easy availability of resources, but it has a much more profound effect on the way we understand.

We assume that the way we understand narrative may be affected by our knowledge but that the way we consume it has relatively little impact. We may read The Odyssey via an ipad rather than a papyrus, or hear a human voice relayed through a cinema sound system rather than in person, but we assume that the story itself is essentially the same as it has always been. After all is the difference in consumption important? Surely it is the message that is important in any given text?

If we read Homer’s Odyessy on an ipad, while sporadically checking Wikipedia, the process is radically different from anything that was possible even fifty years ago. Certain key points may remain with us, certain universal messages, but it is by no means certain that we understand these in the same way as we always did.

In reality our approach to understanding a narrative is, and always has been, evolving. The way we read today is drastically different to the way Oscar Wilde read and that in turn is drastically different to the way that Aristotle read.

When we are told a story in spoken language, most people directly realise the importance of the delivery. The way somebody tells a story is certainly very important to our understanding of both the narrative and the storyteller’s intelligence.

If Shakespeare’s ‘Hamlet’ is performed by idiots not only do we judge the intelligence of the performers but the words themselves lose their power to capture our attention. A poorly delivered version of Hamlet will appear little better than a poorly performed play by a third rate playwright.

Will we recognize artificial intelligence?

One concept that has arisen recently in relation to discussion of intelligence is the idea of an artificially intelligent singularity. This is the idea that computers and machines will reach a level of complexity that exceeds human capabilities and beyond this point they will be able to design ever superior versions of themselves, rapidly accelerating the rate of intelligence far beyond what we can imagine.

One question this discussion of Artificial Intelligent machines raises in the context of narrative is whether we, or any of our descendants, will ever be able to recognize Artificially Intelligent machines.

If we accept that narrative is a key indicator of intelligence then it is reasonable to ask what happens to our concept of intelligence when the way we understand narrative changes.

If we accept that our understanding of narrative is entwined with the technology we use to consume narrative then we face a difficult problem when it comes to recognizing any potential singularity event.

How do we recognize a super-intelligent machine? To answer this question surely we must use the same approach that we would take in deciding if a human is intelligent. We seek to understand the narrative that it creates, we ask it questions, we listen to the stories it spins, we try to gauge how it views the world.

When the subject is a human this is a fairly straightforward task, and something that we do on a daily basis. Whenever we meet a new acquaintance or colleague we sub-consciously create a representation of their intelligence based on the stories that they tell (and the stories others tell about them). We may check their Facebook page, read a blog post they have written, listen to them recount the joys of a recent trip to Paris. All of this we seamlessly join together to form a mental representation of this person.

When it comes to a machine the situation is far more complex. The modern machines we interact with are in one sense a single machine, co-joined by the internet. When we come to ask if a machine is intelligent we are forced to rely on tools that are, in effect, part of that machine itself. If our understanding of intelligence relies on us using technology to reach a conclusion then in a sense the process of asking if a machine is intelligent is inherently self-reflexive.

This is equivalent to asking a human outright if they are intelligent – whatever their answer it tells us nothing about the real situation. A person may reply ‘Yes, I am intelligent’ or ‘No, I am not intelligent’ but they may be joking or obfuscating or self-deprecating.

To put this another way, we cannot expect to see an emergent ‘Artificial intelligence’ because such machines are already here and they will only become more deeply embedded with us not more distinct. It is not possible to recognize an intelligent machine because it will never exist on its own.

It is narrative that gives us a perspective on intelligence and now narrative is intricately bound to technology and medium. Until relatively recently, on the scale of human civilization, the medium was human oration, now however the concept of narrative is severely fragmented and likely to become more so. To understand narrative in the future we will be increasingly reliant on machines and as a result intelligence will become something that becomes less and less meaningful outside the context of this machine infrastructure.

 

 

 

 

Bibliography

[1] Schank, Roger C. Preface. Tell Me a Story: Narrative and Intelligence. Evanston, IL: Northwestern UP, 1995. Xliii. Print[2] Turing, A. M. “I.—Computing Machinery And Intelligence.” Mind LIX.236 (1950): 433-60. Print.[3] K. Dautenhahn (1998) The Art of Designing Socially Intelligent Agents – Science, Fiction, and the Human in the Loop. Applied Artificial Intelligence Journal, Vol 12, 7-8, October- December, pp 573-617.[4] Koller, Veronika. “Michael Mateas and Phoebe Sengers (eds). 2003. Narrative Intelligence.” Studies in Language 29.1 (2005): 227-34. Print.[5] http://www.insurance.lloydstsb.com/personal/general/mediacentre/homehazards_pr.asp

We keep your data private and share your data only with third parties that make this service possible. See our Privacy Policy for more information.
Scroll to Top