
WELCOME TO THE ASOR BLOG
The American Schools of Oriental Research (ASOR) is the preeminent society for individuals interested in the archaeology of the eastern Mediterranean and the Biblical Lands. This blog is intended to facilitate ASOR’s mission “to initiate, encourage and support research into, and public understanding of, the cultures and history of the Near East from the earliest times.”
Augmented Reality, a New Horizon in Archaeology
By: Stuart Eve, University College London and L – P : Archaeology
Firstly, I would like to thank Jen Fitzgerald for asking me to contribute a guest post to the ASOR Blog. I am currently undertaking a doctoral thesis at the Institute of Archaeology, University College London – researching the middle ground between phenomenological, in situ landscape investigation and computer-based analysis. My area of study is the British Bronze Age, but I hope that the techniques and methodologies I discuss will be relevant to the ASOR audience as well.
Archaeology has been a fore-runner in the attempt to use Geographic Information Systems (GIS) to address the challenges of recreating perception and social behaviour within a computer environment. However, these approaches have traditionally been based on the visual aspect of perception and analysis has usually been confined to the computer laboratory. In contrast, the latest archaeological theories and methods involving phenomenological analysis of landscapes and past environments are normally carried out within the landscape itself and computer analysis away from the landscape in question is often seen as anathema to such approaches. I would argue that the importance of the embodied experience to any discussion of past people cannot be overstated. In this post I want to introduce some of the problems that currently exist when trying to marry these two approaches and suggest that a possible solution may be found via the use of Mixed Reality techniques.
What is Mixed Reality?
Most people are probably familiar with Virtual Reality, the creation of an entirely virtual world that can be explored within a computer environment. Heim, writing in 1993 notes that – “… for us, technology and reality are beginning to merge” [1]. This is an important observation as modern technology is opening up avenues of exploration that perhaps haven’t been available in the past. The term Virtual Reality now really only covers one aspect of so-called virtuality. As technology has advanced, we are able to merge computer-generated ‘reality’ with the real-world, so-called Mixed Reality (MR) [2]. This has lead to the creation of a scale of virtuality (the Reality-Virtuality continuum) first proposed by Milgram and Colquhoun [3]. As can be seen below,the scale goes from the Real Environment (RE) through Augmented Reality (AR), Augmented Virtuality (AV) to a full Virtual Environment (VE). Virtual Reality is not seen anymore as the only alternative to real-life, instead it is seen as the polar opposite to Real Reality with many dimensions in between.
For me, the most exciting part of this scale is the Augmented Reality section; this is where we can merge the real world with virtual objects. I realise this might sound like a bit of a strange concept, but AR has been used for many years for a number of different purposes. AR delivery normally involves overlaying live video feed from either a web-camera, a Head-Worn Display (HWD), or a mobile device, with virtual objects. There are a wide number of applications of this technology: interactive greeting cards, advertising of various products such as interactive brochures allowing you to ‘drive’ a car before buying it , visualisation of computer-generated GIS data overlaid onto actual locations [4], indoor and outdoor gaming [5], even heads-up displays (HUDs) in modern aircraft are a form of augmented reality – overlaying information from the aircraft\’s systems and projecting them onto the pilot’s display. The introduction of products such as Google Glass (link- www.google.com/glass), which provides a personal HUD via a pair of special glasses – is now bringing a form of AR technology to the masses.
Essentially, it is possible to hold up my smartphone or tablet whilst I am on site and due to its built-in GPS chip it knows where I am, due to its built-in compass it knows which way I am facing and due to its built-in camera it knows what I am looking at. Using this information it can overlay virtual content, such as labels or 3D models, directly onto the video feed with the correct position and perspective. I can see directions to the nearest Starbucks or I can view a full-scale reconstruction of the Colosseum when I am standing in front of the ruins in Rome. A great example of this in use at the moment is the Museum of London’s Streetmuseum application (link – http://www.museumoflondon.org.uk/Resources/app/you-are-here-app/home.html), that allows you to view archival photographs of London overlaid onto your smartphone screen when you visit the real location

AR in Archaeology
Augmented Reality has been used in archaeology, mainly to provide a form of augmented tourism, an extension of the classic audio tours. ARCHEOGUIDE (link – http://archeoguide.intranet.gr/), released in 2001 is an early example of using an AR device to aid in a tourist\’s experience of an archaeological site. When the tour begins each user is asked to generate a profile outlining what their interests and background are, a personalised tour is then created for that user to follow. The user is given an AR Head-Worn Display (HWD) and reconstructions of the ancient buildings are overlaid directly onto the real world.

The Cultural Heritage Experiences through Socio-personal Interactions and Storytelling (link – http://www.chessexperience.eu/) project takes a similar approach – using slightly more sophisticated profiling, users are led on a personalised tour through the new Acropolis Museum, with the AR content being delivered through a handheld tablet. George Papagiannakis et al [6] produced one of the best known cultural heritage AR applications at the site of Pompeii. Using a tracked video-see-through HWD and dynamic modelling of the real and virtual world, Papagiannakis and his team were able to insert virtual characters into various buildings within Pompeii and enact a real-time storytelling scenario.

Beyond the virtual tourism aspects, I am interested in using AR technology to explore archaeological sites from a phenomenological perspective and in turn to explore geographical data held within a GIS, directly in situ. AR is unique in this in that it allows us to subtly add small elements of virtual content to the real world. It is a step beyond the blinking red location dot of Google Maps or the entirely virtual world of VR – out of the abstraction of the flat plane digital map or the entirely false rendered 3D world and into the real world. With the limited addition to the landscape of data from the GIS, the landscape itself is being used as a canvas, enhancing the feeling of presence and immersion. The introduction of virtual elements can be kept to a minimum, and in contrast the landscape itself provides the bulk of the experience – the way in which the sloping ground tires out your legs; the feeling of shelter gained from standing in the lee of a hill; and the feeling of perspective when vistas open up in front of you as you explore the landscape. As archaeologists, we are striving to get closer to what it was to be a human living in the past. These are all elements that are virtually impossible to recreate within traditional GIS but are vital to the way humans experience space and what it means to them and are also vital to the experience of that specific landscape.
Using AR to Investigate an Archaeological Site
Using a combination of a gaming engine (Unity3D – link http://www.unity3d.com), GIS software (I use QGIS – link http://www.qgis.com), an AR plugin (Qualcomm’s Vuforia –link http:// www.vuforia.com), and some custom scripting it is possible to create a workflow that allows geographic data to be explored in situ as I am exploring the site itself [7].
As touched on in the introduction, I am using AR to reinvestigate the Bronze Age settlement on Leskernick Hill, Bodmin Moor, Cornwall, UK. The settlement consists of approximately 50 different roundhouse structures that can be seen only from their sparse surface remains (most survive to about one course of stone walling). Previous investigations on the site by Barbara Bender, Sue Hamilton, and Christopher Tilley featured an in-depth phenomenological exploration of the site; in particular they were interested in the views from the doorways of the roundhouses, but these were in some ways hampered by the lack of original physical remains to occlude the views [8]. By using AR we are able to reinsert the full-sized roundhouses into the landscape and see how the crowding of them would have affected the overall views across the landscape.

As can be seen, the geographic point data in this case can be displayed as either spherical markers or full-scale reconstructions. It is even possible to sit inside one of the houses and enjoy the view through the doorway

From a landscape perspective, the AR approach is especially effective from far away as can be seen below, where the Leskernick Hill settlement is augmented into a view from a nearby hill

This allows us to play with the feelings of approaching the site, walking through the settlement and gives us a clearer impression of what it would have been like when the settlement was fully built.
Finally, we should not forget the other senses. For example, by providing the user with headphones and by creating sound zones within the GIS, it is possible to insert sounds into the landscape that only play when the user is within a certain geographic space. These sounds could be anything, but I have augmented sounds of people talking within the roundhouses or working in the valleys. As the overall AR experience is controlled via a gaming-engine, the volume can be adjusted depending on how far away or close the user is to the sound source. The same is true of ‘smell zones;’ I have created the Dead Man\’s Nose [link – http://www.dead-mens-eyes.org/archaeology-gis-and-smell-and-arduinos/], a simple device that is worn around the neck and emanates a specific odour depending on your geographic location. So now when I am sitting in a roundhouse I have an augmented view through the doorway, I hear people working and talking around me, and I also smell the meat cooking on the roundhouse fire.
Throughout this post I have only been able to touch upon what might be possible with Augmented Reality now and into the future. The technology is really still in its infancy, and the top-end AR devices are still prohibitively expensive, especially for archaeologists. However, with as little as a GIS and an iPad, it is already possible to walk out onto your site and augment it with the sights, sounds, and smells of past people. By linking the data to a GIS it goes beyond being just a simple tourist or wayfinding device and opens up the possibilities of exploring the results of spatial modelling, changing chronologies, and different types of building reconstruction. The key to all of this, however, is that by using AR the real world is a canvas, and we have the ability to explore our data and our site from a body-centred perspective rather than just from sitting in our office in front of a computer screen.
If anyone would like any further information about using AR in Archaeology or has any questions please feel free to contact me (s.eve@ucl.ac.uk) or alternatively take a look at my blog (http://www.dead-mens-eyes.org).
Stuart Eve is a PhD Candidate at the University College London and a Partner at L – P : Archaeology. You can find his blog at Dead Men’s Eyes (http://www.dead-mens-eyes.org/) and his tweets @stueve (https://twitter.com/stueve)
\n [1] Heim, M., 1993. The Metaphysics of Virtual Reality, Oxford: Oxford University Press.
[2] Ohta, Y. & Tamura, H., 1999. Mixed Reality: Merging Real and Virtual Worlds 1st ed., Springer.
[3] Milgram, P. & Colquhoun, H., 1999. A Taxonomy of Real and Virtual World Display Intergration. In Y. Ohta & H. Tamura, eds. Mixed Reality: Merging Real and Virtual Worlds. Springer, pp. 5–30.
[4] Ghadirian, P. & Bishop, I.D., 2008. Integration of augmented reality and GIS: A new approach to realistic landscape visualisation. Landscape and Urban Planning, 86(3-4), pp.226–232.
[5] Bernardes, J. et al., 2008. Augmented Reality Games. In O. Leino, H. Wirman, & A. Fernandez, eds. Extending Experiences. Lapland University Press.
[6] Papagiannakis, George, et al. “Mixing Virtual and Real scenes in the site of ancient Pompeii.” Computer Animation and Virtual Worlds 16.1 (2005): 11-24.
[7] For a detailed walkthrough of the process, please visit my blog at http://www.dead-mens-eyes.org
[8] Bender, B., Hamilton, S. & Tilley, C., 2007. Stone Worlds: Narrative and Reflexivity in Landscape Archaeology (Publications of the Institute of Archaeology, University College London) illustrated edition., Left Coast Press Inc.
All content provided on this blog is for informational purposes only. The American Schools of Oriental Research (ASOR) makes no representations as to the accuracy or completeness of any information on this blog or found by following any link on this blog. ASOR will not be liable for any errors or omissions in this information. ASOR will not be liable for any losses, injuries, or damages from the display or use of this information. The opinions expressed by Bloggers and those providing comments are theirs alone, and do not reflect the opinions of ASOR or any employee thereof.
WELCOME TO THE ASOR BLOG