Navigation

You are here: Home / Members / Homepage Tobias Langlotz / History of Mobile Augmented Reality

History of Mobile Augmented Reality

 

This web page summarizes the major milestones in mobile Augmented Reality. Mobile Augmented Reality has largely evolved over the last decade, as well as the interpretation itself of what is Mobile Augmented Reality. The first instance of Mobile AR can certainly be associated with the development of wearable AR, in a sense of experiencing AR during locomotion (mobile as a motion). With the transformation and miniaturization of physical devices and displays, the concept of mobile AR evolved towards the notion of "mobile device", aka AR on a mobile device. In this history of mobile AR we considered both definitions and the evolution of the term over time.

The list was initially compiled by Tobias Langlotz, Daniel Wagner, Alessandro Mulloni and Lukas Gruber for the ISMAR society in 2009. An updated version was created by Tobias Langlotz, Raphael Grasset, and Daniel Wagner in 2013.

Permission is granted to copy and modify but please reference this webpage as:

Tobias Langlotz, Daniel Wagner, Raphael Grasset, Alessandro Mulloni, Lukas Gruber, History of Mobile Augmented Reality, http://www.icg.tugraz.at/Members/langlotz/history-of-mobile-ar, 2013.

 

Icon Legend

Research Mobile PC Mobile Phone Hardware Standard Game Tool
Paper Notebook Phone Gadget Standard Game Tool

 

Please notify Tobias Langlotz if you find errors on this page.

 

 

1968


HardwarePaperIvan Sutherland creates the first augmented reality system, which is also the first virtual reality system. It uses an optical see-through head-mounted display that is tracked by one of two different 6DOF trackers: a mechanical tracker and an ultrasonic tracker. Due to the limited processing power of computers at that time, only very simple wireframe drawings could be displayed in real time.

I. Sutherland, “A Head-Mounted Three Dimensional Display”, Proceedings of Fall Joint Computer Conference, 1968, pp. 757-764.

Sutherland


1972


HardwareThe first conceptual tablet computer was proposed in 1972 by Alan Kay, named the Dynabook. The Dynabook was proposed as personal computer for children, having the format factor of a tablet with a mechanical keyboard (really similar design from the One Laptop per Child project started in 2005). The Dynabook is probably recognized as being the precursor of the tablet computers decades before the iPad.

Kay, Alan C. "A Personal Computer for Children of All Ages”, August 1972.

Dynabook


1973


HardwareThe first handheld mobile phone was presented by Motorola and demonstrated in April 1973 by Dr Martin Cooper. The mobile named DynaTAC for Dynamic Adaptive Total Area Coverage was supporting only 35 minutes of call.

Radio Telephone System, U.S. Patent Application U.S. 3,906,166, September 16, 1975.

DynaTac


1982


NotebookThe first laptop, the Grid Compass 1100 is released, which was also the first computer to use a clamshell design. The Grid Compass 1100 had an Intel 8086 CPU, 350 Kbytes of memory and a display with a resolution of 320x240 pixels, which was extremely powerful for that time and justified the enormous costs of 10.000 USD. However, its weight of 5kg made it hardly portable.

GRiD Compass 1101


1992


PaperTom Caudell and David Mizell coin the term "augmented reality" to refer to overlaying computer-presented material on top of the real world. Caudell and Mizell discuss the advantages of AR versus VR such as requiring less processing power since less pixels have to be rendered. They also acknowledge the increased registration requirements in order to align real and virtual.

T. P. Caudell, and D. W. Mizell, “Augmented Reality: An Application of Heads-Up Display Technology to Manual Manufacturing Processes”, Proceedings of 1992 IEEE Hawaii International Conference on Systems Sciences, 1992, pp 659-669.

HUDset

PhoneAt COMDEX 1992, IBM and Bellsouth introduce the first smartphone, the IBM Simon Personal Communicator, which was released in 1993. The phone has 1 Megabyte of memory and a B/W touch screen with a resolution of 160 x 293 pixels. The IBM Simon works as phone, pager, calculator, address book, fax machine, and e-mail device. It weights 500 grams and cost 900 USD.

IBM Simon Personal Communicator

IBM Simon


1993


NotebookPaperLoomis et al. develop a prototype of an outdoor navigation system for visually impaired. They combine a notebook with a differential GPS receiver and a head-worn electronic compass. The application uses data from a GIS (Geographic Information System) database and provides navigational assistance using an "acoustic virtual display": labels are spoken using a speech synthesizer and played back at correct locations within the auditory space of the user.

J. Loomis, R. Golledge and R. Klatzky, “Personal guidance system for the visually impaired using GPS, GIS, and VR technologies”, Proceedings of Conference on Virtual Reality and Persons with Disabilities, 1993.

HardwarePaperFitzmaurice creates Chameleon, a key example of displaying spatially situated information with a tracked hand-held device. In his setup the output device consists of a 4" screen connected to a video camera via a cable. The video camera records the content of a Silicon Graphics workstation's large display in order to display it on the small screen. Fitzmaurice uses a tethered magnetic tracker (Ascension bird) for registration in a small working environment. Several gestures plus a single button allow the user to interact with the mobile device. Chameleon's mobility was strongly limited due to the cabling. It did also not augment in terms of overlaying objects on a video feed of the real world.

G. W. Fitzmaurice, "Situated information spaces and spatially aware palmtop computers", Communications of the ACM, Special issue on computer augmented environments: back to the real world, 1993, vol. 36, issue 7, pp. 39-49.

Chameleon1993

StandardIn December 1993 the Global Positioning System (GPS, official name "NAVSTAR-GPS") achieves initial operational capability. Although GPS was originally launched as a military service, nowadays millions of people use it for navigation and other tasks such as geo-caching or Augmented Reality. A GPS receiver calculates its position by carefully timing the signals sent by the constellation of GPS satellites. The accuracy of civilian GPS receivers is typically in the range of 15 meter. More accuracy can be gained by using Differential GPS (DGPS) that uses correction signals from fixed, ground-based reference stations.

Global Positioning System

Global Positioning System

HardwareThe Apple Newton Message Pad 100 was one of the earliest commercial personal digital assistant (PDA). Equipped with a stylus and handwritten recognition, and feature a screen in black and white of 336x240 pixels.

The Apple Message Pad

Apple Newton


1994


HardwareSteve Mann starts wearing a webcam for almost 2 years. From 1994-1996 Mann wore a mobile camera plus display for almost every waking minute. Both devices were connected to his website allowing online visitors to see what Steve was seeing and to send him messages that would show up on his mobile display. This hardware also allowed him to conduct experiments on mediated reality by changing the appearance of the reality

S. Mann, “Wearable Wireless Webcam,” personal WWW page.

Mann, S. (1994), "Mediated Reality. " M.I.T. M.L. Technical Report 260, Cambridge, Massachusetts.

StandardPaul Milgram and Fumio Kishino write their seminal paper "Taxonomy of Mixed Reality Visual Displays" in which they define the Reality-Virtuality Continuum. Milgram and Kishino describe a continuum that spans from the real environment to the virtual environment. In between there are Augmented Reality, closer to the real environment and Augmented Virtuality, which is closer to the virtual environment. Today Milgram's Continuum and Azuma's definition (1997) are commonly accepted as defining Augmented Reality.

P. Milgram and F. Kishino, "Taxonomy of Mixed Reality Visual Displays", IEICE Transactions on Information and Systems, 1994, pp. 1321-1329.

Milgrams Continuum


1995


HardwarePaperJun Rekimoto and Katashi Nagao create the NaviCam, a tethered setup, similar to Fitzmaurice's Chameleon. The NaviCam also uses a nearby powerful workstation, but has a camera mounted on the mobile screen that is used for optical tracking. The computer detects color-coded markers in the live camera image and displays context sensitive information directly on top of the video feed in a see-through manner.

J. Rekimoto and K. Nagao, “The World through the Computer: Computer Augmented Interaction with Real World Environments”, Proceedings of the 8th annual ACM symposium on User interface and software technology (UIST '95), 1995, pp. 29-36.

NaviCam

PaperBenjamin Bederson introduced the term Audio Augmented Reality by presenting a system that demonstrated an augmentation of the audition modality. The developed prototype uses a MD-player which plays audio information based on the tracked position of the user as part of a museum guide.

Bederson, B.B. Audio augmented reality. Conference companion on Human factors in computing systems - CHI ’95, ACM Press (1995), 210–211.


1996


PaperJun Rekimoto presents 2D matrix markers (square-shaped barcodes), one of the first marker systems to allow camera tracking with six degrees of freedom.

Rekimoto, J. (1996). Augmented Reality Using the 2D Matrix Code. In Proceedings of the Workshop on Interactive Systems and Software (WISS'96).

MatrixCode


1997


StandardRonald Azuma presents the first survey on Augmented Reality. In his publication, Azuma provides a widely acknowledged definition for AR, as identified by three characteristics:

  • it combines real and virtual
  • it is interactive in real time
  • it is registered in 3D.

R. Azuma, “A survey of augmented reality”, Presence: Teleoperators and Virtual Environments, 1997, pp. 355–385.

NotebookPaperSteve Feiner et al. present the Touring Machine, the first mobile augmented reality system (MARS). It uses a see-through head-worn display with integral orientation tracker; a backpack holding a computer, differential GPS, and digital radio for wireless web access; and a hand-held computer with stylus and touchpad interface.

S. Feiner, B. MacIntyre, T. Höllerer and A. Webster, “A touring machine: Prototyping 3D mobile augmented reality systems for exploring the urban environment”, Proceedings of First IEEE International Symposium on Wearable Computers (ISWC '97), 1997, pp 74–81. Cambridge, MA.

Touring Machine Touring Machine

NotebookPaperThad Starner et al. explore possible applications of mobile augmented reality, creating a small community of users equipped with wearable computers interconnected over a network. The explored applications include an information system for offices, people recognition and coarse localization with infrared beacons.

Starner, T., Mann, S., Rhodes, B., Levine, J., Healey, J., Kirsch, D., Picard, R.W., Pentland, A., Augmented Reality Through Wearable Computing, In Presence, Special Issue on Augmented Reality, 1997.

PhonePhilippe Kahn invents the camera phone, a mobile phone which is able to capture still photographs. Back in 1997, Kahn used his invention to share a picture of his newborn daughter with more than 2000 relatives and friends, spread around the world. Today more than half of all mobile phones in use are camera phones.

Camera Phone

HardwareSony releases the Glasstron, a series of optical HMD (optionally see-through) for the general public. Adoption was rather small, but the affordable price of the HMD made it really popular in AR research labs and for the development of wearable AR prototype.

Sony Glasstron


1998


NotebookPaperBruce Thomas et al. present "Map-in-the-hat", a backpack-based wearable computer that includes GPS, electronic compass and a head-mounted display. At this stage the system was utilized for navigation guidance, but it later evolved into Tinmith, an AR platform used for several other AR projects.

B. H. Thomas, V. Demczuk, W. Piekarski, D. Hepworth and B. Gunther, “A wearable computer system with augmented reality to support terrestrial navigation”, Proceedings of Second IEEE International Symposium on Wearable Computers (ISWC '98), 1998, pp. 168-171.

Tinmith


1999


PaperHirokazu Kato and Mark Billinghurst present ARToolKit, a pose tracking library with six degrees of freedom, using square fiducials and a template-based approach for recognition. ARToolKit is available as open source under the GPL license and is still very popular in the AR community.

H. Kato and M. Billinghurst, Marker tracking and HMD calibration for a video-based augmented reality conferencing system, Proceedings of the 2nd IEEE and ACM International Workshop on Augmented Reality (IWAR 99), 1999, pp. 85-94.

Tinmith

NotebookPaperTobias Höllerer et al. develop a mobile AR system that allows the user to explore hypermedia news stories that are located at the places to which they refer and to receive a guided campus tour that overlays models of earlier buildings. This was the first mobile AR system to use RTK GPS and an inertial-magnetic orientation tracker.

T. Höllerer, S. Feiner, and J. Pavlik, Situated documentaries: Embedding multimedia presentations in the real world, Proceedings of the Third IEEE International Symposium on Wearable Computers (ISWC 99), 1999, pp. 79-86.

Mars UI

NotebookPaperTobias Höllerer et al. present a mobile augmented reality system that includes indoor user interfaces (desktop, AR tabletop, and head-worn VR) to interact with the outdoor user. While outdoor users experience a first-person spatialized multimedia presentation via a head-mounted display, indoor users can get an overview of the outdoor scene.

T. Höllerer, S. Feiner, T. Terauchi, G. Rashid and D. Hallaway, Exploring MARS: Developing indoor and outdoor user interfaces to a mobile augmented reality system. Computers and Graphics, 1999, pp. 779–785.

Mars UI

PaperJim Spohrer publishes the Worldboard concept, a scalable infrastructure to support mobile applications that span from low-end location-based services, up to high-end mobile AR. In his paper, Spohrer also envisions possible application cases for mobile AR, and social implications.

J. C. Spohrer, Information in Places, IBM Systems Journal, 1999, pp. 602-628.

hardwareThe first consumer LBS device was the Palm VII, only supporting zip code based location services. 2 years later, different mobile operators provided different location based services using private network technology.

The Palm VIII

PhoneBenefon Esc! NT2002, the first GSM phone with a built-in GPS receiver is released in late 1999. It had a black and white screen with a resolution of 100x160 pixels. Due to limited storage, the phone downloaded maps on demand. The phone also included a friend finder that exchanged GPS positions with other Esc! devices via SMS.

StandardThe wireless network protocols 802.11a/802.11b - commonly known as WiFi - are defined. The original version - obsolete - specifies bitrates of 1 or 2 megabits per second (Mbit/s), plus forward error correction code.


2000


NotebookGameBruce Thomas et al. present AR-Quake, an extension to the popular desktop game Quake. ARQuake is a first-person perspective application which is based on a 6DOF tracking system using GPS, a digital compass and vision-based tracking of fiducial markers. Users are equipped with a wearable computer system in a backpack, an HMD and a simple two-button input device. The game can be played in- or outdoors where the usual keyboard and mouse commands for movement and actions are performed by movements of the user in the real environment and using the simple input interface.

B. Thomas, B. Close, J. Donoghue, J. Squires, P. De Bondi, M. Morris and W. Piekarski, “ARQuake: An Outdoor/Indoor Augmented Reality First Person Application”, Proceedings of the 4th International Symposium on Wearable Computers, 2000, pp. 139-146.

NotebookPaperRegenbrecht and Specht present mPARD, using analogue wireless video transmission to a host computer which is taking the burden of computation off the mobile hardware platform. The rendered and augmented images are sent back to the visualization device over a separate analog channel. The system can operate within 300m outdoors and 30m indoors, and the batteries allow for an uninterrupted operation of 5 hours at max.

H. Regenbrecht and R. Specht, “A mobile Passive Augmented Reality Device”, Proceedings of the International Symposium on Augmented Reality (ISAR 2000), 2000, pp. 81-84.

MPard

PaperFritsch et al. introduces a general architecture for large scale AR system as part of the NEXUS project. The NEXUS model introduces the notion of augmented world using distributed data management and a variety of sensor system

D. Fritsch, D. Klinec, S. Volz, NEXUS — positioning and data management concepts for location-aware applications, Computers, Environment and Urban Systems, Volume 25, Issue 3, 1 May 2001, Pages 279-291

NotebookPaperSimon Julier et al. present BARS, the Battlefield Augmented Reality System. The system consists of a wearable computer, a wireless network system and a see-through HMD. The system targets the augmentation of a battlefield scene with additional information about environmental infrastructure, but also about possible enemy ambushes.

S. Julier, Y. Baillot, M, Lanzagorta, D. Brown and L. Rosenblum, “BARS: Battlefield Augmented Reality System”, NATO Information Systems Technology Panel Symposium on New Information Processing Techniques for Military Systems, 2000.

PhoneSharp corporation releases the first commercial camera phone to public. The official name of the phone is J-SH04. The phones' camera has a resolution of 0.1 megapixels.

PaperJulier et al. described the problem of information overload and visual clutter within mobile Augmented Reality. They proposed information filtering for mobile AR based on techniques such as physically-based methods, methods using the spatial model of interaction, rule-based filtering, and a combination of these methods to reduce the information overload in mobile AR scenarios.

Julier, S.; Baillot, Y.; Brown, D.; Lanzagorta, M., "Information filtering for mobile augmented reality," Computer Graphics and Applications, IEEE , vol.22, no.5, pp.12,15, Sep/Oct 2002.


2001


HardwarePaperJoseph Newman et al. present the BatPortal, a PDA-based, wireless AR system. Localization is performed by measuring the travel time of ultra-sonic pulses between specially built devices worn by the user, so-called Bats, and fixed installed receivers deployed in the floors ceilings building-wide. The system can support an HMD-based system, but also the more well known BatPortal using a handheld device. Based on a fixed configuration of the PDA carried and the personal Bat worn, the direction of the users view is estimated, and a model of the scene with additional information about the scene is rendered onto the PDA screen.

J. Newman, D. Ingram and A. Hopper, “Augmented Reality in a Wide Area Sentient Environment”, Proceedings of the 2nd IEEE and ACM International Symposium on Augmented Reality (ISAR 2001), 2001, pp. 77-86.

NotebookPaperHara et al. introduce TOWNWEAR, an outdoor system that uses a fiber optic gyroscope for orientation tracking. The high precision gyroscope is used to measure the 3DOF head direction accurately with minimal drift, which is then compensated by tracking natural features.

Townwear

PhonePaperJürgen Fruend et al present AR-PDA, a concept for building a wireless AR system and a special prototype of palm-sized hardware. Basic design ideas include the augmentation of real camera images with additional virtual objects, for example for illustration of functionality and interaction with commonly used household equipment.

J. Fruend, C. Geiger, M. Grafe and B. Kleinjohann, ”The Augmented Reality Personal Digital Assistant”, Proceedings of the Second International Symposium on Mixed Reality (ISAR 2001), 2001.

NotebookPaperReitmayr and Schmalstieg present a mobile, multi-user AR system. The ideas of mobile augmented reality and collaboration between users in augmented shared space are combined and merged into a hybrid system. Communication is performed using LAN and wireless LAN, where mobile users and stationary users are acting in a common augmented space.

G. Reitmayr, and D. Schmalstieg, “Mobile Collaborative Augmented Reality”, Proceedings of the International Symposium on Augmented Reality, 2001, pp. 114-123.

NotebookPhonePaperVlahakis et al. present Archeoguide, a mobile AR system for cultural heritage sites. The system is built around the historical site of Olympia, Greece. The system contains a navigation interface, 3D models of ancient temples and statues, and avatars which are competing for the win in the historical run in the ancient Stadium. While communication is based on WLAN, accurate localization is performed using GPS. Within the system a scalable setup of mobile units can be used, starting with a notebook sized system with HMD, down to palmtop computers and Pocket PCs.

V. Vlahakis, J. Karigiannis, M. Tsotros, M. Gounaris, L. Almeida, D. Stricker, T. Gleue, I. Christou, R. Carlucci and N. Ioannidis, “ARCHEOGUIDE: First results of an Augmented Reality, Mobile Computing System in Cultural Heritage Sites”, Proceedings of Virtual Reality, Archaeology, and Cultural Heritage International Symposium (VAST01), 2001,  pp. 131 – 140.

NotebookPaperKretschmer et al. present the GEIST system, a system for interactive story-telling within urban and/or historical environments. A complex database setup provides information queues for the appearance of buildings in ancient times or historical facts and events. Complex queries can be formulated and stories can be told by fictional avatars or historical persons.

U. Kretschmer, V. Coors, U. Spierling, D. Grasbon, K. Schneider, I. Rojas, and R. Malaka, “Meeting the spirit of history”, Proceedings of the 2001 conference on Virtual reality, archeology, and cultural heritage, 2001, pp. 141-152.

NotebookPaperColumbia's Computer Graphics and User Interfaces Lab does an outdoor demonstration of their mobile AR restaurant guide at ISAR 2001, running on their Touring Machine. Pop-up information sheets for nearby restaurants are overlaid on the user's view, and linked to reviews, menus, photos, and restaurant URLs.

B. Bell, S. Feiner, and T. Höllerer, View Management for Virtual and Augmented Reality, In Proc. UIST '01, Orlando, FL, November 11-14 2001. pp. 101-110

NotebookPaperKooper and MacIntyre create the RWWW Browser, a mobile AR application that acts as an interface to the World Wide Web. It is the first AR browser. This early system suffers from the cumbersome AR hardware of that time, requiring a head mounted display and complicated tracking infrastructure. In 2008 Wikitude implements a similar idea on a mobile phone.

Browsing the Real-World Wide Web: Maintaining Awareness of Virtual Information in an AR Information Space, Kooper, R., MacIntyre, B., In International Journal of Human-Computer Interaction, Vol. 16, Nr. 3, pp. 425-446 December 2003


2002


NotebookPaperMichael Kalkusch et al. present a mobile augmented reality system to guide a user through an unfamiliar building to a destination room. The system presents a world-registered wire frame model of the building labeled with directional information in a see-through heads-up display, and a three-dimensional world-in-miniature (WIM) map on a wrist-worn pad that also acts as an input device. Tracking is done using a combination of wall-mounted ARToolkit markers observed by a head-mounted camera, and an inertial tracker.

M. Kalkusch, T. Lidy, M. Knapp, G. Reitmayr, H. Kaufmann and D. Schmalstieg, “Structured Visual Markers for Indoor Pathfinding”, Proceedings of the First IEEE International Workshop on ARToolKit (ART02), 2002.

Structured Visual Markers for Indoor Pathfinding

HardwarePaperLeonid Naimark and Eric Foxlin present a wearable low-power hybrid visual and inertial tracker. This tracker, later to be known as InterSense’s IS-1200, can be used for tracking in large scale, such as a complete building. This is achieved by tracking a newly designed 2-D barcode with thousands of different codes and combining the result with an inertial sensor.

L. Naimark and E. Foxlin, “Circular Data Matrix Fiducial System and Robust Image Processing for a Wearable Vision-Inertial Self-Tracker”, Proceedings of the International Symposium on Mixed and Augmented Reality (ISMAR 2002), 2002, pp. 27-36.

HardwarePaperMogilev et al. introduces the AR Pad a ad-hoc mobile AR device equipped with a spaceball controller.

D. Mogilev, K. Kiyokawa, M. Billinghurst, and J. Pair. 2002. AR Pad: an interface for face-to-face AR collaboration. In CHI '02 Extended Abstracts on Human Factors in Computing Systems (CHI EA '02). ACM, New York, NY, USA, 654-655.

ARPad


2003


NotebookGameAdrian David Cheok et al. present the Human Pacman. Human Pacman is an interactive ubiquitous and mobile entertainment system that is built upon position and perspective sensing via Global Positioning System and inertia sensors; and tangible human-computer interfacing with the use of Bluetooth and capacitive sensors. Pacmen and Ghosts are now real human players in the real world experiencing mixed computer graphics fantasy-reality provided by using wearable computers that are equipped with GPS and inertia sensors for players' position and perspective tracking. Virtual cookies and actual tangible physical objects with Bluetooth devices and capacitive sensors are incorporated into the game play to provide novel experiences of seamless transitions between real and virtual worlds.

A. D. Cheok, S. W. Fong, K. H. Goh, X. Yang, W. Liu and F. Farzbiz, “Human Pacman: a sensing-based mobile entertainment system with ubiquitous computing and tangible interaction” Proceedings of the 2nd Workshop on Network and System Support For Games (NetGames '03), 2003, pp. 71-81.

HumanPacman

NotebookPaperRamesh Raskar et al. present iLamps. This work created a first prototype for object augmentation with a hand-held projector-camera system. An enhanced projector can determine and respond to the geometry of the display surface, and can be used in an ad-hoc cluster to create a self-configuring display. Furthermore interaction techniques and co-operation between multiple units are discussed.

R. Raskar, J. van Baar, P. Beardsley, T. Willwacher, S. Rao and C. Forlines, "ilamps: geometrically aware and self-configuring projectors", SIGGRAPH '05: ACM SIGGRAPH 2005 Courses, 2005.

iLamps

PhonePaperDaniel Wagner and Dieter Schmalstieg present an indoor AR guidance system running autonomously on a PDA. They exploit the wide availability of consumer devices with a minimal need for infrastructure. The application provides the user with a three-dimensional augmented view of the environment by using a Windows Mobile port of ARToolKit for tracking and runs directly on the PDA.

D. Wagner and D. Schmalstieg, “First Steps Towards Handheld Augmented Reality”, Proceedings of the 7th IEEE International Symposium on Wearable Computers (ISWC 03), 2003, pp. 127-135.

First Steps Towards Handheld Augmented Reality

PhoneGameThe Siemens SX1 is released, coming with the first commercial mobile phone AR camera game called Mozzies (also known as Mosquito Hunt). The mosquitoes are superimposed on the live video feed from the camera. Aiming is done by moving the phone around so that the cross hair points at the mosquitoes. Mozzies was awarded the title of best mobile game in 2003.

Video See-Through AR on Consumer Cell Phones

PaperSinem Guven presents a mobile AR authoring system for creating and editing 3D hypermedia narratives that are interwoven with a wearable computer user's surrounding environment. Their system was designed for authors who are not programmers and used a combination of 3D drag-and-drop for positioning media and a timeline for synchronization. It allowed authors to preview their results on a desktop workstation, as well as with a wearable AR or VR system.

Guven, S. and Feiner, S. Authoring 3D hypermedia for wearable augmented and virtual reality. Proc. ISWC 2003 (IEEE Int. Symp. on Wearable Computers), White Plains, NY, October 21-23, 2003, 118-126

Mobile AR authoring system


2004


PhonePaperMathias Möhring et al. present a system for tracking 3D markers on a mobile phone. This work showed a first video see-through augmented reality system on a consumer cell-phone. It supports the detection and differentiation of different 3D markers, and correct integration of rendered 3D graphics into the live video stream.

M. Möhring, C. Lessig and O. Bimber, “Video See-Through AR on Consumer Cell Phones”, Proceedings of the 3th IEEE/ACM international Symposium on Mixed and Augmented Reality (ISMAR 04), 2004, pp. 252-253.

Video See-Through AR on Consumer Cell Phones

PhonePaperMichael Rohs and Beat Gfeller present Visual Codes, a 2D marker system for mobile phones. These codes can be attached to physical objects in order to retrieve object-related information and functionality. They are also suitable for display on electronic screens.

M. Rohs and B. Gfeller, “Using Camera-Equipped Mobile Phones for Interacting with Real-World Objects”, Advances in Pervasive Computing, 2004, pp. 265-271.

Visual Codes

PaperEnylton Machado Coelho et al presents OSGAR, a scene graph with uncertain transformations. In their work they target the problem of registration error, which is especially important for mobile scenarios when high quality trackign is not available and overlay graphics will not align perfectly with the real environment. OSGAR dynamically adapts the display to mitigate the effects of registration errors.

Coelho, E.M.; Julier, S.J.; Maclntyre, B., "OSGAR: a scene graph with uncertain transformations," Mixed and Augmented Reality, 2004. ISMAR 2004. Third IEEE and ACM International Symposium on , vol., no., pp.6,15, 2-5 Nov. 2004

PhoneGameThe Invisible Train, is shown at SIGGRAPH 2004 Emerging Technologies. The Invisible Train is the first multi-user Augmented Reality application for handheld devices.

The invisible train web page

Invisible Train


2005


PhoneGameAnders Henrysson ports ARToolKit to Symbian. Based on this technology he presents the famous AR-Tennis game, the first collaborative AR application running on a mobile phone. ARTennis was awarded the Indepdent Mobile Gaming best game award for 2005, and the technical achievement award.

A. Henrysson, M. Billinghurst, and M. Ollila, “Face to Face Collaborative AR on Mobile Phones”, Proceedings of the 4th IEEE/ACM International Symposium on Mixed and Augmented Reality (ISMAR 05), 2005, pp. 80-89.

AR Tennis

PhonePaperProject ULTRA shows how to use non-realtime natural feature tracking on PDAs to support people in multiple domains such as the maintenance and support of complex machines, construction and production, and edutainment and cultural heritage. Furthermore an authoring environment is developed to create the AR scenes for the maintenance tasks.

A. Makri, D. Arsenijevic, J. Weidenhausen, P. Eschler, D. Stricker, O. Machui, C. Fernandes, S. Maria, G. Voss and N. Ioannidis, “ULTRA: An Augmented Reality System for Handheld Platforms, Targeting Industrial Maintenance Applications”, Proceedings of 11th International Conference on Virtual Systems and Multimedia (VSMM'05), 2005.

PhoneThe first mobile phones equipped with three-axis accelerometers were the Sharp V603SH and the Samsung SCH-S310 both sold in Asian in 2005.

Motion-sensing comes to mobile phones.


2006


NotebookPaperReitmayr et al. presents a model-based hybrid tracking system for outdoor augmented reality in urban environments enabling accurate, real-time overlays on a handheld device. The system combines an edge-based tracker for accurate localization, gyroscope measurements to deal with fast motions, measurements of gravity and magnetic field to avoid drift, and a back store of reference frames with online frame selection to re-initialize automatically after dynamic occlusions or failures.

G. Reitmayr and T. Drummond, “Going Out: Robust Model-based Tracking for Outdoor Augmented Reality”, Proceedings of 5th IEEE and ACM International Symposium on Mixed and Augmented Reality (ISMAR 2006), 2006, pp. 109-118.

Reitmayr

PhonePaperNokia presents Mara, a multi-sensor mobile phone AR guidance application for mobile phones. The prototype application overlays the continuous viewfinder image stream captured by the camera with graphics and text in real time, annotating the user's surroundings.

Markus Kähäri, David J. Murphy, MARA - Sensor Based Augmented Reality System for Mobile Imaging, ISMAR 2006 Demo.


2007


NotebookPaperKlein and Murray present a system capable of robust real-time tracking and mapping in parallel with a monocular camera in small workspaces. It is an adaption of a SLAM approach which processes the tracking and mapping task on two separated threads.

G. Klein and D. Murray, “Parallel tracking and mapping for small ar workspaces”, Proceedings of 6th IEEE and ACM International Symposium on Mixed and Augmented Reality (ISMAR 2007), 2007, pp. 225-234.

Klein

NotebookPaperDiVerdi and Höllerer present the GroundCam, a system combining a camera and an orientation tracker. The camera points at the ground behind the user and provides 2D tracking information. The method is similar to that of an optical desktop mouse.

S. DiVerdi and T. Höllerer, “GroundCam: A Tracking Modality for Mobile Mixed Reality”, Proceedings of the 9th international Conference on Virtual Reality, IEEE VR 2007, pp. 75-82. Best Paper Honorable Mention.

DiVerdi

PhonePaperRohs et al. compare the performance of the following navigation methods for map navigation on mobile devices: joystick navigation, the dynamic peephole method without visual context, and the magic lens paradigm using external visual context. In their user study they demonstrate the advantage of dynamic peephole and magic lens interaction over joystick interaction in terms of search time and degree of exploration of the search space.

M. Rohs, J. Schöning, M. Raubal, G. Essl and A. Krüger, “Map navigation with mobile devices: virtual versus physical movement with and without visual context”, Proceedings of the 9th international Conference on Multimodal interfaces, ICMI 2007, pp. 146-153.

Nokia

PhoneThe first multi-touch screen mobile phone was famously known iPhone sold by Apple, leverage a new way to interact on mobile devices.

The iPhone

iPhone

PhoneHIT Lab NZ and Saatchi and Saatchi deliver the world's first mobile phone based AR advertising application for the Wellington Zoo.

 

Augmented Reality at Wellington Zoo, The Inspiration Room, June 20, 2007

Nokia


2008


PhonePaperWagner et al. present the first real-time 6DOF implementation of natural feature tracking on mobile phones achieving interactive frame rates of up to 20 Hz. They heavily modify the well known SIFT and Ferns methods in order to gain more speed and reduce memory requirements.

D. Wagner, G. Reitmayr, A. Mulloni, T. Drummond and D. Schmalstieg, “Pose tracking from natural features on mobile phones”, Proceedings of the 7th IEEE/ACM International Symposium on Mixed and Augmented Reality, 2008 (ISMAR 2008), 2008, pp. 125-134.

NFT on phones

NotebookPaperMETAIO presents a commercial mobile AR museum guide using natural feature tracking or a six-month exhibition on Islamic art. In their paper they describe the experiences made in this project.

T. Miyashita, P. Meier, T. Tachikawa, S. Orlic, T. Eble, V. Scholz, A. Gapel, O. Gerl, S. Arnaudov and S. Lieberknecht, “An Augmented Reality Museum Guide”, Proceedings of the 7th IEEE/ACM International Symposium on Mixed and Augmented Reality, 2008 (ISMAR 2008), 2008, pp. 103-106.

Metaio

phonePaperWith Augmented Reality 2.0, Schmalstieg et al. presented at the Dagstuhl seminar in 2008 for the first time a concept that combined ideas of the Web 2.0 such as social media, crowd sourcing through public participation, and an open architecture for content markup and distribution, and applied it to mobile Augmented Reality to create a scalable AR experience.

Schmalstieg, D., Langlotz, T., and Billinghurst, M. Augmented Reality 2.0. In C. Sabine, G. Brunnett and G. Welch, eds., Virtual Realities, Dagstuhl seminar series. 2011, 13–37.

PhoneToolMobilizy launches Wikitude, an application that combines GPS and compass data with Wikipedia entries. The Wikitude World Browser overlays information on the real-time camera view of an Android smartphone.

Wikitude

Wikitude


2009


PhonePaperMorrison et al. present MapLens which is a mobile augmented reality (AR) map using a magic lens over a paper map. They conduct a broad user study in form of an outdoor location-based game. Their main finding is that AR features facilitate place-making by creating a constant need for referencing to the physical. The field trials show that the main potential of AR maps lies in their use as a collaborative tool.

A. Morrison, A. Oulasvirta, P. Peltonen, S. Lemmelä, G. Jacucci, G. Reitmayr, J. Näsänen and A. Juustila, “Like Bees Around the Hive: A Comparative Study of a Mobile Augmented Reality Map”, Proceedings of the 27th international conference on Human factors in computing systems (CHI 2009), 2009, pp. 1889-1898.

Map-lens

PhonePaperHagbi et al. presented an approach allowing to track the pose of the mobile device by pointing it to fiducials. Unlike existing systems the approach allows to track a wide set of planar shapes while the user can teach the system new shapes at runtime by showing them to the camera. The learned shapes are then maintained by the system in a shape library enabling new AR application scenarios in terms of interaction with the scene but also in terms of fiducial design.

Hagbi, N.; Bergig, O.; El-Sana, J.; Billinghurst, M., "Shape recognition and pose estimation for mobile augmented reality," Mixed and Augmented Reality, 2009. ISMAR 2009. 8th IEEE International Symposium on , vol., no., pp.65,71, 19-22 Oct. 2009

Map-lens

NotebookPaperSean White introduces SiteLens, a hand-held mobile AR system for urban design and urban planning site visits. SiteLens creates "situated visualizations" that are related to and displayed in their environment. For example, representations of geocoded carbon monoxide concentration data are overlaid at the sites at which the data
was recorded.

White, S. and Feiner, S. SiteLens: Situated visualization techniques for urban site visits. Proc. CHI 2009, Boston, MA, April 4-9, 2009, 1117-1120

Map-lens

PhoneToolSPRXmobile launches Layar, an advanced variant of Wikitude. Layar uses the same registration mechanism as Wikitude (GPS + compass), and incoperates this into an open client-server platform. Content layers are the equivalent of web pages in normal browsers. Existing layers include Wikipedia, Twitter and Brightkite to local services like Yelp, Trulia, store locators, nearby bus stops, mobile coupons, Mazda dealers and tourist, nature and cultural guides. On August 17th Layar went global serving almost 100 content layers. By announcing Layar as an AR browser, SPRXmobile also coined the term AR browser, which is nowadays used for many applications with a similar purpose.

Layar announcement

Layar

PhoneGameKimberly Spreen et al. develop ARhrrrr!, the first mobile AR game with high quality content at the level of commercial games. They use an NVIDIA Tegra developer kit ("Concorde") with a fast GPU. All processing except for tracking are running on the GPU, making the whole application run at high frame rates on a mobile phone class device despite the highly detailed content and natural feature tracking.

ARhrrrr!

ARhrrrr!

PhonePaperGeorg Klein presents his research on SLAM-based tracking by demonstrating his SLAM system running in real-time on an iPhone. Even though it has constrains in terms of working area it is the first time a 6DoF SLAM system is known to run on mobile phones in sufficient speed.

Klein, G. and Murray, D. Parallel Tracking and Mapping on a camera phone. 2009 8th IEEE International Symposium on Mixed and Augmented Reality 41, 1 (2009), 83–86.

Klein-iphone


2010


PhonePaperWagner et al. present an panorama-based orientation tracking that enables precise orientation tracking in real-time on mobile phones while also creating a panoramic map of the environment. The proposed system can be seen as a SLAM system with 3 degrees of freedom.

Wagner, D., Mulloni, A., Langlotz, T., and Schmalstieg, D. Real-time panoramic mapping and tracking on mobile phones. 2010 IEEE Virtual Reality Conference (VR), IEEE (2010), 211–218.

PhonePaperKHARMA is a lightweight and open architecture for referencing and delivering content explicitly aiming for mobile AR applications running on a global scale. It uses KML for describing the geospatial or relative relation of content while utilizing on HTML, JavaScript and CSS technologies for content development and delivery.

Hill, Alex; MacIntyre, B.; Gandy, Maribeth; Davidson, Brian; Rouzati, Hafez, "KHARMA: An open KML/HTML architecture for mobile augmented reality applications" Mixed and Augmented Reality (ISMAR), 2010 9th IEEE International Symposium on , vol., no., pp.233,234, 13-16 Oct. 2010

PhonePaperExisting mobile AR applications where exclusively used to browser and consume digital information. Langlotz et al. presented an new approach aiming for AR browsers that also supported creation of digital information in-situ. The information is registered with pixel-precision by utilizing a panorama of the environment that is created in the background.

Langlotz, Tobias; Wagner, Daniel; Mulloni, Alessandro; Schmalstieg, D., "Online Creation of Panoramic Augmented Reality Annotations on Mobile Phones," Pervasive Computing, IEEE , vol.11, no.2, pp.56,63, Feb. 2012

iPad

HardwareApple releases the iPad on April 2010, which becomes the first tablet computer to be adopted by the large public. The iPad featured an assisted GPS, accelerometers, magnetometers, advanced graphics chipset (PowerVR SGX535), enabling the possibilities to create efficient AR application on tablet computer.

The iPad

iPad


2012


HardwareGoogle Glass (also known as Google Project Glass) is firstly presented to the public. Goggle Glass is is an optical HDM that can be controlled with an integrated touch-sensitive sensor or natural language commands. After it's public anouncement Google Glass had a major impact on research but even more on the public perception of mixed reality technology. While Google Glass was initially not available to the public, Google Glass was release to registered developers in 2013 and is planned to be released to consumers in 2014.

Google Glass project page on Google+

Google Glass

Hardware13th lab released the first commercial mobile SLAM (Simultaneous localization and mapping) system coined Pointcloud to the public, marking a major milestone for app developers who want to integrate SLAM-based tracking into their application.

Pointcloud homepage

Pointcloud video

HardwarePrimeSense, the creator of the Microsoft Kinect, introduced a smaller version of a 3D sensing device called Capri that is small enough to be integrated into mobile devices such as tablets or smartphones.

Capri announcement

Capri video


2013


HardwareNVidia is demonstrating at Siggraph Emerging Technoligies their prototype of a head mounted display supporting accurate accommodation, convergence, and binocular-disparity depth cues. The prototype introduces a light-field-based approach to near-eye displays and can be seen as a next generation wearable display technology for AR as existing hardware can't provide accurate accomodation.

Near-eye light field project page


Document Actions

[Powered by Plone]