Introduction to How Augmented Reality Will Work

   Video games have been entertaining us for nearly 30 years, ever since Pong was introduced to arcades in the early 1970s. Computer graphics have become much more sophisticated since then, and soon, game graphics will seem all too real. In the next decade, researchers plan to pull graphics out of your television screen or computer display and integrate them into real-world environments. This new technology, called augmented reality, will further blur the line between what's real and what's computer-generated by enhancing what we see, hear, feel and smell.


Augmented-reality displays will overlay computer-generated graphics onto the real world.

   On the spectrum between virtual reality, which creates immersible, computer-generated environments, and the real world, augmented reality is closer to the real world. Augmented reality adds graphics, sounds, haptics and smell to the natural world as it exists. You can expect video games to drive the development of augmented reality, but this technology will have countless applications. Everyone from tourists to military troops will benefit from the ability to place computer-generated graphics in their field of vision.

   Augmented reality will truly change the way we view the world. Picture yourself walking or driving down the street. With augmented-reality displays, which will eventually look much like a normal pair of glasses, informative graphics will appear in your field of view, and audio will coincide with whatever you see. These enhancements will be refreshed continually to reflect the movements of your head. In this article, we will take a look at this future technology, its components and how it will be used.

      Augmenting Our World
   The basic idea of augmented reality is to superimpose graphics, audio and other sense enhancements over a real-world environment in real-time. Sounds pretty simple. Besides, haven't television networks been doing that with graphics for decades? Well, sure -- but all television networks do is display a static graphic that does not adjust with camera movement. Augmented reality is far more advanced than any technology you've seen in television broadcasts, although early versions of augmented reality are starting to appear in televised races and football games, such as Racef/x and the super-imposed first down line, both created by SporTVision. These systems display graphics for only one point of view. Next-generation augmented-reality systems will display graphics for each viewer's perspective.

   Augmented reality is still in an early stage of research and development at various universities and high-tech companies. Eventually, possibly by the end of this decade, we will see the first mass-marketed augmented-reality system, which one researcher calls "the Walkman of the 21st century." What augmented reality attempts to do is not only superimpose graphics over a real environment in real-time, but also change those graphics to accommodate a user's head- and eye- movements, so that the graphics always fit the perspective. Here are the three components needed to make an augmented-reality system work:

  • head-mounted display
  • tracking system
  • mobile computing power

Early prototype of a mobile augmented-reality system

   The goal of augmented-reality developers is to incorporate these three components into one unit, housed in a belt-worn device that wirelessly relays information to a display that resembles an ordinary pair of eyeglasses. Let's take a look at each of the components of this system.

      Head-mounted Displays
   Just as monitors allow us to see text and graphics generated by computers, head-mounted displays (HMDs) will enable us to view graphics and text created by augmented-reality systems. So far, there haven't been many HMDs created specifically with augmented reality in mind. Most of the displays, which resemble some type of skiing goggles, were originally created for virtual reality. There are two basic types of HMDS:

  • video see-through
  • optical see-through

   Video see-through displays block out the wearer's surrounding environment, using small video cameras attached to the outside of the goggles to capture images. On the inside of the display, the video image is played in real-time and the graphics are superimposed on the video. One problem with the use of video cameras is that there is more lag, meaning that there is a delay in image-adjustment when the viewer moves his or her head.

   Most companies who have made optical see-through displays have gone out of business. Sony makes a see-through display that some researchers use, called the Glasstron. Blair MacIntyre, director of the Augmented Environments Lab at Georgia Tech, believes that the Microvision's Virtual Retinal Display holds the most promise for an augmented-reality system. This device actually uses light to paint images onto the retina by rapidly moving the light source across and down the retina. The problem with the Microvision display is that it currently costs about $10,000. MacIntyre says that the retinal-scanning display is promising because it has the potential to be small. He imagines an ordinary-looking pair of glasses that will have a light source on the side to project images on to the retina.

      Tracking and Orientation
   The biggest challenge facing developers of augmented reality is the need to know where the user is located in reference to his or her surroundings. There's also the additional problem of tracking the movement of users' eyes and heads. A tracking system has to recognize these movements and project the graphics related to the real-world environment the user is seeing at any given moment. Currently, both video see-through and optical see-through displays typically have lag in the overlaid material due to the tracking technologies currently available. For augmented reality to reach its full potential, it must be usable both outdoors and indoors. Currently, the best tracking technology available for large open areas is the Global Positioning System. However, GPS receivers have an accuracy of about 10 to 30 meters, which is not bad in the grand scheme of things, but isn't good enough for augmented reality, which needs accuracy measured in millimeters or smaller. An augmented-reality system would be worthless if the graphics projected were of something 10 to 30 meters away from what you were actually looking at.

   There are ways to increase tracking accuracy. For instance, the military uses multiple GPS signals. There is also differential GPS, which involves using an area that has already been surveyed. Then the system would use a GPS receiver with an antenna that's location is known very precisely to track your location within that area. This will allow users to know exactly how inaccurate their GPS receivers are, and can adjust an augmented-reality system accordingly. Differential GPS allows for submeter accuracy. A more accurate system being developed, known as real-time kinematic GPS, can achieve centimeter-level accuracy.

   Tracking is easier in small spaces than in large spaces. Researchers at the University of North Carolina-Chapel Hill have developed a very precise system that works within 500 square feet. The HiBall Tracking System is an optoelectronic tracking system made of two parts:

  • six user-mounted, optical sensors
  • infrared-light-emitting diodes (LEDs) embedded in special ceiling panels

The Hiball Tracking System uses an optical sensing device and LED-embedded ceiling tiles to track movements over a short range.

   The system uses the known location of the LEDs, the known geometry of the user-mounted optical sensors and a special algorithm to compute and report the user's position and orientation. The system resolves linear motion of less than .2 millimeters, and angular motions less than .03 degrees. It has an update rate of more than 1500 Hz, and latency is kept at about one millisecond.

      Mobile Computing Power
   For a wearable augmented reality system, there is still not enough computing power to create stereo 3-D graphics. So researchers are using whatever they can get out of laptops and personal computers, for now. Laptops are just now starting to be equipped with graphics processing units (GPUs). Toshiba just added an NVidia GPU to their notebooks that is able to process more than 17-million triangles per second and 286-million pixels per second, which can enable CPU-intensive programs, such as 3-D games. But still, notebooks lag far behind -- NVidia has developed a custom 300-MHz 3-D graphics processor for Microsoft's upcoming Xbox game console that can produce 150 million polygons per second -- and polygons are more complicated than triangles. So you can see how far mobile graphics chips have to go before they can create smooth graphics like the ones you see on your home video-game system.

   Practical portable 3-D systems won't be available until at least 2005, said MacIntyre. His research lab is currently using a ThinkPad to power their mobile augmented-reality system. The top ThinkPads use an ATI Mobility 128, 16-MB graphics chip.

      Using Augmented Reality
   Once researchers overcome the challenges that face them, augmented reality will likely pervade every corner of our lives. It has the potential to be used in almost every industry, including:

  • Maintenance and construction - This will likely be one of the first uses for augmented reality. Markers can be attached to a particular object that a person is working on, and the augmented-reality system can draw graphics on top of it. This is a more simple form of augmented reality, since the system only has to know where the user is in reference to the object that he or she is looking at. It's not necessary to track the person's exact physical location.
  • Military - The military has been devising uses for augmented reality for decades. In fact, the Office of Naval Research has sponsored some augmented-reality research. And the Defense Advanced Research Projects Agency (DARPA) has funded an HMD project to develop a display that can be coupled with a portable information system. The idea here is that an augmented-reality system could provide troops with vital information about their surroundings, such as showing where entrances are on the opposite end of a building, somewhat like X-ray vision. Augmented reality displays could also highlight troop movements, and give soldiers the ability to move to where the enemy can't see them.
  • Instant information - Tourists and students could use these systems to learn more about a certain historical event. Imagine walking onto a Civil War battlefield and seeing a re-creation of historical events on a head-mounted, augmented-reality display. It would immerse you in the event, and the view would be panoramic.
  • Gaming - How cool would it be to take video games outside? The game could be projected onto the real world around you, and you could, literally, be in it as one of the characters. One Australian researcher has created a prototype game that combines Quake, a popular video game, with augmented reality. He put a model of a university campus into the game's software. Now, when he uses this system, the game surrounds him as he walks across campus.

   There are hundreds of potential applications for such a technology, gaming and entertainment being the most obvious ones. Any system that gives people instant information, requiring no research on their part, is bound to be a valuable to anyone in pretty much any field. Augmented-reality systems will instantly recognize what someone is looking at, and retrieve and display the data related to that view.

Первоисточник