- Nov 27, 2012
- Reaction score
A new patent has been awarded to the tech giant Apple, one that describes an augmented reality technology of some sorts. The patent has been published by the US Patent and Trademark office, bearing the number 8,400,548.
The filing, called “Synchronized, interactive augmented reality displays for multifunction devices” outlines a system that utilizes different parts of an iOS device such as a touch screen display or camera in order to “scan” the environment. The scanning produces a replica of the real-time picture, where different objects are being labeled and tagged. Once an object is identified by the system, information related to the object in question can be displayed in a live video stream. Imagine scanning a Muse member and getting to watch the “Bliss” video by doing so.
What’s more important is that users can connect and help improve the system. For example, if the technology failed to identify an object accordingly or display the relevant information, the human user can step in to correct and edit things.
Apple’s technology would allow the person to view the environment upfront like an image or a 3D model with information present as an overlay. Both options can be displayed side by side. When describing the patent, Apple provided the example of a real-world view of San Francisco and the computer generated version of it. The user can then interact with various points and walk about the streets. He or she can then share the views with another iOS device. It sounds remotely similar to what Apple Maps is supposed to achieve. Might it be that Apple is planning to somehow extend its Map agenda?
Last edited by a moderator: