Inventors list

Assignees list

Classification tree browser

Top 100 Inventors

Top 100 Assignees


Augmented reality (real-time)

Subclass of:

345 - Computer graphics processing and selective visual display systems

345418000 - COMPUTER GRAPHICS PROCESSING

345619000 - Graphic manipulation (object processing or display attributes)

345629000 - Merge or overlay

345632000 - Placing generated data in real scene

Patent class list (only not empty are listed)

Deeper subclasses:

Entries
DocumentTitleDate
20130044132USER AUGMENTED REALITY FOR CAMERA-ENABLED MOBILE DEVICES - A user augmented reality (UAR) service for a camera-enabled mobile device obtains meta data regarding one or more images/video that are captured with such device and provides the meta data in the display of the mobile device. The meta data is interactive and allows the user to obtain additional information or specific types of information, such as information that will aid the user in making a decision regarding the identified objects or selectable action options that can be used to initiate actions with respect to the identified objects.02-21-2013
20130044131SOFTWARE CONTROLLER FOR AUDIO MIXER EQUIPMENT - A method for revealing changes in settings of an analogue control console, the method comprising: 02-21-2013
20130044128CONTEXT ADAPTIVE USER INTERFACE FOR AUGMENTED REALITY DISPLAY - A user interface includes a virtual object having an appearance in context with a real environment of a user using a see-through, near-eye augmented reality display device system. A virtual type of object and at least one real world object are selected based on compatibility criteria for forming a physical connection like attachment, supporting or integration of the virtual object with the at least one real object. Other appearance characteristics, e.g. color, size or shape, of the virtual object are selected for satisfying compatibility criteria with the selected at least one real object. Additionally, a virtual object type and appearance characteristics of the virtual object may be selected based on a social context of the user, a personal context of the user or both.02-21-2013
20130044130PROVIDING CONTEXTUAL PERSONAL INFORMATION BY A MIXED REALITY DEVICE - The technology provides contextual personal information by a mixed reality display device system being worn by a user. A user inputs person selection criteria, and the display system sends a request for data identifying at least one person in a location of the user who satisfy the person selection criteria to a cloud based application with access to user profile data for multiple users. Upon receiving data identifying the at least one person, the display system outputs data identifying the person if he or she is within the field of view. An identifier and a position indicator of the person in the location is output if not. Directional sensors on the display device may also be used for determining a position of the person. Cloud based executing software can identify and track the positions of people based on image and non-image data from display devices in the location.02-21-2013
20130044129LOCATION BASED SKINS FOR MIXED REALITY DISPLAYS - The technology provides embodiments for providing a location-based skin for a see-through, mixed reality display device system. In many embodiments, a location-based skin includes a virtual object viewable by a see-through, mixed reality display device system which has been detected in a specific location. Some location-based skins implement an ambient effect. The see-through, mixed reality display device system is detected to be present in a location and receives and displays a skin while in the location in accordance with user settings. User data may be uploaded and displayed in a skin in accordance with user settings. A location may be a physical space at a fixed position and may also be a space defined relative to a position of a real object, for example, another see-through, mixed reality display device system. Furthermore, a location may be a location within another location.02-21-2013
20100164990SYSTEM, APPARATUS, AND METHOD FOR AUGMENTED REALITY GLASSES FOR END-USER PROGRAMMING - A system, apparatus, and method is provided for augmented reality (AR) glasses (07-01-2010
20080266323Augmented reality user interaction system - An augmented reality user interaction system includes a wearable computer equipped with at least one camera to detect one or more fiducial markers worn by a user. A user-mounted visual display worn by the user is employed to display visual 3D information. The computer detects in an image a fiducial marker worn by the user, extracts a position and orientation of the fiducial marker in the image, and superimposes on the image a visual representation of a user interface component directly on or near the user based on the position and orientation.10-30-2008
20120162257AUTHENTICATION APPARATUS AND METHOD FOR PROVIDING AUGMENTED REALITY (AR) INFORMATION - A authentication method for providing augmented reality (AR) information includes acquiring an image of a real-world environment including a target object; identifying the target object in the acquired image; requesting data related to the target object to a server; receiving encoded data related to the target object from the server; authenticating the encoded data; and outputting the authenticated data as AR information. A terminal to perform authentication to provide AR information includes a communication unit to receive and transmit a signal from and to a server; a display to output a target object and data related to the target object; and a controller to receive encoded data related to the target object from the server, to authenticate the encoded data, and to output the authenticated data on the display.06-28-2012
20120162256MACHINE-IMPLEMENTED METHOD, SYSTEM AND COMPUTER PROGRAM PRODUCT FOR ENABLING A USER TO VIRTUALLY TRY ON A SELECTED GARMENT USING AUGMENTED REALITY - In a machine-implemented method for use with a handheld device, a user is able to virtually try on a selected garment using augmented reality. The machine-implemented method includes: (A) establishing a garment database containing information corresponding to at least one garment, the information corresponding to each garment including a backside image of the garment; (B) establishing a marker database containing feature information of a backside marker; and (C) upon determining from a captured image of the user who is tagged with at least one physical maker that the physical marker corresponds to the backside marker, retrieving from the garment database the backside image of a selected garment, and superimposing the retrieved backside image onto the captured image of the user to form a composite image for display on a screen of the handheld device.06-28-2012
20110187745APPARATUS AND METHOD FOR PROVIDING AUGMENTED REALITY INFORMATION - A system and method for providing augmented reality (AR) information to a mobile communication terminal in a mobile communication system is provided. If the mobile communication terminal is determined to have entered a service cell providing AR information, the mobile communication terminal transmits an AR information request including position information to a server. Upon receiving the AR information request signal, the server determines AR information including at least one tag pattern provided in the service cell and information associated with the tag pattern and transmits the AR information to the mobile communication terminal.08-04-2011
20110187744SYSTEM, TERMINAL, SERVER, AND METHOD FOR PROVIDING AUGMENTED REALITY - A system, a terminal, a server, and a method for providing an augmented reality are capable of providing environment information data in a direction viewed by a user from a current position. The server for providing an augmented reality manages information data to be provided to the terminal in a database according to a section. If the server receives current position information and direction information of the terminal from the terminal connected according to an execution of an augmented reality mode, the server searches information data in a direction in which the terminal faces in a section in which the terminal is currently located from the database, and transmits the searched information data to the terminal.08-04-2011
20130208005IMAGE PROCESSING DEVICE, IMAGE PROCESSING METHOD, AND PROGRAM - There is provided an image processing device including a data acquisition unit configured to acquire a recommended angle-of-view parameter that represents a recommended angle of view for a subject in an environment that appears in an image, and a display control unit configured to overlay on the image a virtual object that guides a user so that an angle of view for capturing an image of the subject becomes closer to the recommended angle of view, using the recommended angle-of-view parameter. The recommended angle-of-view parameter is a parameter that represents a three-dimensional position and attitude of a device that captures an image of the subject at the recommended angle of view.08-15-2013
20130208004DISPLAY CONTROL DEVICE, DISPLAY CONTROL METHOD, AND PROGRAM - There is provided a display control device including an action information acquisition unit that acquires, at an action position of one actor, action information regarding a past action of another actor, an object generation unit that generates a virtual object for virtually indicating a position of the other actor during an action of the one actor based on the acquired action information, and a display control unit that causes a display unit displaying a surrounding scene to superimpose and display the generated virtual object during the action of the one actor.08-15-2013
20090021531Window or door showing remote scenery in real-life motion - A device and method to show scenery in a window (01-22-2009
20130076788APPARATUS, METHOD AND SOFTWARE PRODUCTS FOR DYNAMIC CONTENT MANAGEMENT - The present invention provides systems and methods for dynamic content management, the method including generating content associated with an object, dynamically adjusting the content associated with the object according to a user profile to form a user-defined object-based content package, displaying at least one captured image of the identified object on the device, and uploading the user-defined object-based content package associated with the identified object to the device simultaneously with the displaying step to provide dynamic content to the user on the device.03-28-2013
20130076789AUGMENTED REALITY USING PROJECTOR-CAMERA ENABLED DEVICES - An augmented reality scene may be registered onto an arbitrary surface. A camera may capture an image of the arbitrary surface. The camera may analyze the surface geometry of the arbitrary surface. In some embodiments, a processing computing device may analyze data captured by the camera and an adjacent camera to reconstruct the surface geometry of the arbitrary surface. A scene may be registered to a three dimensional coordinate system corresponding to the arbitrary surface. A projector may project the scene onto the arbitrary surface according to the registration so that the scene may not display as being distorted.03-28-2013
20130076791HEAD-UP DISPLAY SYSTEM - Independent optical unit for head-up display system for motor vehicle, intended for the display in the field of view of the driver of a virtual image obtained from an object image coming from a projector, including a first optical component reflecting the incident light rays emanating from the projector towards a second optical component placed in the field of view of the driver for the positioning of a final virtual image, means being provided for the adjustment of their relative position.03-28-2013
20130076790METHOD FOR AUGMENTING A REAL SCENE - Methods, systems and devices for augmenting a real scene in a video stream are disclosed herein.03-28-2013
20120176410METHOD FOR REPRESENTING VIRTUAL INFORMATION IN A REAL ENVIRONMENT - The invention relates to a method for ergonomically representing virtual information in a real environment, comprising the following steps: providing at least one view of a real environment and of a system setup for blending in virtual information for superimposing with the real environment in at least part of the view, the system setup comprising at least one display device, ascertaining a position and orientation of at least one part of the system setup relative to at least one component of the real environment, subdividing at least part of the view of the real environment into a plurality of regions comprising a first region and a second region, with objects of the real environment within the first region being placed closer to the system setup than objects of the real environment within the second region, and blending in at least one item of virtual information on the display device in at least part of the view of the real environment, considering the position and orientation of said at least one part of the system setup, wherein the virtual information is shown differently in the first region than in the second region with respect to the type of blending in in the view of the real environment.07-12-2012
20130083066AUGMENTED REALITY FOR TABLE GAMES - A method includes acquiring media content of a wagering game table at a wagering game establishment with a camera of a mobile device. A location of the mobile device is determined when the media content is acquired. A direction that a lens of the camera is facing when the media content is acquired is determined. The wagering game table is identified based on the location and the direction. Overlay imagery derived from wagering game activity of the wagering game table is downloaded into the mobile device from a server. The overlay imagery is composited onto the media content to create a composited media content. The composited media content is displayed on a display of the mobile device.04-04-2013
20130076787DYNAMIC INFORMATION PRESENTATION ON FULL WINDSHIELD HEAD-UP DISPLAY - A method to dynamically register a graphic representing essential vehicle information onto a driving scene of a subject vehicle utilizing a substantially transparent windscreen head up display includes monitoring subject vehicle information and identifying the essential vehicle information based on the monitored subject vehicle information. The graphic representing the essential vehicle information is determined and determining, and a preferred location for the graphic upon the substantially transparent windscreen head up display is dynamically registering in accordance with minimizing an operator's head movement and eye saccades for viewing the graphic. The graphic is displayed upon the substantially transparent windscreen head up display based upon the preferred location.03-28-2013
20130083064PERSONAL AUDIO/VISUAL APPARATUS PROVIDING RESOURCE MANAGEMENT - Technology is described for resource management based on data including image data of a resource captured by at least one capture device of at least one personal audiovisual (A/V) apparatus including a near-eye, augmented reality (AR) display. A resource is automatically identified from image data captured by at least one capture device of at least one personal A/V apparatus and object reference data. A location in which the resource is situated and a 3D space position or volume of the resource in the location is tracked. A property of the resource is also determined from the image data and tracked. A function of a resource may also be stored for determining whether the resource is usable for a task. Responsive to notification criteria for the resource being satisfied, image data related to the resource is displayed on the near-eye AR display.04-04-2013
20130083063Service Provision Using Personal Audio/Visual System - A collaborative on-demand system allows a user of a head-mounted display device (HMDD) to obtain assistance with an activity from a qualified service provider. In a session, the user and service provider exchange camera-captured images and augmented reality images. A gaze-detection capability of the HMDD allows the user to mark areas of interest in a scene. The service provider can similarly mark areas of the scene, as well as provide camera-captured images of the service provider's hand or arm pointing to or touching an object of the scene. The service provider can also select an animation or text to be displayed on the HMDD. A server can match user requests with qualified service providers which meet parameters regarding fee, location, rating and other preferences. Or, service providers can review open requests and self-select appropriate requests, initiating contact with a user.04-04-2013
20130083062PERSONAL A/V SYSTEM WITH CONTEXT RELEVANT INFORMATION - A system for generating an augmented reality environment in association with one or more attractions or exhibits is described. In some cases, a see-through head-mounted display device (HMD) may acquire one or more virtual objects from a supplemental information provider associated with a particular attraction. The one or more virtual objects may be based on whether an end user of the HMD is waiting in line for the particular attraction or is on (or in) the particular attraction. The supplemental information provider may vary the one or more virtual objects based on the end user's previous experiences with the particular attraction. The HMD may adapt the one or more virtual objects based on physiological feedback from the end user (e.g., if a child is scared). The supplemental information provider may also provide and automatically update a task list associated with the particular attraction.04-04-2013
20130083061FRONT- AND REAR- SEAT AUGMENTED REALITY VEHICLE GAME SYSTEM TO ENTERTAIN & EDUCATE PASSENGERS - In accordance with an exemplary embodiment, an augmented virtual reality game is provided for a vehicle. A method comprises receiving a real-time video image during operation of a vehicle from a camera mounted on the vehicle and merging the real-time video image with one or more virtual images to provide an augmented reality image. The augmented reality image is then transmitted to a display of a gaming device during the operation of the vehicle. A system comprises a camera providing a real-time video image and a controller coupled to the camera. Additionally, a database provides the controller with one or more virtual images so that the controller may provide an augmented reality image by merging the real-time video image with the one or more virtual images. Finally, a transmitter is included for transmitting the augmented reality image to a display of a game device during the operation of the vehicle.04-04-2013
20130038632SYSTEM AND METHOD FOR IMAGE REGISTRATION OF MULTIPLE VIDEO STREAMS - Provided herein are methods and systems for image registration from multiple sources. A method for image registration includes rendering a common field of interest that reflects a presence of a plurality of elements, wherein at least one of the elements is a remote element located remotely from another of the elements and updating the common field of interest such that the presence of the at least one of the elements is registered relative to another of the elements.02-14-2013
20130038633ASSEMBLING METHOD, OPERATING METHOD, AUGMENTED REALITY SYSTEM AND COMPUTER PROGRAM PRODUCT - An assembling method for assembling a measurement or production set-up includes providing an augmented reality system with a processing device, an output device and a sensing device. The sensing device captures sensing data belonging to a working space. The method then includes providing first and second set-up components having first and second markers at the working space where the second set-up component is connectable to the first set-up component. The method captures the first and second markers by the sensing device and identifies the first and second marker. The processing device retrieves respective digital information assigned to the identified first and second markers from a database and makes a decision on the compatibility of the first set-up component with the second set-up component based on the retrieved digital information. An augmented representation of at least part of the captured sensing data and the decision on the compatibility is output.02-14-2013
20130038631SYSTEMS AND METHODS FOR A VIRTUAL TERRAIN DISPLAY - Embodiments of the present invention provide improved systems and methods for providing a virtual terrain display. In one embodiment, a method comprises identifying a location within an enclosure. The location is referenced against an external environment containing the enclosure. The method also comprises identifying a portion of a structure of the enclosure. The portion of the structure exists between the location and the external environment and blocks a view of the external environment. The method also comprises generating a display depicting a view of the external environment from the location; and applying a translucent structure representation to the display. The structure representation is a visual depiction of the portion of the structure and appearing in front of the depicted view without blocking the depicted view of the external environment.02-14-2013
20130033522PREPOPULATING APPLICATION FORMS USING REAL-TIME VIDEO ANALYSIS OF IDENTIFIED OBJECTS - Embodiments of the invention are directed to methods and apparatuses for populating documents based on identification of objects in an augmented reality environment. The method includes capturing a video stream using a mobile computing device; determining, using a computing device processor, the object; identifying a document associated with the object; populating at least a portion of the document; and submitting the document. The method may also include presenting indicators associated with the user, the identified document, or a financial transaction associated with the document. The method may also include providing recommendations or suggestions to the user related to alternative offers associated with the document. Systems and computer program products for populating forms using video analysis of identified objects are also provided.02-07-2013
20100103196SYSTEM AND METHOD FOR GENERATING A MIXED REALITY ENVIRONMENT - A system and method for generating a mixed-reality environment is provided. The system and method provides a user-worn sub-system communicatively connected to a synthetic object computer module. The user-worn sub-system may utilize a plurality of user-worn sensors to capture and process data regarding a user's pose and location. The synthetic object computer module may generate and provide to the user-worn sub-system synthetic objects based information defining a user's real world life scene or environment indicating a user's pose and location. The synthetic objects may then be rendered on a user-worn display, thereby inserting the synthetic objects into a user's field of view. Rendering the synthetic objects on the user-worn display creates the virtual effect for the user that the synthetic objects are present in the real world.04-29-2010
20130135348COMMUNICATION DEVICE, COMMUNICATION SYSTEM, COMMUNICATION METHOD, AND COMMUNICATION PROGRAM - A communication device includes a content generation unit that generates annotation generation information for generating annotation corresponding to a physical object, a display requirement setting unit that generates display requirement information indicative of requirements for displaying the annotation, a display object setting unit that generates display object information for identifying the physical object to be displayed where the annotation is displayed, and a communication unit that transmits the annotation generation information, the display requirement information, and the display object information.05-30-2013
20130083065FIT PREDICTION ON THREE-DIMENSIONAL VIRTUAL MODEL - Systems and methods for conducting and providing a virtual shopping experience are disclosed. The disclosed systems and methods enable a customer to create a single instance of customer-specific dimension data and then securely utilize their dimension data to conduct virtual shopping sessions with a plurality of different and unrelated entities.04-04-2013
20130083067INFORMATION PROCESSING METHOD AND DEVICE FOR PRESENTING HAPTICS RECEIVED FROM A VIRTUAL OBJECT - At the time of using a haptic device to present to a user haptics which a first virtual object superimposed on the haptic device receives from a second virtual object superimposed on a real object, the user is enabled to touch within the second virtual object, regardless of the real object. Accordingly, the haptics received from the second virtual object is obtained using a first haptic event model of the first virtual object and a second haptic event model of the second virtual object, and while the first haptic event model corresponds to computer graphics information of the first virtual object, the shape of the first virtual object differs from that of the haptic device, such that instructions can be made regarding the inside of the real object, using the first virtual object.04-04-2013
20130027429SYSTEM AND METHOD FOR LOCATIONAL MESSAGING - Herein is disclosed a positional content platform and related systems and methods. According to some embodiments, the platform includes a mobile processing and communication device and a service layer executing on a server. The mobile processing and communication device communicates with the service layer. By virtue of the communication, a user of the mobile device is able to locate and view digital content that has been created and stored on the platform. The aforementioned content is associated with a geographic location, and, according to some embodiments, the content is represented in a user interface by an icon that is superimposed over a field of view representative of a geographic region.01-31-2013
20130027430IMAGE PROCESSING DEVICE, IMAGE PROCESSING METHOD AND PROGRAM - A method is provided for superimposing schedule data on a temporal measurement object. The method comprises receiving image data representing an input image. The method further comprises detecting the presence of a temporal measurement object in the input image based on features of the temporal measurement object detected in the image data. The method further comprises providing, in response to detection of the presence of the temporal measurement object in the input image, schedule data for superimposing on a user's view of the temporal measurement object.01-31-2013
20130027428Generating a Discussion Group in a Social Network Based on Metadata - The present invention includes a system and method for generating a discussion in a social network based on visual search results. A mixed media reality (MMR) engine receives images from a user device and identifies matching MMR objects. A social network application determines whether a discussion group that is related to metadata associated with the images from user devices are related to a discussion group. If the discussion group does not yet exist, the social network application generates the discussion group and provides the user devices with information about the discussion group.01-31-2013
20120182313APPARATUS AND METHOD FOR PROVIDING AUGMENTED REALITY IN WINDOW FORM - An apparatus to provide an augmented reality includes a window detector to determine a first region and a second region; an information processor to identify a first portion of a virtual-world image layer to be displayed in the first region based on a first viewing direction; and an image processor to display the first portion in the first region, and to display a real-world image layer in the second region. A method for providing an augmented reality includes determining a first region and a second region; identifying a first portion of a virtual-world image layer to be displayed in the first region based on a first viewing direction; and displaying the first portion in the first region, and displaying a real-world image layer in the second region.07-19-2012
20130050262METHOD FOR ACCESSING INFORMATION ON CHARACTER BY USING AUGMENTED REALITY, SERVER, AND COMPUTER READABLE RECORDING MEDIUM - The present invention relates to a method for accessing information of a person by using augmented reality. The method includes the steps of: (a) receiving profile information from multiple users and information on a level of sharing the profile information; (b) checking locations of the multiple users; and (c) allowing a program code for (i) acquiring information on at least one user in close proximity, if it is sensed that a surrounding image is received in a preview state through a terminal of the first user and displaying at least one icon corresponding to the user in close proximity through the terminal of the first user in a form of AR with the surrounding image and (ii) displaying the profile information of a specific user corresponding to a specific icon, if being selected later among the displayed icons, through the screen of the first user to be executed.02-28-2013
20130050261INFORMATION PROCESSING APPARATUS AND METHOD, INFORMATION PROCESSING SYSTEM, AND PROVIDING MEDIUM - The invention enables users to virtually attach information to situations in the real world, and also enables users to quickly and easily find out desired information. An IR sensor receives an IR signal transmitted from an IR beacon, and supplies the received signal to a sub-notebook PC. A CCD video camera takes in a visual ID from an object, and supplies the inputted visual ID to the sub-notebook PC. A user inputs, through a microphone, a voice to be attached to situations in the real world. The sub-notebook PC transmits position data, object data and voice data, which have been supplied to it, to a server through a communication unit. The transmitted data is received by the server via a wireless LAN. The server stores the received voice data in a database in correspondence to the position data and the object data.02-28-2013
20130050260COHERENT PRESENTATION OF MULTIPLE REALITY AND INTERACTION MODELS - A method for navigating concurrently and from point-to-point through multiple reality models is described. The method includes: generating, at a processor, a first navigatable virtual view of a first location of interest, wherein the first location of interest is one of a first virtual location and a first non-virtual location; and concurrently with the generating the first navigatable virtual view of the first location of interest, generating, at the processor, a second navigatable virtual view corresponding to a current physical position of an object, such that real-time sight at the current physical position is enabled within the second navigatable virtual view.02-28-2013
20130050259APPARATUS AND METHOD FOR SHARING DATA USING AUGMENTED REALITY (AR) - A first terminal, includes: an image acquiring unit to acquire an image of a second terminal; a controller to control the first terminal and to acquire network information from the image of the second terminal; an AR configuration unit to create an AR display based on the image of the second terminal and the acquired network information; and a communication unit to communicate data between the first terminal and the second terminal via a network. A method, includes: acquiring an image of a second terminal; acquiring network information of a network from the image of the second terminal; creating an AR display based on the image of the second terminal and the acquired network information; allowing a selection of data based on the AR display; and communicating the selected data between the terminal and the second terminal via the network.02-28-2013
20130069985Wearable Computer with Superimposed Controls and Instructions for External Device - A wearable computing device includes a head-mounted display (HMD) that provides a field of view in which at least a portion of the environment of the wearable computing device is viewable. The HMD is operable to display images superimposed over the field of view. When the wearable computing device determines that a target device is within its environment, the wearable computing device obtains target device information related to the target device. The target device information may include information that defines a virtual control interface for controlling the target device and an identification of a defined area of the target device on which the virtual control image is to be provided. The wearable computing device controls the HMD to display the virtual control image as an image superimposed over the defined area of the target device in the field of view.03-21-2013
20130069986METHODS AND ARRANGEMENTS FOR AUGMENTED REALITY - A mobile device, a method in a mobile device, a server and a method in a server for augmented reality.03-21-2013
20110001760METHOD AND SYSTEM FOR DISPLAYING AN IMAGE GENERATED BY AT LEAST ONE CAMERA - The invention is directed to a method and system for displaying an image generated by at least one camera (01-06-2011
20130088514MOBILE ELECTRONIC DEVICE, METHOD AND WEBPAGE FOR VISUALIZING LOCATION-BASED AUGMENTED REALITY CONTENT - A mobile electronic device comprises a display, a processor controlling the display, a memory for storing data and software code, and a data interface for establishing data connection to a server. It further comprises a camera configured to generate video data, a position sensor configured to generate position data, a digital compass configured to generate directional data, and an inclination sensor configured to generate inclination data. The process is configured to retrieve video data from the camera, to generate a video stream from the retrieved video data and to display the video stream in a window representation in the display. The processor is further configured to enhance the window representation with augmented reality objects derived from the position data, the directional data and the inclination data.04-11-2013
20130088516OBJECT DISPLAYING APPARATUS, OBJECT DISPLAYING SYSTEM, AND OBJECT DISPLAYING METHOD - In an object displaying apparatus, when, in the case where a virtual object is displayed superimposed on an image of the real space according to information about the layout position of the virtual object, the virtual object is displayed overlapping a superimposition inhibit object, display control is performed to display the virtual object transparently so that the superimposition inhibit object is not hidden by the virtual object. It is thereby possible to preferentially display the superimposition inhibit object.04-11-2013
20130088515METHOD OF PROVIDING AUGMENTED CONTENTS AND APPARATUS FOR PERFORMING THE SAME, METHOD OF REGISTERING AUGMENTED CONTENTS AND APPARATUS FOR PERFORMING THE SAME, SYSTEM FOR PROVIDING TARGETING AUGMENTED CONTENTS - The system for providing targeting augmented contents includes: an augmented metadata generation apparatus that generates augmented metadata designating specific space and time of broadcast contents as an augmented area; a broadcast content providing apparatus that transmits the augmented metadata to a first broadcast terminal apparatus and transmits the augmented metadata and the broadcast contents to a second broadcast terminal apparatus; a first broadcast terminal apparatus that transmits augmented contents displayed in the augmented area in which the augmented metadata are designated to the augmented content providing apparatus; an augmented content providing apparatus that transmits the augmented contents to a second broadcast terminal apparatus; and a second broadcast terminal apparatus that receives the broadcast contents and the augmented metadata from the broadcast content providing apparatus and receives the augmented contents from the augmented content providing apparatus based on the augmented metadata.04-11-2013
20120218296METHOD AND APPARATUS FOR FEATURE-BASED PRESENTATION OF CONTENT - An approach is provided for location-based presentation of content. A content service platform determines one or more representations of at least one structure. The content service platform also processes and/or facilitates a processing of the one or more representations to determine one or more features of the one or more representations. The content service platform further causes, at least in part, designation of the one or more features as elements of a virtual display area, wherein the one or more representations comprise, at least in part, the virtual display area. The content service platform also causes, at least in part, presentation of one or more outputs of one or more applications, one or more services, or a combination thereof in the virtual display area.08-30-2012
20120218300IMAGE PROCESSING SYSTEM, METHOD AND APPARATUS, AND COMPUTER-READABLE MEDIUM RECORDING IMAGE PROCESSING PROGRAM - An example image processing apparatus has a captured image acquisition unit for acquiring a captured image captured by an imaging device, a feature detection unit for detecting the markers from the captured image, a reference acquisition unit for acquiring, based on each of the detected markers, a coordinate system serving as a reference indicating a position and attitude in a space, and a relative relation information acquisition unit for acquiring, based on the captured image in which a plurality of the markers are detected, relative relation information indicating a relative relation in position and attitude of a plurality of coordinate systems acquired for the respective markers.08-30-2012
20120218299INFORMATION PROCESSING SYSTEM, INFORMATION PROCESSING METHOD, INFORMATION PROCESSING DEVICE AND TANGIBLE RECODING MEDIUM RECORDING INFORMATION PROCESSING PROGRAM - An example information processing system which includes a plurality of information processing devices, the respective information processing devices carrying out imaging by an imaging device, wherein the respective information processing devices include: an imaging processing unit to generate a captured image by sequentially capturing images of a real space; a virtual space setting unit to set a virtual space commonly used by another information processing device which captures an image of one of an imaging object that is included in a captured image, and an imaging object, at least a portion of external appearance of which matches the imaging object, based on at least the portion of the imaging object included in the captured image; and a transmission unit to send data relating to change in a state of the virtual space, to the other information processing device, when the change in the state of the virtual space is detected.08-30-2012
20120218298INFORMATION PROCESSING SYSTEM, INFORMATION PROCESSING METHOD, INFORMATION PROCESSING DEVICE AND TANGIBLE RECORDING MEDIUM RECORDING INFORMATION PROCESSING PROGRAM - An example system includes an information processing device, which includes: an arrangement item including a first marker; an arrangement region providing object which provides an arrangement region and includes a plurality of second markers; and an information processing device including: an imaging processing unit to generate a captured image with an imaging device; a positional relationship judgment unit to judge a positional relationship between the first marker of the arrangement item and at least one of the second markers, from a captured image which includes the first marker of the arrangement item and at least one of the plurality of second markers of the arrangement region providing object; an information superimposition unit to superimpose predetermined information based on the positional relationship, onto the captured image; and a display processing unit to cause a display device to display the captured image on which the predetermined information is superimposed.08-30-2012
20120218297AUGMENTED REALITY PRESENTATIONS - Technology is generally disclosed for augmented-reality presentations. In some embodiments, the technology can receive an indication of a user's sensitivity to an aspect of a presentation, receive general content relating to the presentation, receive overlay content relating to the presentation, combine the received general content and the received overlay content to create the presentation, render the presentation. The overlay content may respond to the user's sensitivity.08-30-2012
20130057583PROVIDING INFORMATION SERVICES RELATED TO MULTIMODAL INPUTS - A system and method provides information services related to multimodal inputs. Several different types of data used as multimodal inputs are described. Also described are various methods involving the generation of contexts using multimodal inputs, synthesizing context-information service mappings and identifying and providing information services.03-07-2013
20130057585USER AUGMENTED REALITY FOR CAMERA-ENABLED MOBILE DEVICES - Apparatus and methods are described for providing a user augmented reality (UAR) service for a camera-enabled mobile device, so that a user of such mobile device can use the mobile device to obtain meta data regarding one or more images/video that are captured with such device. The meta data is interactive and allows the user to obtain additional information or specific types of information, such as information that will aid the user in making a decision regarding the identified objects or selectable action options that can be used to initiate actions with respect to the identified objects.03-07-2013
20130057582DISPLAY CONTROL APPARATUS, METHOD FOR CONTROLLING DISPLAY CONTROL APPARATUS, AND STORAGE MEDIUM - A display control apparatus capable of displaying information about an object detected from a captured image causes a display unit to display information corresponding to even an object that cannot be detected from the captured image, if the information is relevant to a query input by a user.03-07-2013
20130057584INFORMATION PROCESSING APPARATUS AND METHOD, INFORMATION PROCESSING SYSTEM, AND PROVIDING MEDIUM - The invention enables users to virtually attach information to situations in the real world, and also enables users to quickly and easily find out desired information. An IR sensor receives an IR signal transmitted from an IR beacon, and supplies the received signal to a sub-notebook PC. A CCD video camera takes in a visual ID from an object, and supplies the inputted visual ID to the sub-notebook PC. A user inputs, through a microphone, a voice to be attached to situations in the real world. The sub-notebook PC transmits position data, object data and voice data, which have been supplied to it, to a server through a communication unit. The transmitted data is received by the server via a wireless LAN. The server stores the received voice data in a database in correspondence to the position data and the object data.03-07-2013
20130057581METHOD OF DISPLAYING VIRTUAL INFORMATION IN A VIEW OF A REAL ENVIRONMENT - A method of displaying virtual information in a view of a real environment comprising the following steps: providing a system for displaying of virtual information in a view of a real environment, determining a current pose of at least one part of the system relative to at least one part of the real environment and providing accuracy information of the current pose, providing multiple pieces of virtual information, and assigning a respective one of the pieces of virtual information to one of different parameters indicative of different pose accuracy information, and displaying at least one of the pieces of virtual information in the view of the real environment according to the accuracy information of the current pose in relation to the assigned parameter of the at least one of the pieces of virtual information.03-07-2013
20090091583Apparatus and method for on-field virtual reality simulation of US football and other sports - An apparatus and method are disclosed for simulating United States football and other sports that are held on a playing field. The user stands in an area that at least approximates an actual playing field, and an apparatus incorporated into a football helmet or other headgear worn by the user superimposes simulation images onto the field of view of the user, creating an illusion of simulated action taking place on the actual field where the user is standing. This makes the information and skills conveyed by the simulation directly relevant and immediately useful. Preferred embodiments track the location and orientation of the user and thereby allow the user to participate in the simulation. In another aspect, essentially the same apparatus and method are used to simulate driving or flying of vehicles without the need of an expensive mockup of the interior of the vehicle.04-09-2009
20120223969DEVICE FOR CAPTURING AND DISPLAYING IMAGES OF OBJECTS, IN PARTICULAR DIGITAL BINOCULARS, DIGITAL CAMERA OR DIGITAL VIDEO CAMERA - The invention relates to a device for capturing and displaying images of objects, in particular digital binoculars (1), a digital camera (14) or a digital video camera, the device comprising: a digital storage medium (24, 27) including a database, which includes elements including meta information (10, 20) for objects, in particular geographic data and/or names of a landscape, a mountain range or a building or names of plants or animals; a comparison device (33); and a superposition device (34) for superimposing meta information (10, 20) selected by the comparison device (09-06-2012
20120223968DISPLAY PROCESSING DEVICE, DISPLAY METHOD, AND PROGRAM - A display processing device extracts a marker image from an image (captured image) of a page in a book. Afterward, a curved plane is created according to the extracted marker image to represent the degree of curvature of the page. Then, a virtual object is distorted to match the curved plane and overlaid on the captured image for display on an HMD.09-06-2012
20130063487METHOD AND SYSTEM OF USING AUGMENTED REALITY FOR APPLICATIONS - A computerized method for superposing an image of an object onto an image of a scene, including obtaining a 2.5D representation of the object, obtaining the image of the scene, obtaining a location in the image of the scene for superposing the image of the object, producing the image of the object using the 2.5D representation of the object, superposing the image of the object onto the image of the scene, at the location. A method for online commerce via the Internet, including obtaining an image of an object for display, obtaining an image of a scene suitable for including the image of the object for display, and superposing the image of the object for display onto the image of the scene, wherein the image of the object for display is produced from a 2.5D representation of the object. Related apparatus and methods are also described.03-14-2013
20130063486Optical Display System and Method with Virtual Image Contrast Control - A method includes generating a light pattern using a display panel and forming a virtual image from the light pattern utilizing one or more optical components. The virtual image is viewable from a viewing location. The method also includes receiving external light from a real-world environment incident on an optical sensor. The real-world environment is viewable from the viewing location. Further, the method includes obtaining an image of the real-world environment from the received external light, identifying a background feature in the image of the real-world environment over which the virtual image is overlaid, and extracting one or more visual characteristics of the background feature. Additionally, the method includes comparing the one or more visual characteristics to an upper threshold value and a lower threshold value and controlling the generation of the light pattern based on the comparison.03-14-2013
20120194549AR GLASSES SPECIFIC USER INTERFACE BASED ON A CONNECTED EXTERNAL DEVICE TYPE - This disclosure concerns an interactive head-mounted eyepiece with an integrated processor for handling content for display and an integrated image source for introducing the content to an optical assembly through which the user views a surrounding environment and the displayed content, wherein the eyepiece includes a user interface based on a connected external device type.08-02-2012
20090237419METHOD AND APPARATUS FOR EVOKING PERCEPTIONS OF AFFORDANCES IN VIRTUAL ENVIRONMENTS - Methods and apparatus are provided for evoking perceptions of affordances in a user/virtual environment interface. The method involves recognizing the absence or inadequacy of certain sensory stimuli in the user/virtual environment interface, and then creating sensory stimuli in the virtual environment to substitute for the recognized absent or inadequate sensory stimuli. The substitute sensory stimuli are typically communicated to the user (e.g., visually and/or audibly) as properties and behavior of objects in the virtual environment. Appropriately designed substitute sensory stimuli can evoke perceptions of affordances for the recognized absent or inadequate sensory stimuli in the user/virtual environment interface.09-24-2009
20120113145AUGMENTED REALITY SURVEILLANCE AND RESCUE SYSTEM - A system, method, and computer program product for automatically combining computer-generated imagery with real-world imagery in a portable electronic device by retrieving, manipulating, and sharing relevant stored videos, preferably in real time. A video is captured with a hand-held device and stored. Metadata including the camera's physical location and orientation is appended to a data stream, along with user input. The server analyzes the data stream and further annotates the metadata, producing a searchable library of videos and metadata. Later, when a camera user generates a new data stream, the linked server analyzes it, identifies relevant material from the library, retrieves the material and tagged information, adjusts it for proper orientation, then renders and superimposes it onto the current camera view so the user views an augmented reality.05-10-2012
20120113144AUGMENTED REALITY VIRTUAL GUIDE SYSTEM - A system, method, and computer program product for automatically combining computer-generated imagery with real-world imagery in a portable electronic device by retrieving, manipulating, and sharing relevant stored videos, preferably in real time. A video is captured with a hand-held device and stored. Metadata including the camera's physical location and orientation is appended to a data stream, along with user input. The server analyzes the data stream and further annotates the metadata, producing a searchable library of videos and metadata. Later, when a camera user generates a new data stream, the linked server analyzes it, identifies relevant material from the library, retrieves the material and tagged information, adjusts it for proper orientation, then renders and superimposes it onto the current camera view so the user views an augmented reality.05-10-2012
20120113143AUGMENTED REALITY SYSTEM FOR POSITION IDENTIFICATION - A system, method, and computer program product for automatically combining computer-generated imagery with real-world imagery in a portable electronic device by retrieving, manipulating, and sharing relevant stored videos, preferably in real time. A video is captured with a hand-held device and stored. Metadata including the camera's physical location and orientation is appended to a data stream, along with user input. The server analyzes the data stream and further annotates the metadata, producing a searchable library of videos and metadata. Later, when a camera user generates a new data stream, the linked server analyzes it, identifies relevant material from the library, retrieves the material and tagged information, adjusts it for proper orientation, then renders and superimposes it onto the current camera view so the user views an augmented reality.05-10-2012
20120113142AUGMENTED REALITY INTERFACE FOR VIDEO - A system, method, and computer program product for automatically combining computer-generated imagery with real-world imagery in a portable electronic device by retrieving, manipulating, and sharing relevant stored videos, preferably in real time. A video is captured with a hand-held device and stored. Metadata including the camera's physical location and orientation is appended to a data stream, along with user input. The server analyzes the data stream and further annotates the metadata, producing a searchable library of videos and metadata. Later, when a camera user generates a new data stream, the linked server analyzes it, identifies relevant material from the library, retrieves the material and tagged information, adjusts it for proper orientation, then renders and superimposes it onto the current camera view so the user views an augmented reality.05-10-2012
20120113141TECHNIQUES TO VISUALIZE PRODUCTS USING AUGMENTED REALITY - Techniques to visual products using augmented reality are described. An apparatus may comprise an augmentation system having a pattern detector component operative to receive an image with a first virtual object representing a first real object, and determine a location parameter and a scale parameter for a second virtual object based on the first virtual object, an augmentation component operative to retrieve the second virtual object representing a second real object from a data store, and augment the first virtual object with the second virtual object based on the location parameter and the scale parameter to form an augmented object, and a rendering component operative to render the augmented object in the image with a scaled version of the second virtual object as indicated by the scale parameter at a location on the first virtual object as indicated by the location parameter. Other embodiments are described and claimed.05-10-2012
20120236031SYSTEM AND METHOD FOR DELIVERING CONTENT TO A GROUP OF SEE-THROUGH NEAR EYE DISPLAY EYEPIECES - This disclosure concerns an interactive head-mounted eyepiece with an integrated processor for handling content for display and an integrated image source for introducing the content to an optical assembly through which the user views a surrounding environment and the displayed content. The optical assembly includes absorptive polarizers or anti-reflective coatings to reduce stray light.09-20-2012
20120236030SEE-THROUGH NEAR-EYE DISPLAY GLASSES INCLUDING A MODULAR IMAGE SOURCE - This disclosure concerns an interactive head-mounted eyepiece with an integrated processor for handling content for display and an integrated image source for introducing the content to an optical assembly through which the user views a surrounding environment and the displayed content. The optical assembly comprises a reflective image display that generates and reflects image light to an optically flat film then to a curved partially reflecting mirror of the optical assembly that reflects a portion of the image light from the image source and transmits a portion of the scene light from a see-through view of the surrounding environment to the user's eye as a combined image. The optical assembly comprises a modular image source, wherein the modular image source is mounted in a frame of the eyepiece such that its position with respect to a user's eye can be adjusted.09-20-2012
20120236029SYSTEM AND METHOD FOR EMBEDDING AND VIEWING MEDIA FILES WITHIN A VIRTUAL AND AUGMENTED REALITY SCENE - A preferred method for viewing embedded media in a virtual and augmented reality (VAR) scene can include at a viewer device, defining a real orientation of the viewer device relative to a projection matrix; and orienting a VAR scene on the viewer device in response to the real orientation in block, in which the VAR scene includes one or both of visual data and orientation data. The preferred method can further include selecting a media file in the VAR scene, wherein the media file is selected at a media location correlated at least to the real orientation of the viewer device; and activating the media file in the VAR scene at the media location. The preferred method and variations thereof functions to allow a viewer to interact with media that is embedded, tagged, linked, and/or associated with a VAR scene viewable on the viewer device.09-20-2012
20130162674INFORMATION PROCESSING TERMINAL, INFORMATION PROCESSING METHOD, AND PROGRAM - An information processing terminal includes a recognition unit that recognizes an identifier projected over an image, an acquisition unit that acquires data of an object corresponding to the identifier, a processing unit that changes the orientation of the object according to the positional relationship between the information processing terminal itself and the identifier specified based on the image and, when it is no longer able to recognize the identifier, changes the orientation of the object according to the positional relationship between the information processing terminal itself and the identifier specified based on sensor data, and a display control unit that causes the object of which the orientation is changed according to the positional relationship between the information processing terminal itself and the identifier to be displayed over the image in a superimposed manner.06-27-2013
20130162675INFORMATION PROCESSING APPARATUS - An information processing apparatus comprising first and second display units for respectively displaying first and a second composite images for the two eyes of a user, comprising: a moving unit configured to move positions of the first and second display units; a detecting unit configured to detect moving amounts of the first and second display units; first and second image capturing units configured to respectively obtain first and second captured images; an extracting unit configured to generate first and second extracted images by respectively extracting portions of the first and second captured images in extraction ranges associated with the moving amounts; and a composite image generating unit configured to generate the first and second composite images by respectively compositing first and second CG images with the first and second extracted images.06-27-2013
20130162677METHOD FOR DISPLAYING A VIRTUAL WORLD IN WHICH THE AVATAR OF A USER OF A VIRTUAL-REALITY SERVICE EVOLVES - The invention pertains to a method for displaying a virtual world in which the avatar of a user of a virtual reality service evolves, said method being operative to use a standard mode for displaying said virtual world, to identify objects visible to the avatar within the displayed virtual world, and, for at least one of said identified objects, to determine whether a relationship exists within the virtual reality service's social network between said object and the user, and if so, to determine a display mode to apply to said object depending on said relationship, the display of said object being altered by applying said determined mood.06-27-2013
20120147042ELECTRONIC PUBLICATION VIEWER, METHOD FOR VIEWING ELECTRONIC PUBLICATION, PROGRAM, AND INTEGRATED CIRCUIT - An electronic publication viewer (06-14-2012
20120092370APPARATUS AND METHOD FOR AMALGAMATING MARKERS AND MARKERLESS OBJECTS - An apparatus to provide AR includes a marker recognition unit to recognize objects in reality information, an amalgamation determining unit to determine whether the objects are amalgamated, an amalgamation processing unit to determine an attribute of each of the recognized objects and to generate an amalgamated object based on the determined attributes, and an object processing unit to map the amalgamated object to the reality information and to display the mapped amalgamated object. A method for amalgamating objects in AR includes recognizing objects in reality information, determining whether the objects are amalgamated, determining an attribute of each of the recognized objects, generating an amalgamated object based on the determined attribute, mapping the amalgamated object to the reality information, and displaying the mapped amalgamated object.04-19-2012
20110279478Virtual Tagging Method and System - A method for associating a virtual tag with a geographical location, comprising the steps of displaying (11-17-2011
20110279479Narrowcasting From Public Displays, and Related Methods - A user with a cell phone interacts, in a personalized session, with an electronic sign system. In some embodiments, the user's location relative to the sign is discerned from camera imagery—either imagery captured by the cell phone (i.e., of the sign), or captured by the sign system (i.e., of the user). Demographic information about the user can be estimated from imagery captured acquired by the sign system, or can be accessed from stored profile data associated with the user. The sign system can transmit payoffs (e.g., digital coupons or other response data) to viewers—customized per user demographics. In some arrangements, the payoff data is represented by digital watermark data encoded in the signage content. The encoding can take into account the user's location relative to the sign—allowing geometrical targeting of different payoffs to differently-located viewers. Other embodiments allow a user to engage an electronic sign system for interactive game play, using the cell phone as a controller.11-17-2011
20130162676CONTENT IDENTIFICATION AND DISTRIBUTION - The invention provides an identifier system for computing identity information from image data. At least part of the image data is representative of an identifier. The identifier comprises a location element and encoded data associated with the location element. The identifier system comprises computer interpretable reference data corresponding to the identifier. The reference data is suitable for use in feature matching to determine a location and an orientation of the location element in the image data—thereby to locate the encoded data in the image data for subsequent decoding into the identity information The invention also provides a computer implemented method of presenting an augmented reality view of a physical article using the identifier system.06-27-2013
20120218301SEE-THROUGH DISPLAY WITH AN OPTICAL ASSEMBLY INCLUDING A WEDGE-SHAPED ILLUMINATION SYSTEM - This disclosure concerns an interactive head-mounted eyepiece with an integrated processor for handling content for display and an integrated image source for introducing the content to an optical assembly through which the user views a surrounding environment and the displayed content. The optical assembly includes a light transmissive wedge-shaped illumination system with angle selective coatings and an LED lighting system coupled to an edge of the wedge. An angled surface of the wedge directs light from the LED lighting system to uniformly irradiate a reflective image display to produce an image that is reflected through the illumination system to provide the displayed content to the user.08-30-2012
20110298825IMAGE PROCESSING METHOD AND IMAGE PROCESSING APPARATUS - A CG image having a transparency parameter is superimposed on a shot image, which is an image picked up by an image-pickup device, to obtain a combined image. The combined image is displayed in a combined-image-display region. In the combined image, a mask region of the CG image is set based on parameter information used to extract a region of a hand. The transparency parameter of the CG image is set based on a ratio of the size of the region of the CG image excluding the mask region to the size of the shot image. By checking the combined image, which is displayed in the combined-image-display region, the user can set the parameter information by a simple operation.12-08-2011
20110298824SYSTEM AND METHOD OF VIRTUAL INTERACTION - A system for virtual interaction, comprising two or more portable electronic devices, is provided. Each device comprises, in turn, coordinate referencing means operable to define a coordinate system common to the portable electronic devices with respect to a physically defined reference position, position estimation means operable to detect a the physical position of its respective portable electronic device with respect to the reference position, virtual environment generation means operable to generate a virtual environment, and communication means operable to transmit positional data using the common coordinate system from that portable electronic device to another portable electronic device. The virtual environment is shared in common between the portable electronic devices. The virtual environment uses the common co-ordinate system within which each portable electronic device defines a position for itself responsive to its physical position with respect to the reference position.12-08-2011
20100091036Method and System for Integrating Virtual Entities Within Live Video - The present application provides a method and system for inserting virtual entities into live video with proper depth and obscuration. The virtual entities are drawn using a model of the real terrain, animated virtual entities, and a location of the live camera and field of view. The virtual entities are then merged with the live video feed. The merging can occur in real-time so that virtual entity is inserted into the live video feed in real-time.04-15-2010
20100271394SYSTEM AND METHOD FOR MERGING VIRTUAL REALITY AND REALITY TO PROVIDE AN ENHANCED SENSORY EXPERIENCE - A system and method of merging virtual reality sensory detail from a remote site into a room environment at a local site. The system preferably includes at least one image server; a plurality of image collection devices; a display system, comprising display devices, a control unit, digital processor and a viewer position detector. The control unit preferably receives the viewer position information and transmits instructions to the digital processor. The digital processor preferably processes source data representing an aggregated field of view from the image capturing devices in accordance with the instructions received from the control unit and outputs refined data representing a desired display view to be displayed on the one or more display devices wherein the viewer position detector dynamically determines the position of the viewer in the room environment and changes the desired display view corresponding to position changes of the viewer.10-28-2010
20110216089ALIGNMENT OF OBJECTS IN AUGMENTED REALITY - Technologies are generally described for aligning objects in augmented reality. In some examples, a processor may be adapted to receive detected image data and virtual object data. In some examples, the processor may further be adapted to generate and apply weights to log-likelihood functions at intensity and feature levels based on the virtual object data and detected image data. In some examples, the processor may further be adapted to add the weighted log-likelihood function at intensity level to the weighted log-likelihood function at feature level to produce a cost function. In some examples, the processor may further be adapted to determine transformation parameters based on the cost function that may be used to align the detected image data with virtual object data.09-08-2011
20110216090REAL-TIME INTERACTIVE AUGMENTED REALITY SYSTEM AND METHOD AND RECORDING MEDIUM STORING PROGRAM FOR IMPLEMENTING THE METHOD - The present invention relates to a real-time interactive system and method regarding an interactive technology between miniatures in real environment and digital contents in virtual environment, and a recording medium storing a program for performing the method. An exemplary embodiment of the present invention provides a real-time interactive augmented reality system including: an input information acquiring unit acquiring input information for an interaction between real environment and virtual contents in consideration of a planned story; a virtual contents determining unit determining the virtual contents according to the acquired input information; and a matching unit matching the real environment and the virtual contents by using an augmented position of the virtual contents acquired in advance. According to the present invention, an interaction between the real environment and the virtual contents can be implemented without tools, and improved immersive realization can be obtained by augmentation using natural features.09-08-2011
20110216088INTERPRETATION OF CONSTRAINED OBJECTS IN AUGMENTED REALITY - Technologies are generally described for interpretation of constrained objects in augmented reality. An example system may comprise a processor, a memory arranged in communication with the processor, and a display arranged in communication with the processor. An example system may further comprise a sensor arranged in communication with the processor. The sensor may be effective to detect measurement data regarding a constrained object. The sensor may be configured to send the measurement data to the processor. The processor may be effective to receive the measurement data, determine a model for the object, and process the measurement data to produce weighted measurement data. The processor may also be effective to apply a filter to the model and to the weighted measurement data to produce position information regarding the object, which may be utilized to generate an image based on the position information. The display may be effective to display the image.09-08-2011
20090289956VIRTUAL BILLBOARDS - Disclosed are methods and apparatus for implementing a reality overlay device. A reality overlay device captures information that is pertinent to physical surroundings with respect to a device, the information including at least one of visual information or audio information. The reality overlay device may transmit at least a portion of the captured information to a second device. For instance, the reality overlay device may transmit at least a portion of the captured information to a server via the Internet, where the server is capable of identifying an appropriate virtual billboard. The reality overlay device may then receive overlay information for use in generating a transparent overlay via the reality overlay device. The transparent overlay is then superimposed via the device using the overlay information, wherein the transparent overlay provides one or more transparent images that are pertinent to the physical surroundings. Specifically, one or more of the transparent images may operate as “virtual billboards.” Similarly, a portable device such as a cell phone may automatically receive a virtual billboard when the portable device enters an area within a specified distance from an associated establishment.11-26-2009
20090147025METHOD AND SYSTEM FOR MODIFICATION OF TURF TV PARTICIPANT DECORATIONS BASED ON MULTIPLE REAL-TIME FACTORS - A method of modifying sporting event participant decorations displayed on a fiber optic “Turf TV” playing surface based on multiple real-time factors. A decoration utility calculates a direction of movement of a player or object in proximity to the playing surface, which is configured to display images, during a live sporting event. The utility adds a graphical aura to a real-time graphical image displayed in proximity to the player on the playing surface. The utility animates the aura in response to wind and/or noise in proximity to the playing surface. The utility modifies the aura based on pre-defined custom attributes, penalties, errors, and/or player status. If the player moves, the utility adds a graphical player trail to the image. The utility also adds a graphical object trail that includes previous locations of an object. The object trail may also include spin and a visual appearance corresponding to an object height.06-11-2009
20120293548EVENT AUGMENTATION WITH REAL-TIME INFORMATION - A system and method to present a user wearing a head mounted display with supplemental information when viewing a live event. A user wearing an at least partially see-through, head mounted display views the live event while simultaneously receiving information on objects, including people, within the user's field of view, while wearing the head mounted display. The information is presented in a position in the head mounted display which does not interfere with the user's enjoyment of the live event.11-22-2012
20120293551USER INTERFACE ELEMENTS AUGMENTED WITH FORCE DETECTION - A computing device includes a touch screen display with at least one force sensor, each of which provides a signal in response to contact with the touch screen display. Using force signals from the at least one force sensor that result from contact with the touch screen, the operation of the computing device may be controlled, e.g. to select one of a plurality of overlaying interface elements, to prevent the unintended activation of suspect commands that require secondary confirmation, and to mimic the force requirements of real-world objects in augmented reality applications.11-22-2012
20100277504METHOD AND SYSTEM FOR SERVING THREE DIMENSION WEB MAP SERVICE USING AUGMENTED REALITY - Disclosed is a method for a 3-dimensional (3D) web map service using augmented reality, the method including downloading a mapping information file where 2-dimensional (2D) marker information and 3D modeling data are mapped, receiving map data including the 2D marker information from a map data providing server, rendering a map to a frame buffer in advance using the received map data, extracting an identification (ID) of the 3D modeling data through detecting 2D marker information from the map data and searching the mapping information file, extracting the 3D modeling data corresponding to the detected 2D marker information from a 3D modeling database using the ID of the 3D modeling data, additionally rendering the 3D modeling data to the frame buffer after processing the 3D modeling data, and rendering the rendered data to a screen.11-04-2010
20120086729ENTERTAINMENT DEVICE, SYSTEM, AND METHOD - An entertainment device generates a composite image with a combiner that combines camera-captured images with a computer-generated image of an object resting on a virtual surface. The device also includes a detector that detects image movement in the captured images in one or more contact point regions corresponding to image positions at which the object contacts the virtual surface. The device further comprises an initiator for initiating movement of the object to a new position with respect to the virtual surface in response to detected motion in the contact point regions. The detector detects whether a first image area corresponding to a captured image feature is greater than a predetermined proportion of a second image area corresponding to a full field of view of the camera. If the first image area is greater than the predetermined proportion, the initiator initiates movement of the object to an avoidance position.04-12-2012
20120086727METHOD AND APPARATUS FOR GENERATING AUGMENTED REALITY CONTENT - An approach is provided for providing augmented reality based on tracking. Information, including location information, orientation information, or a combination thereof of a device is determined. A representation of a location indicated based, at least in part, on the information is determined. One or more items are selected to associate with one or more points within the representation. Display information is determined to be generated, the display information including the one or more items overlaid on the representation based, at least in part, on the one or more points.04-12-2012
20120268491Color Channels and Optical Markers - Color channel optical marker techniques are described. In one or more implementations, a plurality of color channels obtained from a camera are examined, each of the color channels depicting an optical marker having a different scale than another optical maker depicted in another one of the color channels. At least one optical marker is identified in a respective one of the plurality of color channels and an optical basis is computed using the identified optical marker usable to describe at least a position or orientation of a part of the computing device.10-25-2012
20120293546AUGMENTED-REALITY MOBILE COMMUNICATOR WITH ORIENTATION - A mobile communication device adapted to communicate with a plurality of pre-determined sources disposed at pre-determined different locations includes a receiver adapted to receive wirelessly communicated visual information from a particular source at a pre-determined location, an orientation detector that detects the orientation of the receiver relative to the pre-determined location of the particular source to provide an orientation signal indicating that the mobile communication device is oriented toward the predetermined location of the particular source, and an interface circuit responsive to the wirelessly communicated visual information and the orientation signal to present the visual information to a user.11-22-2012
20120293547Management Of Access To And Life Cycles Of Virtual Signs - Many different methods, apparatus, and program products are disclosed for handling virtual signs over their life cycles. Potential future locations and headings of a mobile device are used to fetch virtual signs in advance of when the virtual signs might be used. Techniques are disclosed for handling timelines of virtual signs, including registering and responding to events in the timelines. Techniques are disclosed for allowing localities to license virtual signs. Techniques are disclosed to allow advertisers to bid for and win virtual sign competitions and product placement. Techniques are presented for presenting billing information to owners of virtual signs.11-22-2012
20120293550LOCALIZATION DEVICE AND LOCALIZATION METHOD WITH THE ASSISTANCE OF AUGMENTED REALITY - A localization device assisted with augmented reality and a localization method thereof are provided. The localization device includes a subject object coordinate generating unit, a relative angle determining element and a processing unit. The subject object coordinate generating unit selects at least three subject objects outside the localization device and obtains at least three subject object coordinate values of the at least three subject objects. The relative angle determining element determines at least two viewing angle differences between any two of the at least three subject objects. The processing unit generates a location coordinate value of the localization device according to the at least two viewing angle differences and the at least three subject object coordinate values.11-22-2012
20120032977APPARATUS AND METHOD FOR AUGMENTED REALITY - Disclosed is a method for augmented reality. A real world image including a marker is generated, the marker is detected from the real world image, an object image corresponding to the detected marker is combined with the real world image, and the combined image is displayed.02-09-2012
20110205243IMAGE PROCESSING APPARATUS, IMAGE PROCESSING METHOD, PROGRAM, AND IMAGE PROCESSING SYSTEM - There is provided an image processing apparatus including: an input image acquisition unit for obtaining an input image generated by taking an image of a real space; an image recognition unit for recognizing, when a first user-input representing a start of manipulation is detected, a manipulator used for manipulating a virtual object, wherein the manipulator appears in the input image; a calculation unit for calculating, according to a result of the recognition of the manipulator provided by the image recognition unit, a position on a screen of a display device at which the virtual object is to be displayed; a display control unit for displaying the virtual object at the position of the screen of the display device calculated by the calculation unit; and a communication unit for transmitting, when the first user-input is detected, a first notification signal for notifying the start of manipulation to another apparatus displaying the same virtual object.08-25-2011
20130215147Apparatus and Method for Enhancing Human Visual Performance in a Head Worn Video System - Visual impairment, or vision impairment, refers to the vision loss of an individual to such a degree as to require additional support for one or more aspects of their life. Such a significant limitation of visual capability may result from disease, trauma, congenital, and/or degenerative conditions that cannot be corrected by conventional means, such as refractive correction, such as eyeglasses or contact lenses, medication, or surgery. According to embodiments of the invention a method of augmenting a user's sight is provided comprising obtaining an image of a scene using a camera carried by the individual, transmitting the obtained image to a processor, selecting an algorithm of a plurality of spectral, spatial, and temporal image modification algorithms to be applied to the image by the processor, modifying the using the algorithm substantially in real time, and displaying the modified image on a display device worn by the individual.08-22-2013
20110205242Augmented Reality Design System - An augmented reality design system is disclosed. The augmented reality design system allows a user to create a design for an article in real time using a proxy. The system can be configured using a head mounted display for displaying at least one virtual design element over a proxy located in a real-world environment. The system can also be configured using a projector that projects at least one virtual design element onto a proxy located in the real world.08-25-2011
20130120449INFORMATION PROCESSING SYSTEM, INFORMATION PROCESSING METHOD AND PROGRAM - A failure analysis apparatus obtains information associated with an operational status of a data center, determines information regarding fault repair work for the data center, based on the information associated with the operational status, and transmits the information regarding the fault repair work to an HMD. The HMD synthesizes and presents computer graphics image data for providing guidance for a method of the fault repair work, with an image of real space, based on the information regarding the fault repair work.05-16-2013
20130120450METHOD AND APPARATUS FOR PROVIDING AUGMENTED REALITY TOUR PLATFORM SERVICE INSIDE BUILDING BY USING WIRELESS COMMUNICATION DEVICE - A method of providing an augmented reality tour platform service for the inside of a building by using a wireless communication device. The method includes: acquiring an image of the building from the wireless communication device; collecting information associated with the acquired image; extracting a candidate building group from a previously established database on the basis of the acquired image and the collected information; specifying a building matching the acquired image from among the extracted candidate building group; and transmitting information regarding the inside of the specified building to the wireless communication device.05-16-2013
20130120451IMAGE PROCESSING DEVICE, IMAGE PROCESSING METHOD, AND PROGRAM - When combining a virtual subject with a background image, there may be a case where the hue is different between both and a feeling of difference arises. Moreover, conventionally, it is necessary to manually adjust rendering parameters etc. from the rendering result, which takes time and effort. An image processing device that combines a virtual subject with a background image to generate a combined image is characterized by including a correction coefficient deriving unit configured to derive a correction coefficient by performing rendering of a color object arranged in a position where the virtual subject is placed using an environment map indicating information of a light source around the virtual subject, a background image correcting unit configured to correct the background image based on the derived correction coefficient, and a combining unit configured to combine a corrected background image and the virtual subject using the environment map.05-16-2013
20130120452WIRELESS AUGMENTED REALITY COMMUNICATION SYSTEM - The system of the present invention is a highly integrated radio communication system with a multimedia co-processor which allows true two-way multimedia (video, audio, data) access as well as real-time biomedical monitoring in a pager-sized portable access unit. The system is integrated in a network structure including one or more general purpose nodes for providing a wireless-to-wired interface. The network architecture allows video, audio and data (including biomedical data) streams to be connected directly to external users and devices. The portable access units may also be mated to various non-personal devices such as cameras or environmental sensors for providing a method for setting up wireless sensor nets from which reported data may be accessed through the portable access unit. The reported data may alternatively be automatically logged at a remote computer for access and viewing through a portable access unit, including the user's own.05-16-2013
20100149213Virtual Penetrating Mirror Device for Visualizing of Virtual Objects within an Augmented Reality Environment - A virtual penetrating mirror (06-17-2010
20090244097System and Method for Providing Augmented Reality - A system and method for providing augmented reality. A method comprises retrieving a specification of an environment of the electronic device, capturing optical information of the environment of the electronic device, and computing the starting position/orientation from the captured optical information and the specification. The use of optical information in addition to positional information from a position sensor to compute the starting position may improve a viewer's experience with a mobile augmented reality system.10-01-2009
20130215148INTERACTIVE INPUT SYSTEM HAVING A 3D INPUT SPACE - An interactive input system comprises computing structure; and an input device detecting at least one physical object carrying a recognizable pattern within a three-dimensional (3D) input space and providing output to the computing structure, wherein the computing structure processes the output of the input device to: recognize the pattern carried by the at least one physical object in the 3D input space; and modify an image presented on a display surface by applying a transition to digital content associated with the at least one physical object based on a detected state of the at least one physical object.08-22-2013
20090167787AUGMENTED REALITY AND FILTERING - A system (and corresponding method) that can enhance a user experience by augmenting real-world experiences with virtual world data to is provided. The augmented reality system discloses various techniques to personalize real-world experiences by overlaying or interspersing virtual capabilities (and data) with real world situations. The innovation can also filter, rank, modify or ignore virtual-world information based upon a particular real-world class, user identity or context.07-02-2009
20110221772System And Methods For Generating Virtual Clothing Experiences - A system for generating a virtual clothing experience has a display for mounting with a wall, one or more digital cameras for capturing first images of the person standing in front of the display, an image processing module for synthesizing the first images and for generating a display image on the display that substantially appears, to the person, like a reflection of the person in a mirror positioned at the display. The cameras capture second images of a garment with the person; the module synthesizes the second images with the first images to generate the image that substantially appears, to the person, that the reflection wears the garment. A home version of the system may be formed with a home computer and a database storing garment images in cooperation with a manufacturer.09-15-2011
20120105475Range of Focus in an Augmented Reality Application - A computer-implemented augmented reality method includes receiving one or more indications, entered on a mobile computing device by a user of the mobile computing device, of a distance range for determining items to display with an augmented reality application, the distance range representing geographic distance from a base point where the mobile computing device is located. The method also includes selecting, from items in a computer database, one or more items that are located within the distance range from the mobile computing device entered by the user, and providing data for representing labels for the selected one or more items on a visual display of the mobile computing device, the labels corresponding to the selected items, and the items corresponding to geographical features that are within the distance range as measure from the mobile computing device.05-03-2012
20120194554INFORMATION PROCESSING DEVICE, ALARM METHOD, AND PROGRAM - An apparatus comprising a memory storing instructions is provided. The apparatus includes a control unit for executing the instructions to send signals to display, for a user, a first virtual image superimposed onto an image of real space, the image of real space comprising an image of a potential source of interest for the user. The control unit further executes instructions to send signals to send signals to analyze the image of real space to detect the potential source of interest. The control unit further executes instructions to send signals to notify the user of the potential source of interest.08-02-2012
20120194552AR GLASSES WITH PREDICTIVE CONTROL OF EXTERNAL DEVICE BASED ON EVENT INPUT - This disclosure concerns an interactive head-mounted eyepiece with an integrated processor for handling content for display and an integrated image source for introducing the content to an optical assembly through which the user views a surrounding environment and the displayed content, wherein the eyepiece includes predictive control of external device based on an event input.08-02-2012
20120194548SYSTEM AND METHOD FOR REMOTELY SHARING AUGMENTED REALITY SERVICE - An augmented reality (AR) system and method for remotely sharing an AR service using different markers are provided. The AR system includes a plurality of client devices and a host device. The host device of the AR system may set a sharing area of different makers, and may enable sharing of information included in the sharing area among remotely located client devices. The client device of the AR system may display an AR object identified in the sharing area and may share information related to the AR object through an AR service.08-02-2012
20090315916ON-THE-FLY CREATION OF VIRTUAL PLACES IN VIRTUAL WORLDS - A specification of a set of objects associated with at least one virtual world is obtained. The objects are laid out in a three-dimensional virtual representation. An on-the-fly virtual place is created in the virtual world, based on the layout. The virtual place depicts the set of objects in the three-dimensional virtual representation and enables navigation and interaction therewith12-24-2009
20120194551AR GLASSES WITH USER-ACTION BASED COMMAND AND CONTROL OF EXTERNAL DEVICES - This disclosure concerns an interactive head-mounted eyepiece with an integrated processor for handling content for display and an integrated image source for introducing the content to an optical assembly through which the user views a surrounding environment and the displayed content, wherein the eyepiece has a user-action based command and control of external devices.08-02-2012
20100182340SYSTEMS AND METHODS FOR COMBINING VIRTUAL AND REAL-TIME PHYSICAL ENVIRONMENTS - Systems, methods and structures for combining virtual reality and real-time environment by combining captured real-time video data and real-time 3D environment renderings to create a fused, that is, combined environment, including capturing video imagery in RGB or HSV/HSV color coordinate systems and processing it to determine which areas should be made transparent, or have other color modifications made, based on sensed cultural features, electromagnetic spectrum values, and/or sensor line-of-sight, wherein the sensed features can also include electromagnetic radiation characteristics such as color, infra-red, ultra-violet light values, cultural features can include patterns of these characteristics, such as object recognition using edge detection, and whereby the processed image is then overlaid on, and fused into a 3D environment to combine the two data sources into a single scene to thereby create an effect whereby a user can look through predesignated areas or “windows” in the video image to see into a 3D simulated world, and/or see other enhanced or reprocessed features of the captured image.07-22-2010
20130215149INFORMATION PRESENTATION DEVICE, DIGITAL CAMERA, HEAD MOUNT DISPLAY, PROJECTOR, INFORMATION PRESENTATION METHOD AND NON-TRANSITORY COMPUTER READABLE MEDIUM - A digital camera functioning as an information presentation device is provided with a CG superimposition unit 08-22-2013
20120194553AR GLASSES WITH SENSOR AND USER ACTION BASED CONTROL OF EXTERNAL DEVICES WITH FEEDBACK - This disclosure concerns an interactive head-mounted eyepiece with an integrated processor for handling content for display and an integrated image source for introducing the content to an optical assembly through which the user views a surrounding environment and the displayed content, wherein the eyepiece includes sensor and user action based control of external devices with feedback.08-02-2012
20120194550SENSOR-BASED COMMAND AND CONTROL OF EXTERNAL DEVICES WITH FEEDBACK FROM THE EXTERNAL DEVICE TO THE AR GLASSES - This disclosure concerns an interactive head-mounted eyepiece with an integrated processor for handling content for display and an integrated image source for introducing the content to an optical assembly through which the user views a surrounding environment and the displayed content, wherein the eyepiece includes sensor-based command and control of external devices with feedback from the external device to the eyepiece.08-02-2012
20100253700Real-Time 3-D Interactions Between Real And Virtual Environments - Systems and methods providing for real and virtual object interactions are presented. Images of virtual objects can be projected onto the real environment, now augmented. Images of virtual objects can also be projected to an off-stage invisible area, where the virtual objects can be perceived as holograms through a semi-reflective surface. A viewer can observe the reflected images while also viewing the augmented environment behind the pane, resulting in one perceived uniform world, all sharing the same Cartesian coordinates. One or more computer-based image processing systems can control the projected images so they appear to interact with the real-world object from the perspective of the viewer.10-07-2010
20120242699MODIFICATION OF TURF TV PARTICIPANT DECORATIONS BASED ON MULTIPLE REAL-TIME FACTORS - A method, system and compute program product for modifying sporting event participant decorations displayed on a fiber optic “Turf TV” playing surface. A utility calculates a direction of movement of a player or object in proximity to the playing surface, which is configured to display images, during a live sporting event. The utility adds a graphical aura to a real-time graphical image displayed in proximity to the player on the playing surface. The utility animates the aura in response to wind and/or noise in proximity to the playing surface. The utility modifies the aura based on pre-defined custom attributes, penalties, errors, and/or player status. If the player moves, the utility adds a graphical player trail to the image. The utility also adds a graphical object trail that includes previous locations of an object. The object trail may also include spin and a visual appearance corresponding to an object height.09-27-2012
20120242697SEE-THROUGH NEAR-EYE DISPLAY GLASSES WITH THE OPTICAL ASSEMBLY INCLUDING ABSORPTIVE POLARIZERS OR ANTI-REFLECTIVE COATINGS TO REDUCE STRAY LIGHT - This disclosure concerns an interactive head-mounted eyepiece with an integrated processor for handling content for display and an integrated image source for introducing the content to an optical assembly through which the user views a surrounding environment and the displayed content. The optical assembly includes absorptive polarizers or anti-reflective coatings to reduce stray light.09-27-2012
20120242695Augmented Reality System for Public and Private Seminars - Described are computer implemented augmented reality techniques for public and private seminars that includes receive an indication of a start of a segment of a live presentation, the live presentation comprising at least one segment, with the at least one segment having at least one presentation component; receive information related to a plurality of users, receive rules to analyze the information and select from the information, private information pertaining to a particular user, with the selected information being relevant to the at least one presentation component of the at least one segment of the live presentation, generate an image that when rendered on a display device renders the private information pertaining to the particular user for that presentation component, and send the image of the private information to a device associated with the particular user.09-27-2012
20120242694MONOCULAR HEAD MOUNTED DISPLAY - According to one embodiment, a monocular head mounted display comprising an information acquisition section, an image data generation section, and an image display section. The information acquisition section acquires solid body position information on a position of a solid body located on ground around a user, and indication position information on an indication position for the user. The image data generation section generates image data including an information object to provide provision information to the user. The image display section displays an image based on the image data on one eye of the user in superimposition on a real scene. The image data generation section generates the image data so as to move the information object in the image so that the information object is superimposed on the indication position after placing the information object in the image so that the information object is superimposed on the solid body.09-27-2012
20090109240Method and System for Providing and Reconstructing a Photorealistic Three-Dimensional Environment - The present invention relates to a method and system for providing and reconstructing a photorealistic environment, by integrating a virtual item into it, comprising: (a) a dedicated marker, placed in a predefined location within an environment, in which a virtual item has to be integrated, for enabling determining the desired location of said virtual item within said environment; (b) a conventional camera for taking a picture or shooting a video clip of said environment, in which said marker was placed, and then providing a corresponding images of said environment; and (c) one or more servers for receiving said corresponding image of said environment from said camera, processing it, and outputting a photorealistic image that contains said virtual item integrated within it, comprising: (c.1.) a composer for composing a photorealistic image from said corresponding image of said environment; (c.2.) an image processing unit for processing said corresponding image and for determining the location of said marker within said environment; (c.3.) a configuration database for storing configurations and other data; and (c.4.) an image rendering unit for reconstructing the photorealistic image by integrating said virtual item into said predefined location of the photographed environment, wherein said marker is located.04-30-2009
20090109241IMAGE DISPLAY SYSTEM, IMAGE DISPLAY APPARATUS, AND CONTROL METHOD THEREOF - Upon receiving a communication switching instruction from a first wireless access point used for communication with an image processing apparatus, an image display apparatus disconnects communication with the first wireless access point. Simultaneously, the image display apparatus transmits, to a second wireless access point, a link request to establish communication with the second wireless access point of a new communication destination included in the switching instruction. The image display apparatus displays, on a display unit, a captured image continuously acquired from am image capturing unit until switching from the first wireless access point to the second wireless access point finishes as communication destination switching.04-30-2009
20100045700DEVICE FOR WATCHING REAL-TIME AUGMENTED REALITY AND METHOD FOR IMPLEMENTING SAID DEVICE - The invention relates to a real-time augmented-reality watching device (02-25-2010
20130141460METHOD AND APPARATUS FOR VIRTUAL INCIDENT REPRESENTATION - A virtual incident representation capability is disclosed. The virtual incident representation capability is configured to represent a real world incident within a virtual world representation to provide thereby a virtual incident representation of the real world incident, which may be made available to people involved in the handling of the real world incident (e.g., operators at the safety answering point to which the real world incident is reported, responders in the field who have or will respond to the site of the real world incident, and the like). The virtual incident representation approximates the actual events of the real world incident in both space and time, and also may indicate the degree of certainty of at least a portion of the information included within the virtual incident representation. The virtual incident representation may be dynamic and interactive.06-06-2013
20130141461AUGMENTED REALITY CAMERA REGISTRATION - A system and method executable by a computing device of an augmented reality system for registering a camera in a physical space is provided. The method may include identifying an origin marker in a series of images of a physical space captured by a camera of an augmented reality system, and defining a marker graph having an origin marker node. The method may further include analyzing in real-time the series of images to identify a plurality of expansion markers with locations defined relative to previously imaged markers, and defining corresponding expansion marker nodes in the marker graph. The method may further include calculating a current position of the camera of the augmented reality system in the physical space based on a location of a node in the marker graph corresponding to a most recently imaged marker, relative to the origin marker and any intermediate markers.06-06-2013
20130147837AUGMENTED REALITY PERSONALIZATION - A method is provided, such as for mobile augmented reality personalization. A front-facing camera of the mobile device acquires a first view of a user of the mobile device. A personal characteristic of the user of the mobile device is identified from the first view. A location of the mobile device may be determined. A back-facing camera of the mobile device may acquire a second view of a region at the location. Augmented reality information is selected as a function of the personal characteristic. A second view is displayed with the augmented reality information.06-13-2013
20130147838UPDATING PRINTED CONTENT WITH PERSONALIZED VIRTUAL DATA - The technology provides for updating printed content with personalized virtual data using a see-through, near-eye, mixed reality display device system. A printed content item, for example a book or magazine, is identified from image data captured by cameras on the display device, and user selection of a printed content selection within the printed content item is identified based on physical action user input, for example eye gaze or a gesture. Virtual data is selected from available virtual data for the printed content selection based on user profile data, and the display device system displays the selected virtual data in a position registered to the position of the printed content selection. In some examples, a task related to the printed content item is determined based on physical action user input, and personalized virtual data is displayed registered to the printed content item in accordance with the task.06-13-2013
20130147839AUGMENTED REALITY PROVIDING SYSTEM, INFORMATION PROCESSING TERMINAL, INFORMATION PROCESSING APPARATUS, AUGMENTED REALITY PROVIDING METHOD, INFORMATION PROCESSING METHOD, AND PROGRAM - An Augmented Reality (AR) providing apparatus sends to a server apparatus a request, including image information from an imaging device, for obtaining product information indicating a product that can be displayed on a shelf. and the AR apparatus displays product information included in a reply from the server apparatus in response to the request in an overlaying image manner. The server apparatus determines a shelf from the image information included in the request, determines a size of an empty shelf space, and selects product information of products smaller than the determined size of the empty shelf space. The product information is selected from a storage device storing multiple sets of product information indicating a product and its associated size information. The server apparatus sends a reply including the selected product information to the AR providing apparatus.06-13-2013
20110128300Augmented reality videogame broadcast programming - There is provided a system and method for integrating a virtual rendering system and a video capture system using flexible camera control to provide an augmented reality. There is provided a method comprising receiving input data from a plurality of clients for modifying a virtual environment presented using the virtual rendering system, obtaining, from the virtual rendering system, a virtual camera configuration of a virtual camera in the virtual environment, programming the video capture system using the virtual camera configuration to correspondingly control a robotic camera in a real environment, capturing a video capture feed using the robotic camera, obtaining a virtually rendered feed using the virtual camera showing the modifying of the virtual environment, rendering the composite render by processing the feeds, and outputting the composite render to the display.06-02-2011
20110084983Systems and Methods for Interaction With a Virtual Environment - Systems and methods for interaction with a virtual environment are disclosed. In some embodiments, a method comprises generating a virtual representation of a user's non-virtual environment, determining a viewpoint of a user in a non-virtual environment relative to a display, and displaying, with the display, the virtual representation in a spatial relationship with the user's non-virtual environment based on the viewpoint of the user.04-14-2011
20110074816SYSTEMS AND METHODS FOR INTEGRATING GRAPHIC ANIMATION TECHNOLOGIES IN FANTASY SPORTS CONTEST APPLICATIONS - Systems and methods for integrating graphic animation technologies with fantasy sports contest applications are provided. This invention enables a fantasy sports contest application to depict plays in various sporting events using graphic animation. The fantasy sports contest application may combine graphical representation of real-life elements such as, for example, player facial features, with default elements such as, for example, a generic player body, to create realistic graphic video. The fantasy sports contest application may provide links to animated videos for depicting plays on contest screens in which information associated with the plays may be displayed. The fantasy sports contest application may play the animated video for a user in response to the user selecting such a link. In some embodiment of the present invention, the fantasy sports contest application may also customize animated video based on user-supplied setup information. For example, the fantasy sports contest application may provide play information and other related data to allow a user to generate animated videos using the user's own graphics processing equipment and graphics animation program.03-31-2011
20120120103ALIGNMENT CONTROL IN AN AUGMENTED REALITY HEADPIECE - This patent discloses a method for providing an augmented image in a see-through head mounted display. The method includes capturing an image of a scene containing objects and displaying the image to a viewer. The method also includes capturing one or more additional image(s) of the scene in which the viewer indicates a misalignment between the displayed image and a see-through view of the scene. The captured images are then compared to determine an image adjustment to align corresponding objects in displayed images to the objects in the see-through view of the scene. This method provides augmented image information that is displayed in correspondence to the image adjustments so the viewer sees an augmented image comprised of the augmented image information overlaid and aligned to the see-through view.05-17-2012
20120120102SYSTEM AND METHOD FOR CONTROLLING DEVICE - Provided is a system and method for controlling a device using Augmented Reality (AR). A system for controlling a device using Augmented Reality (AR) includes a device server, an AR server, and a portable terminal. The device server registers information about each device. The AR server generates an AR screen displaying type information and service-related information of at least one device searched in response to a request of a portable terminal by using the registered device information, and provides the generated AR screen to the portable terminal. The portable terminal connects with a device selected among devices displayed on the AR screen and performs a specific function with the connected device.05-17-2012
20130162673PIXEL OPACITY FOR AUGMENTED REALITY - In embodiments of pixel opacity for augmented reality, a display lens system includes a first display panel that displays a virtual image generated to appear as part of an environment viewed through optical lenses. A second display panel displays an environment image of the environment as viewed through the optical lenses, and the environment image includes opaque pixels that form a black silhouette of the virtual image. The display lens system also includes a beam-splitter panel to transmit light of the environment image and reflect light of the virtual image to form a composite image that appears as the virtual image displayed over the opaque pixels of the environment image.06-27-2013
20110148922APPARATUS AND METHOD FOR MIXED REALITY CONTENT OPERATION BASED ON INDOOR AND OUTDOOR CONTEXT AWARENESS - Provided are an apparatus and method for mixed reality content operation based on indoor and outdoor context awareness. The apparatus for mixed reality content operation includes a mixed reality visualization processing unit superposing at least one of a virtual object and a text on an actual image which is acquired through the camera to generate a mixed reality image; a context awareness processing unit receiving at least one of sensed data peripheral to the mobile device and a location and posture data of the camera to perceive a peripheral context of the mobile device on the basis of the received data; and a mixed reality application content driving unit adding a content in the mixed reality image to generate an application service image, the content being provided in a context linking type according to the peripheral context.06-23-2011
20120200602HAND IMAGE FEEDBACK - An image generation method and system. The method includes receiving by a computing apparatus from a video recording device attached to a backside of a video monitor connected to the computing apparatus, a video data stream comprising a first video image of an input device connected to the computing apparatus and a second video image of a users hands enabling switches on the input device. An input device image associated with the input device is displayed. The computing apparatus super-imposes and displays a hand image associated with the user's hands over the input device image. The computing apparatus adjusts a brightness of the hand image such that the input device image is visible through the hand image.08-09-2012
20120200600HEAD AND ARM DETECTION FOR VIRTUAL IMMERSION SYSTEMS AND METHODS - Systems and methods for detection of the head and arms of a user to interact with an immersive virtual environment are disclosed. In some embodiments, a method comprises generating a virtual representation of a non-virtual environment, determining a position of a user relative to the display using an overhead sensor when the user is within a predetermined proximity to a display, determining a position of a user's head relative to the display using the overhead sensor, and displaying the virtual representation on the display in a spatial relationship with the non-virtual environment based on the position of the user's head relative to the display.08-09-2012
20110254860MOBILE DEVICE FOR AUGMENTED REALITY APPLICATION - The mobile device includes a visual input device, for capturing external visual information having real visual background information, and a processing device. The processing device is for associating a selected application with the external visual information, and for executing the selected application based on the external visual information and on user-related input information. The processing device for generating a visual output signal related to at least one virtual visual object in response to the application is further configured to provide the visual output signal to a projector device included within the mobile device such that the projector device will be configured to project said visual output signal related to the at least one virtual visual object onto the visual background, thereby modifying said external visual information.10-20-2011
20110254859IMAGE PROCESSING SYSTEM, IMAGE PROCESSING APPARATUS, IMAGE PROCESSING METHOD, AND PROGRAM - There is provided an image processing apparatus including: a communication unit receiving first feature amounts, which include coordinates of feature points in an image acquired by another image processing apparatus, and position data showing a position in the image of a pointer that points at a location in a real space; an input image acquisition unit acquiring an input image by image pickup of the real space; a feature amount generating unit generating second feature amounts including coordinates of feature points set in the acquired input image; a specifying unit comparing the first feature amounts and the second feature amounts and specifying, based on a comparison result and the position data, a position in the input image of the location in the real space being pointed at by the pointer; and an output image generating unit generating an output image displaying an indicator indicating the specified position.10-20-2011
20130169684POSITIONAL CONTEXT DETERMINATION WITH MULTI MARKER CONFIDENCE RANKING - A computer implemented method for augmenting a display image includes receiving image data, the image data including data representing one or more objects, and at least a first marker and a second marker. The method includes receiving a first confidence level for the first marker and a second confidence level for the second marker. The method includes determining a selected marker from the first marker and the second marker. The selected marker is determined according to a highest confidence level of the first confidence level and the second confidence level. The method includes determining a transformation and a positional offset for the selected marker. The method includes generating overlaid display data for the one or more objects in the image data, the one or more objects determined in accordance with the transformation and the positional offset.07-04-2013
20110254861INFORMATION DISPLAYING APPARATUS AND INFORMATION DISPLAYING METHOD - An information displaying apparatus capable of providing an easy-to-see display of information that the user wants to know out of information related to objects seen in a captured real-world image. The information displaying apparatus (10-20-2011
20120200601AR GLASSES WITH STATE TRIGGERED EYE CONTROL INTERACTION WITH ADVERTISING FACILITY - This disclosure concerns an interactive head-mounted eyepiece with an integrated processor for handling content for display and an integrated image source for introducing the content to an optical assembly through which the user views a surrounding environment and the displayed content, wherein the eyepiece includes a state triggered eye control interaction with advertising facility.08-09-2012
20110090252MARKERLESS AUGMENTED REALITY SYSTEM AND METHOD USING PROJECTIVE INVARIANT - Disclosed herein are a markerless augmented reality system and method for extracting feature points within an image and providing augmented reality using a projective invariant of the feature points. The feature points are tracked in two images photographed while varying the position of an image acquisition unit, a set of feature points satisfying a plane projective invariant is obtained from the feature points, and augmented reality is provided based on the set of feature points. Accordingly, since the set of feature points satisfies the plane projective invariant even when the image acquisition unit is moved and functions as a marker, a separate marker is unnecessary. In addition, since augmented reality is provided based on the set of feature points, a total computation amount is decreased and augmented reality is more efficiently provided.04-21-2011
20120147043OPTICAL COMMUNICATION APPARATUS AND OPTICAL COMMUNICATION METHOD - An optical communication apparatus is disclosed. The optical communication apparatus includes a light transmission section, a light reception section, and a control section. The light transmission section causes a light emitting portion which outputs light in a visual line direction of a user to transmit information. The light reception section causes a light receiving portion which receives light from the visual line direction of the user to receive information. The control section determines whether or not another optical communication apparatus is an information communication target based on identification information when the light reception section has optically received communication request information and the identification information from the other optical communication apparatus and causes the light transmission section to optically transmit communication response information to the other optical communication apparatus when the control section has determined that the other optical communication apparatus be the information communication target.06-14-2012
20120147041APPARATUS AND METHOD FOR SEARCHING ACCESS POINTS IN PORTABLE TERMINAL - An apparatus and method combine an augmented reality scheme with an AP search function and visually provide positions of searched APs as well as names of the searched APs and strength of signals received from the searched APs. The apparatus includes a communication unit for receiving signals of APs around the portable terminal, an input unit for receiving input for searching the APs, a camera unit for photographing environments around the portable terminal when searching the APs, a display unit for outputting an image photographed by the camera unit on a preview picture, an AP attribute ascertaining unit for ascertaining attributes of the APs which exist around the portable terminal and ascertaining positions of the APs, and a controller for outputting the APs on the preview picture to correspond to the positions of the ascertained APs.06-14-2012
20120147040APPARATUS AND METHOD FOR PROVIDING WIRELESS NETWORK INFORMATION - In a terminal to display wireless network information by use of augmented reality, the terminal is able to be connected to various servers that provide wireless network information through a communication network. In a method for displaying wireless network information using augmented reality, a real-world image is acquired. Wireless network information for a region, corresponding to a location of the image, is acquired from the server. The acquired wireless network information is overlaid onto the real-world image and displayed.06-14-2012
20120147039TERMINAL AND METHOD FOR PROVIDING AUGMENTED REALITY - A terminal to provide augmented reality includes a camera unit to capture a real-world view having a marker comprising a first region and a second region; a memory unit to store an object corresponding to the marker, first control information to control a first part of the object, and second control information to control a second part of the object; an object control unit to control the first part of the object based on the first control information if the first region is selected, and to control the second part of the object based on the second control information if the second region is selected; an image processing unit to synthesize the object with the real-world view into a synthesized view; and a display unit to display the synthesized view.06-14-2012
20110187743TERMINAL AND METHOD FOR PROVIDING AUGMENTED REALITY - A first terminal shares a digital marker edited in a digital marker editing mode and an object corresponding to the edited digital marker with a second terminal using a wireless communication technology. If a digital marker is displayed on an image display unit of the first terminal, the second terminal photographs the digital marker using a camera, and synthesizes an object corresponding to the photographed digital marker with a real-time video image obtained through the camera to display a merged image as augmented reality. Then, the second terminal receives input information for changing the digital marker from a user, and transmits the received input information to the first terminal. The first terminal changes a digital marker using the input information received from the second terminal. The second terminal photographs the changed digital marker, and displays an object corresponding to the changed digital marker.08-04-2011
20100026714MIXED REALITY PRESENTATION SYSTEM - An image composition unit outputs a composition image of a physical space and virtual space to a display unit. The image composition unit calculates, as difference information, a half of the difference between an imaging time of the physical space and a generation completion predicted time of the virtual space. The difference information and acquired position and orientation information are transmitted to an image processing apparatus. A line-of-sight position prediction unit updates previous difference information using the received difference information, calculates, as the generation completion predicted time, a time ahead of a receiving time by the updated difference information, and predicts the position and orientation of a viewpoint at the calculated generation completion predicted time using the received position and orientation information. The virtual space based on the predicted position and orientation, and the generation completion predicted time are transmitted to a VHMD.02-04-2010
20120038668METHOD FOR DISPLAY INFORMATION AND MOBILE TERMINAL USING THE SAME - A method for controlling information on a mobile terminal may be provided. The method may include displaying an image (at least one object) on a display of the mobile terminal, receiving information regarding movement of a pointer with respect to the displayed image on the display, obtaining augmented reality (AR) information regarding the object based on the received information regarding movement of the pointer, and displaying the image and the augmented reality (AR) information related to the at least one object on the display of the mobile terminal.02-16-2012
20120038669USER EQUIPMENT, SERVER, AND METHOD FOR SELECTIVELY FILTERING AUGMENTED REALITY - An augmented reality filter selecting user equipment, includes a display unit display to display a real image including a target object and a plurality of filter icons, a user input unit to receive a user command, the user command including a selection of a target filter icon and a movement of the target filter icon, and a control unit to control the display unit to display the target object and filtered AR information corresponding to the target object. A method for selecting a filter includes displaying a real image including a target object and a plurality of filter icons, receiving a command to select a target filter icon, applying the target filter icon by moving the selected target filter icon onto a target object on the displayed image, and displaying the target object and information corresponding to the target object, in which the target filter icon is applied.02-16-2012
20120038670APPARATUS AND METHOD FOR PROVIDING AUGMENTED REALITY INFORMATION - An apparatus, system, and method for providing augmented reality (AR) information of a concealed object are disclosed. The method for providing AR information of a concealed object by a terminal connectable to a server via a wired and/or wireless communication network may include acquiring an image of a real environment; defining a reference object included in the acquired image; obtaining image capturing position information about a position of the image and reference object recognition information of the defined reference object; transmitting the obtained image capturing position information and the reference object recognition information to the server; receiving information about concealed objects from the server, the concealed objects being disposed behind the reference object along or about a direction from the image capturing position to the reference object; and outputting the received information about concealed objects.02-16-2012
20120306920AUGMENTED REALITY AND FILTERING - A system (and corresponding method) that can enhance a user experience by augmenting real-world experiences with virtual world data to is provided. The augmented reality system discloses various techniques to personalize real-world experiences by overlaying or interspersing virtual capabilities (and data) with real world situations. The innovation can also filter, rank, modify or ignore virtual-world information based upon a particular real-world class, user identity or context.12-06-2012
20120306918IMAGE PROCESSING APPARATUS, IMAGE PROCESSING METHOD, AND PROGRAM - Disclosed herein is an image processing apparatus including a display control part configured to display a human-figure virtual object image in a pose from which to extract information necessary for motion capture, the human-figure virtual object image being the object to be handled corresponding to a person targeted to be recognized.12-06-2012
20120306917COMPUTER-READABLE STORAGE MEDIUM HAVING STORED THEREIN IMAGE DISPLAY PROGRAM, IMAGE DISPLAY APPARATUS, IMAGE DISPLAY METHOD, IMAGE DISPLAY SYSTEM, AND MARKER - A content of an image corresponding to a recognition object in a captured image is identified, and first identification information and second identification information are acquired. Based on the first identification information and second identification information, one display object is determined from a plurality of virtual objects stored in advance in a predetermined storage medium. Then, an image of the determined virtual object captured by a virtual camera is displayed on a predetermined display section.12-06-2012
20120306919IMAGE PROCESSING APPARATUS, IMAGE PROCESSING METHOD, AND PROGRAM - Disclosed herein is an image processing apparatus including: an image processing part configured such that if an image taken of a user includes an image of the clothes worn by the user and making up a clothes region, if the image of the clothes is to be replaced with an image of virtual clothes prepared beforehand and making up a virtual clothes region, and if the clothes region overlaid with the virtual clothes region has a protruded region protruding from the virtual clothes region, then the image processing part performs a process of making the virtual clothes region coincide with the clothes region.12-06-2012
20110304648MOBILE TERMINAL AND METHOD FOR OPERATING THE MOBILE TERMINAL - A mobile terminal and a method for operating the mobile terminal are provided. The method senses touch of the mobile terminal in a predetermined mode and senses movement of the mobile terminal upon determining that the mobile terminal has been gripped based on the touch and then changes the mode of the mobile terminal according to the predetermined mode and the movement of the mobile terminal. This method enhances user convenience since it is possible to change the mode of the mobile terminal through movement of the mobile terminal while the mobile terminal is gripped.12-15-2011
20110304647INFORMATION PROCESSING PROGRAM, INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING SYSTEM, AND INFORMATION PROCESSING METHOD - An information processing section of a game apparatus executes a program which includes: acquiring a real world image; setting the most recent view matrix of a virtual camera based on a detected marker S12-15-2011
20100045701AUTOMATIC MAPPING OF AUGMENTED REALITY FIDUCIALS - Systems and methods expedite and improve the process of configuring an augmented reality environment. A method of pose determination according to the invention includes the step of placing at least one synthetic fiducial in a real environment to be augmented. A camera, which may include apparatus for obtaining directly measured camera location and orientation (DLMO) information, is used to acquire an image of the environment. The natural and synthetic fiducials are detected, and the pose of the camera is determined using a combination of the natural fiducials, the synthetic fiducial if visible in the image, and the DLMO information if determined to be reliable or necessary. The invention is not limited to architectural environments, and may be used with instrumented persons, animals, vehicles, and any other augmented or mixed reality applications.02-25-2010
20120038671USER EQUIPMENT AND METHOD FOR DISPLAYING AUGMENTED REALITY WINDOW - A user equipment to display an augmented reality (AR) window includes a display unit to display an image and AR windows corresponding to objects included in the image, and a control unit to determine an arrangement pattern of the AR windows by adjusting at least one of a size, a display location, a display pattern, and a color of the AR windows and to control the display unit to display the AR windows in the determined arrangement pattern, together with the objects. A method includes detecting the object in the image, generating the AR window corresponding to the object, determining an arrangement pattern of the AR window based on an adjustment of an attribute, and displaying the AR window in the determined arrangement pattern along with the object.02-16-2012
20120098859APPARATUS AND METHOD FOR PROVIDING AUGMENTED REALITY USER INTERFACE - An apparatus and method for providing an augmented reality (AR) user interface. The method includes acquiring an image containing at least one object; recognizing the object from the acquired image; detecting AR information related to the recognized object; classifying the detected AR information into groups according to specific property information; generating a user interface that displays the groups of AR information separately.04-26-2012
20110316880METHOD AND APPARATUS PROVIDING FOR ADAPTATION OF AN AUGMENTATIVE CONTENT FOR OUTPUT AT A LOCATION BASED ON A CONTEXTUAL CHARACTERISTIC - An apparatus may include a contextual characteristic determiner configured to determine a contextual characteristic of a location. A sensory device may collect sensed data and location information which is used to determine the contextual characteristic of the location. The sensed data, contextual characteristic and/or location information may be compiled into a database by a database compiler. Further, an ambient content package sharer may request and/or provide sensed data and/or determined contextual characteristics to other devices or the database compiler for inclusion in the database. An augmentative content adaptor may thereby provide for adaptation of an augmentative content for output at the location based on the contextual characteristic. Contextual characteristics may include audible contextual characteristics and visual contextual characteristics.12-29-2011
20120044264APPARATUS AND METHOD FOR PROVIDING AUGMENTED REALITY - An apparatus and method for providing augmented reality (AR) includes acquiring an image of a real world including a first object, setting the first object as a reference object, acquiring a photographing position, a photographing direction, and a distance value between the reference object and the photographing position, acquiring map information corresponding to the photographing position and a photographing direction, mapping the reference object to the map information by using the acquired distance value, detecting AR information of the objects from the map information, and outputting the detected AR information.02-23-2012
20120044263TERMINAL DEVICE AND METHOD FOR AUGMENTED REALITY - A terminal device and method for augmented reality (AR) is disclosed herein. The terminal device including: a communication unit to communicate with an object server, the object server storing images of a plurality of objects and property information corresponding to levels of each object; an object recognition unit to recognize an object contained in the image; and a control unit to receive property information from the object server corresponding to a pixel value of the recognized object from and to combine the received property information and the recognized object.02-23-2012
20120001938METHODS, APPARATUSES AND COMPUTER PROGRAM PRODUCTS FOR PROVIDING A CONSTANT LEVEL OF INFORMATION IN AUGMENTED REALITY - An apparatus for providing a constant level of information in an augmented reality environment may include a processor and memory storing executable computer program code that cause the apparatus to at least perform operations including determining a first number of points of interest associated with a first set of real world objects of a current location(s). The first set of real world objects is currently displayed. The computer program code may further cause the apparatus to determine whether the first number is below a predetermined threshold and may increase a view range of a device to display a second set of real world objects. The view range may be increased in order to increase the first number to a second number of points of interest that corresponds to the threshold, based on determining that the first number is below the threshold. Corresponding methods and computer program products are also provided.01-05-2012
20120001939METHODS, APPARATUSES AND COMPUTER PROGRAM PRODUCTS FOR AUTOMATICALLY GENERATING SUGGESTED INFORMATION LAYERS IN AUGMENTED REALITY - An apparatus for automatically suggesting information layers in augmented reality may include a processor and memory storing executable computer program code that cause the apparatus to at least perform operations including providing layers of information relating to virtual information corresponding to information indicating a current location of the apparatus. The computer program code may further cause the apparatus to determine that a layer(s) of information is enabled to provide virtual information for display. The virtual information corresponds to locations of real world objects in or proximate to the current location. The computer program code may further cause the apparatus to determine other information layers associates with content for the current location based on the number of items virtual information for the enabled layer being below a threshold and automatically suggest one or more other layers of information for selection. Corresponding methods and computer program products are also provided.01-05-2012
20120062595METHOD AND APPARATUS FOR PROVIDING AUGMENTED REALITY - There is provided a method of providing Augmented Reality (AR) using the relationship between objects in a server that is accessible to at least one terminal through a wired/wireless communication network, including: recognizing a first object-of-interest from first object information received from the terminal; detecting identification information and AR information about related objects associated with the first object-of-interest, and storing the identification information and AR information about the related objects; recognizing, when receiving second object information from the terminal, a second object-of-interest using the identification information about the related objects; and detecting AR information corresponding to the second object-of-interest from the AR information about the related objects, and transmitting the detected AR information to the terminal.03-15-2012
20110096093IMAGE PROCESSING DEVICE, IMAGE PROCESSING METHOD AND PROGRAM - There is provided an image processing device, including: a data storage unit storing feature data indicating a feature of appearance of an object; an environment map generating unit for generating an environment map representing a position of one or more objects existing in a real space based on an input image obtained by imaging the real space using an imaging device and the feature data stored in the data storage unit; and an output image generating unit for generating an output image obtained by erasing an erasing target object from the input image based on a position of the erasing target object specified out of objects present in the input image represented in the environment map and a position of the imaging device.04-28-2011
20110063324IMAGE PROJECTION SYSTEM, IMAGE PROJECTION METHOD, AND IMAGE PROJECTION PROGRAM EMBODIED ON COMPUTER READABLE MEDIUM - An image projection system includes a projector to project a projected image onto a drawing surface of a whiteboard, a drawn-image detecting portion to detect a drawn image which is drawn on the drawing surface of the whiteboard while the projector is projecting the projected image, and a modification portion which, in the case where the drawn image is detected, to specify from the projected image which is projected onto the projection surface a part including at least a drawn image part overlapping the detected drawn image, and modify the specified part of the projected image, on the basis of the detected drawn image, so as to emphasize or cancel the drawn image.03-17-2011
20120206485AR GLASSES WITH EVENT AND SENSOR TRIGGERED USER MOVEMENT CONTROL OF AR EYEPIECE FACILITIES - This disclosure concerns an interactive head-mounted eyepiece with an integrated processor for handling content for display and an integrated image source for introducing the content to an optical assembly through which the user views a surrounding environment and the displayed content, wherein the eyepiece includes event and sensor triggered user movement control.08-16-2012
20120007886INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING METHOD, AND PROGRAM - Disclosed herein is an information processing apparatus configured to edit video, including: a computer graphics image generation block configured to execute realtime rendering of a computer graphics animation by use of a timeline time with a fraction permitted for a seconds value that is a minimum unit as a parameter indicative of a temporal position of the computer graphics animation; an operation input block configured to enter a user operation for specifying progression of the computer graphics animation; and a control block configured to control the computer graphics image generation block in response to the user operation entered through the operation input block.01-12-2012
20120007885System and Method for Viewing Golf Using Virtual Reality - A system and method for viewing artificial reality (AR) messages on a golf course, where the messages are geo-referenced artificial reality words or symbols to indicate distances, tips, targets or other information to the golfer. Typically, the AR messages are geo-referenced to a fixed location on the golf hole, such as a hazard or green. Using the spectator's chosen location as the viewing origin, an artificial reality message or object is inserted into the golfer's perspective view of the golf hole. Outings and contests can be held even if the matches are separated by hours or days, and outcomes and information published to select groups or individuals.01-12-2012
20120007884APPARATUS AND METHOD FOR PLAYING MUSICAL INSTRUMENT USING AUGMENTED REALITY TECHNIQUE IN MOBILE TERMINAL - An apparatus and a method related to an application of a mobile terminal using an augmented reality technique capture an image of a musical instrument directly drawn/sketched by a user to recognize the particular relevant musical instrument, and provide an effect of playing the musical instrument on the recognized image as if a real instrument were being played. The apparatus preferably includes an image recognizer and a sound source processor. The image recognizer recognizes a musical instrument on an image through a camera. The sound source processor outputs the recognized musical instrument on the image on a display unit to use the same for a play, and matches the musical instrument play on the image to a musical instrument play output on the display unit.01-12-2012
20120069051Method and System for Compositing an Augmented Reality Scene - Disclosed are systems and methods for compositing an augmented reality scene, the methods including the steps of extracting, by an extraction component into a memory of a data-processing machine, at least one object from a real-world image detected by a sensing device; geometrically reconstructing at least one virtual model from at least one object; and compositing AR content from at least one virtual model in order to augment the AR content on the real-world image, thereby creating AR scene. Preferably, the method further includes; extracting at least one annotation from the real-world image into the memory of the data-processing machine for modifying at least one virtual model according to at least one annotation. Preferably, the method further includes: interacting with AR scene by modifying AR content based on modification of at least one object and/or at least one annotation in the real-world image.03-22-2012
20120113140Augmented Reality with Direct User Interaction - Augmented reality with direct user interaction is described. In one example, an augmented reality system comprises a user-interaction region, a camera that captures images of an object in the user-interaction region, and a partially transparent display device which combines a virtual environment with a view of the user-interaction region, so that both are visible at the same time to a user. A processor receives the images, tracks the object's movement, calculates a corresponding movement within the virtual environment, and updates the virtual environment based on the corresponding movement. In another example, a method of direct interaction in an augmented reality system comprises generating a virtual representation of the object having the corresponding movement, and updating the virtual environment so that the virtual representation interacts with virtual objects in the virtual environment. From the user's perspective, the object directly interacts with the virtual objects.05-10-2012
20120056898IMAGE PROCESSING DEVICE, PROGRAM, AND IMAGE PROCESSING METHOD - There is provided an image processing device including: a recognition unit configured to recognize a plurality of users present in an input image captured by an imaging device; an information acquisition unit configured to acquire display information to be displayed in association with each user recognized by the recognition unit; a weight determination unit configured to determine a weight of each user recognized by the recognition unit; and an output image generation unit configured to generate an output image by determining a display position of the display information associated with each user on the basis of the weight of each user determined by the weight determination unit and overlaying the display information on the input image in the determined display position.03-08-2012
20120026191METHOD FOR DISPLAYING AUGMENTATION INFORMATION IN AN AUGMENTED REALITY SYSTEM - A method for displaying augmentation information in an augmented reality system (02-02-2012
20120026192APPARATUS AND METHOD FOR PROVIDING AUGMENTED REALITY (AR) USING USER RECOGNITION INFORMATION - An apparatus and method for providing Augmented Reality (AR) using user recognition information includes: acquiring user recognition information for a real object; selecting a virtual object that is to be mapped to the acquired user recognition information; and adding the user recognition information as mapping information for the selected virtual object.02-02-2012
20120026190METHODS AND SYSTEMS FOR ATTITUDE DIFFERENTIATION IN ENHANCED VISION IMAGES - Methods and systems are provided for displaying information on a flight deck display onboard an aircraft. An exemplary method comprises obtaining image data for an imaging region and displaying, on the display device, a graphical representation of a first portion of the image data using a first visually distinguishable characteristic and a graphical representation of a second portion of the image data using a second visually distinguishable characteristic. The first portion corresponds to a portion of the image data above an attitude reference and the second portion corresponds to a portion of the image data below the attitude reference, and the first visually distinguishable characteristic and the second visually distinguishable characteristic are different.02-02-2012
20120105473LOW-LATENCY FUSING OF VIRTUAL AND REAL CONTENT - A system that includes a head mounted display device and a processing unit connected to the head mounted display device is used to fuse virtual content into real content. In one embodiment, the processing unit is in communication with a hub computing device. The processing unit and hub may collaboratively determine a map of the mixed reality environment. Further, state data may be extrapolated to predict a field of view for a user in the future at a time when the mixed reality is to be displayed to the user. This extrapolation can remove latency from the system.05-03-2012
20120062596PROVIDING AUGMENTED REALITY INFORMATION - A system, method and computer program product for providing augmented reality information is disclosed. The method includes capturing an image of a set of items with an image capturing component coupled to a network-enabled computing device associated with a user identifier. The captured image is processed to identify each item of the set of items while a predefined list of user's preferences is retrieved using the user identifier. For each identified item, checking is made if the item matches a condition related to the predefined list of user's preferences. And based on the matching result, item information is conveyed to the network-enabled computing device and overlaid on the image.03-15-2012
20120154440AUGMENTED 2D REPRESENTATION OF MOLECULAR STRUCTURES - A method, computing apparatus, and computer readable medium, for augmenting and displaying a 2D-representation of a molecular structure, or assemblage of molecular structures, augmented with various graphical elements. The technology further provides various functionality that permits a user to define the form and number of types of graphical elements to apply to a 2-D structure.06-21-2012
20120154441AUGMENTED REALITY DISPLAY SYSTEM AND METHOD FOR VEHICLE - A system includes a head front display device, an eye position tracking camera to track movement of a driver's irises, a front view camera to take a picture of a front view of the driver, a head front display device controller to implement at least one of an angle change, forward movement, backward movement, upward movement, downward movement, leftward movement, and rightward movement of the head front display device, an image adjuster to adjust an object displayed on the head front display device in association with an object of an actual view seen through the front window of the vehicle based on positions of the driver's irises obtained through the eye position tracking camera and an image of the front view obtained by the front view camera, and a display unit controlled by the image adjuster and configured to display information on the head front display device.06-21-2012
20120154439APPARATUS AND METHOD FOR OPERATING MULTIPLE OBJECT OF AUGMENTED REALITY SYSTEM - An apparatus for operating multiple objects in an augmented reality system converts a reference point recognized in an input image into copyable basic data, then copies the basic data to each position of a screen where the image is to be output, and then augments an object by using a copy of the basic data as the reference point.06-21-2012
20120105474METHOD AND APPARATUS FOR DETERMINING LOCATION OFFSET INFORMATION - An approach is provided for determining location offset information. A correction manager determines to present, at a device, a location-based display including one or more representations of one or more location-based features. Next, the correction manager receives an input for specifying offset information for at least one of the one or more representations with respect to the location-based display. Then, the correction manager determines to present the one or more representations in the location-based display based, at least in part, on the offset information.05-03-2012
20120105477APPARATUS AND METHOD FOR DISPLAYING DATA IN PORTABLE TERMINAL - An apparatus and method for displaying data in a portable terminal to control data displayed on a projection beam screen. The apparatus includes a beam projector unit for displaying data on a beam screen, at least one camera unit for capturing the data displayed on the beam screen, and a controller for extracting a differential region between data to be displayed on the beam screen and the displayed data captured by the camera unit and displaying the data on the beam screen according to a display screen region excluding the differential region therefrom.05-03-2012
20120105476Range of Focus in an Augmented Reality Application - A computer-implemented augmented reality method includes receiving one or more indications, entered on a mobile computing device by a user of the mobile computing device, of a distance range for determining items to display with an augmented reality application, the distance range representing geographic distance from a base point where the mobile computing device is located. The method also includes selecting, from items in a computer database, one or more items that are located within the distance range from the mobile computing device entered by the user, and providing data for representing labels for the selected one or more items on a visual display of the mobile computing device, the labels corresponding to the selected items, and the items corresponding to geographical features that are within the distance range as measure from the mobile computing device.05-03-2012
20110090253AUGMENTED REALITY LANGUAGE TRANSLATION SYSTEM AND METHOD - A real-time augmented-reality machine translation system and method are provided herein.04-21-2011
20110102459AUGMENTED REALITY GAMING VIA GEOGRAPHIC MESSAGING - Geographic gaming via a scalable, wireless geographic broadcast protocol enables multiplayer gaming between communication devices without relying on traditional network elements. Games can be fully distributed over an ad hoc network of mobile communications devices. The scalable nature of the wireless geographic broadcast protocol enables multiplayer games to function equally well in both remote areas with no or little network service and in crowded areas containing both game players and other users of mobile communications devices. Wireless geographic broadcast messages distributed among multiplayer game participants can be used to control gameplay features and/or game elements of multiplayer games. Embodiments include simulated artillery battles, simulated throw and catch games, and simulated reconnaissance elements.05-05-2011
20110102460PLATFORM FOR WIDESPREAD AUGMENTED REALITY AND 3D MAPPING - A client device sends the following data to the servers: still frames from captured video and in some embodiments other data such as GPS coordinates, compass reading, and accelerometer data. The servers break down each frame into feature points and match those feature points to existing point cloud data to determine client device's point of view (POV). The servers send the resulting information back to the client device, which uses the POV information to render augmentation content on a video stream. Information sent by client devices to the server can be used to augment the feature-point cloud.05-05-2011
20120120101AUGMENTED REALITY SYSTEM FOR SUPPLEMENTING AND BLENDING DATA - A system, method, and computer program product for automatically combining computer-generated imagery with real-world imagery in a portable electronic device by retrieving, manipulating, and sharing relevant stored videos, preferably in real time. A video is captured with a hand-held device and stored. Metadata including the camera's physical location and orientation is appended to a data stream, along with user input. The server analyzes the data stream and further annotates the metadata, producing a searchable library of videos and metadata. Later, when a camera user generates a new data stream, the linked server analyzes it, identifies relevant material from the library, retrieves the material and tagged information, adjusts it for proper orientation, then renders and superimposes it onto the current camera view so the user views an augmented reality.05-17-2012
20120162254OBJECT MAPPING TECHNIQUES FOR MOBILE AUGMENTED REALITY APPLICATIONS - Techniques are disclosed that involve mobile augmented reality (MAR) applications in which users (e.g., players) may experience augmented reality (e.g., altered video or audio based on a real environment). Such augmented reality may include various alterations. For example, particular objects may be altered to appear differently. Such alterations may be based on stored profiles and/or user selections. Further features may also be employed. For example, in embodiments, characters and/or other objects may be sent (or caused to appear) to other users in other locations. Also, a user may leave a character at another location and receive an alert when another user/player encounters this character. Also, characteristics of output audio may be affected based on events of the MAR application.06-28-2012
20100245387SYSTEMS AND METHODS FOR COMBINING VIRTUAL AND REAL-TIME PHYSICAL ENVIRONMENTS - Systems, methods and structures for combining virtual reality and real-time environment by combining captured real-time video data and real-time 3D environment renderings to create a fused, that is, combined environment, including capturing video imagery in RGB or HSV/HSV color coordinate systems and processing it to determine which areas should be made transparent, or have other color modifications made, based on sensed cultural features, electromagnetic spectrum values, and/or sensor line-of-sight, wherein the sensed features can also include electromagnetic radiation characteristics such as color, infra-red, ultra-violet light values, cultural features can include patterns of these characteristics, such as object recognition using edge detection, and whereby the processed image is then overlaid on, and fused into a 3D environment to combine the two data sources into a single scene to thereby create an effect whereby a user can look through predesignated areas or “windows” in the video image to see into a 3D simulated world, and/or see other enhanced or reprocessed features of the captured image.09-30-2010
20120127203MIXED REALITY DISPLAY - An image processing device includes capture optics for capturing light-field information for a scene, and a display unit for providing a display of the scene to a viewer. A tracking unit tracks relative positions of a viewer's head and the display and the viewer's gaze to adjust the display based on the relative positions and to determine a region of interest on the display. A virtual tag location unit determines locations to place one or more virtual tags on the region of interest, by using computational photography of the captured light-field information to determine depth information of an object in the region of interest. A mixed-reality display is produced by combining display of the virtual tags with the display of objects in the scene.05-24-2012
20120127202SYSTEM AND METHOD FOR PROVIDING DELIVERY INFORMATION - Provided are a system and a method for logistics delivery using augmented reality, which are used for efficient logistics delivery through generation of an optimal delivery route and improvement of a delivery rate. The present invention is constituted by a mobile terminal which is used by a delivery man, a logistics information system, and a mobile communication company's server. When the delivery man scans an image in a delivery scheduled area by using the mobile terminal, a mobile apparatus displays detailed delivery information regarding a delivery point in an area for each set area unit and whether a customer is positioned in the vicinity of an address and generates the optimal delivery route.05-24-2012
20120127201APPARATUS AND METHOD FOR PROVIDING AUGMENTED REALITY USER INTERFACE - An apparatus and method for providing an augmented reality user interface are provided. The method may be as follows. An augmented reality image is stored. The augmented reality is obtained by overlapping an image with augmented reality information, which is related to at least one object included in the image. The stored augmented reality image and an augmented reality image, which is captured in real time, are output through a divided display user interface at the same time.05-24-2012
20120299963METHOD AND SYSTEM FOR SELECTION OF HOME FIXTURES - A system for selecting one of a plurality of home fixtures, includes a processor, a memory including a database, an input device, and a display. The processor executes instructions to present information corresponding to the home fixtures. The memory is in communication with the processor. The database contains the information corresponding to the home fixtures, such as materials, models, category, accessories, 360° views, high definition images and specifications. The database also contains information corresponding to a context, such as a counter top surface, that may be associated with the home fixtures. The input device permits a user to select the one of the home fixtures to be presented. The display is in communication with the processor, and is configured to show the information corresponding to the home fixtures. The system may also include a camera that permits acquisition of information corresponding to a custom context for the home fixture.11-29-2012
20120162258METHOD, SYSTEM, AND COMPUTER-READABLE RECORDING MEDIUM FOR PROVIDING INFORMATION ON AN OBJECT USING VIEWING FRUSTUMS - The present invention relates to a method for providing information on an object by using viewing frustums. The method includes the steps of: (a) specifying at least two viewing frustums whose vertexes are visual points of respective user terminals; and (b) calculating a degree of interest in the object by referring to the object commonly included in both a first viewing frustum whose vertex is a visual point of a first user terminal and a second one whose vertex is a visual point of a second user terminal.06-28-2012
20120162255TECHNIQUES FOR MOBILE AUGMENTED REALITY APPLICATIONS - Techniques are disclosed that involve mobile augmented reality (MAR) applications in which users (e.g., players) may experience augmented reality. Further, the actual geographical position of MAR application objects (e.g., players, characters, and other objects) may be tracked, represented, and manipulated. Accordingly, MAR objects may be tracked across multiple locations (e.g., multiple geographies and player environments). Moreover, MAR content may be manipulated and provided to the user based on a current context of the user.06-28-2012
20120212509Providing an Interactive Experience Using a 3D Depth Camera and a 3D Projector - An interaction system is described which uses a depth camera to capture a depth image of a physical object placed on, or in vicinity to, an interactive surface. The interaction system also uses a video camera to capture a video image of the physical object. The interaction system can then generate a 3D virtual object based on the depth image and video image. The interaction system then uses a 3D projector to project the 3D virtual object back onto the interactive surface, e.g., in a mirrored relationship to the physical object. A user may then capture and manipulate the 3D virtual object in any manner. Further, the user may construct a composite model based on smaller component 3D virtual objects. The interaction system uses a projective texturing technique to present a realistic-looking 3D virtual object on a surface having any geometry.08-23-2012
20120212508PROVIDING A CORRECTED VIEW BASED ON THE POSITION OF A USER WITH RESPECT TO A MOBILE PLATFORM - A mobile platform displays a corrected view of an image and/or augmented reality (AR) data based on the position of the user with respect to the mobile platform. The corrected view is produced by determining a position of the user with respect to the mobile platform using an image of the user from a backward facing camera. The display information is provided in the form of an image or video frame of the environment captured with a forward facing camera or AR data. The position of the user with respect to the mobile platform is used to determine the portion of the display information to be displayed that is aligned with the line of sight between the user and the mobile platform so that the displayed information is aligned with the real world environment.08-23-2012
20120133676STORAGE MEDIUM HAVING STORED THEREON IMAGE PROCESSING PROGRAM, IMAGE PROCESSING APPARATUS, IMAGE PROCESSING SYSTEM, AND IMAGE PROCESSING METHOD - At least one virtual object for which a predetermined color is set is placed in a virtual world. In a captured image captured by a real camera, at least one pixel corresponding to the predetermined color is detected, using color information including at least one selected from the group including RGB values, a hue, a saturation, and a brightness of each pixel of the captured image. When the pixel corresponding to the predetermined color has been detected, a predetermined process is performed on the virtual object for which the predetermined color is set. An image of the virtual world where at least the virtual object is placed is displayed on a display device.05-31-2012
20120313969PROVIDING A SIMULATION OF WEARING ITEMS SUCH AS GARMENTS AND/OR ACCESSORIES - A user may simulate wearing real-wearable items, such as virtual garments and accessories. A virtual-outfitting interface may be provided for presentation to the user. An item-search/selection portion within the virtual-outfitting interface may be provided. The item-search/selection portion may depict one or more virtual-wearable items corresponding to one or more real-wearable items. The user may be allowed to select at least one virtual-wearable item from the item-search/selection portion. A main display portion within the virtual-outfitting interface may be provided. The main display portion may include a composite video feed that incorporates a video feed of the user and the selected at least one virtual-wearable item such that the user appears to be wearing the selected at least one virtual-wearable item in the main display portion.12-13-2012
20120249586METHOD AND APPARATUS FOR PROVIDING COLLABORATION BETWEEN REMOTE AND ON-SITE USERS OF INDIRECT AUGMENTED REALITY - An apparatus for enabling provision of collaboration of remote and on-site users of indirect augmented reality may include at least one processor and at least one memory including computer program code. The at least one memory and the computer program code may be configured, with the processor, to cause the apparatus to perform at least selecting a stored image including a virtual representation of a real world location based on position information and orientation information of a mobile terminal, causing provision of first visual content to be displayed at the mobile terminal based on the virtual representation, causing provision of second visual content to be displayed at a remote device based on the virtual representation, and enabling collaborative interaction between a user of the mobile terminal and a user of the remote device with respect to the first visual content and the second visual content. A corresponding method and computer program product are also provided.10-04-2012
20120176409Computer-Readable Storage Medium Having Image Processing Program Stored Therein, Image Processing Apparatus, Image Processing System, and Image Processing Method - A storage medium has stored therein an image processing program that causes a computer of an image processing apparatus, which is connected to a real camera for taking an image of a real space and a display device that allows the real space to be viewed on a display area thereof, to operate as real space image obtaining means, specific object detection means, calculation means, setting means, identification means, event providing means, virtual space image generation means, and display control means.07-12-2012
20120176411GPS-Based Location and Messaging System and Method - A system and method for viewing a target in a background from a user's perspective. In one form, the views are selectable by the user on, for example, a GPS equipped cell phone, to include a view from the participant's position, zoom, pan, and tilt views, or views from another geographic location, giving increased situational awareness and identification of the target. Other information can be conveyed, such as messages or advertisements, on a billboard, which may be a geo-referenced area on or near the target. Preferably, an orientation mechanism shows when the device is correctly pointed to a target.07-12-2012
20120075345METHOD, TERMINAL AND COMPUTER-READABLE RECORDING MEDIUM FOR PERFORMING VISUAL SEARCH BASED ON MOVEMENT OR POSITION OF TERMINAL - The present invention includes a method for performing visual search based on a movement and/or an angular position of a terminal. The method includes the steps of: (a) sensing a movement and/or an angular position of a terminal by using at least one of sensors; (b) determining whether a triggering event occurs or not by referring to at least one of the sensed movement and the sensed angular position of the terminal; and (c) if the triggering event occurs, allowing visual search to be performed for at least one of objects included in an output image displayed on the terminal at the time of the occurrence of the triggering event; wherein the output image is generated in a form of augmented reality by combining an image inputted through the terminal in real time with information relevant thereto.03-29-2012
20120075344INFORMATION DISPLAY DEVICE - To provide an information display device for displaying at least one item of display target information in respective screen element, receiving, while catalog display takes place, an instruction operation which is made utilizing display target information shown in the screen elements displayed as a catalog, and executing a process based on the instruction operation.03-29-2012
20120075343AUGMENTED REALITY (AR) SYSTEM AND METHOD FOR TRACKING PARTS AND VISUALLY CUEING A USER TO IDENTIFY AND LOCATE PARTS IN A SCENE - An AR system both identifies and visually tracks parts for a user by maintaining spatial awareness of the user's pose and provides instructions to the user for the use of those parts. Tracking the identified parts, both inside and outside the current Field of View (FOV), and any missing parts for use with the current instruction improves the effectiveness and efficiency of both novice and experienced user alike.03-29-2012
20120075342AUGMENTING IMAGE DATA BASED ON RELATED 3D POINT CLOUD DATA - Embodiments of the invention describe processing a first image data and 3D point cloud data to extract a first planar segment from the 3D point cloud data. This first planar segment is associated with an object included in the first image data. A second image data is received, the second image data including the object captured in the first image data. A second planar segment related to the object is generated, where the second planar segment is geometrically consistent with the object as captured in the second image data. This planar segment is generated based, at least in part, on the second image data, the first image data and the first planar segment.03-29-2012
20120075341METHODS, APPARATUSES AND COMPUTER PROGRAM PRODUCTS FOR GROUPING CONTENT IN AUGMENTED REALITY - An apparatus for grouping content in an augmented reality environment may include a processor and memory storing executable computer code that cause the apparatus to at least perform operations including receiving a detection of real world objects, of a current location, that are currently displayed. The computer program code may further cause the apparatus to determine whether one or more of the real world objects are located along a line of direction and determine virtual objects that correspond to the real world objects located along the line. The computer program code may further cause the apparatus to display an item of visible indicia signifying a group, associated with the virtual objects, that is positioned so as to correspond to at least one of the real world objects located along the line. Corresponding methods and computer program products are also provided.03-29-2012
20120249587KEYBOARD AVATAR FOR HEADS UP DISPLAY (HUD) - In some embodiments, the invention involves using a heads up display (HUD) or head mounted display (HMD) to view a representation of a user's fingers with an input device communicatively connected to a computing device. The keyboard/finger representation is displayed along with the application display received from a computing device. In an embodiment, the input device has an accelerometer to detect tilting movement in the input device, and send this information to the computing device. An embodiment provides visual feedback of key or control actuation in the HUD/HMD display. Other embodiments are described and claimed.10-04-2012
20120249590SELECTIVE HAND OCCLUSION OVER VIRTUAL PROJECTIONS ONTO PHYSICAL SURFACES USING SKELETAL TRACKING - A head mounted device provides an immersive virtual or augmented reality experience for viewing data and enabling collaboration among multiple users. Rendering images in a virtual or augmented reality system may include performing operations for capturing an image of a scene in which a virtual object is to be displayed, recognizing a body part present in the captured image, and adjusting a display of the virtual object based upon the recognized body part. The rendering operations may also include capturing an image with a body mounted camera, capturing spatial data with a body mounted sensor array, recognizing objects within the captured image, determining distances to the recognized objects within the captured image, and displaying the virtual object on a head mounted display.10-04-2012
20120249592Situational Awareness Components of an Enhanced Vision System - A virtual sphere provided by an enhanced vision system includes synthetic imagery filling said virtual sphere and a common view window mapped to a dedicated position within the synthetic imagery. Imagery of the line of sight of a user is displayed in the common view window. By providing the common view window, visual communication between all users may be possible. By connecting a virtual user to the enhanced vision system and by displaying the imagery for the line of sight of the virtual user in the common view window, the workload of a human operator may be reduced and the time line of actions may be shortened. The enhanced vision system of the present invention may be used, but is not limited to, in a military aircraft to enhance the situational awareness of the flight crew.10-04-2012
20120249588Augmented Reality Data Center Visualization - Datacenter datasets and other information are visually displayed in an augmented reality view using a portable device. The visual display of this information is presented along with a visual display of the actual datacenter environment. The combination of these two displays allows installers and technicians to view instructions or other data that are visually correlated to the environment in which they are working.10-04-2012
20120249589Method for the Output of Graphic Driving Indications - The method outputs graphic driving indications for assisting a motor vehicle driver in a driving maneuver. The graphic driving indications are displayed by a head-up display. A first graphic driving indication is in the form of a traffic lane change indication pointing out to the driver the direction from a traffic lane traveled at the beginning of the maneuver to a desired traffic lane. A second graphic driving indication is output in the form of a contact-analog traffic lane marking graphically emphasizing the desired traffic lane relative to other traffic lanes. A third graphic driving indication is in the form of a contact-analog maneuvering impulse including a driving funnel originating from the desired traffic lane and corresponding to the driving maneuver. A fourth graphic driving indication is a symbolic maneuvering display indication which symbolically displays the beginning driving maneuver after the vehicle enters into the driving funnel.10-04-2012
20120223967Dynamic Perspective Video Window - Systems and methods are disclosed for generating an image for a user based on an image captured by a scene-facing camera or detector. The user's position relative to a component of the system is determined, and the image captured by the scene-facing detector is modified based on the user's position. The resulting image represents the scene as seen from the perspective of the user. The resulting image may be further modified by augmenting the image with additional images, graphics, or other data.09-06-2012
20120223966TERMINAL TO PROVIDE AUGMENTED REALITY - A terminal that displays augmented reality, allows tag information of various recognized objects to be chosen and displayed in a more efficient manner. The terminal that displays augmented reality allows the terminal to enter into an augmented reality mode is executed, and provides location information about the terminal and the recognized objects, a category selection ability to select categories of tag information, and a tag information control layout for displaying the selected tag information and different stages on a display in a more efficient manner.09-06-2012
20120256954Interference Based Augmented Reality Hosting Platforms - Interference-based augmented reality hosting platforms are presented. Hosting platforms can include networking nodes capable of analyzing a digital representation of scene to derive interference among elements of the scene. The hosting platform utilizes the interference to adjust the presence of augmented reality objects within an augmented reality experience. Elements of a scene can constructively interfere, enhancing presence of augmented reality objects; or destructively interfere, suppressing presence of augmented reality objects.10-11-2012
20120256956DISPLAY CONTROL DEVICE, DISPLAY CONTROL METHOD, AND PROGRAM - Aspects of the present invention include a display control device comprising a determining unit configured to determine an orientation of a real object in a real space image. The device may also comprise a control unit configured to select between first and second orientations of a virtual object based on the real object orientation, one of the first or second virtual object orientations aligning the virtual object with the orientation of the real object, and output an image of the virtual object based on the selected orientation, for display on an associated display device.10-11-2012
20120256955SYSTEM AND METHOD FOR ENABLING AUGMENTED REALITY IN REPORTS - In accordance with various embodiments of the present invention, the electronic device checks whether a viewing application capable of rendering augmented reality (AR) is present in the electronic device. If such a viewing application is present, the viewing application identifies any augmented reality (AR) markers present in the physical document. On identifying an AR marker in the document, the viewing application (or the electronic device) fetches relevant content from a predefined source and displays the relevant content to the user. The relevant content may be any of a chart, a 3D chart, a report, a video recording, and so forth. In case the viewing application is not already present, the electronic device downloads the viewing application from a predefined location.10-11-2012
20120256953SYSTEMS AND METHODS FOR MANAGING ERRORS UTILIZING AUGMENTED REALITY - Systems for managing errors utilizing augmented reality are provided. One system includes a transceiver configured to communicate with a systems management console, capture device for capturing environmental inputs, memory storing code comprising an augmented reality module, and a processor. The processor, when executing the code comprising the augmented reality module, is configured to perform the method below. One method includes capturing an environmental input, identifying a target device in the captured environmental input, and querying the systems management console regarding a status condition for the target device. Also provided are physical computer storage mediums including a computer program product for performing the above method.10-11-2012
20120188279Multi-Sensor Proximity-Based Immersion System and Method - Systems and methods for interaction with a virtual environment are disclosed. In some embodiments, a method comprises generating a virtual representation of a user's non-virtual environment, determining a viewpoint of a user in a non-virtual environment relative to a display, and displaying, with the display, the virtual representation in a spatial relationship with the user's non-virtual environment based on the viewpoint of the user.07-26-2012
20120081394SYSTEM AND METHOD OF IMAGE AUGMENTATION - A method of image augmentation for an image of a book includes capturing an image of the book, detecting at least a portion of at least one fiduciary marker of the book within the image, estimating placement of the book's spine based upon the detected portion of the fiduciary marker, hypothesising possible positions for edges of a rigid leaf being turned in the book based upon estimated placement of the spine, processing the book image to identify edges within the image, comparing elements of the identified edges with the hypothesised positions for edges of the rigid leaf, selecting one of the hypothesised positions that best coincides with the compared elements of the processed image as representative of the position of the rigid leaf being turned in the book, and augmenting the book image with a virtual graphic element arranged in accordance with the selected representative position of the rigid leaf.04-05-2012
20120081393APPARATUS AND METHOD FOR PROVIDING AUGMENTED REALITY USING VIRTUAL OBJECTS - A method for providing AR information, in which the method includes receiving virtual object setting information by a first terminal, in which the virtual object setting information including virtual object selection information and movement setting information; and transmitting a request to a server for uploading a virtual object onto a real-world image of a target location based on the virtual object setting information. An apparatus to provide AR information, in which the apparatus includes a communication unit to process signals received from a server and to transmit signals to the server; a display unit to display a real-world image of a target location; a manipulation unit to receive a user input signal; and a control unit to receive virtual object setting information and to request the server to upload a virtual object onto the real-world image of the target location.04-05-2012
20120081392ELECTRONIC DEVICE OPERATION ADJUSTMENT BASED ON FACE DETECTION - An electronic device and methods of use thereof are described. The electronic device having at least a front facing image capture device and a front facing display device arranged to display visual content is described. In one embodiment, the front facing camera can capture an image that can include at least image content. The image content can process in such a way that an operational state of the electronic device is modified in accordance with processed image content. In a particular embodiment, the modification of the current operating state can include aligning an orientation of visual content presented by the front facing display with a current facial orientation of a user.04-05-2012
20120262485SYSTEM AND METHOD OF INPUT PROCESSING FOR AUGMENTED REALITY - A method of input processing for augmented reality comprises the steps of capturing a video image, generating an augmented image layer for superposition over the captured video image, and for a region of the augmented image layer, detecting for each pixel in the region a property of a corresponding pixel in the captured video image, and mapping with a first mapping the property detected for each pixel of the region back to a reference two-dimensional array of pixels; and generating an input based upon the property values as mapped to the reference two-dimensional array of pixels.10-18-2012
20120229509SYSTEM AND METHOD FOR USER INTERACTION - A system is used for user interaction. When the system is in use, a signal source is configured to provide an image signal to a retina display unit. The retina display unit is configured to project the image signal provided by the signal source onto a user's retina such that the user visually senses a virtual interface. The image signal is displayed on the virtual interface. A camera unit is configured to capture the user's body motion. An identification-interaction unit is configured to determine an interactive operation command corresponding to the user's body motion and transmit the interactive operation command to the signal source.09-13-2012
20120229510STORAGE MEDIUM HAVING STORED THEREON INFORMATION PROCESSING PROGRAM, INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING SYSTEM, AND INFORMATION PROCESSING METHOD - An action of a first object placed in a virtual world is controlled on the basis of body state data output from a portable display apparatus. An action of a second object placed in the virtual world is controlled on the basis of touch position data based on a touch position on a touch panel provided on a surface of a display screen of the portable display apparatus. Then, a first image including at least a part of the first object and at least a part of the second object is generated, and the first image is displayed on the portable display apparatus.09-13-2012
20120229508THEME-BASED AUGMENTATION OF PHOTOREPRESENTATIVE VIEW - On a display configured to provide a photorepresentative view from a user's vantage point of a physical environment in which the user is located, a method is provided comprising receiving, from the user, an input selecting a theme for use in augmenting the photorepresentative view. The method further includes obtaining, optically and in real time, environment information of the physical environment and generating a spatial model of the physical environment based on the environment information. The method further includes identifying, via analysis of the spatial model, one or more features within the spatial model that each corresponds to one or more physical features in the physical environment. The method further includes based on such analysis, displaying, on the display, an augmentation of an identified feature, the augmentation being associated with the theme.09-13-2012
20100328344METHOD AND APPARATUS FOR AN AUGMENTED REALITY USER INTERFACE - An approach is provided for an augmented reality user interface. An image representing a physical environment is received. Data relating to a horizon within the physical environment is retrieved. A section of the image to overlay location information based on the horizon data is determined. Presenting of the location information within the determined section to a user equipment is initiated.12-30-2010
20080297535Terminal device for presenting an improved virtual environment to a user - The virtual environment terminal device comprises a user terminal device that interfaces the user to a computer controlled virtual reality system via the sense of touch by applying forces, vibrations, and/or motions to the user to emulate the sensations that the user would encounter in the environment emulated by the virtual reality system while also providing a three-dimensional image of the workspace by using a view splitting device to display the screens of two different monitors to respective ones of the user's two eyes.12-04-2008
20120320092METHOD AND APPARATUS FOR EXHIBITING MIXED REALITY BASED ON PRINT MEDIUM - An apparatus for exhibiting mixed reality based on a print medium includes a command identification module and a content reproduction module. The command identification module identifies a hand gesture of a user performed on a printer matter in the print medium to recognize a user input command corresponding to the hand gesture. The content reproduction module provides a digital content corresponding to the printed matter onto a display area on the print medium.12-20-2012
20120262486SYSTEM AND METHOD OF USER INTERACTION FOR AUGMENTED REALITY - A method of user interaction in augmented reality comprises the steps of capturing a video image of a scene, and for each pixel in at least a sub-region of the captured video, classifying the pixel as either a skin or non-skin pixel responsive to whether the colour of the pixel exceeds a predetermined threshold purity of red; and generating a mask based upon the classification of the pixels of the captured video, generating an augmentation image layer to superpose on the captured video image, and limiting a mode of combination of the captured video and the augmentation image layer, responsive to the mask.10-18-2012
20110018903AUGMENTED REALITY DEVICE FOR PRESENTING VIRTUAL IMAGERY REGISTERED TO A VIEWED SURFACE - An augmented reality device for inserting virtual imagery into a user's view of their physical environment. The device comprises: a see-through display device including a wavefront modulator; a camera for imaging a surface in the physical environment; and a controller. The controller is configured for capturing an image of the surface; determining the virtual imagery to be displayed at a predetermined position relative to the surface; determining a position of the surface relative to the augmented reality device; generating an image based on the virtual imagery and on the position of the surface relative to the augmented reality device; and displaying the generated image via the display device. Based on pixel depth information, the controller modulates the wavefront curvature of light emitted for each pixel so that the user sees the virtual imagery at the predetermined position relative to the surface regardless of changes in position of the user's eyes with respect to the display device.01-27-2011
20120268493INFORMATION PROCESSING SYSTEM FOR AUGMENTED REALITY - An example of an information processing system according to a present disclosure controls a single virtual object based on a plurality of pieces of real object information including posture information that indicates respective postures of a plurality of real objects in order to perform control of a virtual object in wider variations when controlling the virtual object using a so-called AR marker.10-25-2012
20120268494HEAD MOUNTED DISPLAY AND CONTROL METHOD THEREFOR - A viewpoint information calculation unit (10-25-2012
20120268492IMAGE PROCESSING APPARATUS, IMAGE PROCESSING METHOD, AND PROGRAM - Provided is an image processing apparatus including an image acquisition unit for acquiring an input image, a selection unit for selecting a recognition method of an object shown in the input image from a plurality of recognition methods, a recognition unit for recognising the object shown in the input image using the recognition method selected by the selection unit, and a display control unit for superimposing a virtual object that is associated with the object recognised by the recognition unit onto the input image and displaying the virtual object. The display control unit changes display of the virtual object according to the recognition method selected by the selection unit.10-25-2012
20120268490AUGMENTED REALITY EXTRAPOLATION TECHNIQUES - Augmented reality extrapolation techniques are described. In one or more implementations, an augmented-reality display is rendered based at least in part on a first basis that describes a likely orientation or position of at least a part of the computing device. The rendered augmented-reality display is updated based at least in part on data that describes a likely orientation or position of the part of the computing device that was assumed during the rendering of the augmented-reality display.10-25-2012
20120327117DIGITALLY ENCODED MARKER-BASED AUGMENTED REALITY (AR) - A system and method for markers with digitally encoded geographic coordinate information for use in an augmented reality (AR) system. The method provides accurate location information for registration of digital data and real world images within an AR system. The method includes automatically matching digital data within an AR system by utilizing a digitally encoded marker (DEM) containing world coordinate information system and mathematical offset of digital data and a viewing device. The method further includes encoding geographic coordinate information into markers (e.g., DEMs) and decoding the coordinate information into an AR system. Through use of the method and corresponding system, marker technology and the basis of geo-location technology can be combined into a geo-located marker, thereby solving the problem of providing accurate registration within an augmented reality.12-27-2012
20120327119USER ADAPTIVE AUGMENTED REALITY MOBILE COMMUNICATION DEVICE, SERVER AND METHOD THEREOF - The present disclosure provides an augmented reality mobile communication device and a method and system thereof, which can provide digital content items to individual users by reflecting a user preference associated with user circumstances in the provision of augmented reality. The augmented reality mobile communication device includes: a context inference unit that receives sensory information and predicts a user context regarding a user of a mobile communication device based on the sensory information; a transmission unit that transmits user context data to a server; a receiving unit that receives a personalized content item from the server, the personalized content item being generated based on user profile data and user preference data corresponding to user context data; and an augmented reality content renderer that overlays the received personalized content item on an image captured by a camera.12-27-2012
20120327116TOTAL FIELD OF VIEW CLASSIFICATION FOR HEAD-MOUNTED DISPLAY - Virtual images are located for display in a head-mounted display (HMD) to provide an augment reality view to an HMD wearer. Sensor data may be collected from on-board sensors provided on an HMD. Additionally, other day may be collected from external sources. Based on the collected sensor data and other data, the position and rotation of the HMD wearer's head relative to the HMD wearer's body and surrounding environment may be determined. After resolving the HMD wearer's head position, the HMD wearer's total field of view (TFOV) may be classified into regions. Virtual images may then be located in the classified TFOV regions to locate the virtual images relative to the HMD wearer's body and surrounding environment.12-27-2012
20120327118DISPLAY CONTROL APPARATUS, DISPLAY CONTROL METHOD AND PROGRAM - There is provided a display control apparatus including a display control unit that adds a virtual display to a real object containing a region associated with a time. The display control unit may add the virtual display to the region.12-27-2012
20120327120METHOD AND APPARATUS FOR CREATING VIRTUAL GRAFFITI IN A MOBILE VIRTUAL AND AUGMENTED REALITY SYSTEM - A method and system is provided for easily creating virtual graffiti that will be left for a particular device to view. During operation a device will be placed near a first point that is used to define a boundary for the virtual graffiti. The device will locate the first point, and use the point to define the boundary. The device will receive an image that is to be used as virtual graffiti, and will fit the image within or upon the boundary of the virtual graffiti. For example, the device may be consecutively placed near four points that will define a polygon to be used as the boundary for the virtual graffiti. An image will then be received, and the image will be fit within the polygon. Additionally, a device may create virtual graffiti from an image and a boundary.12-27-2012
20120327114DEVICE AND ASSOCIATED METHODOLOGY FOR PRODUCING AUGMENTED IMAGES - An augmented image producing device includes a processor programmed receive scene imagery from an imaging device and to identify at least one marker in the scene imagery. The processor then determines whether at least one marker corresponds to a known pattern and if the marker does correspond to a known pattern, the scene imagery is augmented with computer-generated graphics dispersed from a position of the at least one marker. Once the scene imagery is augmented, the computer-generated graphics are displayed on a display screen. The augmented scene imagery can then be used, for example, to actively engage audience members during an event.12-27-2012
20120327115Signal-enhancing Beamforming in an Augmented Reality Environment - An augmented reality environment allows interaction between virtual and real objects. Beamforming techniques are applied to signals acquired by an array of microphones to allow for simultaneous spatial tracking and signal acquisition from multiple users. Localization information such as from other sensors in the environment may be used to select a particular set of beamformer coefficients and resulting beampattern focused on a signal source. Alternately, a series of beampatterns may be used iteratively to localize the signal source in a computationally efficient fashion. The beamformer coefficients may be pre-computed.12-27-2012
20120092373METHOD FOR PROVIDING INFORMATION ON OBJECT WHICH IS NOT INCLUDED IN VISUAL FIELD OF TERMINAL DEVICE, TERMINAL DEVICE AND COMPUTER READABLE RECORDING MEDIUM - The present invention relates to a method for providing information on an object excluded in a visual field of a terminal in a form of augmented reality (AR) by using an image inputted to the terminal and information related thereto. The method includes the steps of: (a) specifying the visual field of the terminal corresponding to the inputted image by referring to at least one piece of information on a location, a displacement and a viewing angle of the terminal; (b) searching an object(s) excluded in the visual field of the terminal; and (c) displaying guiding information on the searched object(s) with the inputted image in a form of the augmented reality; wherein the visual field is specified by a viewing frustum whose vertex corresponds to the terminal.04-19-2012
20120092372METHOD FOR PROVIDING INFORMATION ON OBJECT WITHIN VIEW OF TERMINAL DEVICE, TERMINAL DEVICE FOR SAME AND COMPUTER-READABLE RECORDING MEDIUM - The present invention relates to a method for providing information on an object included in a visual field of a terminal in a form of augmented reality (AR) by using an image inputted to the terminal and its relating information. The method includes the steps of: (a) specifying the visual field of the terminal corresponding to the inputted image by referring to at least one piece of information on a location, a displacement and a viewing angle of the terminal; and (b) acquiring a graphic element corresponding to the object, included in the visual field of the terminal, whose identity is recognized by using a technology for matching a building image and displaying the acquired graphic element with the inputted image in the form of the augmented reality by providing the graphic element on a location of the object displayed on a screen of the terminal.04-19-2012
20120092371INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING METHOD, PROGRAM, AND INFORMATION PROCESSING SYSTEM - Provided is an information processing apparatus including an image acquisition unit for acquiring a real space image including an image of another apparatus, a coordinate system generation unit for generating a spatial coordinate system of the real space image acquired by the image acquisition unit, and a transmission unit for transmitting spatial information constituting the spatial coordinate system generated by the coordinate system generation unit to the other apparatus sharing the spatial coordinate system.04-19-2012
20120092369DISPLAY APPARATUS AND DISPLAY METHOD FOR IMPROVING VISIBILITY OF AUGMENTED REALITY OBJECT - Provided are a display apparatus and a display method for improving visibility of each object by differently displaying each object from the background when providing an augmented reality (AR) service. The display apparatus and the display method may improve visibility of each object by outputting a list of overlapped objects or a map of overlapped objects. Also, the display apparatus and the display method may improve visibility of each object by enlarging a complex area, in which objects are densely disposed, to reduce overlapping of the objects.04-19-2012
20120092368APPARATUS AND METHOD FOR PROVIDING AUGMENTED REALITY (AR) INFORMATION - Provided are a server, a terminal, and a method of providing Augmented Reality (AR) using a mobile tag in the terminal which is accessible to the server through a wired/wireless communication network. The method includes writing content in response to a tag creation request to create a tag; setting location information of the tag; setting, if the tag is a mobile tag, the location information of the mobile tag to a variable state and setting movement information of the mobile tag; and transmitting information about the mobile tag including the content, the location information, and the movement setting information of the mobile tag to the server, and requesting the server to register the mobile tag.04-19-2012
20120287159VIEWING OF REAL-TIME, COMPUTER-GENERATED ENVIRONMENTS - A method of generating a view of a computer-generated environment using a location in a real-world environment, comprising receiving real-time data regarding the location of a device in the real-world environment; mapping the real-time data regarding the device into a virtual camera within a directly-correlating volume of space in the computer-generated environment; updating the virtual camera location using the real-time data, such that the virtual camera is assigned a location in the computer-generated environment which corresponds to the location of the device in the real-world environment; and using the virtual camera to generate a view of the computer-generated environment from the assigned location in the computer-generated environment.11-15-2012
20120287158DISPLAY APPARATUS, CONTROL METHOD FOR DISPLAY APPARATUS, AND STORAGE MEDIUM - To display a captured image, a display apparatus includes an acquisition unit, a display unit, and a setting unit. The acquisition unit acquires, from an external apparatus, additional information that is to be displayed in association with an object in the captured image. The display unit displays the additional information acquired by the acquisition unit in association with the object. The setting unit allows a user to set a set number of objects in association with which the additional information is to be displayed by the display unit. The acquisition unit acquires, from the external apparatus, the additional information according to the set number of objects.11-15-2012
20100194782METHOD AND APPARATUS FOR CREATING VIRTUAL GRAFFITI IN A MOBILE VIRTUAL AND AUGMENTED REALITY SYSTEM - A method and apparatus is provided for easily creating virtual graffiti that will be left for a particular device to view. During operation a device will be placed near a first point that is used to define a boundary for the virtual graffiti. The device will locate the first point, and use the point to define the boundary. The device will receive an image that is to be used as virtual graffiti, and will fit the image within the boundary of the virtual graffiti. For example, the device may be consecutively placed near four points that will define a polygon to be used as the boundary for the virtual graffiti. An image will then be received, and the image will be fit within the polygon.08-05-2010
20120139941INFORMATION DISPLAY SYSTEM, INFORMATION DISPLAY APPARATUS AND NON-TRANSITORY STORAGE MEDIUM - The matching processor acquires the key information such as position information and/or the like by the key information acquirer, and notifies to an external apparatus via the communicator. The matching processor stores the acquired key information in the storer corresponding to the identification information which is acquired from the external apparatus. When the acquired identification information does not exist in a AR data management table, the matching processor acquires the AR data by notifying the identification information to the external apparatus. When there is no empty record in a key information management table, key information of which usage date and time is old is considered as a deletion target. Moreover, when there is no empty record in the AR data management table, the AR data of which usage date and time is old is considered as the deletion target.06-07-2012
20130009993Systems, Computer Medium and Computer-Implemented Methods for Providing Health Information to Employees Via Augmented Reality Display - Provided are embodiments of systems, computer medium and computer-implemented methods for providing feedback of health information to an employee when the employee is engaged in their work duties. The method including receiving health data output by a set of health sensors provided on or near the employee when the employee is engaged in work duties. The health sensors comprising at least one of biometric and biomechanic sensors. The health data corresponding to biometric and/or biomechanic characteristics sensed by the set of health sensors. The method including processing the health data to identify health status information for the employee, and providing for display via an augmented reality display, augmented reality content including the health status information. The augmented reality display providing the employee with an augmented reality view including a real world view of a surrounding environment having the health status information for the employee overlaid thereon.01-10-2013
20130009994METHODS AND APPARATUS TO GENERATE VIRTUAL-WORLD ENVIRONMENTS - Example methods and apparatus to generate virtual-world environments are disclosed. A disclosed example method involves receiving real-world data associated with a real-world environment in which a person is located at a particular time and receiving virtual-reality data representative of a virtual-world environment corresponding to the real-world environment in which the person was located at the particular time. The method also involves displaying the virtual-world environment based on the virtual-reality data and displaying, in connection with the virtual-world environment, a supplemental visualization based on supplemental user-created information. The supplemental user-created information is obtained based on the real-world data.01-10-2013
20130016123SYSTEMS AND METHODS FOR AN AUGMENTED REALITY PLATFORM - Systems and methods for augmenting a view of reality. In an embodiment, a first medium is superimposed over a first view of reality. One or more changes to the superimposed medium are received, such as a change in transparency, change in size, and change in position. A first marker, comprising at least a portion of the first view of reality, is generated. First metadata related to the first medium and/or the first marker are also generated. The first medium, the first marker, and the first metadata are sent to a depository. In a further embodiment, a second medium, second marker, and second metadata are received from the depository. The second marker is matched to a least a portion of a second view of reality, and the second medium is superimposed over the at least a portion of the second view of reality to generate an augmented view of reality.01-17-2013
20110157223ELECTRO-OPTIC VISION SYSTEMS - An image processing system for displaying an augmented real scene image. The system includes a device for identifying a geographic position, a processor for combining a real scene image at the position with information about the real scene to produce an augmented image and a display for displaying the augmented image.06-30-2011
20130021374Manipulating And Displaying An Image On A Wearable Computing System - Example methods and systems for manipulating and displaying a real-time image and/or photograph on a wearable computing system are disclosed. A wearable computing system may provide a view of a real-world environment of the wearable computing system. The wearable computing system may image at least a portion of the view of the real-world environment in real-time to obtain a real-time image. The wearable computing system may receive at least one input command that is associated with a desired manipulation of the real-time image. The at least one input command may be a hand gesture. Then, based on the at least one received input command, the wearable computing system may manipulate the real-time image in accordance with the desired manipulation. After manipulating the real-time image, the wearable computing system may display the manipulated real-time image in a display of the wearable computing system.01-24-2013
20130021373Automatic Text Scrolling On A Head-Mounted Display - A see-through head-mounted display (HMD) device, e.g., in the form of glasses, provides view an augmented reality image including text, such as in an electronic book or magazine, word processing document, email, karaoke, teleprompter or other public speaking assistance application. The presentation of text and/or graphics can be adjusted based on sensor inputs indicating a gaze direction, focal distance and/or biological metric of the user. A current state of the text can be bookmarked when the user looks away from the image and subsequently resumed from the bookmarked state. A forward facing camera can adjust the text if a real word object passes in front of it, or adjust the appearance of the text based on a color of pattern of a real world background object. In a public speaking or karaoke application, information can be displayed regarding a level of interest of the audience and names of audience members.01-24-2013
20120242698SEE-THROUGH NEAR-EYE DISPLAY GLASSES WITH A MULTI-SEGMENT PROCESSOR-CONTROLLED OPTICAL LAYER - This disclosure concerns an interactive head-mounted eyepiece with an integrated processor for handling content for display and an integrated image source for introducing the content to an optical assembly through which the user views a surrounding environment and the displayed content. The optical assembly includes a multi-segment optical layer that provides a display characteristic adjustment that is dependent on displayed content requirements and surrounding environmental conditions. Each segment of the multi-segment optical layer is individually controlled by the integrated processor.09-27-2012
20120242696Augmented Reality In A Virtual Tour Through A Financial Portfolio - Disclosed are techniques for providing a presentation as a virtual tour through a user's portfolio based on receiving signals that correspond to user movements in the physical world and processing the signals to select generated images associated with the user's movements to generate an image that when rendered on a display device renders a visual representation of the portfolio in a virtual world.09-27-2012
20130169679VEHICLE IMAGE DISPLAY SYSTEM AND CORRECTION METHOD THEREOF - A vehicle image display system and correction method thereof, applicable to a vehicle to perform vehicle information display, comprising following steps, fetch a front road image; obtain positions of a lane marking and an obstacle in front based on said front road image; after correction, calculate display information of said positions of said lane markings and said obstacles in front, to be overlapped with images of an actual traffic lane; utilize a motor control unit to adjust focal length or inclination angle of a projector unit, or inclination angle of a viewable panel, to produce overlap error correction values to correct said display information. Said projector unit projects large area display information overlapped entirely with images of said actual traffic lane onto said viewable panel or windshield, so that a driver can obtain vehicle driving information, to raise vehicle driving safety.07-04-2013
20130169680SOCIAL SYSTEM AND METHOD USED FOR BRINGING VIRTUAL SOCIAL NETWORK INTO REAL LIFE - The present invention is related to a social system and process used for bringing virtual social network into real life, which is allowed for gathering and analyzing a social message of at least one interlocutor from virtual social network so as to generate at least one recommended topic, allowing a user to talk with the interlocutor through the utilization of the recommended topic, and then capturing and analyzing at least one speech and behavior and/or physiological response of the interlocutor during talking so as to generate an emotion state of the interlocutor. The user is allowed to determine whether the interlocutor is interested in the recommended topic through the emotion state of interlocutor. Thus, it is possible to bring the social message on virtual network into real life, so as to increase communication topics between persons in real life.07-04-2013
20130169681SYSTEMS AND METHODS FOR PRESENTING BUILDING INFORMATION - Described herein are systems and methods for presenting building information. In overview, the technologies described herein provide relationships between Building Information Modeling (BIM) data (which includes building schematics defined in terms of standardized three dimensional models) and Building Management System (BMS) data (which includes data indicative of the operation of building components such as HVAC components and the like). Some embodiments use relationships between these forms of data thereby to assist technicians in identifying the physical location of particular pieces of equipment, for example in the context of performing inspections and/or maintenance. In some cases this includes the provision of 2D and/or 3D maps to portable devices, these maps including the location of equipment defined both in BIM and BMS data. In some cases, augmented reality technology is applied thereby to provide richer access to positional information.07-04-2013
20130169682TOUCH AND SOCIAL CUES AS INPUTS INTO A COMPUTER - A system for automatically displaying virtual objects within a mixed reality environment is described. In some embodiments, a see-through head-mounted display device (HMD) identifies a real object (e.g., a person or book) within a field of view of the HMD, detects one or more interactions associated with real object, and automatically displays virtual objects associated with the real object if the one or more interactions involve touching or satisfy one or more social rules stored in a social rules database. The one or more social rules may be used to infer a particular social relationship by considering the distance to another person, the type of environment (e.g., at home or work), and particular physical interactions (e.g., handshakes or hugs). The virtual objects displayed on the HMD may depend on the particular social relationship inferred (e.g., a friend or acquaintance).07-04-2013
20130169683HEAD MOUNTED DISPLAY WITH IRIS SCAN PROFILING - A see-through head mounted-display and method for operating the display to optimize performance of the display by referencing a user profile automatically. The identity of the user is determined by performing an iris scan and recognition of a user enabling user profile information to be retrieved and used to enhance the user's experience with the see through head mounted display. The user profile may contain user preferences regarding services providing augmented reality images to the see-through head-mounted display, as well as display adjustment information optimizing the position of display elements in the see-though head-mounted display.07-04-2013
20130201217OBJECT DISPLAY DEVICE AND OBJECT DISPLAY METHOD - In an object display device, in the case that a marker is not detected at present, a display complementing unit acquires a change in an image in real space displayed on a display unit between the past when the marker was detected and the present. Since a virtual object is displayed based on the position and shape of the marker in the image in real space, the display position and display manner of the virtual object are also to be changed in accordance with a change in the image in real space. A display decision unit can therefore decide the display position and display manner of the virtual object at present from the display position and display manner of the virtual object in the past, based on the change in the image in real space between the past and the present.08-08-2013
20120249591SYSTEM FOR THE RENDERING OF SHARED DIGITAL INTERFACES RELATIVE TO EACH USER'S POINT OF VIEW - A head mounted device provides an immersive virtual or augmented reality experience for viewing data and enabling collaboration among multiple users. Rendering images in a virtual or augmented reality system may include capturing an image and spatial data with a body mounted camera and sensor array, receiving input indicating a first anchor surface, calculating parameters with respect to the body mounted camera and displaying a virtual object such that the virtual object appears anchored to the selected first anchor surface. Further rendering operations may include receiving a second input indicating a second anchor surface within the captured image that is different from the first anchor surface, calculating parameters with respect to the second anchor surface and displaying the virtual object such that the virtual object appears anchored to the selected second anchor surface and moved from the first anchor surface.10-04-2012
20130176335VEHICULAR DISPLAY DEVICE AND VEHICULAR DISPLAY SYSTEM - A vehicular display device includes a display unit that displays visible information and a light projection unit that guides light including the visible information displayed on the display unit to a predetermined projection surface, and displaying the visible information as a virtual image. The vehicular display device includes a guide display unit and a guide display control unit. The guide display unit indicates a relationship between at least positions of a first display region in which the virtual image is displayed by projection of the light projection unit and a second display region in which detailed information is displayed. The detailed information has an association with a content of particular information that is displayed on the display unit under a predetermined condition. The guide display control unit controls the guide display unit into a display state when the particular information is displayed on the display unit.07-11-2013
20130176336METHOD OF AND SYSTEM FOR OVERLAYING NBS FUNCTIONAL DATA ON A LIVE IMAGE OF A BRAIN - The present invention discloses a method of overlaying Navigated Brain Stimulation (NBS) functional data on a live image of a brain. The method comprises the steps of obtaining a live image of a brain, obtaining a functional map of the brain comprising an anatomical model of the brain and NBS functional data associated with the brain, identifying at least one anatomical landmark of the brain from the live image of the brain, identifying at least one of said identified anatomical landmarks on the anatomical model of the brain, modifying the functional map so that the identified at least one anatomical landmark of the model corresponds in size and orientation to the corresponding at least one anatomical landmark in the live image of the brain, and digitally overlaying at least said NBS functional data on said live image of the brain according to the corresponding, aligned anatomical landmarks.07-11-2013
20130176337Device and Method For Information Processing - A device and method for information processing are described. The device includes a display unit having a preset transmittance; an object determination unit configured to determine at least one object at the information processing device side; an additional information acquisition unit configured to acquire additional information corresponding to the at least one object; an additional information position determination unit configured to determine the display position of the additional information on the display unit; and a display processing unit configured to display the additional information on the display unit based on the display position.07-11-2013
20130113828IMAGE PROCESSING APPARATUS, IMAGE PROCESSING METHOD, AND PROGRAM - There is provided an image processing apparatus including an image processing unit which combines a virtual object with a captured image. The image processing unit determines the virtual object based on a state or a type of an object shown in the captured image.05-09-2013
20130201215ACCESSING APPLICATIONS IN A MOBILE AUGMENTED REALITY ENVIRONMENT - An augmented reality system and method that allows a user to access, and more particularly, install and subsequently have access to an application on an augmented reality mobile device. The system and method enhances the augmented reality experience by minimizing or eliminating user interaction in the process of initiating the installation of the application. This is achieved, at least in part, through the use of a passively activated application program. It is passively activated in that it effects the application installation based on signals received and processed by the augmented reality mobile device, where the signals reflect the surrounding environment in which the augmented reality mobile device is operating. No direct interaction by the user of the augmented reality mobile device is required to initiate the installation of the application.08-08-2013
20130093788USER CONTROLLED REAL OBJECT DISAPPEARANCE IN A MIXED REALITY DISPLAY - The technology causes disappearance of a real object in a field of view of a see-through, mixed reality display device system based on user disappearance criteria. Image data is tracked to the real object in the field of view of the see-through display for implementing an alteration technique on the real object causing its disappearance from the display. A real object may satisfy user disappearance criteria by being associated with subject matter that the user does not wish to see or by not satisfying relevance criteria for a current subject matter of interest to the user. In some embodiments, based on a 3D model of a location of the display device system, an alteration technique may be selected for a real object based on a visibility level associated with the position within the location. Image data for alteration may be prefetched based on a location of the display device system.04-18-2013
20130113829INFORMATION PROCESSING APPARATUS, DISPLAY CONTROL METHOD, AND PROGRAM - There is provided an information processing apparatus including an operation detecting unit detecting an operation of a subject that has been captured, and a display control unit controlling at least one of wearing or removal of at least one of virtual clothing or accessories to be displayed overlaid on the subject in accordance with the operation detected by the operation detecting unit.05-09-2013
20130127907APPARATUS AND METHOD FOR PROVIDING AUGMENTED REALITY SERVICE FOR MOBILE TERMINAL - An apparatus and method for providing an augmented reality service in a mobile terminal are provided. A method for providing an augmented reality service in a mobile terminal includes generating an augmented reality service scene using surrounding information, identifying recognizable at least one marker based on the augmented reality service scene, generating a guide scene considering the recognizable at least one marker, and adding the guide scene to a part of the augmented reality service scene and displaying the added scene.05-23-2013
20130127906INFORMATION DISPLAY APPARATUS, METHOD THEREOF AND PROGRAM THEREOF - An information display apparatus creates determination image which indicates the fact that a reference object in a plurality of objects arranged in a row satisfies a predetermined rule or that the reference object does not satisfy the rule, creates an image to be displayed by superimposing the determination image on the acquired image, and displays the image to be displayed.05-23-2013
20110221771Merging of Grouped Markers in An Augmented Reality-Enabled Distribution Network - The present invention relates to systems and methods for merging smart markers in augmented reality. The system includes a server supporting the presentation of information within augmented reality of a plurality of participants. A communication network facilitates the transfer of information from the server to devices of the plurality of participants. A first participant in the plurality of participants is associated with a view of augmented reality and a plurality of smart markers, with each smart marker having an attribute. A merged group of smart markers includes smart markers from within the plurality of smart markers with said attribute being a first common attribute, wherein smart markers in the merged group are displayable within the augmented reality of the first participant.09-15-2011
20110221770SELECTIVE MOTOR CONTROL CLASSIFICATION - Techniques for detecting and classifying motion of a human subject are generally described. More particularly, techniques are described for detecting and classifying motion as either a broad selection or a precise selection to facilitate interaction with augmented reality (AR) systems. An example may include detecting a motion of human and repeatedly analyzing a step response associated with the motion to determine one or more of a peak time t09-15-2011
20110221769ROBUST OBJECT RECOGNITION BY DYNAMIC MODELING IN AUGMENTED REALITY - Technologies are generally described for providing a robust object recognition scheme based on dynamic modeling. Correlations in fine scale temporal structure of cellular regions may be employed to group the regions together into higher-order entities. The entities represent a rich structure and may be used to code high level objects. Object recognition may be formatted as elastic graph matching.09-15-2011
20110227945INFORMATION PROCESSING DEVICE, INFORMATION PROCESSING METHOD, AND PROGRAM - There is provided an information processing device includes a virtual space recognition unit for analyzing 3D space structure of a real space to recognize a virtual space, a storage unit for storing an object to be arranged in the virtual space, a display unit for making a display unit display the object arranged in the virtual space, a direction of gravitational force detection unit for detecting a direction of gravitational force of a real space, and a direction of gravitational force reflection unit for reflecting the direction of gravitational force detected by the detection unit in the virtual space.09-22-2011
20130147836MAKING STATIC PRINTED CONTENT DYNAMIC WITH VIRTUAL DATA - The technology provides embodiments for making static printed content being viewed through a see-through, mixed reality display device system more dynamic with display of virtual data. A printed content item, for example a book or magazine, is identified from image data captured by cameras on the display device, and user selection of a printed content selection within the printed content item is identified based on physical action user input, for example eye gaze or a gesture. A task in relation to the printed content selection can also be determined based on physical action user input. Virtual data for the printed content selection is displayed in accordance with the task. Additionally, virtual data can be linked to a work embodied in a printed content item. Furthermore, a virtual version of the printed material may be displayed at a more comfortable reading position and with improved visibility of the content.06-13-2013
20130147840PROJECTED REAR PASSENGER ENTERTAINMENT SYSTEM - A method for augmenting a graphic displayed on a surface inside of a vehicle using a rear seat entertainment projection (RSEP) system includes generating the graphic for display on the surface inside the vehicle. When the graphic is displayed on the surface, an input that causes a reaction to the graphic displayed upon the surface is obtained, and the graphic displayed on the surface is augmented based on the reaction to the graphic.06-13-2013
20130093790METHOD AND SYSTEM FOR IMPLEMENTING AUGMENTED REALITY APPLICATION - The present invention discloses a method and an apparatus for implementing an augmented reality application. The method includes: searching for AR applications related to set AR application parameter; selecting at least two AR applications from multiple AR applications found through searching and integrating the at least two AR applications into one new AR application; and providing the new AR application after integration for a user.04-18-2013
20130093789TOTAL FIELD OF VIEW CLASSIFICATION FOR HEAD-MOUNTED DISPLAY - Virtual images are located for display in a head-mounted display (HMD) to provide an augment reality view to an HMD wearer. Sensor data may be collected from on-board sensors provided on an HMD. Additionally, other day may be collected from external sources. Based on the collected sensor data and other data, the position and rotation of the HMD wearer's head relative to the HMD wearer's body and surrounding environment may be determined. After resolving the HMD wearer's head position, the HMD wearer's total field of view (TFOV) may be classified into regions. Virtual images may then be located in the classified TFOV regions to locate the virtual images relative to the HMD wearer's body and surrounding environment.04-18-2013
20100309225IMAGE MATCHING FOR MOBILE AUGMENTED REALITY - Embodiments of a system and method for mobile augmented reality are provided. In certain embodiments, a first image is acquired at a device. Information corresponding to at least one second image matched with the first image is obtained from a server. A displayed image on the device is augmented with the obtained information.12-09-2010
20130176334METHOD AND APPARATUS FOR ANALYZING CLUSTERING OF MIXED REALITY CONTENT AND COMPUTAIONS - An approach is provided for analyzing clustering of mixed reality content and computations. A mixed reality platform determines one or more clusters of one or more mixed reality digital objects, one or more computations associated with the one or more mixed reality digital objects, or a combination thereof based, at least in part, one or more densities of one or more requests for the one or more mixed reality digital objects. The mixed reality platform also processes and/or facilitates a processing of the one or more requests, the one or more densities, or a combination thereof to determine one or more gradients with respect to one or more locations associated with the mixed reality digital objects. The one or more gradients represent inflow/outflow information associated with the one or more locations.07-11-2013
20110242134METHOD FOR AN AUGMENTED REALITY CHARACTER TO MAINTAIN AND EXHIBIT AWARENESS OF AN OBSERVER - Methods and systems for enabling an augmented reality character to maintain and exhibit awareness of an observer are provided. A portable device held by a user is utilized to capture an image stream of a real environment, and generate an augmented reality image stream which includes a virtual character. The augmented reality image stream is displayed on the portable device to the user. As the user maneuvers the portable device, its position and movement are continuously tracked. The virtual character is configured to demonstrate awareness of the user by, for example, adjusting its gaze so as to look in the direction of the portable device.10-06-2011
20110242133AUGMENTED REALITY METHODS AND APPARATUS - Augmented reality methods and apparatus are described according to some aspects of the disclosure. In one aspect, a method of experiencing augmented data includes using a source system, emitting a dynamic symbol which changes over time, using a consumption system, receiving the emission of the source system, using the consumption system, analyzing the emission which was received by the consumption system to determine whether the dynamic symbol is present in the emission, and using the consumption system, generating a representation of augmented data to be consumed by a user of the consumption system as a result of the analyzing determining that the dynamically changing symbol is present in the emission of the source system.10-06-2011
20130155105METHOD AND APPARATUS FOR PROVIDING SEAMLESS INTERACTION IN MIXED REALITY - An approach is provided for providing seamless interaction in mixed reality. A mixed reality platform processes and/or facilitates a processing of media information associated with at least one augmented reality application to determine one or more digital objects, wherein the one or more digital objects aggregate, at least in part, data for defining the one or more digital objects, one or more computation closures acting on the data, one or more results of the one or more computation closures, or a combination thereof. The mixed reality platform also causes, at least in part, a composition, a decomposition, or a combination thereof of the one or more digital objects to cause, at least in part, an enhancement, a modification, an initiation, or a combination thereof of one or more functions associated with the at least one augmented reality application.06-20-2013
20130155106METHOD AND SYSTEM FOR COORDINATING COLLISIONS BETWEEN AUGMENTED REALITY AND REAL REALITY - A method and system for coordinating placement of an augmented reality/virtual world object(s) into a scene relative to position and orientation. The object(s) can be connected to an anchor point having an absolute location relative to the marker via a connector (e.g., spring-like connector) in such a way that the behavior of the object responds to a physical force and a collision which exists in the augmented reality scene. The connection between the virtual object and location of the marker permits the object to exactly track the marker when there are no real world collisions between the markers. The virtual objects can be displaced so the objects do not pass through one another when the real world markers come into a close spatial proximity and the corresponding virtual objects begin to collide.06-20-2013
20130155107Systems and Methods for Providing an Augmented Reality Experience - Methods and systems, including computer program products, are described for providing an augmented reality experience. A Near Field Communication (NFC) enabled mobile device reads data from a data-encoded tag associated with a physical object. The mobile device captures an image. The mobile device transmits the read data to a server computing device. The mobile device retrieves, from the server computing device, data elements associated with the physical object. The mobile device generates an augmented image based on the capture image and the data elements associated with the physical object.06-20-2013
20130155108Augmented Reality User Interaction Methods, Computing Devices, And Articles Of Manufacture - Augmented reality user interaction methods, computing devices, and articles of manufacture are disclosed according to some aspects of the description. In one aspect, an augmented reality user interaction method includes executing an augmented reality browser application, displaying a camera view of a computing device wherein images generated by a camera are displayed using a touch sensitive display, during the displaying the camera view, displaying an icon interface comprising a pathway and a plurality of icons with respect to the pathway using the touch sensitive display, first detecting a user input moving in a direction of the pathway, moving the icons along the pathway in the direction of the user input as a result of the first detecting, second detecting a user input selecting one of the icons, and depicting augmented reality content with respect to at least one of the images as a result of the second detecting.06-20-2013
20130182010DEVICE FOR CAPTURING AND DISPLAYING IMAGES OF OBJECTS, IN PARTICULAR DIGITAL BINOCULARS, DIGITAL CAMERA OR DIGITAL VIDEO CAMERA - The invention relates to a device for capturing and displaying images of objects, in particular digital binoculars (07-18-2013
20130182011Method for Providing Information on Object Which Is Not Included in Visual Field of Terminal Device, Terminal Device and Computer Readable Recording Medium - The present invention relates to a method for providing information on an object excluded in a visual field of a terminal in a form of augmented reality (AR) by using an image inputted to the terminal and information related thereto. The method includes the steps of: (a) specifying the visual field of the terminal corresponding to the inputted image by referring to at least one piece of information on a location, a displacement and a viewing angle of the terminal; (b) searching an object(s) excluded in the visual field of the terminal; and (c) displaying guiding information on the searched object(s) with the inputted image in a form of the augmented reality; wherein the visual field is specified by a viewing frustum whose vertex corresponds to the terminal.07-18-2013
20130182012METHOD OF PROVIDING AUGMENTED REALITY AND TERMINAL SUPPORTING THE SAME - A method of providing augmented reality and a terminal supporting the same are provided. The terminal for supporting augmented reality includes: a display unit displaying a specific image during a preview image mode; and a controller recognizing at least one surface from the specific image according to a predetermined criteria, combining an image of a virtual object with the specific image so that the image of a virtual object is displayed on the recognized at least one surface, and controlling the display unit to output the combined image.07-18-2013
20110310120TECHNIQUES TO PRESENT LOCATION INFORMATION FOR SOCIAL NETWORKS USING AUGMENTED REALITY - Techniques to present location information using augmented reality are described. An apparatus may comprise an augmentation system operative to augment an image with information for an individual, the image having a virtual object representing a real object. The augmentation system may comprise a location component operative to determine location information for the real object, a virtual information component operative to retrieve location information for an individual, and a proximity component operative to determine whether location information for the real object substantially matches location information for the individual. The augmentation system may further comprise an augmentation component operative to augment the virtual object with information for the individual to form an augmented object when the location information for the real object substantially matches the location information for the individual. Other embodiments are described and claimed.12-22-2011
20130187952NETWORK-BASED REAL TIME REGISTERED AUGMENTED REALITY FOR MOBILE DEVICES - A method of operating a mobile device with a camera, a display and a position sensor to provide a display of supplementary information aligned with a view of a scene. One or more image obtained from the camera is uploaded to a remote server together with corresponding data from the position sensor. Image processing is then performed to track image motion between that image and subsequent images obtained from the camera, determining a mapping between the uploaded image and a current image. Data is then received via the network indicative of a pixel location for display of supplementary information within the reference image. The mapping is used to determine a corresponding pixel location for display of the supplementary information within the current image, and the supplementary information is displayed on the display correctly aligned with the view of the scene.07-25-2013
20130187953Image Matching Apparatus and Image Matching Method - An image matching apparatus is provided. The apparatus includes a storage unit, an obtaining unit, a specification unit, and an image matching unit. The storage unit is configured to store image data of one or more devices that are connected to a local network. The obtaining unit is configured to obtain image data of device image obtained by capturing a device. The specification unit is configured to specify one or more local networks to be used for image matching. The image matching unit is configured to perform image matching of the obtained image data against the stored image data of one or more devices that are connected to the specified local network.07-25-2013
20130187951AUGMENTED REALITY APPARATUS AND METHOD - According to one embodiment, an augmented reality apparatus includes an estimation unit, a search unit, a first generation unit, a second generation unit and selection unit. The estimation unit estimates a main facility. The search unit searches for facilities to obtain target facilities. The first generation unit generates a first feature value according to each item of interest of the user. The second generation unit generates a second feature value for each target facility. The selection unit calculates a degree of association based on the first feature value and the second feature value to select data of a target facility having the degree of association not less than a first threshold as recommended facility data.07-25-2013
20130187950TRANSPARENT DISPLAY FOR MOBILE DEVICE - A projection-type display device is connectively coupled to a mobile device (such as a smartphone) where the light generated by a small projection device is directed at a relatively transparent holographic optical element (HOE) to provide a display to an operator of the mobile device or a viewer. The projector and HOE may be configured to produce and magnify a virtual image that is perceived as being displayed at a large distance from the operator who views the image through the HOE. The HOE may comprise a volume grating effective at only the narrow wavelengths of the projection device to maximize transparency while also maximizing the light reflected from the display projector to the eyes of the operator.07-25-2013
20130113827HANDS-FREE AUGMENTED REALITY FOR WIRELESS COMMUNICATION DEVICES - This disclosure relates to techniques for providing hands-free augmented reality on a wireless communication device (WCD). According to the techniques, an application processor within the WCD executes an augmented reality (AR) application to receive a plurality of image frames and convert the plurality of image frames into a single picture comprising the plurality of image frames stitched together to represent a scene. The WCD executing the AR application then requests AR content for the scene represented in the single picture from an AR database server, receives AR content for the scene from the AR database server, and processes the AR content to overlay the single picture for display to a user on the WCD. In this way, the user may comfortably look at the single picture with the overlaid AR content on a display of the WCD to learn more about the scene represented in the single picture.05-09-2013
20120019558APPARATUS AND METHOD FOR PROVIDING AUGMENTED REALITY SERVICE USING SOUND - An apparatus and method for providing an AR service using a sound. When a user starts an AR service providing function in a mobile communication terminal, a sound signal is received, whether the sound signal is carrying additional information is determined by analyzing the sound signal, if the additional information is carried on the sound signal, the additional information is extracted, data associated with the extracted additional information is acquired, and the AR service is provided using the acquired data. Accordingly, various kinds of additional information may be provided, and the AR service may be provided.01-26-2012
20120019557DISPLAYING AUGMENTED REALITY INFORMATION - A device may obtain location information of an AR display device and obtain identifiers associated with objects that are within a field of view of the AR display device based on the location information. In addition, the device may obtain, for each of the objects, AR information based on the identifiers and determine, for each of the objects, a distance of the object from the AR display device. Furthermore, the device may generate, for each of the objects, images of the AR information at a virtual distance from the AR display device, the virtual distance corresponding to the determined distance. The device may display the generated images at the AR display device.01-26-2012
20130194304COORDINATE-SYSTEM SHARING FOR AUGMENTED REALITY - A method for presenting real and virtual images correctly positioned with respect to each other. The method includes, in a first field of view, receiving a first real image of an object and displaying a first virtual image. The method also includes, in a second field of view oriented independently relative to the first field of view, receiving a second real image of the object and displaying a second virtual image, the first and second virtual images positioned coincidently within a coordinate system.08-01-2013
20130194305MIXED REALITY DISPLAY SYSTEM, IMAGE PROVIDING SERVER, DISPLAY DEVICE AND DISPLAY PROGRAM - A Mixed Reality display system and others capable of experiencing Mixed Reality by changing own line of sight of a user in a case where a plurality of users experience a synthesized image are provided. The Mixed Reality display system is a system in which an image providing server and a plurality of client terminals are constructed to be capable of being communicated with each other, the image providing server represents a virtual object, synthesizes the represented object and an omnidirectional image taken by an omnidirectional image obtaining camera, and then delivers the synthesized image information to a plurality of client terminals. The client terminal extracts a partial area image from the synthesized image indicated by the synthesized image information received from the image providing server based on the portion/pose information defining the line of sight of a user observing the client terminal, and then displayed the extracted image.08-01-2013
20130194306SYSTEM FOR PROVIDING TRAFFIC INFORMATION USING AUGMENTED REALITY - The present invention relates to a system for providing traffic information using augmented reality, and comprises: an AR server for providing a virtual image which is obtained by processing transfer information on transfer locations at stops for transportation means and information according to the transportation means, using characters and graphics; and a personal portable communication device which displays the virtual image received from the AR server while overlapping the virtual image on a real-captured image of a transfer location obtained through a camera. According to the present invention, a virtual image which displays transfer information on transfer locations such as bus platforms, subway stations, airports and the like and information according to transportation means and the like is displayed while being overlapped on an actual real-captured mage, thereby making it possible for a user to easily and simply obtain information on the detailed traffic flow at transfer locations or the surroundings thereof.08-01-2013
20120293549COMPUTER-READABLE STORAGE MEDIUM HAVING INFORMATION PROCESSING PROGRAM STORED THEREIN, INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING SYSTEM, AND INFORMATION PROCESSING METHOD - In a game apparatus, a plurality of images of a real object are taken from a plurality of directions, and the plurality of images are previously stored in a storage device so as to be associated with imaging directions. The game apparatus causes an outer imaging section to take an image including a marker positioned in a real space, and detects the marker included in the taken image. The game apparatus calculates, based on the detected marker, a position of the outer imaging section in a marker coordinate system based on the marker. The game apparatus calculates a vector indicating a direction from the position of the outer imaging section toward the marker, selects, based on the vector, an image from among the plurality of images stored in the storage device, and displays the selected image on the upper LCD.11-22-2012
20130201214Methods, Apparatuses, and Computer-Readable Storage Media for Providing Interactive Navigational Assistance Using Movable Guidance Markers - Methods, apparatuses and computer-readable storage media for displaying at a first user equipment a first marker a location wherein the location is defined remotely at a second user equipment; displaying at the second user equipment a current geographic location and a current vantage point for the first user equipment; displaying at the second user equipment a second marker; accepting at the second user equipment an input for adjusting the second marker from a first position to a second position, wherein the second position is indicative of a target geographic location in a first virtual view of the current geographic location of the first user equipment as displayed on the second user equipment; and in response to the adjusting, displaying at the first user equipment the first marker at the target geographic location in a second virtual view of the current geographic location.08-08-2013
20130201216SERVER, CLIENT TERMINAL, SYSTEM, AND PROGRAM - There is provided a server including a reception unit configured to receive, from a client terminal, position information indicating a position of the client terminal, and direction information indicating a direction in which the client terminal is directed, and a search unit configured to search for image data provided with position information indicating an opposite position across a target object present in the direction indicated by the direction information with respect to the position of the client terminal based on the position information.08-08-2013
20120069052METHOD AND TERMINAL FOR PROVIDING DIFFERENT IMAGE INFORMATION IN ACCORDANCE WITH THE ANGLE OF A TERMINAL, AND COMPUTER-READABLE RECORDING MEDIUM - The present invention includes a method for providing different images by referring to angles of a terminal. The method includes the steps of: (a) if the angle falls under an angular range which includes 0 degree with respect to the horizontal plane, setting a content provided through the terminal to be map information, if the angle falls under an angular range which includes 90 degrees with respect to the horizontal plane, setting a content provided through the terminal to be preview image, and if the angle falls under an angular range which includes 180 degrees with respect to the horizontal plane, setting a content provided through the terminal to be weather; (b) acquiring information on the angle by using a sensor; (c) creating information on an image based on the set content; and (d) providing a user with the created information.03-22-2012
20120299962METHOD AND APPARATUS FOR COLLABORATIVE AUGMENTED REALITY DISPLAYS - Methods and apparatuses are provided for facilitating interaction with augmented reality devices, such as augmented reality glasses and/or the like. A method may include receiving a visual recording of a view from a first user from an imaging device. The method may also include displaying the visual recording to a display. Further, the method may include receiving an indication of a touch input to the display. In addition, the method may include determining, by a processor, a relation of the touch input to the display. The method may also include displaying, at least in part on the determined relation, an icon representative of the touch input to the imaging device. Corresponding apparatuses are also provided.11-29-2012
20120086728SYSTEM AND METHOD FOR TRANSITIONING BETWEEN INTERFACE MODES IN VIRTUAL AND AUGMENTED REALITY APPLICATIONS - One preferred embodiment of the present invention includes a method for transitioning a user interface between viewing modes. The method of the preferred embodiment can include detecting an orientation of a mobile terminal including a user interface disposed on a first side of the mobile terminal, wherein the orientation of the mobile terminal includes an imaginary vector originating at a second side of the mobile terminal and projecting in a direction substantially opposite the first side of the mobile terminal. The method of the preferred embodiment can also include transitioning between at least two viewing modes in response to the imaginary vector intersecting an imaginary sphere disposed about the mobile terminal at a first latitudinal point having a predetermined relationship to a critical latitude of the sphere.04-12-2012
20130208006SYSTEM AND METHOD OF IMAGE AUGMENTATION - A book for use in an augmented reality system includes first and second pages. The first page is on a first leaf of the book and includes a first fiduciary marker for indicating the orientation of the book to a recognition system. The second page is on a second leaf of the book and includes a second fiduciary marker for indicating the orientation of the book to the recognition system and also page marker for indicating the page number of the second page to the recognition system. The page marker orientation is ambiguous without reference to a fiduciary marker. The page marker is positioned on the second page closer to an edge of the page than the second fiduciary marker to become visible to the recognition system before the second fiduciary marker, as the book is turned to the second page08-15-2013
20130208003IMAGING STRUCTURE EMITTER CONFIGURATIONS - In embodiments of imaging structure emitter configurations, an imaging structure includes a silicon backplane with a driver pad array. The embedded light sources are formed on the driver pad array in an emitter material layer, and the embedded light sources can be individually controlled at the driver pad array to generate and emit light. The embedded light sources are configured in multiple rows for scanning by an imaging unit to generate a scanned image for display.08-15-2013
20130208007POSITION-RELATED INFORMATION REGISTRATION APPARATUS, POSITION-RELATED INFORMATION REGISTRATION SYSTEM, POSITION-RELATED INFORMATION REGISTRATION AND DISPLAY SYSTEM, AND RECORDING MEDIUM - A position-related information registration and display system can register position-related information in a position-related information management server using a first terminal, and display an air tag based on the position-related information on a second data terminal including an imaging unit. The first data terminal extracts a search keyword from content displayed on a first display unit, accesses a keyword information storage unit, acquires a corresponding position information piece that is a position information piece corresponding to the search keyword extracted from the content, and transmits a registration request in addition to the content and the corresponding position information piece to the position-related information management server. The second data terminal acquires a position-related information piece corresponding to a location where the imaging unit performs imaging, from the position-related information management server, and displays an air tag based on the acquired position-related information piece in a superimposed manner in a captured image.08-15-2013

Patent applications in class Augmented reality (real-time)