Patent application number | Description | Published |
20100125799 | PHYSICAL-VIRTUAL ENVIRONMENT INTERFACE - Apparatus, methods, and computer-program products are disclosed for automatically interfacing a first physical environment and a second physical environment to a virtual world model where some avatar controller piloting commands are automatically issued by a virtual world client to a virtual world server responsive to the focus of the avatar controller. | 05-20-2010 |
20110063286 | SYSTEM FOR INTERACTING WITH OBJECTS IN A VIRTUAL ENVIRONMENT - A system that facilitates interaction with a 3-dimensional (3-d) virtual environment is described. In this system, a controller device provides information associated with the 3-d virtual environment to a first display. Furthermore, the system includes at least one portable electronic device, which includes a second display and a 3-d motion sensor that detects motion of the portable electronic device, such as: linear motion, rotational motion and/or a gesture. This portable electronic device communicates the detected motion to the controller device. In response, the controller device provides a subset of the 3-d virtual environment and associated context-dependent information to the portable electronic device, which are displayed on the second display. | 03-17-2011 |
20110316845 | SPATIAL ASSOCIATION BETWEEN VIRTUAL AND AUGMENTED REALITY - One embodiment of the present invention provides a system that facilitates interaction between two entities located away from each other. The system includes a virtual reality system, an augmented reality system, and an object-state-maintaining mechanism. During operation, the virtual reality system displays an object associated with a real-world object. The augmented reality system displays the object based on a change to the state of the object. The object-state-maintaining mechanism determines the state of the object and communicates a state change to the virtual reality system, the augmented reality system, or both. A respective state change of the object can be based on one or more of: a state change of the real-world object; a user input to the virtual reality system or the augmented reality system; and an analysis of an image of the real-world object. | 12-29-2011 |
20130063560 | COMBINED STEREO CAMERA AND STEREO DISPLAY INTERACTION - One embodiment of the present invention provides a system that facilitates interaction between a stereo image-capturing device and a three-dimensional (3D) display. The system comprises a stereo image-capturing device, a plurality of trackers, an event generator, an event processor, and a 3D display. During operation, the stereo image-capturing device captures images of a user. The plurality of trackers track movements of the user based on the captured images. Next, the event generator generates an event stream associated with the user movements, before the event processor in a virtual-world client maps the event stream to state changes in the virtual world. The 3D display then displays an augmented reality with the virtual world. | 03-14-2013 |
20130325970 | COLLABORATIVE VIDEO APPLICATION FOR REMOTE SERVICING - One embodiment of the present invention provides a system for sharing annotated videos. During operation, the system establishes a real-time video-sharing session between a remote field computer and a local computer. During the established real-time video-sharing session, the system receives a real-time video stream from a remote field computer, forwards the real-time video stream to a local computer to allow an expert to provide an annotation to the real-time video stream, receives the annotation from the local computer, and forwards the annotation to the remote field computer, which associates the annotation with a corresponding portion of the real-time video stream and displays the annotation on top of the corresponding portion of the real-time video stream. | 12-05-2013 |
20140016820 | DISTRIBUTED OBJECT TRACKING FOR AUGMENTED REALITY APPLICATION - One embodiment of the present invention provides a system for tracking and distributing annotations for a video stream. During operation, the system receives, at an annotation server, the video stream originating from a remote field computer, extracts a number of features from the received video stream, and identifies a group of features that matches a known feature group, which is associated with an annotation. The system further associates the identified group of features with the annotation, and forwards the identified group of features and the annotation to the remote field computer, thereby facilitating the remote field computer to associate the annotation with a group of locally extracted features and display the video stream with the annotation placed in a location based at least on locations of the locally extracted features. | 01-16-2014 |
20140156389 | INTERACTIVE CONSUMER LEVEL SERVICING PORTAL WITH PER-USE USER PAYMENT - One embodiment of the present invention provides a system for helping a consumer to solve a problem associated with a product. During operation, the system receives, via a web portal, a description of the problem from the consumer; generates a list of experts based on the description of the problem; receives a selection of an expert based on the list from the consumer; and establishes a online session between the consumer and the selected expert, thereby facilitating the expert in helping the consumer to solve the problem. | 06-05-2014 |
20140320529 | VIEW STEERING IN A COMBINED VIRTUAL AUGMENTED REALITY SYSTEM - One embodiment of the present invention provides a system for assisting view-steering from a remote client machine. During operation, the system receives, at a local client from a collaboration server, a view-synchronization request for synchronizing a local scene displayed on the local client with a remote scene displayed on the remote client; generates, at the local client, a view-steering widget based on the view-synchronization request; and displays the view-steering widget on top of the local scene, thereby facilitating a local user of the local client to update the local scene displayed on the local machine in order to match the local scene to at least a portion of the remote scene displayed on the remote client machine. | 10-30-2014 |
20140324751 | GENERALIZED CONTEXTUAL INTELLIGENCE PLATFORM - One embodiment of the present invention provides a system for providing user information to a recommender. During operation, the system receives, from the recommender, a registration for notification of changes to a context graph. The context graph includes information about user behavior and/or user interests. Next, the system receives, from a mobile device, event data derived from contextual data collected using detectors that detect the mobile device's physical surroundings. The system modifies the context graph based on the event data. The system then determines that the modification to the context graph matches the registration, and sends a notification of context graph change to the recommender. | 10-30-2014 |
20140344709 | RULE-BASED MESSAGING AND DIALOG ENGINE - One embodiment of the present invention provides a system for generating a message. During operation, the system receives user interaction event data. The user interaction event data describes explicit or implicit interactions of a user with a web application and/or mobile application. Next, the system modifies a graph describing the user's current context associated with the user based on an analysis of the user interaction event data, as interpreted by the system learning from previous processing of user interaction event data. The context graph includes information about the user's state, behavior, and interests, and some or all portions of the context graph may be shared between users. The system determines a set of rules associated with a group of users that includes the user, and then applies the determined set of rules to any context graph associated with the user to generate the message. | 11-20-2014 |