Patent application number | Description | Published |
20130257692 | METHOD AND APPARATUS FOR EGO-CENTRIC 3D HUMAN COMPUTER INTERFACE - In the method, a processor generates a three dimensional interface with at least one virtual object, defines a stimulus of the interface, and defines a response to the stimulus. The stimulus is an approach to the virtual object with a finger or other end-effector to within a threshold of the virtual object. When the stimulus is sensed, the response is executed. Stimuli may include touch, click, double click, peg, scale, and swipe gestures. The apparatus includes a processor that generates a three dimensional interface with at least one virtual object, and defines a stimulus for the virtual object and a response to the stimulus. A display outputs the interface and object. A camera or other sensor detects the stimulus, e.g. a gesture with a finger or other end-effector, whereupon the processor executes the response. The apparatus may be part of a head mounted display. | 10-03-2013 |
20130336528 | METHOD AND APPARATUS FOR IDENTIFYING INPUT FEATURES FOR LATER RECOGNITION - Disclosed are methods and apparatuses to recognize actors during normal system operation. The method includes defining actor input such as hand gestures, executing and detecting input, and identifying salient features of the actor therein. A model is defined from salient features, and a data set of salient features and/or model are retained, and may be used to identify actors for other inputs. A command such as “unlock” may be executed in response to actor input. Parameters may be applied to further define where, when, how, etc. actor input is executed, such as defining a region for a gesture. The apparatus includes a processor and sensor, the processor defining actor input, identifying salient features, defining a model therefrom, and retaining a data set. A display may also be used to show actor input, a defined region, relevant information, and/or an environment. A stylus or other non-human actor may be used. | 12-19-2013 |
20130336529 | METHOD AND APPARATUS FOR SEARCHING IMAGES - Disclosed are methods and apparatuses for searching images. An image is received and a first search path is defined for the image. The first search path may be a straight line, horizontal, and/or near the bottom of the image, and/or may begin at one edge and move toward the other. A transition is defined for the image, distinguishing a feature to be found. The image is searched for the transition along the first search path. When the transition is detected, the image is searched along a second search path that follows the transition. The apparatus includes an image sensor and a processor. The sensor is adapted to obtain images. The processor is adapted to define a first search path and a transition for the image, to search for the transition along the first search path, and to search along a second search path upon detecting the transition, following the transition. | 12-19-2013 |
20140067768 | METHOD AND APPARATUS FOR CONTENT ASSOCIATION AND HISTORY TRACKING IN VIRTUAL AND AUGMENTED REALITY - A machine-implemented method includes establishing a virtual or augmented reality entity, and establishing a state for the entity having a state time and state properties including a state spatial arrangement. The data entity and state are stored, and are subsequently received and outputted at a time other than the state time so as to exhibit a “virtual time machine” functionality. An apparatus includes a processor, a data store, and an output. A data entity establisher, a state establisher, a storer, a data entity receiver, a state receiver, and an outputter are instantiated on the processor. | 03-06-2014 |
20140067869 | METHOD AND APPARATUS FOR CONTENT ASSOCIATION AND HISTORY TRACKING IN VIRTUAL AND AUGMENTED REALITY - A machine-implemented method includes establishing a virtual or augmented reality entity, and establishing a state for the entity having a state time and state properties including a state spatial arrangement. The data entity and state are stored, and are subsequently received and outputted at a time other than the state time so as to exhibit a “virtual time machine” functionality. An apparatus includes a processor, a data store, and an output. A data entity establisher, a state establisher, a storer, a data entity receiver, a state receiver, and an outputter are instantiated on the processor. | 03-06-2014 |
20140115520 | METHOD AND APPARATUS FOR SECURE DATA ENTRY USING A VIRTUAL INTERFACE - Method and apparatus for secure data entry. In the method a virtual data entry interface is generated, and is outputted so as to be readable only by the user. The user then enters data using the interface. The apparatus includes at least one display, or optionally a pair of displays that output a 3D stereo image. It also includes a data processor, and at least one sensor, or optionally a pair of sensors that capture 3D stereo data. The data processor generates a virtual data entry interface, and communicates it to the display or displays. The displays output the virtual interface such that it is only readable by the user. The sensor or sensors receives data entered by the user's actions, and send signals representing those actions to the processor. The processor then detects the data from the signals. | 04-24-2014 |
20140118570 | METHOD AND APPARATUS FOR BACKGROUND SUBTRACTION USING FOCUS DIFFERENCES - First and second images are captured at first and second focal lengths, the second focal length being longer than the first focal length. Element sets are defined with a first element of the first image and a corresponding second element of the second image. Element sets are identified as background if the second element thereof is more in-focus than or as in-focus as the first element. Background elements are subtracted from further analysis. Comparisons are based on relative focus, e.g. whether image elements are more or less in-focus. Measurement of absolute focus is not necessary, nor is measurement of absolute focus change; images need not be in-focus. More than two images, multiple element sets, and/or multiple categories and relative focus relationships also may be used. | 05-01-2014 |
20140125557 | METHOD AND APPARATUS FOR A THREE DIMENSIONAL INTERFACE - Method and apparatus for interacting with a three dimensional interface. In the method, a three dimensional interface with at least one virtual object is generated. An interaction zone is defined and generated, enclosing some or all of the object. A stimulus of the interaction zone, e.g. approach/contact with a finger/stylus is defined, and a response to the stimulus is defined, e.g. changes to the object, system actions, feedback, etc. When the stimulus is sensed the response is executed. The apparatus includes a processor that generates a three dimensional interface with at least one virtual object, defines an interaction zone for the object, and defines a stimulus and a response. A display outputs the interface and object. A camera or other sensor detects stimulus of the interaction zone, whereupon the processor generates a response signal. The apparatus may be part of a head mounted display. | 05-08-2014 |
20140139340 | METHOD AND APPARATUS FOR POSITION AND MOTION INSTRUCTION - World data is established, including real-world position and/or real-world motion of an entity. Target data is established, including planned or ideal position and/or motion for the entity. Guide data is established, including information for guiding a person or other subject in bringing world data into match with target data. The guide data is outputted to the subject as virtual and/or augmented reality data. Evaluation data may be established, including a comparison of world data with target data. World data, target data, guide data, and/or evaluation data may be dynamically updated. Subjects may be instructed in positions and motions by using guide data to bring world data into match with target data, and by receiving evaluation data. Instruction includes physical therapy, sports, recreation, medical treatment, fabrication, diagnostics, repair of mechanical systems, etc. | 05-22-2014 |
20140237586 | APPARATUS FOR PROCESSING WITH A SECURE SYSTEM MANAGER - Method and apparatus for secure processing. The method includes detecting communication among secure and non-secure data entities, prohibiting execution of non-secure executable instructions on secure data entities unless the non-secure executable instructions are recorded in a permitted instruction record, and prohibiting execution of non-secure executable instructions if the non-secure executable instructions are recorded in a prohibited instruction record. The apparatus includes a processor, at least one non-secure data entity, and secure data entities including: a communication monitor adapted to detect communication among secure and non-secure data entities; a permitted instruction record; a first prohibitor adapted to prohibit execution of non-secure executable instructions on secure data entities unless the non-secure executable instructions are recorded in the permitted instruction record; a prohibited instruction record; and a second prohibitor adapted to prohibit execution of non-secure executable instructions if the non-secure executable instructions are recorded in the prohibited instruction record. | 08-21-2014 |
20140357246 | METHOD AND APPARATUS FOR OPT-IN COMPLIANCE WITH REGULATIONS - Disclosed are methods, systems and paradigms for opt-in compliance with regulations. A region in physical space is defined. A condition for the region is defined, the condition being a capability of a communicator such as video recording, still image recording, audio recording, audio output, text messaging, audio communication, or remote connection. The presence and location of a communicator with the capability is detected in the region, and a message is sent to the communicator with a request for a response accepting or rejecting remote deactivation of the capability of the communicator. If an acceptance response is received, the communicator capability is deactivated. If an acceptance response is not received, a notification is generated that includes the lack of acceptance response and the location of the communicator. | 12-04-2014 |