Patent application number | Description | Published |
20090285484 | PORTABLE IMAGE PROCESSING AND MULTIMEDIA INTERFACE - A portable device configured to provide enhanced shopping information is provided. The portable device has a display screen and an image capture device and the portable device is configured to access databases through a wireless network. The portable device includes image recognition logic that is configured to perform analysis of an image of an object that includes a bar code associated with a product. The analysis determines if the graphics found on the object correspond to a bar code and a portion of an image with the bar code is communicated through the wireless network to the databases to identify the product. The portable device further includes image generation logic that is configured to obtain product information for the identified product from the databases and present the product information on the display screen of the portable device. | 11-19-2009 |
20100033427 | Computer Image and Audio Processing of Intensity and Input Devices for Interfacing with a Computer Program - A method and system for determining an intensity value of an interaction with a computer program is described. The method and device includes capturing an image of a capture zone, identifying an input object in the image, identifying an initial value of a parameter of the input object, capturing a second image of the capture zone, and identifying a second value of the parameter of the input object. The parameter identifies one or more of a shape, color, or brightness of the input object and is affected by human manipulation of the input object. The extent of change in the parameter is calculated, which is the difference between the second value and the first value. An activity input is provided to the computer program, the activity input including an intensity value representing the extent of change of the parameter. A method for detecting an intensity value from sound generating input objects, and a computer video game are also described. | 02-11-2010 |
20100056277 | METHODS FOR DIRECTING POINTING DETECTION CONVEYED BY USER WHEN INTERFACING WITH A COMPUTER PROGRAM - A method for detecting direction conveyed by a user when interfacing with a computer program executed on a computing device. The computing device being interfaced with an image capture device. The method includes detecting a human head of a person in the image taken with the image capture device and assigning the human head a head location. And, detecting an object held by the person in the image and assigning the object an object location. The method determines a relative position in space between the head location and the object location from the capture location, such that the relative position is used to identify a pointing direction of the object. | 03-04-2010 |
20100097476 | Method and Apparatus for Optimizing Capture Device Settings Through Depth Information - A method for adjusting image capture settings for an image capture device is provided. The method initiates with capturing depth information of a scene at the image capture device. Depth regions are identified based on the captured depth information. Then, an image capture setting is adjusted independently for each of the depth regions. An image of the scene is captured with the image capture device, wherein the image capture setting is applied to each of the depth regions when the image of the scene is captured. | 04-22-2010 |
20100173710 | PATTERN CODES USED FOR INTERACTIVE CONTROL OF COMPUTER APPLICATIONS - Methods for determining input to be supplied to a computer program are provided. One of the methods include processing a first video frame having a pattern code before light is applied to the pattern code. The first video frame defines a first characteristic of the pattern code, and the pattern code is defined by at least two tags. The method further includes processing a second video frame having the pattern code when light is applied to the pattern code, such that the second video frame defines a second characteristic of the pattern code. Then, decoding the first characteristic and the second characteristic of the pattern code to produce decoded information. An interactive command is then initiated to the computer program. A type of the interactive command is defined by the decoded information, wherein one of the tags has a reflective surface and one of the tags has a non-reflective surface. | 07-08-2010 |
20100303298 | SELECTIVE SOUND SOURCE LISTENING IN CONJUNCTION WITH COMPUTER INTERACTIVE PROCESSING - A method and apparatus for capturing image and sound during interactivity with a computer program is provided. The apparatus includes an image capture unit that is configured to capture one or more image frames. Also provided is a sound capture unit. The sound capture unit is configured to identify one or more sound sources. The sound capture unit generates data capable of being analyzed to determine a zone of focus at which to process sound to the substantial exclusion of sounds outside of the zone of focus. In this manner, sound that is captured and processed for the zone of focus is used for interactivity with the computer program. | 12-02-2010 |
20110014981 | TRACKING DEVICE WITH SOUND EMITTER FOR USE IN OBTAINING INFORMATION FOR CONTROLLING GAME PROGRAM EXECUTION - A tracking device may include a body, a sound emitter operable to emit a sound, an array of two or more microphones adapted to produce discrete time domain input signals at a runtime, one or more processors coupled to the array of two or more microphones; and a memory coupled to the microphones and the processor. The memory has a set of processor readable instructions embodied therein. The instructions include one or more instructions for: | 01-20-2011 |
20110034244 | METHODS AND SYSTEMS FOR ENABLING DEPTH AND DIRECTION DETECTION WHEN INTERFACING WITH A COMPUTER PROGRAM - Detecting direction pointing direction when interfacing with a computer program is described. Two or more stereo images presented in front of two or more corresponding image capture devices can be captured. Each image capture device having a capture location in a coordinate space. The image capture devices can be synchronized with a strobe signal that is visible to each image capture device. When a person is captured in the image, first and second body parts of the person in the image can be identified and assigned first and second locations in the coordinate space. A relative position that includes a dimension of depth can be identified in coordinate space between the first location and the second location when viewed from the capture location. | 02-10-2011 |
20110086708 | SYSTEM FOR TRACKING USER MANIPULATIONS WITHIN AN ENVIRONMENT - Systems, methods and for analyzing game control input data are disclosed. A machine-readable medium having embodied thereon instructions for analyzing game control input data is also disclosed. | 04-14-2011 |
20110118031 | CONTROLLER FOR PROVIDING INPUTS TO CONTROL EXECUTION OF A PROGRAM WHEN INPUTS ARE COMBINED - A controller provides inputs to control execution of a program by combining controller input information and supplementary information indicating a three-dimensional motion of the controller. The combined input can be obtained by assigning the value of the controller input information as coarse control information and assigning the value of the supplementary input information as fine control information or vice versa. | 05-19-2011 |
20110294579 | Peripheral Device Having Light Emitting Objects for Interfacing With a Computer Gaming System Claim of Priority - A peripheral device for communicating with a computer gaming system having an image capture device associated therewith is provided. The image capture device is configured to capture image data of the peripheral device and the computer system is configured to exchange wireless communication data with the peripheral device. The peripheral device includes a body having a first location and a second location, where the first location is defined for a first light emitting object and the second location is defined for a second light emitting object. The first and second illuminating objects having a size that is identifiable in captured image data. The first location on the body is at a fixed predetermined distance from the second location. The peripheral device includes one or more buttons, and circuitry interfaced with the first and second light emitting objects. Also included is circuitry interfaced with a motion sensing device, and circuitry for the exchange of wireless communication data between the peripheral device and the computer gaming system. The wireless communication data includes data associated with the light emitting objects, the one or more buttons, and the motion sensing device. | 12-01-2011 |
20120046101 | APPARATUS FOR IMAGE AND SOUND CAPTURE IN A GAME ENVIRONMENT - An apparatus for capturing image and sound during interactivity with a computer game in a game environment is provided. The apparatus includes a housing and a base stand for supporting the housing. An image capture device is defined along a front portion of the housing. Also, an array of microphones is defined along the front portion of the housing. The array of microphones is defined by a single microphone positioned on a first lateral side of the image capture device and two or more microphones positioned on a second lateral side of the image capture device opposite the first side. The apparatus also includes a connector for connecting to a computing device. | 02-23-2012 |
20120319950 | METHODS AND SYSTEMS FOR ENABLING DEPTH AND DIRECTION DETECTION WHEN INTERFACING WITH A COMPUTER PROGRAM - One or more images can be captured with a depth camera having a capture location in a coordinate space. First and second objects in the one or more images can be identified and assigned corresponding first and second object locations in the coordinate space. A relative position can be identified in the coordinate space between the first object location and the second object location when viewed from the capture location by computing an azimuth angle and an altitude angle between the first object location and the object location in relation to the capture location. The relative position includes a dimension of depth with respect to the coordinate space. The dimension of depth is determined from analysis of the one or more images. A state of a computer program is changed based on the relative position. | 12-20-2012 |
20130084981 | CONTROLLER FOR PROVIDING INPUTS TO CONTROL EXECUTION OF A PROGRAM WHEN INPUTS ARE COMBINED - A controller provides inputs to control execution of a program by combining controller input information and supplementary information utilizing a value of a first one of the controller input information or the supplementary input information as a modifying input to the game program for modifying a mapping or gearing of an input controlling a still active function activated in accordance with at least one of a second one of the controller input information or the supplementary input information or vice versa. It is emphasized that this abstract is provided to comply with the rules requiring an abstract that will allow a searcher or other reader to quickly ascertain the subject matter of the technical disclosure. It is submitted with the understanding that it will not be used to interpret or limit the scope or meaning of the claims. | 04-04-2013 |
20140059484 | VOICE AND VIDEO CONTROL OF INTERACTIVE ELECTRONICALLY SIMULATED ENVIRONMENT - A method of moving objects in a graphical user interface, includes obtaining a video image of a user of the interface; displaying the video image on a display such that the video image is superposed with one or more objects displayed on the display; and moving one or more objects displayed on the display based on recognition of motions of the video image of the user. Recognition of motions of the video image may include recognition of motions of an image of the user's hand. | 02-27-2014 |
20140342827 | INERTIALLY TRACKABLE HAND-HELD CONTROLLER - A game controller includes an image capture unit; a body; at least one input device assembled with the body, the input device manipulable by a user to register an input from the user; an inertial sensor operable to produce information for quantifying a movement of said body through space; and at least one light source assembled with the body; and a processor coupled to the image capture unit and the inertial sensor. The processor is configured to track the body by analyzing a signal from the inertial sensor and analyzing an image of the light source from the image capture unit. The processor is configured to establish a gearing between movement of the body and actions to be applied by a computer program. | 11-20-2014 |