Patent application number | Description | Published |
20130342671 | SYSTEMS AND METHODS FOR TRACKING HUMAN HANDS USING PARTS BASED TEMPLATE MATCHING WITHIN BOUNDED REGIONS - Systems and methods for tracking human hands using parts based template matching within bounded regions are described. One embodiment of the invention includes a processor; an image capture system configured to capture multiple images of a scene; and memory containing a plurality of templates that are rotated and scaled versions of a finger template. A hand tracking application configures the processor to: obtain a reference frame of video data and an alternate frame of video data from the image capture system; identify corresponding pixels within the reference and alternate frames of video data; identify at least one bounded region within the reference frame of video data containing pixels having corresponding pixels in the alternate frame of video data satisfying a predetermined criterion; and detect at least one candidate finger within the at least one bounded region in the reference frame of video data. | 12-26-2013 |
20130343605 | SYSTEMS AND METHODS FOR TRACKING HUMAN HANDS USING PARTS BASED TEMPLATE MATCHING - Systems and methods for tracking human hands using parts based template matching are described. One embodiment of the invention includes a processor, a reference camera and memory containing: a hand tracking application; and a finger template including an edge features template. In addition, the hand tracking application configures the processor to: detect at least one candidate finger in a frame of video data received from the reference camera, where each candidate finger is a grouping of pixels identified by searching the frame of video data for a grouping of pixels that have image gradient orientations that match the edge features of the finger template accounting for rotation and scaling differences; and verify the correct detection of a candidate finger by confirming that the colors of the pixels within the grouping of pixels identified as a candidate finger satisfy a skin color criterion. | 12-26-2013 |
20130343606 | SYSTEMS AND METHODS FOR TRACKING HUMAN HANDS BY PERFORMING PARTS BASED TEMPLATE MATCHING USING IMAGES FROM MULTIPLE VIEWPOINTS - Systems and methods for tracking human hands by performing parts based template matching using images captured from multiple viewpoints are described. One embodiment of the invention includes a processor, a reference camera, an alternate view camera, and memory containing: a hand tracking application; and a plurality of edge feature templates that are rotated and scaled versions of a finger template that includes an edge features template. In addition, the hand tracking application configures the processor to: detect at least one candidate finger in a reference frame, where each candidate finger is a grouping of pixels identified by searching the reference frame for a grouping of pixels that have image gradient orientations that match one of the plurality of edge feature templates; and verify the correct detection of a candidate finger in the reference frame by locating a grouping of pixels in an alternate view frame that correspond to the candidate finger. | 12-26-2013 |
20130343610 | SYSTEMS AND METHODS FOR TRACKING HUMAN HANDS BY PERFORMING PARTS BASED TEMPLATE MATCHING USING IMAGES FROM MULTIPLE VIEWPOINTS - Systems and methods for tracking human hands by performing parts based template matching using images captured from multiple viewpoints are described. One embodiment includes a processor, a reference camera, an alternate view camera, and memory containing: a hand tracking application; and a plurality of edge feature templates that are rotated and scaled versions of a finger template that includes an edge features template. In addition, the hand tracking application configures the processor to: detect at least one candidate finger in a reference frame, where each candidate finger is a grouping of pixels identified by searching the reference frame for a grouping of pixels that have image gradient orientations that match one of the plurality of edge feature templates; and verify the correct detection of a candidate finger in the reference frame by locating a grouping of pixels in an alternate view frame that correspond to the candidate finger. | 12-26-2013 |
20140085625 | SKIN AND OTHER SURFACE CLASSIFICATION USING ALBEDO - A system and method are disclosed relating to a pipeline for generating a computer model of a target user, including a hand model of the user's hands and fingers, captured by an image sensor in a NUI system. The computer model represents a best estimate of the position and orientation of a user's hand or hands. The generated hand model may be used by a gaming or other application to determine such things as user gestures and control actions. | 03-27-2014 |
20140119599 | SYSTEMS AND METHODS FOR TRACKING HUMAN HANDS USING PARTS BASED TEMPLATE MATCHING WITHIN BOUNDED REGIONS - Systems and methods for tracking human hands using parts based template matching within bounded regions are described. One embodiment of the invention includes a processor; an image capture system configured to capture multiple images of a scene; and memory containing a plurality of templates that are rotated and scaled versions of a finger template. A hand tracking application configures the processor to: obtain a reference frame of video data and an alternate frame of video data from the image capture system; identify corresponding pixels within the reference and alternate frames of video data; identify at least one bounded region within the reference frame of video data containing pixels having corresponding pixels in the alternate frame of video data satisfying a predetermined criterion; and detect at least one candidate finger within the at least one bounded region in the reference frame of video data. | 05-01-2014 |
20140211991 | SYSTEMS AND METHODS FOR INITIALIZING MOTION TRACKING OF HUMAN HANDS - Systems and methods for initializing motion tracking of human hands are disclosed. One embodiment includes a processor; a reference camera; and memory containing: a hand tracking application; and a plurality of edge feature templates that are rotated and scaled versions of a base template. The hand tracking application configures the processor to: determine whether any pixels in a frame of video are part of a human hand, where a part of a human hand is identified by searching the frame of video data for a grouping of pixels that have image gradient orientations that match the edge features of one of the plurality of edge feature templates; track the motion of the part of the human hand visible in a sequence of frames of video; confirm that the tracked motion corresponds to an initialization gesture; and commence tracking the human hand as part of a gesture based interactive session. | 07-31-2014 |
20140211992 | SYSTEMS AND METHODS FOR INITIALIZING MOTION TRACKING OF HUMAN HANDS USING TEMPLATE MATCHING WITHIN BOUNDED REGIONS - Systems and methods for initializing motion tracking of human hands within bounded regions are disclosed. One embodiment includes: a processor; reference and alternate view cameras; and memory containing a plurality of templates that are rotated and scaled versions of a base template. In addition, a hand tracking application configures the processor to: obtain reference and alternate view frames of video data; generate a depth map; identify at least one bounded region within the reference frame of video data containing pixels having distances from the reference camera that are within a specific range of distances; determine whether any of the pixels within the at least one bounded region are part of a human hand; track the motion of the part of the human hand in a sequence of frames of video data obtained from the reference camera; and confirm that the tracked motion corresponds to a predetermined initialization gesture. | 07-31-2014 |
20150089453 | Systems and Methods for Interacting with a Projected User Interface - A system and method for providing a 3D gesture based interaction system for a projected 3D user interface is disclosed. A user interface display is projected onto a user surface. Image data of the user interface display and an interaction medium are captured. The image data includes visible light data and IR data. The visible light data is used to register the user interface display on the projected surface with the Field of View (FOV) of at least one camera capturing the image data. The IR data is used to determine gesture recognition information for the interaction medium. The registration information and gesture recognition information is then used to identify interactions. | 03-26-2015 |