PRIMESENSE LTD. Patent applications |
Patent application number | Title | Published |
20140313519 | DEPTH SCANNING WITH MULTIPLE EMITTERS - Mapping apparatus includes a transmitter, which is configured to emit, in alternation, at least two beams comprising pulses of light along respective beam axes that are mutually offset transversely relative to a scan line direction of a raster pattern. A scanner is configured to scan the two or more beams in the raster pattern over a scene. A receiver is configured to receive the light reflected from the scene and to generate an output indicative of a time of flight of the pulses to and from points in the scene. A processor is coupled to process the output of the receiver so as to generate a 3D map of the scene. | 10-23-2014 |
20140211215 | Projectors of structured light - Optical apparatus includes a beam source, which is configured to generate an optical beam having a pattern imposed thereon. A projection lens is configured to receive and project the optical beam so as to cast the pattern onto a first area in space having a first angular extent. A field multiplier is interposed between the projection lens and the first area and is configured to expand the projected optical beam so as to cast the pattern onto a second area in space having a second angular extent that is at least 50% greater than the first angular extent. | 07-31-2014 |
20140211084 | INTEGRATED PHOTONICS MODULE FOR OPTICAL PROJECTION - Optical apparatus includes a semiconductor substrate and an edge-emitting radiation source, mounted on a surface of the substrate so as to emit optical radiation along an axis that is parallel to the surface. A reflector is fixed to the substrate in a location on the axis and is configured to reflect the optical radiation in a direction that is angled away from the surface. One or more optical elements are mounted on the substrate so as to receive and transmit the optical radiation reflected by the reflector. | 07-31-2014 |
20140211073 | OBJECTIVE OPTICS WITH INTERFERENCE FILTER - Optical apparatus includes an image sensor and an optical assembly, which is configured to focus optical radiation via an aperture stop onto the image sensor. The optical assembly includes a plurality of optical surfaces, consisting of a first, curved surface through which the optical radiation enters the assembly, a final surface through which the rays exit the assembly toward the image sensor, and at least two intermediate surfaces between the first and final surfaces. An interference filter, which has a center wavelength and a passband no greater than 4% of the center wavelength, and includes a coating formed on one of the optical surfaces. All rays of the optical radiation passing through the aperture stop are incident on the coating over a range of incidence angles with a half-width that is no greater than three fourths of the numerical aperture of the optical assembly. | 07-31-2014 |
20140153001 | GIMBALED SCANNING MIRROR ARRAY - An optical scanning device includes a substrate, which is etched to define an array of two or more parallel micromirrors and a support surrounding the micromirrors. Respective spindles connect the micromirrors to the support, thereby defining respective parallel axes of rotation of the micromirrors relative to the support. One or more flexible coupling members are connected to the micromirrors so as to synchronize an oscillation of the micromirrors about the respective axes. | 06-05-2014 |
20140152552 | DETECTING USER INTENT TO REMOVE A PLUGGABLE PERIPHERAL DEVICE - A method includes receiving, from a three-dimensional (3D) sensing device coupled to a computer, a sequence of 3D maps including at least part of a hand of a user positioned in proximity to the computer. In embodiments of the present invention, the computer is coupled to one or more peripheral devices, and upon identifying, in the sequence of 3D maps, a movement of the hand toward a given peripheral device, an action preparatory to disengaging the given peripheral device is initiated. | 06-05-2014 |
20140118493 | THREE-DIMENSIONAL MAPPING AND IMAGING - Imaging apparatus includes an illumination subassembly, which is configured to project onto an object a pattern of monochromatic optical radiation in a given wavelength band. An imaging subassembly includes an image sensor, which is configured both to capture a first, monochromatic image of the pattern on the object by receiving the monochromatic optical radiation reflected from the object and to capture a second, color image of the object by receiving polychromatic optical radiation, and to output first and second image signals responsively to the first and second images, respectively. A processor is configured to process the first and second signals so as to generate and output a depth map of the object in registration with the color image. | 05-01-2014 |
20140118335 | Depth mapping with enhanced resolution - A method for depth mapping includes receiving an image of a pattern of spots that has been projected onto a scene, which includes a feature having a set of elongate appendages, which have respective transverse dimensions that are less than twice an average distance between the spots in the pattern that is projected onto the feature. The image is processed in order to segment and find a three-dimensional (3D) location of the feature. The spots appearing on the feature in the 3D location are connected in order to extract separate, respective contours of the appendages. | 05-01-2014 |
20140110566 | OVERLOAD PROTECTION FOR AMPLIFIER OF PHOTODIODE SIGNAL - A method for sensing includes connecting an input of a trans-impedance amplifier (TIA) to a first terminal of a sensor, which generates a current in response to an input signal that is incident on the sensor, the signal comprising pulses of a characteristic duration. A resistor is connected in series between a second terminal of the sensor and a power supply, which is set to drive the sensor at a selected voltage. A capacitor is connected to the second terminal in parallel with the sensor and the TIA and in series with the resistor. An upper limit is set on the current that is to be input to the TIA from the sensor, and respective values of the resistor and the capacitor are chosen, responsively to the characteristic duration of the pulses and the selected voltage, to prevent the current input to the TIA from exceeding the upper limit. | 04-24-2014 |
20140043230 | Three-Dimensional User Interface Session Control - A method, including receiving, by a computer executing a non-tactile three dimensional (3D) user interface, a set of multiple 3D coordinates representing a gesture by a hand positioned within a field of view of a sensing device coupled to the computer, the gesture including a first motion in a first direction along a selected axis in space, followed by a second motion in a second direction, opposite to the first direction, along the selected axis. Upon detecting completion of the gesture, the non-tactile 3D user interface is transitioned from a first state to a second state. | 02-13-2014 |
20140037191 | LEARNING-BASED POSE ESTIMATION FROM DEPTH MAPS - A method for processing data includes receiving a depth map of a scene containing a humanoid form. Respective descriptors are extracted from the depth map based on the depth values in a plurality of patches distributed in respective positions over the humanoid form. The extracted descriptors are matched to previously-stored descriptors in a database. A pose of the humanoid form is estimated based on stored information associated with the matched descriptors. | 02-06-2014 |
20140028548 | GAZE DETECTION IN A 3D MAPPING ENVIRONMENT - A method, including receiving a three-dimensional (3D) map of at least a part of a body of a user ( | 01-30-2014 |
20140022348 | DEPTH MAPPING USING TIME-CODED ILLUMINATION - A method for depth mapping includes illuminating an object with a time-coded pattern and capturing images of the time-coded pattern on the object using a matrix of detector elements. The time-coded pattern in the captured images is decoded using processing circuitry embedded in each of the detector elements so as to generate respective digital shift values, which are converted into depth coordinates. | 01-23-2014 |
20130321271 | POINTING-BASED DISPLAY INTERACTION - A method includes receiving and segmenting a first sequence of three-dimensional (3D) maps over time of at least a part of a body of a user of a computerized system in order to extract 3D coordinates of a first point and a second point of the user, the 3D maps indicating a motion of the second point with respect to a display coupled to the computerized system. A line segment that intersects the first point and the second point is calculated, and a target point is identified where the line segment intersects the display. An interactive item presented on the display in proximity to the target point is engaged. | 12-05-2013 |
20130321265 | Gaze-Based Display Control - A method includes receiving an image including an eye of a user of a computerized system and identifying, based the image of the eye, a direction of a gaze performed by the user. Based on the direction of the gaze, a region on a display coupled to the computerized system is identified, an operation is performed on content presented in the region. | 12-05-2013 |
20130293679 | Upper-Body Skeleton Extraction from Depth Maps - A method for processing data includes receiving a depth map of a scene containing at least an upper body of a humanoid form. The depth map is processed so as to identify a head and at least one arm of the humanoid form in the depth map. Based on the identified head and at least one arm, and without reference to a lower body of the humanoid form, an upper-body pose, including at least three-dimensional (3D) coordinates of shoulder joints of the humanoid form, is extracted from the depth map. | 11-07-2013 |
20130263036 | Gesture-based interface with enhanced features - A method includes presenting, on a display coupled to a computer, an image of a keyboard comprising multiple keys, and receiving a sequence of three-dimensional (3D) maps including a hand of a user positioned in proximity to the display. An initial portion of the sequence of 3D maps is processed to detect a transverse gesture performed by a hand of a user positioned in proximity to the display, and a cursor is presented on the display at a position indicated by the transverse gesture. While presenting the cursor in proximity to the one of the multiple keys, one of the multiple keys is selected upon detecting a grab gesture followed by a pull gesture followed by a release gesture in a subsequent portion of the sequence of 3D maps. | 10-03-2013 |
20130250387 | Diffraction-based sensing of mirror position - Scanning apparatus includes a transmitter, which is configured to emit a beam comprising pulses of light, and a scanning mirror, which is configured to scan the beam over a scene. A receiver is configured to receive the light reflected from the scene and to generate an output indicative of the pulses returned from the scene. A grating is formed on an optical surface in the apparatus and is configured to diffract a portion of the beam at a predetermined angle, so as to cause the diffracted portion to be returned from the scanning mirror to the receiver. A controller is coupled to process the output of the receiver so as to detect the diffracted portion and to monitor a scan of the mirror responsively thereto. | 09-26-2013 |
20130241824 | INTEGRATED PROCESSOR FOR 3D MAPPING - A device for processing data includes a first input port for receiving color image data from a first image sensor and a second input port for receiving depth-related image data from a second image sensor. Processing circuitry generates a depth map using the depth-related image data. At least one output port conveys the depth map and the color image data to a host computer. | 09-19-2013 |
20130236089 | LEARNING-BASED ESTIMATION OF HAND AND FINGER POSE - A method for processing data includes receiving a depth map of a scene containing a human hand, the depth map consisting of a matrix of pixels having respective pixel depth values. The method continues by extracting from the depth map respective descriptors based on the depth values in a plurality of patches distributed in respective positions over the human hand, and matching the extracted descriptors to previously-stored descriptors in a database. A pose of the human hand is estimated based on stored information associated with the matched descriptors. | 09-12-2013 |
20130230234 | ANALYSIS OF THREE-DIMENSIONAL SCENES WITH A SURFACE MODEL - A method for processing data includes receiving a depth map of a scene containing a humanoid form. The depth map is processed so as to identify three-dimensional (3D) connected components in the scene, each connected component including a set of the pixels that are mutually adjacent and have mutually-adjacent depth values. Separate, first and second connected components are identified as both belonging to the humanoid form, and a representation of the humanoid form is generated including both of the first and second connected components. | 09-05-2013 |
20130230215 | IDENTIFYING COMPONENTS OF A HUMANOID FORM IN THREE-DIMENSIONAL SCENES - A method for processing data includes receiving a depth map of a scene containing a humanoid form. The depth map is processed so as to identify three-dimensional (3D) connected components in the scene, each connected component including a set of the pixels that are mutually adjacent and have mutually-adjacent depth values. Separate, first and second connected components are identified as both belonging to the humanoid form, and a representation of the humanoid form is generated including both of the first and second connected components. | 09-05-2013 |
20130222239 | ASYMMETRIC MAPPING FOR TACTILE AND NON-TACTILE USER INTERFACES - A method, including receiving, by a computer, a sequence of signals indicating a motion of a hand of a user within a predefined area, and segmenting the area into multiple regions. Responsively to the signals, a region is identified in which the hand is located, and a mapping ration is assigned to the motion of the hand based on a direction of the motion and the region in which the hand is located. Using the assigned mapping ratio, a cursor on a display is presented responsively to the indicated motion of the hand. | 08-29-2013 |
20130207970 | SCANNING DEPTH ENGINE - Mapping apparatus includes a transmitter, which emits a beam comprising pulses of light, and a scanner, which is configured to scan the beam, within a predefined scan range, over a scene. A receiver receives the light reflected from the scene and to generate an output indicative of a time of flight of the pulses to and from points in the scene. A processor is coupled to control the scanner so as to cause the beam to scan over a selected window within the scan range and to process the output of the receiver so as to generate a 3D map of a part of the scene that is within the selected window. | 08-15-2013 |
20130206967 | INTEGRATED OPTOELECTRONIC MODULES - An optoelectronic module includes a micro-optical substrate and a beam transmitter, including a laser die mounted on the micro-optical substrate and configured to emit at least one laser beam along a beam axis. A receiver includes a detector die mounted on the micro-optical substrate and configured to sense light received by the module along a collection axis of the receiver. Beam-combining optics are configured to direct the laser beam and the received light so that the beam axis is aligned with the collection axis outside the module. | 08-15-2013 |
20130202161 | Enhanced face detection using depth information - A method for face detection includes capturing a depth map and an image of a scene and selecting one or more locations in the image to test for presence of human faces. At each selected location, a respective face detection window is defined, having a size that is scaled according to a depth coordinate of the location that is indicated by the depth map. Apart of the image that is contained within each face detection window is processed to determine whether the face detection window contains a human face. Similar methods may also be applied in identifying other object types. | 08-08-2013 |
20130155195 | Method and system for object reconstruction - A system and method are presented for use in the object reconstruction. The system comprises an illuminating unit, and an imaging unit (see FIG. | 06-20-2013 |
20130147921 | Generation of patterned radiation - Imaging apparatus includes an illumination assembly, including a plurality of radiation sources and projection optics, which are configured to project radiation from the radiation sources onto different, respective regions of a scene. An imaging assembly includes an image sensor and objective optics configured to form an optical image of the scene on the image sensor, which includes an array of sensor elements arranged in multiple groups, which are triggered by a rolling shutter to capture the radiation from the scene in successive, respective exposure periods from different, respective areas of the scene so as to form an electronic image of the scene. A controller is coupled to actuate the radiation sources sequentially in a pulsed mode so that the illumination assembly illuminates the different, respective areas of the scene in synchronization with the rolling shutter. | 06-13-2013 |
20130136305 | Pattern generation using diffractive optical elements - Apparatus ( | 05-30-2013 |
20130127854 | Scanning Projectors And Image Capture Modules For 3D Mapping - Apparatus ( | 05-23-2013 |
20130120841 | Optical pattern projection - Optical apparatus includes first and second diffractive optical elements (DOEs) arranged in series to diffract an input beam of radiation. The first DOE is configured to apply to the input beam a pattern with a specified divergence angle, while the second DOE is configured to split the input beam into a matrix of output beams with a specified fan-out angle. The divergence and fan-out angles are chosen so as to project the radiation onto a region in space in multiple adjacent instances of the pattern. | 05-16-2013 |
20130107021 | Interactive Reality Augmentation for Natural Interaction | 05-02-2013 |
20130106692 | Adaptive Projector | 05-02-2013 |
20130055150 | VISUAL FEEDBACK FOR TACTILE AND NON-TACTILE USER INTERFACES - A method, including presenting, by a computer, a scrollable list of interactive items on a display driven by the computer, and receiving an input from a user of the computer. The list is scrolled at a speed indicated by the input, and the list is zoomed in response to the speed of the scrolling. | 02-28-2013 |
20130055120 | SESSIONLESS POINTING USER INTERFACE - A method, including receiving, by a computer, a sequence of three-dimensional maps containing at least a hand of a user of the computer, and identifying, in the maps, a device coupled to the computer. The maps are analyzed to detect a gesture performed by the user toward the device, and the device is actuated responsively to the gesture. | 02-28-2013 |
20130044053 | Combining Explicit Select Gestures And Timeclick In A Non-Tactile Three Dimensional User Interface - A method including presenting, by a computer, multiple interactive items on a display coupled to the computer, and receiving, from a depth sensor, a sequence of three-dimensional (3D) maps containing at least a hand of a user of the computer. An explicit select gesture performed by the user toward one of the interactive items is detected in the maps, and the one of the interactive items is selected responsively to the explicit select gesture. Subsequent to selecting the one of the interactive items, a TimeClick functionality is actuated for subsequent interactive item selections to be made by the user. | 02-21-2013 |
20130038941 | Lens Array Projector - Optical apparatus includes a matrix of light sources arranged on a substrate with a predetermined, uniform spacing between the light sources. A beam homogenizer includes a first optical surface, including a first microlens array, which has a first pitch equal to the spacing between the light sources and which is aligned with the matrix so that a respective optical axis of each microlens in the array intercepts a corresponding light source in the matrix and transmits light emitted by the corresponding light source. A second optical surface, including a second microlens array, is positioned to receive and focus the light transmitted by the first microlens array and has a second pitch that is different from the first pitch. | 02-14-2013 |
20130038881 | Projectors of Structured Light - Optical apparatus includes a beam source, which is configured to generate an optical beam having a pattern imposed thereon. A projection lens is configured to receive and project the optical beam so as to cast the pattern onto a first area in space having a first angular extent. A field multiplier is interposed between the projection lens and the first area and is configured to expand the projected optical beam so as to cast the pattern onto a second area in space having a second angular extent that is at least 50% greater than the first angular extent. | 02-14-2013 |
20130014052 | ZOOM-BASED GESTURE USER INTERFACE - A user interface method, including presenting by a computer executing a user interface, multiple interactive items on a display. A first sequence of images is captured indicating a position in space of a hand of a user in proximity to the display, and responsively to the position, one of the interactive items is associated with the hand. After associating the item, a second sequence of images is captured indicating a movement of the hand, and responsively to the movement, a size of the one of the items is changed on the display. | 01-10-2013 |
20120313848 | Three Dimensional User Interface Session Control - A method, including receiving, by a computer executing a non-tactile three dimensional (3D) user interface, a set of multiple 3D coordinates representing a gesture by a hand positioned within a field of view of a sensing device coupled to the computer, the gesture including a first motion in a first direction along a selected axis in space, followed by a second motion in a second direction, opposite to the first direction, along the selected axis. Upon detecting completion of the gesture, the non-tactile 3D user interface is transitioned from a first state to a second state. | 12-13-2012 |
20120281240 | Error Compensation in Three-Dimensional Mapping - A method for forming a three-dimensional (3D) map of an object, including illuminating the object from a light source so as to project a pattern onto the object, capturing an image of the pattern using an array of detector elements, and processing the captured image so as to measure respective offsets of elements of the pattern in the captured image relative to a reference pattern, the offsets including at least a first offset of a first element of the pattern and a second offset of a second element of the pattern, measured respectively in first and second, mutually-perpendicular directions in a plane of the array. The method further includes computing a correction factor in response to the first offset, applying the correction factor to the second offset so as to find a corrected offset, and computing depth coordinates of the object in response to the corrected offset. | 11-08-2012 |
20120249744 | Multi-Zone Imaging Sensor and Lens Array - An imaging module includes a matrix of detector elements formed on a single semiconductor substrate and configured to output electrical signals in response to optical radiation that is incident on the detector elements. A filter layer is disposed over the detector elements and includes multiple filter zones overlying different, respective, convex regions of the matrix and having different, respective passbands. | 10-04-2012 |
20120223882 | Three Dimensional User Interface Cursor Control - A method, including receiving, by a computer executing a non-tactile three dimensional (3D) user interface, a first set of multiple 3D coordinates representing a gesture performed by a user positioned within a field of view of a sensing device coupled to the computer, the first set of 3D coordinates comprising multiple points in a fixed 3D coordinate system local to the sensing device. The first set of multiple 3D coordinates are transformed to a second set of corresponding multiple 3D coordinates in a subjective 3D coordinate system local to the user. | 09-06-2012 |
20120204133 | Gesture-Based User Interface - A user interface method, including capturing, by a computer, a sequence of images over time of at least a part of a body of a human subject, and processing the images in order to detect a gesture, selected from a group of gestures consisting of a grab gesture, a push gesture, a pull gesture, and a circular hand motion. A software application is controlled responsively to the detected gesture. | 08-09-2012 |
20120202569 | Three-Dimensional User Interface for Game Applications - A method, including defining an interaction surface containing an interaction region in space, capturing a sequence of depth maps over time of at least a part of a body of a human subject, and processing the depth maps in order to detect a direction and speed of movement of the part of the body as the part of the body passes through the interaction region. A software application, selected from a group of software applications consisting of a flight simulation and an interactive tennis game, is controlled responsively to the detected direction and speed. | 08-09-2012 |
20120182464 | Objective optics with interference filter - Optical apparatus includes an image sensor and an optical assembly, which is configured to focus optical radiation via an aperture stop onto the image sensor. The optical assembly includes a plurality of optical surfaces, consisting of a first, curved surface through which the optical radiation enters the assembly, a final surface through which the rays exit the assembly toward the image sensor, and at least two intermediate surfaces between the first and final surfaces. An interference filter, which has a center wavelength and a passband no greater than 4% of the center wavelength, and includes a coating formed on one of the optical surfaces. All rays of the optical radiation passing through the aperture stop are incident on the coating over a range of incidence angles with a half-width that is no greater than three fourths of the numerical aperture of the optical assembly. | 07-19-2012 |
20120169583 | SCENE PROFILES FOR NON-TACTILE USER INTERFACES - A method, including capturing an image of a scene including one or more users in proximity to a display coupled to a computer executing a non-tactile interface, and processing the image to generate a profile of the one or more users. Content is then selected for presentation on the display responsively to the profile. | 07-05-2012 |
20120140109 | Lens Arrays for Pattern Projection and Imaging - A method for imaging includes focusing optical radiation so as to form respective first and second optical images of a scene on different, respective first and second regions of an array of detector elements. The focused optical radiation is filtered with different, respective first and second passbands for the first and second regions. A difference is taken between respective first and second input signals provided by the detector elements in the first and second regions so as to generate an output signal indicative of the difference. | 06-07-2012 |
20120140094 | Pattern projection and imaging using lens arrays - A method for projection includes generating a pattern of illumination, and positioning an array of lenses so as to project different, respective parts of the pattern onto a scene. | 06-07-2012 |
20120078614 | Virtual keyboard for a non-tactile three dimensional user interface - A method, including presenting, by a computer system executing a non-tactile three dimensional user interface, a virtual keyboard on a display, the virtual keyboard including multiple virtual keys, and capturing a sequence of depth maps over time of a body part of a human subject. On the display, a cursor is presented at positions indicated by the body part in the captured sequence of depth maps, and one of the multiple virtual keys is selected in response to an interruption of a motion of the presented cursor in proximity to the one of the multiple virtual keys. | 03-29-2012 |
20120070070 | LEARNING-BASED POSE ESTIMATION FROM DEPTH MAPS - A method for processing data includes receiving a depth map of a scene containing a humanoid form. Respective descriptors are extracted from the depth map based on the depth values in a plurality of patches distributed in respective positions over the humanoid form. The extracted descriptors are matched to previously-stored descriptors in a database. A pose of the humanoid form is estimated based on stored information associated with the matched descriptors. | 03-22-2012 |
20120042150 | MULTIPROCESSOR SYSTEM-ON-A-CHIP FOR MACHINE VISION ALGORITHMS - A multiprocessor system includes a main memory and multiple processing cores that are configured to execute software that uses data stored in the main memory. In some embodiments, the multiprocessor system includes a data streaming unit, which is connected between the processing cores and the main memory and is configured to pre-fetch the data from the main memory for use by the multiple processing cores. In some embodiments, the multiprocessor system includes a scratch-pad processing unit, which is connected to the processing cores and is configured to execute, on behalf of the multiple processing cores, a selected part of the software that causes two or more of the processing cores to access concurrently a given item of data. | 02-16-2012 |
20120038986 | PATTERN PROJECTOR - A pattern projector, comprising a light source, configured to emit a beam of light. A transparent substrate, which has a pair of mutually-opposed planar surfaces is configured to receive and propagate the beam within the substrate by total internal reflection between the planar surfaces. The transparent substrate comprises a diffractive structure that is formed on one of the planar surfaces and is configured to direct at least a part of the beam to propagate out of the substrate in a direction that is angled away from the surface and to create a pattern comprising multiple interleaved light and dark areas. | 02-16-2012 |
20110310010 | GESTURE BASED USER INTERFACE - A gesture based user interface includes a movement monitor configured to monitor a user's hand and to provide a signal based on movements of the hand. A processor is configured to provide at least one interface state in which a cursor is confined to movement within a single dimension region responsive to the signal from the movement monitor, and to actuate different commands responsive to the signal from the movement monitor and the location of the cursor in the single dimension region. | 12-22-2011 |
20110293137 | ANALYSIS OF THREE-DIMENSIONAL SCENES - A method for processing data includes receiving a depth map of a scene containing a humanoid form. The depth map is processed so as to identify three-dimensional (3D) connected components in the scene, each connected component including a set of the pixels that are mutually adjacent and have mutually-adjacent depth values. Separate, first and second connected components are identified as both belonging to the humanoid form, and a representation of the humanoid form is generated including both of the first and second connected components. | 12-01-2011 |
20110292036 | DEPTH SENSOR WITH APPLICATION INTERFACE - A method for processing data includes receiving a depth map of a scene containing a body of a humanoid subject. The depth map includes a matrix of pixels, each pixel corresponding to a respective location in the scene and having a respective pixel depth value indicative of a distance from a reference plane to the respective location. The depth map is processed in a digital processor to extract a skeleton of at least a part of the body, the skeleton including multiple joints having respective coordinates. An application program interface (API) indicates at least the coordinates of the joints. | 12-01-2011 |
20110254765 | Remote text input using handwriting - A method for user input includes capturing a sequence of positions of at least a part of a body, including a hand, of a user of a computerized system, independently of any object held by or attached to the hand, while the hand delineates textual characters by moving freely in a 3D space. The positions are processed to extract a trajectory of motion of the hand. Features of the trajectory are analyzed in order to identify the characters delineated by the hand. | 10-20-2011 |
20110211754 | Tracking body parts by combined color image and depth processing - A method for image processing includes receiving a depth image of a scene containing a human subject and receiving a color image of the scene containing the human subject. A part of a body of the subject is identified in at least one of the images. A quality of both the depth image and the color image is evaluated, and responsively to the quality, one of the images is selected to be dominant in processing of the part of the body in the images. The identified part is localized in the dominant one of the images, while using supporting data from the other one of the images. | 09-01-2011 |
20110211044 | Non-Uniform Spatial Resource Allocation for Depth Mapping - A method for depth mapping includes providing depth mapping resources including an illumination module, which is configured to project patterned optical radiation into a volume of interest containing the object, and an image capture module, which is configured to capture an image of the pattern reflected from the object. A depth map of the object is generated using the resources while applying at least one of the resources non-uniformly over the volume of interest. | 09-01-2011 |
20110205421 | WIDEBAND AMBIENT LIGHT REJECTION - Optical apparatus includes an image sensor and objective optics, which are configured to collect and focus optical radiation over a range of wavelengths along a common optical axis toward a plane of the image sensor. A dispersive element is positioned to spread the optical radiation collected by the objective optics so that different wavelengths in the range are focused along different, respective optical axes toward the plane. | 08-25-2011 |
20110188054 | INTEGRATED PHOTONICS MODULE FOR OPTICAL PROJECTION - Optical apparatus includes a semiconductor substrate and an edge-emitting radiation source, mounted on a surface of the substrate so as to emit optical radiation along an axis that is parallel to the surface. A reflector is fixed to the substrate in a location on the axis and is configured to reflect the optical radiation in a direction that is angled away from the surface. One or more optical elements are mounted on the substrate so as to receive and transmit the optical radiation reflected by the reflector. | 08-04-2011 |
20110187878 | Synchronization of projected illumination with rolling shutter of image sensor - Imaging apparatus includes an illumination assembly, including a plurality of radiation sources and projection optics, which are configured to project radiation from the radiation sources onto different, respective regions of a scene. An imaging assembly includes an image sensor and objective optics configured to form an optical image of the scene on the image sensor, which includes an array of sensor elements arranged in multiple groups, which are triggered by a rolling shutter to capture the radiation from the scene in successive, respective exposure periods from different, respective areas of the scene so as to form an electronic image of the scene. A controller is coupled to actuate the radiation sources sequentially in a pulsed mode so that the illumination assembly illuminates the different, respective areas of the scene in synchronization with the rolling shutter. | 08-04-2011 |
20110158508 | DEPTH-VARYING LIGHT FIELDS FOR THREE DIMENSIONAL SENSING - A method for mapping includes projecting onto an object a pattern of multiple spots having respective positions and shapes, such that the positions of the spots in the pattern are uncorrelated, while the shapes share a common characteristic. An image of the spots on the object is captured and processed so as to derive a three-dimensional (3D) map of the object. | 06-30-2011 |
20110134114 | DEPTH-BASED GAIN CONTROL - A method for depth mapping includes capturing an electronic image of a scene using an imaging device. The electronic image is processed to generate depth data with respect to the scene. The gain of the imaging device is set responsively to the depth data. | 06-09-2011 |
20110114857 | OPTICAL PROJECTOR WITH BEAM MONITOR - Optical apparatus includes a device package, with a radiation source contained in the package and configured to emit a beam of coherent radiation. A diffractive optical element (DOE) is mounted in the package so as to receive and diffract the radiation from the radiation source into a predefined pattern comprising multiple diffraction orders. An optical detector is positioned in the package so as to receive and sense an intensity of a selected diffraction order of the DOE. | 05-19-2011 |
20110075259 | OPTICAL DESIGNS FOR ZERO ORDER REDUCTION - Apparatus for projecting a pattern includes a first diffractive optical element (DOE) configured to diffract an input beam so as to generate a first diffraction pattern on a first region of a surface, the first diffraction pattern including a zero order beam. A second DOE is configured to diffract the zero order beam so as to generate a second diffraction pattern on a second region of the surface such that the first and the second regions together at least partially cover the surface. | 03-31-2011 |
20110069389 | OPTICAL DESIGNS FOR ZERO ORDER REDUCTION - Apparatus for projecting a pattern includes a first diffractive optical element (DOE) configured to diffract an input beam so as to generate a first diffraction pattern on a first region of a surface, the first diffraction pattern including a zero order beam. A second DOE is configured to diffract the zero order beam so as to generate a second diffraction pattern on a second region of the surface such that the first and the second regions together at least partially cover the surface. | 03-24-2011 |
20110052006 | EXTRACTION OF SKELETONS FROM 3D MAPS - A method for processing data includes receiving a temporal sequence of depth maps of a scene containing a humanoid form having a head. The depth maps include a matrix of pixels having respective pixel depth values. A digital processor processes at least one of the depth maps so as to find a location of the head and estimates dimensions of the humanoid form based on the location. The processor tracks movements of the humanoid form over the sequence using the estimated dimensions. | 03-03-2011 |
20110025827 | Depth Mapping Based on Pattern Matching and Stereoscopic Information - A method for depth mapping includes projecting a pattern of optical radiation onto an object. A first image of the pattern on the object is captured using a first image sensor, and this image is processed to generate pattern-based depth data with respect to the object. A second image of the object is captured using a second image sensor, and the second image is processed together with another image to generate stereoscopic depth data with respect to the object. The pattern-based depth data is combined with the stereoscopic depth data to create a depth map of the object. | 02-03-2011 |
20100284082 | OPTICAL PATTERN PROJECTION - Optical apparatus includes first and second diffractive optical elements (DOEs) arranged in series to diffract an input beam of radiation. The first DOE is configured to apply to the input beam a pattern with a specified divergence angle, while the second DOE is configured to split the input beam into a matrix of output beams with a specified fan-out angle. The divergence and fan-out angles are chosen so as to project the radiation onto a region in space in multiple adjacent instances of the pattern. | 11-11-2010 |
20100265316 | THREE-DIMENSIONAL MAPPING AND IMAGING - Imaging apparatus includes an illumination subassembly, which is configured to project onto an object a pattern of monochromatic optical radiation in a given wavelength band. An imaging subassembly includes an image sensor, which is configured both to capture a first, monochromatic image of the pattern on the object by receiving the monochromatic optical radiation reflected from the object and to capture a second, color image of the object by receiving polychromatic optical radiation, and to output first and second image signals responsively to the first and second images, respectively. A processor is configured to process the first and second signals so as to generate and output a depth map of the object in registration with the color image. | 10-21-2010 |
20100235786 | ENHANCED 3D INTERFACING FOR REMOTE DEVICES - Operating a computerized system includes presenting user interface elements on a display screen. A first gesture made in a three-dimensional space by a part of a body of a user is detected. In response to the first gesture, an area of the display screen selected by the user is identified, and a magnification level of one or more of the user elements appearing in the selected area on the display screen is increased. After increasing the magnification level, a second gesture made by the part of the body of the user is detected so as to select one of the user interface elements that appear in the selected area. | 09-16-2010 |