Inventors list

Assignees list

Classification tree browser

Top 100 Inventors

Top 100 Assignees


Range or distance measuring

Subclass of:

382 - Image analysis

382100000 - APPLICATIONS

Patent class list (only not empty are listed)

Deeper subclasses:

Entries
DocumentTitleDate
20110176709METHOD AND APPARATUS FOR CALCULATING A DISTANCE BETWEEN AN OPTICAL APPARATUS AND AN OBJECT - A method and apparatus for determining a distance between an optical apparatus and an object by considering a measured nonlinear waveform, as opposed to a mathematically ideal waveform. The method and apparatus may accurately calculate distance information without being affected by a type of waveform projected onto the object and may not require an expensive light source or a light modulator for generating a light with little distortion and nonlinearity. Further, since the method may be able to use a general light source, a general light modulator, and a general optical apparatus, additional costs do not arise. Furthermore, a lookup table, in which previously calculated distance information is stored, may be used, and thus the amount of computation required to be performed to calculate the distance is small, thereby allowing for quick calculation of the distance information in real time.07-21-2011
20130028485IMAGE PROCESSING APPARATUS, IMAGE PROCESSING METHOD, AND COMPUTER READABLE RECORDING DEVICE - An image processing apparatus includes a distance information calculator that calculates distance information corresponding to a distance to an imaging object at each of portions in an image; a feature data calculator that calculates feature data at each portion in the image; a feature data distribution calculator that calculates a distribution of the feature data in each of regions that are classified according to the distance information in the image; a reliability determining unit that determines the reliability of the distribution of the feature data in each of the regions; and a discrimination criterion generator that generates, for each of the regions, a discrimination criterion for discriminating a specific region in the image based on a determination result of the reliability and the distribution of the feature data in each of the regions.01-31-2013
20130028482Method and System for Thinning a Point Cloud - A system and method for thinning a point cloud. In one aspect of the method, the point cloud is generated by an imaging device and data points are thinned out of the point cloud based upon their distance from the imaging device.01-31-2013
20130028483IMAGE EDITING PROGRAM, IMAGE EDITING METHOD, IMAGE EDITING APPARATUS AND STORAGE MEDIUM - Provided is an image editing program that causes an arithmetic processing unit to execute setting a recognition target image from an image included in image data as well as setting attribute information of the recognition target image, searching a frame image for the recognition target image by using the attribute information with respect to each of frames constituting the image data, and assigning image processing information regarding a processing target region determined in accordance with an image range, within a frame, of the recognition target image retrieved in the step of searching.01-31-2013
20120163672Depth Estimate Determination, Systems and Methods - Systems and methods for generating pixel based depth estimates are disclosed. An image processing system operating as depth analysis engine generates an estimated depth associated with a pixel based on a reference image and other related images. A current depth estimate is refined based on neighboring pixels and calculated consistency scores. Further, depth estimates can be levered in object or scene recognition to trigger or initiate an action taken by a computing device.06-28-2012
20130028484IMAGE PROCESSING APPARATUS FOR FUNDUS IMAGE, IMAGE PROCESSING METHOD FOR FUNDUS IMAGE, AND PROGRAM MEDIUM - A image processing apparatus includes a selection unit configured to select either first color tone conversion processing or second color tone conversion processing having a ratio of red wavelength component set lower than either blue wavelength or green wavelength components with respect to the first color tone conversion processing, and a color tone conversion unit configured to convert a color of a fundus image by the selected color tone conversion processing.01-31-2013
20120170815SYSTEM AND METHOD FOR RANGE AND VELOCITY ESTIMATION IN VIDEO DATA AS A FUNCTION OF ANTHROPOMETRIC MEASURES - A system and method calculate a range and velocity of an object in image data. The range calculation includes detecting a contour of the object from the image data, forming a template from the image data based on the contour; and calculating a range to the object using pixel resolution and dimension statistics of the object. A three-dimensional velocity of the object is determined by calculating a radial component and an angular component of the velocity. The radial velocity component is calculated by determining the range of the object in two or more image frames, determining a time differential between the two or more image frames, and calculating the radial velocity as a function of the range of the object in the two or more image frames and the time differential between the two or more image frames. The angular component is calculated using spatial-temporal derivatives as a function of a motion constraint equation.07-05-2012
20100158319METHOD AND APPARATUS FOR FAKE-FACE DETECTION USING RANGE INFORMATION - A fake-face detection method using range information includes: detecting face range information and face features from an input face image; matching the face image with the range information; and distinguishing a fake face by analyzing the matched range information.06-24-2010
20090123032MEASUREMENT OF GAPS BETWEEN VALVE SEATS AND ATTACHMENT PARTS - An apparatus for measuring gaps that form between attachment parts (05-14-2009
20090290758Rectangular Table Detection Using Hybrid RGB and Depth Camera Sensors - Objects having a flat surface such as a table are detected by processing a depth image and a color image. A mask indicating an area likely to include an object having the flat surface is generated by processing a depth image including the depth information. A color image corresponding to the depth image is then cropped using the mask to detect a portion of the color image that likely include the object having the flat surface. Geometric features of the cropped color image such as lines are then detected to determine the location and orientation of the object having the flat surface. A subset of the detected geometric features is selected as outlines of the flat surface.11-26-2009
20100119115IMAGING SYSTEM FOR IMAGING AN OBJECT IN AN EXAMINATION ZONE - The invention relates to an imaging system for imaging an object (05-13-2010
20100074474Device and Method for Analyzing an Organic Sample - A device and method for analyzing an organic sample provide high spatial resolution. A focused ion beam is directed onto the organic sample. Fragments detached from the sample are examined using mass spectroscopy.03-25-2010
20100074473SYSTEM AND METHOD OF EXTRACTING PLANE FEATURES - A navigation system comprises an image sensor operable to obtain range data for a first scene, and a processing unit coupled to the image sensor. The processing unit is operable to identify one or more plane features, based on the range data, using each of a plurality of scales. The processing unit is further operable to combine each of the one or more plane features with a corresponding plane feature from each of the plurality of scales and to project the one or more combined plane features to a reference orientation.03-25-2010
20090169057METHOD FOR PRODUCING IMAGE WITH DEPTH BY USING 2D IMAGES - A method for producing an image with depth by using 2D image includes obtaining a set of internal parameters of a camera. The camera takes at least a first and a second 2D images with a small shift. The first 2D image has N depths, and N≧2. Several sets of external parameters of the camera corresponding to the 2D images are estimated. A 3D information respectively corresponding to the N depths of the first 2D image at each pixel or block is calculated. A proper depth of each pixel or image block is determined. Through the internal parameters, the external parameters, and the N depths, each pixel or image block of the first 2D image is projected onto N positions of the second 2D image, so as to perform a matching comparison analysis with the second 2D image, thereby determining the proper depth from the N depths.07-02-2009
20100046803Apparatus, method, and program for processing image - A reference image and a synthesis image covering an area outside the viewfield of the reference image are synthesized as one piece of display image data in a manner reflecting the positional relationship of the images at the image capturing. The display image data is displayed with a partial area thereof presented on a display screen, and the partial area is moved in response to a shift instruction.02-25-2010
20100046802DISTANCE ESTIMATION APPARATUS, DISTANCE ESTIMATION METHOD, STORAGE MEDIUM STORING PROGRAM, INTEGRATED CIRCUIT, AND CAMERA - Attempts to achieve a higher resolution and a higher frame rate of a distance image when a distance to an object within a target space is estimated using the TOF method would cause CCD saturation due to shot noise or environment light, and lower distance precision. A distance estimation apparatus illuminates an object with illumination light for distance estimation emitted from a light source that can emit light (electromagnetic wave) having a predetermined illumination frequency, receives reflected light of the illumination light, obtains information about the distance from the apparatus to the object, generates distance image data based on the distance information, extracts edge information of a color image formed using a visible light component obtained in synchronization with the reflected light, and corrects distance information of a target part of the distance image using distance information of a neighboring part of the target part based on the edge information.02-25-2010
20100046801APPARATUS, METHOD AND PROGRAM FOR DISTANCE MEASUREMENT - A depth distance of a target point for measurement can be found by exploiting the equality of a cross ratio of phases of reference data points and a target point for measurement to a cross ratio of depth distances of these points. The depth distance of the target point for measurement can also be determined by exploiting the equality of a cross ratio of the distances among a set of projection points corresponding to projection of reference data points and a target point on a reference plane; of a cross ratio of distances among a set of image projection points corresponding to projection of a set of projection points on an image; and of a cross ratio of distances of a set of points which are equiphase to a set of image projection points on an arbitrary straight line on an image, respectively, relative to the cross ratio of the depth distances.02-25-2010
20100046800THREE DIMENSIONAL SCANNING ARRANGEMENT INCLUDING DYNAMIC UPDATING - A three dimensional machine scanning arrangement for a machine traveling over a worksite includes a pair of scanners that are mounted on the machine. Each of the pair of scanners measures distances to a number of points on the ground at the worksite. One of the pair of scanners faces rearward and the other of the pair of scanners is faces forward. A control is responsive to the pair of scanners. The control determines the contour of the worksite. A display, mounted on the machine, is responsive to the control for displaying the contour of the worksite.02-25-2010
20130034270METHOD AND DEVICE FOR SHAPE EXTRACTION, AND SIZE MEASURING DEVICE AND DISTANCE MEASURING DEVICE - When an image is captured in a wood lumber measuring mode, a distance to wood lumber is detected. An area extracting unit 02-07-2013
20130044917METHOD FOR DRIVING SEMICONDUCTOR DEVICE - A method for driving a semiconductor device which enables three-dimensional imaging is provided. The method for driving the semiconductor device also enables a reduction in the size of a pixel, two-dimensional imaging concurrently with the three-dimensional imaging, and/or accurate three-dimensional imaging of a fast-moving object. The distance from a light source to an object is measured by performing a first imaging and a second imaging with respect to the timings of the first irradiation and the second irradiation, respectively. A first photosensor absorbing visible light and a second photosensor absorbing infrared light are overlapped with each other and enable the two-dimensional imaging and the three-dimensional imaging, respectively, to be performed concurrently. Adjacent photosensors detect light reflected off substantially the same point of an object, preventing a reduction in the accuracy of the three-dimensional imaging of a fast-moving object.02-21-2013
20100104139METHOD AND SYSTEM FOR VIDEO-BASED ROAD LANE CURVATURE MEASUREMENT - A method and system for video-based road lane curvature measurement is provided. An image processing system receives roadway scene images from a vehicle-mounted video camera to measure road curvature. Road boundary indicators, such as lane markings, are used for identifying the road boundaries. The road curvature is approximated using a relation between the slope of a line connecting two points on a lane marking, and the average longitudinal distance from the camera to the line.04-29-2010
20100104138VEHICLE PERIPHERY MONITORING DEVICE, VEHICLE PERIPHERY MONITORING PROGRAM, AND VEHICLE PERIPHERY MONITORING METHOD - A vehicle periphery monitoring device is provided with an image processing target area setting portion for setting an image processing target area (04-29-2010
20100092042MANEUVERING ASSISTING APPARATUS - A maneuvering assisting apparatus includes a plurality of cameras. Each camera is arranged in a downward attitude on a side surface of a ship hull, and captures surroundings of the ship. A CPU creates a whole-circumference bird's eye view image representing in an aerially viewed manner the surroundings of the ship, based on outputs of these cameras. Also, the CPU transparently multiplexes a graphic image representing at least an extension of the aerially viewed ship, onto the whole-circumference bird's eye view image. Moreover, the CPU non-transparently multiplexes a graphic image representing one portion of the aerially viewed ship, onto the whole-circumference bird's eye view image.04-15-2010
20130089239METHOD OF DETERMINING SOLDER PASTE HEIGHT AND DEVICE FOR DETERMINING SOLDER PASTE HEIGHT - A method of determining a solder paste height of solder paste printed on a circuit board, the method including obtaining a two-dimensional image of the circuit board which is captured from above a solder printed surface, and determining the solder paste height corresponding to a pixel value of each of pixels of the two-dimensional image, based on height information which defines a relationship between the pixel value and the solder paste height, the pixel value being a value representing at least one of luminance of red in a RGB color model, luminance of green in the RGB color model, luminance of blue in the RGB color model, hue in a HSI color model, saturation in the HSI color model, and intensity in the HSI color model.04-11-2013
20130089238SYSTEM AND METHOD FOR MEASURING IMAGES OF OBJECT - A computing device reads an entire image of an object. The entire image is spliced by a plurality of part images. A user selects an area on the entire image. The computing device determines a first number of first pixel points between a center point of the selected area and a center point of each covered image. The converted images are part images that the selected area covers. The coordinate values of the center point of the selected area are calculated according to the first number of pixel points and a size of each pixel point of the entire image. The computing device calculates coordinate values of each point of a selected area according to the size of each pixel point and the coordinate values of the center point of the selected area.04-11-2013
20130051626Method And Apparatus For Object Pose Estimation - A pose of an object is estimated from an from an input image and an object pose estimation is then stored by: inputting an image containing an object; creating a binary mask of the input image; extracting a set of singlets from the binary mask of the input image, each singlet representing points in an inner and outer contour of the object in the input image; connecting the set of singlets into a mesh represented as a duplex matrix; comparing two duplex matrices to produce a set of candidate poses; and producing an object pose estimate, and storing the object pose estimate. The estimated pose of the object is refined by: inputting an image of an object in an estimated pose, a model of the object, and parameters of a camera used to take the image of the object in the estimated pose; projecting the model of the object into a virtual image of the object using the parameters of the camera and initial pose parameters to obtain a binary mask image and image depth information; and updating the initial pose parameters to new pose parameters using the binary mask image and image depth information and updating the new pose parameters iteratively to minimize an energy function or until a maximum number of iterations is reached.02-28-2013
20110311104Multi-Stage Linear Structure from Motion - Described is a linear structure from motion technique that is scalable, parallelizable, treats images equally, and is robust to outliers, without requiring intermediate bundle adjustment. Camera rotations for images are estimated using feature point correspondence and vanishing points matched across the images. The camera rotation data is fed into a linear system for structure and translation estimation that removes outliers and provides output data corresponding to structure from motion parameters. The data may be used in further optimization e.g. with a final non-linear optimization stage referred to as bundle adjustment to provide final refined structure from motion parameters.12-22-2011
20090304235METHOD AND DEVICE FOR OBSERVING AN OBJECT - The invention relates to a method of analysing or observing an object (12-10-2009
20100272318ENDOSCOPIC MEASUREMENT TECHNIQUES - Apparatus for use in a lumen is provided, including a light source, configured to illuminate a vicinity of an object of interest of a wall of the lumen, and an optical system (10-28-2010
20120219191LOCAL METRIC LEARNING FOR TAG RECOMMENDATION IN SOCIAL NETWORKS - A tag recommendation for an item to be tagged is generated by: selecting a set of candidate neighboring items in an electronic social network based on context of items in the electronic social network respective to an owner of the item to be tagged; selecting a set of nearest neighboring items from the set of candidate neighboring items based on distances of the candidate neighboring items from the item to be tagged as measured by an item comparison metric; and selecting at least one tag recommendation based on tags of the items of the set of nearest neighboring items. The item comparison metric may comprise a Mahalanobis distance metric trained on the set of candidate neighboring items to correlate the trained Mahalanobis distance between pairs of items of the set of candidate neighboring items with an overlap metric indicative of overlap of the tag sets of the two items.08-30-2012
20090092288IMAGE POSITION MEASURING APPARATUS AND EXPOSURE APPARATUS - An image position measuring apparatus is provided with a photographing unit including an image pickup device and/or a lens to measure a position of a reference mark formed on a work and a correcting unit for correcting distortion(s) of the image pickup device and/or the lens. An exposure apparatus is provided with an image position measuring apparatus and an exposure unit for exposing the work based on image data corrected based on positional information of the reference mark photographed by the image position measuring apparatus. Influences of the distortions of the image pickup device and the lens can be eliminated and accuracy of measurement of the position of the reference mark provided to the work can be improved.04-09-2009
20130058539METHOD AND A SYSTEM TO DETECT AND TO DETERMINE GEOMETRICAL, DIMENSIONAL AND POSITIONAL FEATURES OF PRODUCTS TRANSPORTED BY A CONTINUOUS CONVEYOR, PARTICULARLY OF RAW, ROUGHLY SHAPED, ROUGHED OR HALF-FINISHED STEEL PRODUCTS - A method to detect and to determine geometrical, dimensional and positional features of products (P) transported by a continuous conveyor (03-07-2013
20130058540METHOD FOR MEASURING AND CONTROLLING DISTANCE BETWEEN LOWER END SURFACE OF HEAT SHIELDING MEMBER AND SURFACE OF RAW MATERIAL MELT AND METHOD FOR MANUFACTURING SILICON SINGLE CRYSTAL - A method for measuring a distance between a lower end surface of a heat shielding member including a criterion reflector inside a concavity on the lower end surface and a surface of a raw material melt includes: a silicon single crystal is pulled by the Czochralski method while a magnetic field is applied to the raw material melt in a crucible, measuring the distance between the lower end surface of the heat shielding member and the surface of the raw material melt and observing a position of a mirror image of the criterion reflector with a fixed point observation apparatus; and measuring a movement distance of the mirror image with the apparatus and calculating the distance between the lower end surface of the heat shielding member and the surface of the raw material melt from the movement distance of the image and the measured distance.03-07-2013
20130058538CONTACT STATE ESTIMATING APPARATUS AND TRAJECTORY GENERATION APPARATUS - A state where at least a part of the virtual object (e.g., a foot of a robot) enters into the actual object (e.g., floor), i.e., a state where at least a part of a plurality of virtual points located on the surface of the virtual object is inside the actual object, can be assumed. At each of the inside and the outside of the actual object, as a virtual point is located at a deeper position inside and away from the surface or the skin part of the actual object and as a coordinate value difference ΔZ03-07-2013
20130058537SYSTEM AND METHOD FOR IDENTIFYING A REGION OF INTEREST IN A DIGITAL IMAGE - A system and method for identifying a region of interest in a digital image. A first and second images of a scene may be obtained from a respective first and second points of view. Following an acquisition of a first image from a first point of view, a subsequent image may be automatically acquired upon determining that a second view point is achieved. Based on two or more images of a scene, a background object may be removed from an image to produce an image that only includes a foreground object or a region of interest.03-07-2013
20120224748METHOD FOR INTELLIGENTLY DISPLAYING SPORTS GAME VIDEO FOR MULTIMEDIA MOBILE TERMINAL - Disclosed is a method for intelligently displaying a sports game video for a multimedia mobile terminal, the method including the steps of: determining if a camera shot existing in the sports game video is a long-shot; when the camera shot is determined as a long-shot, determining a ROI within an image frame of the sports game video; and when the camera shot is determined as a non-long-shot, displaying the image frame on the mobile terminal, and, when the camera shot is determined as a long-shot, enlarging and displaying the ROI on the mobile terminal.09-06-2012
20090232359METHOD FOR DETERMINING DISTANCE BETWEEN REFERENCE MEMBER AND MELT SURFACE, METHOD FOR CONTROLLING LOCATION OF MELT SURFACE USING THE SAME, AND APPARATUS FOR PRODUCING SILICON SINGLE CRYSTAL - The present invention is a method for determining a relative distance between a reference member placed above a melt surface and the melt surface upon pulling a silicon single crystal out of a raw material melt in a crucible by a CZ method characterized by at least: pulling the silicon single crystal applying a magnetic field; taking a picture of a real image of the reference member and a mirror image of the reference member reflected on the melt surface with a detector; processing the picture taken of the real image and the mirror image of the reference member as different pictures by separating the picture taken; and calculating the relative distance between the real image and the mirror image of the reference member from the processed pictures to determine the relative distance between the reference member and the melt surface.09-17-2009
20090010496IMAGE INFORMATION PROCESSING APPARATUS, JUDGING METHOD, AND COMPUTER PROGRAM - Image information is input by an image input unit, a marker is extracted from an image of the input image information by a marker detector, a position of the extracted marker in the image is detected by a position/posture detector, and the difference between indicators extracted from images is judged. At this time, the position/posture detector is provided with a judgment condition based on the position of each marker, as a judgment condition that is at least selectively applied.01-08-2009
20090010495Vulnerable Road User Protection System - A range map of a visual scene generated by a stereo vision and associate image processing system, and is filtered to remove objects beyond a region of interest and for which a collision is not possible, and to remove an associated road surface. Objects clustered in range bins are separated by segmentation. A composite range map is generated using principle components analysis and processed with a connected components sieve filter. Objects are identified using one or more of a harmonic profile and other features using an object recognition processor using a combination of inclusive, exclusive and harmonic networks to generate a classification metric.01-08-2009
20120114182METHOD AND APPARATUS FOR MEASURING DEPTH OF FIELD - A method and an apparatus for measuring a depth of field (DOF) are disclosed. The DOF measuring method includes the following steps. Firstly, an image is captured in each of a plurality of focus scales respectively, wherein each image respectively includes an image region corresponding to the same image area. Next, one of the image regions is selected as the best DOF region. Then, a DOF value corresponding to the focus scale corresponding to the best DOF image region is determined according to a lookup table.05-10-2012
20120288158Method and Apparatus for Capturing, Geolocating and Measuring Oblique Images - A computerized system for displaying and making measurements based upon captured oblique images. The system includes a computer system executing image display and analysis software reading an oblique image having corresponding geo-location data and a data table storing ground plane data, the ground plane data comprising a plurality of facets within an area depicted within the oblique image, the facets having a plurality of elevation data that conforms to at least a portion of terrain depicted within the oblique image; wherein the computer system displays the oblique image, receives a starting point and an end point selected by the user, where one or both points may be above the terrain, and calculates a height difference between the starting and end points dependent upon the geo-location data and the elevation data of a facet of the ground plane data.11-15-2012
20090310824SYSTEM FOR READING AND AUTHENTICATING A COMPOSITE IMAGE IN A SHEETING - A system for reading and authenticating a composite image in a sheeting. A exemplary embodiment of the invention provides a system for reading and authenticating a sheeting including a composite image that appears to the unaided eye to be floating above or below the sheeting or both. The present invention also relates to methods of reading and authenticating a composite image that appears to the unaided eye to be floating above or below the sheeting or both.12-17-2009
20110299736APPARATUS FOR LOCALIZATION USING IMAGE AND RANGE DATA AND METHOD THEREOF - A localization apparatus and method for localizing by use of image information and range data are provided. The localization apparatus includes an image sensor which captures an image and outputs image data, a range sensor which senses a distance and outputs a range sensing signal, an image feature point information processing unit which extracts image feature points from the sensed image and generates image feature point information about the extracted image feature points, a range feature point information processing unit which generates range feature point information about range feature points from the range sensing signal, and a state estimation unit which estimates a state variable including a position of the localization apparatus by using the image feature point information and the range feature point information as an observation value.12-08-2011
20130163824Method and Device for Detecting Distance, Identifying Positions of Targets, and Identifying Current Position in Smart Portable Device - An electronic device for recognizing a position of a target object in a smart portable device includes a distance detection device for determining a distance between the smart portable device and the target object according to an image of the target object, a direction determination unit for acquiring a direction from the smart portable device to the target object, a positioning unit for acquiring coordinate information of a current position of the smart portable device, and a determination unit for determining the position of the target object according to the distance between the smart portable device and the target object, the direction from the smart portable device to the target object and the coordinate information of the current position of the smart portable device.06-27-2013
20110286634IMAGING DEVICE AND DISTANCE-MEASURING DEVICE USING SAME - An image capture device according to the present invention includes: a first optical system 10 that has a longitudinal chromatic aberration to cause first, second and third colors to form images at mutually different positions on an optical axis; a first image capturing region Na for generating an image that has a component in at least one of the first, second and third colors by using light that has been transmitted through the first optical system 10; a second optical system 20 that has a different longitudinal chromatic aberration from that of the first optical system 10; a second image capturing region Nb for generating an image that has a component in the same color as the at least one color by using light that has been transmitted through the second optical system 20; and an arithmetic processing section C for generating an output image by using one of the two images that has been generated in the first or second image capturing region Na or Nb so as to have the component in the at least one color apiece and that has the component with the higher degree of sharpness.11-24-2011
20120014565IMAGE PROCESSING METHOD, IMAGE PROCESSING APPARATUS AND NON-TRANSITORY COMPUTER-READABLE STORAGE MEDIUM THEREFOR - When a feature point in a first image is employed to search for corresponding points in a second image, there is a case wherein an appropriate motion vector can not be determined, such as a case wherein multiple corresponding point choices are present, or no corresponding points are found. Therefore, when a corresponding point that has a high similarity to the feature point in the first image is not specified in the second image, a motion vector that is to be determined for a feature point, other than the feature point in the first image, is examined, and a motion vector for this feature point is determined.01-19-2012
20110293146Methods for Estimating Peak Location on a Sampled Surface with Improved Accuracy and Applications to Image Correlation and Registration - Methods and systems for estimating peak location on a sampled surface (e.g., a correlation surface generated from pixilated images) utilize one or more processing techniques to determine multiple peak location estimates for at least one sampled data set at a resolution smaller than the spacing of the data elements. Estimates selected from the multiple peak location estimates are combined (e.g., a group of estimates is combined by determining a weighted average of the estimates selected for the group) to provide one or more refined estimates. In example embodiments, multiple refined estimates are combined to provide an estimate of overall displacement (e.g., of an image or other sampled data representation of an object).12-01-2011
20090003654SINGLE-APERATURE PASSIVE RANGEFINDER AND METHOD OF DETERMINING A RANGE - A single-aperture passive rangefinder and a method of determining a range. In one embodiment, the single-aperture passive rangefinder includes: (1) an imaging system configured to form a first image that includes a point of interest at a first position and a second image at a second position that includes the point of interest and (2) a processor associated with the imaging system and configured to acquire and store the first image and the second image and determine a range to the point of interest based on a separation between the first position and the second position and a position of the point of interest relative to virtual axes of the imaging system at the first position and at the second position.01-01-2009
20090103782Method and apparatus for obtaining depth information - A method and apparatus for obtaining depth information are provided. The method includes calculating a relative depth value between a first color pixel and a second color pixel based on values of color pixels of a color image, and calculating a depth value of a second depth pixel that belongs to a depth image corresponding to the color image, matches the second color pixel, and does not have the depth value, based on the calculated relative depth value and a depth value of a first depth value that belongs to the depth image, matching the first color pixel, and has the depth value thereof.04-23-2009
20100284571OBJECT POSITION ESTIMATING SYSTEM, OBJECT POSITION ESTIMATING APPARATUS, OBJECT POSITION ESTIMATING METHOD, AND OBJECT POSITION ESTIMATING PROGRAM - A plurality of pieces of observation raw data observed by an observation unit are subjected to an object identifying process, and by using a parameter determined by a parameter determination unit in accordance with a period of time between a point of object identifying process completion time of one piece of data and process completion scheduled time of another piece of data, a position estimating process of the object is carried out from the data that has been subjected to the object identifying process, and the position of the object relating to the object ID is estimated based upon the object ID and positional candidates acquired by an object identifying unit.11-11-2010
20100226542METHOD OF BIOIMAGE DATA PROCESSING FOR REVEALING MORE MEANINGFUL ANATOMIC FEATURES OF DISEASED TISSUES - The present invention discloses a method for generating elevation maps or images of a tissue layer/boundary with respect to a fitted reference surface, comprising the steps of finding and segmenting a desired tissue layer/boundary; fitting a smooth reference surface to the segmented tissue layer/boundary; calculating elevations of the same or other tissue layer/boundary relative to the fitted reference surface; and generating maps of elevation relative to the fitted surface. The elevation can be displayed in various ways including three-dimensional surface renderings, topographical contour maps, contour maps, en-face color maps, and en-face grayscale maps. The elevation can also be combined and simultaneously displayed with another tissue layer/boundary dependent set of image data to provide additional information for diagnostics.09-09-2010
20080310682System and Method for Real-Time Calculating Location - Provided is a system and method for providing location information of robot in real time using an artificial mark. The system includes: an image processing module for obtaining an image signal by photographing artificial marks installed at a predetermined space with a space coordinate and detecting an image coordinate of artificial mark from the obtaining the image signal; a location calculating module for calculating a current location by comparing the image coordinate of detected artificial mark and a pre-stored space coordinate of artificial mark; and an artificial mark identifying module for updating current location information by selectively using one of an artificial mark tracing process and an image coordinate estimating process.12-18-2008
20100034427TARGET ORIENTATION ESTIMATION USING DEPTH SENSING - A system for estimating orientation of a target based on real-time video data uses depth data included in the video to determine the estimated orientation. The system includes a time-of-flight camera capable of depth sensing within a depth window. The camera outputs hybrid image data (color and depth). Segmentation is performed to determine the location of the target within the image. Tracking is used to follow the target location from frame to frame. During a training mode, a target-specific training image set is collected with a corresponding orientation associated with each frame. During an estimation mode, a classifier compares new images with the stored training set to determine an estimated orientation. A motion estimation approach uses an accumulated rotation/translation parameter calculation based on optical flow and depth constrains. The parameters are reset to a reference value each time the image corresponds to a dominant orientation.02-11-2010
20090087031THREE-DIMENSIONAL MEASUREMENT INSTRUMENT, IMAGE PICK-UP APPARATUS AND ADJUSTING METHOD FOR SUCH AN IMAGE PICKUP APPARATUS - An image pickup apparatus is provided with cameras and string-shaped members. The user is allowed to know working distances based upon the lengths of the string-shaped members. The optical axis direction of the camera is adjusted so that the leading edge of the string-shaped member is included in the viewing field of the camera, and the optical axis direction of the camera is adjusted so that the leading edge of the string-shaped member is included in the viewing field of the camera. Even in the case when no image-pickup object is present, by presuming the position of the image-pickup portion of the object, the working distance from the presumed position can be found by the string-shaped member. Moreover, since the string-shaped members are coupled to each other at the image-pickup position, the directions of the optical axes of the cameras can be determined based upon the extending directions of the string members.04-02-2009
20090147998Image processing system, image processing method, and computer readable medium - There is provided an image processing system configured to correct an image of an object inside a physical body. The image processing system includes an object image obtaining section that obtains an object image formed by light from the object, a depth identifying section that identifies a depth from a surface of the physical body to the object, a distance information identifying section that identifies distance information indicating a distance from an image capturing section capturing the object image to the surface of the physical body, and an image correcting section that corrects the object image according to the distance information and the depth.06-11-2009
20090147999IMAGE PROCESSING SYSTEM, IMAGE PROCESSING METHOD, AND COMPUTER READABLE MEDIUM - Provided is an image processing system, including a depth calculating section that calculates a depth of an object from a surface of a body, the object existing inside the body; a light receiving section that receives light from the object; and a substance amount calculating section that calculates an amount of a substance, which generates the light received by the light receiving section, inside the object based on the depth of the object calculated by the depth calculating section and an amount of light received by the light receiving section.06-11-2009
20090147997METHOD AND SYSTEM FOR CORRECTING DISTANCE USING LINEAR REGRESSION AND SMOOTHING IN AMBIENT INTELLIGENCE DISPLAY - The invention provides a method of correcting a distance between an ambient intelligence display and a user using a linear regression and a smoothing, by which distance information of a user who approaches to the display can be accurately output even in an unanticipated condition using a passive infrared (PIR) sensor and an ultrasonic device.06-11-2009
20090147996SAFE FOLLOWING DISTANCE WARNING SYSTEM AND METHOD FOR A VEHICLE - A safe following distance warning system for a vehicle includes a forward-looking image capturing unit, a speed sensor, a processor, and a warning output device. The forward-looking image capturing unit is for capturing images forward of the vehicle. The forward-looking image capturing unit includes a taking lens, an image sensor disposed behind the taking lens, an image processing unit, and a lens driving unit. The image processing unit is for calculating an in-focus position of the taking lens. The speed sensor is for sensing a current speed of the vehicle. The processor is for calculating a following distance from the vehicle to a vehicle directly forward of the vehicle and comparing the following distance to an applicable safe distance parameter. The warning output device is for outputting an alarm to the driver of the vehicle when the following distance is less than the applicable safe distance parameter.06-11-2009
20100278390Task-based imaging systems - A task-based imaging system for obtaining data regarding a scene for use in a task includes an image data capturing arrangement for (a) imaging a wavefront of electromagnetic energy from the scene to an intermediate image over a range of spatial frequencies, (b) modifying phase of the wavefront, (c) detecting the intermediate image, and (d) generating image data over the range of spatial frequencies. The task-based imaging system also includes an image data processing arrangement for processing the image data and performing the task. The image data capturing and image data processing arrangements cooperate so that signal-to-noise ratio (SNR) of the task-based imaging system is greater than SNR of the task-based imaging system without phase modification of the wavefront over the range of spatial frequencies.11-04-2010
20100080420Image classification program, image classification device, and electronic camera - An image classification program that is executed by a computer includes a first processing step of calculating feature values on the basis of the pixel densities of images, a second processing step of performing calculating in a space defined by the feature values, and a third processing step of grouping images that correspond to feature values that have been divided by the clustering.04-01-2010
20120269399ABOVE-WATER MONITORING OF SWIMMING POOLS - An above-water system provides automatic alerting for possible drowning victims in swimming pools or the like. One or more electro-optical sensors are placed above the pool surface. Sequences of images are digitized and analyzed electronically to determine whether there are humans within the image, and whether such humans are moving in a manner that would suggest drowning. Effects due to glint, refraction, and variations in light, are offset automatically by the system. If a potential drowning incident is detected, the system produces an alarm sound, and/or a warning display, so that an operator can determine whether action must be taken.10-25-2012
20090046895METHOD AND MEASUREMENT SYSTEM FOR CONTACTLESS COORDINATE MEASUREMENT ON AN OBJECT SURFACE - The invention relates to a method and a surveying system for noncontact coordinate measurement on the object surface (02-19-2009
20090169058METHOD AND DEVICE FOR ADJUSTING OUTPUT FRAME - A method and a device for adjusting an output frame are provided. The present disclosure is suitable for an electronic device with a display. In the present disclosure, a relative position between a user and the display is obtained first. Then, an output frame which will be outputted by the display later is adjusted according to the relative position. As a result, the output frame is adjusted in advance to fit the present position of the user, so that the user views the output frame more easily and enjoys the best display result.07-02-2009
20090141942Non-contact passive ranging system - A non-contact passive ranging system wherein a first imager on a platform is focused on a first object and a second imager on the platform is also focused on the first object. The optical path from the first object to the first imager is configured to be shorter than the optical path from the object to the second imager. Processing circuitry is responsive to an output of the first imager and an output of the second imager as relative motion is provided between the platform and the first object and is configured to calculate the distance from the platform to the object.06-04-2009
20110170748VEHICLE PERIPHERY MONITORING DEVICE - A vehicle periphery monitoring device includes: a parallax calculating unit which extracts a first image section that contains a target object in real space from a first image imaged by a first imaging unit at a predetermined time and extracts a second image section correlated to the first image section from a second image imaged by a second imaging unit at the predetermined time, and then calculates the parallax between the first image section and the second image section; a parallax gradient calculating unit for calculating a parallax gradient based on a time series calculation of the parallax of the identical target object in real space by the parallax calculating unit; and a first distance calculating unit for calculating the distance from the vehicle to the target object on the basis of the parallax gradient and the velocity of the vehicle.07-14-2011
20090279740DISTANCE MEASURING DEVICE AND METHOD, AND COMPUTER PROGRAM - A distance measuring device (11-12-2009
20090279739POINT SUBSELECTION FOR FAST DEFORMABLE POINT-BASED IMAGING - A method for selecting vertices for performing deformable registration of imaged objects is provided. The selected vertices form corresponding pairs, each pair including a vertex from a first imaged object and a vertex from a second imaged object. The corresponding vertex pairs are sorted in order of distance between the vertices making up the corresponding vertex pair. The corresponding vertex pair with the greatest distance is given top priority. Corresponding vertex pairs that lie within a selected distance from the selected corresponding vertex pair are discarded. In this manner, the number of vertex pairs used for deformable registration of the imaged objects is reduced and therefore allows for processing times that are clinically acceptable.11-12-2009
20090285451DISPLACEMENT SENSOR HAVING A DISPLAY DATA OUTPUT - A method of displaying sensed displacement is such as to collect single dimension light distribution data using a single dimensional imaging device; transmit the light distribution data to a processor; and display the light distribution data as a line bright wave form on a display using software implemented by the processor.11-19-2009
20090290759STEREOSCOPIC MEASUREMENT SYSTEM AND METHOD - A stereoscopic measurement system captures stereo images and determines measurement information for user-designated points within stereo images. The system comprises an image capture device for capturing stereo images of an object. A processing system communicates with the capture device to receive stereo images. The processing system displays the stereo images and allows a user to select one or more points within the stereo image. The processing system processes the designated points within the stereo images to determine measurement information for the designated points.11-26-2009
20090296990EVALUATING DRIVER WALK DISTANCES AND BUILDING TYPES USING OVERHEAD IMAGERY - A method of determining a distance to be walked by a delivery vehicle driver including providing a satellite image that has an image of a building to which an item is to be delivered and an image of a street adjacent to the building. The method further includes defining a path, within the image, that corresponds to a path that the delivery vehicle driver will walk when delivering the item to the building. The method also includes the step of determining a length of the path.12-03-2009
20110200231DEVICE TO ANALYZE AND DETERMINE THE MOVEMENT CHARACTERISTICS OF PRODUCTS, IN PARTICULAR IN A CASTING LINE, AND RELATIVE METHOD - A device to analyze and determine the movement characteristics of products moving in a determinate direction of feed (F) and emitting radiations, in particular products exiting from a casting line, comprises a camera to continuously acquire images of the product moving in the direction of feed (F) in at least two successive instants of time, and an electronic processing unit by means of which the comparison between at least two successive images acquired is carried out, using mathematical algorithms based on the image correlation principle, in order to determine the spatial displacement of the images and then the movement characteristics of the product moving in the direction of feed (F) are analyzed and determined.08-18-2011
20100278391Apparatus for behavior analysis and method thereof - In the present invention, an apparatus for behavior analysis and method thereof is provided. In this apparatus, each behavior is analyzed and has its corresponding posture sequence through a triangulation-based method of triangulating the different triangle meshes. The two important posture features, the skeleton feature and the centroid context, are extracted and complementary to each other. The outstanding ability of posture classification can generate a set of key postures for coding a behavior sequence to a set of symbols. Then, based on the string representation, a novel string matching scheme is proposed to analyze different human behaviors even though they have different scaling changes. The proposed method of the present invention has been proved robust, accurate, and powerful especially in human behavior analysis.11-04-2010
20100278392VEHICLE PERIPHERY MONITORING DEVICE, VEHICLE, AND VEHICLE PERIPHERY MONITORING PROGRAM - A vehicle periphery monitoring device which determines type of an object with high accuracy, wherein the size of object regions that are set in each of the image representing the peripheral condition of the vehicle [[1]] in each of two different points in time and that include the identical objects, are aligned on the basis of the distance from the vehicle to the object in each of the two different point in time. Further, local regions with the same arrangement pattern are set taking each of the object regions with aligned size as reference. Still further, the object is classified into the object class which corresponds to the arrangement pattern in the case where the degree of correlation between the local regions becomes equal to or larger than [[the]] a threshold value.11-04-2010
20090169056SYSTEM AND METHOD FOR DETERMINING CUMULATIVE TOW GAP WIDTH - A system for determining cumulative tow gap width includes an in-process vision system having at least one camera adapted to record images of a composite material and a data analysis computer communicating with and adapted to receive image data from the in-process vision system. The data analysis computer may be adapted to calculate a cumulative gap width of tow gaps in the composite material. A user interface may communicate with and be adapted to receive data analysis results from the data analysis computer. A method for determining cumulative tow width gap of tow gaps in a composite structure is also disclosed.07-02-2009
20080292141METHOD AND SYSTEM FOR TRIGGERING A DEVICE WITH A RANGE FINDER BASED ON AIMING PATTERN - Described is a method and system for triggering a device with a range finder based on aiming pattern. The system includes a processing device acquiring and processing data; an imager providing an image of an object; a range finder determining a distance from the imager to the object; a timer which is activated by the processing device; and an application acquiring data from the object within a measuring range if the processing device determines that the distance remains within a stabilization range for a period of time.11-27-2008
20130011017METHOD OF OBTAINING SPATIAL IMAGES ON DEMAND - Provided is a method of obtaining spatial images on demand. The server segments a spatial image into a plurality of regions to detect an isolated region whose continuity with peripheral regions is less than a threshold value, generates geographic information corresponding to the detected isolated region, and generates a capture request message including the generated geographic information. In addition, the capture request message may be transmitted to at least one terminal using unicast scheme, a multicast scheme, an anycast scheme, or a broadcast scheme. The capture request message may comprise a capture condition value such as a capture direction, a capture angle, a capture time, a zoom information and an exposure value. At least one terminal which meets the capture condition value within the capture request message transmits a captured image to the server.01-10-2013
20110007947SUBJECT POSITION DETERMINATION METHOD, PROGRAM PRODUCT FOR DETERMINING SUBJECT POSITION, AND CAMERA - A subject position determination method includes: generating a plurality of binarized images of a target image based upon color information or brightness information of the target image; calculating an evaluation value used to determine a subject position in the target image for each of the plurality of binarized images; and determining a subject position in the target image based upon the evaluation value.01-13-2011
20080205708RANGING APPARATUS AND RANGING METHOD - A first ranging apparatus includes a light-emitting unit for emitting a series of first through fourth modulated lights which have respective different time lengths from a reference time to respective time points at which the first through fourth modulated lights start being emitted, a light-detecting unit for detecting reflected lights from an object that is irradiated with the first through fourth modulated lights, and a calculating unit for calculating the distance up to the object based on the phase difference between the first through fourth modulated lights and the reflected lights. The light-emitting unit comprises a start time controller for controlling the time lengths. The light-detecting units samples the amounts of the reflected lights in exposure periods established at a constant cycle length from the reference time.08-28-2008
20100135533DETERMINATION DEVICE AND DETERMINATION METHOD - A determination device includes a region information recording unit that records therein region information regarding a closed region corresponding to a data distribution shape of a same category within a feature space, the closed region being formed by a plurality of nodes and line segments connecting the plurality of nodes. The determination device also includes a category deciding unit that decides a category of a determination target based on the region information and a position of the determination target within the feature space.06-03-2010
20100135534NON-CONTACT PROBE - A non-contact measurement apparatus and method. A probe is provided for mounting on a coordinate positioning apparatus, comprising at least one imaging device for capturing an image of an object to be measured. Also provided is an image analyser configured to analyse at least one first image of an object obtained by the probe from a first perspective and at least one second image of the object obtained by the probe from a second perspective so as to identify at least one target feature on the object to be measured. The image analyser is further configured to obtain topographical data regarding a surface of the object via analysis of an image, obtained by the probe, of the object on which an optical pattern is projected.06-03-2010
20090123033SYSTEM AND METHOD FOR ADJUSTING LUMINANCE OF A LIGHT-EMITTING DEVICE ON AN IMAGE MEASURING MACHINE - A computer-based method for adjusting luminance of a light-emitting device on image measuring machine is provided. The method includes reading a model definition curve and model coordinates of an object and a charge couple device (CCD). The method further includes locating the object and the CCD to positions on the image measuring machine, and capturing a digital image of the object. Furthermore, the method includes adjusting a resistance of the light-emitting device to ensure an ordinate deviation corresponding to each abscissa value between a new definition curve and the model definition curve falls in an allowable deviation range. A related system is also provided.05-14-2009
20090161914Visibility Range Estimation Method and System - The present disclosure provides methods and systems for estimating a visibility range in a visibility-degraded environment, e.g., fog. The methods and systems involve digital image processing.06-25-2009
20090136091DATA PROCESSING SYSTEM AND METHOD - A powerful, scaleable, and reconfigurable image processing system and method of processing data therein is described. This general purpose, reconfigurable engine with toroidal topology, distributed memory, and wide bandwidth I/O are capable of solving real applications at real-time speeds. The reconfigurable image processing system can be optimized to efficiently perform specialized computations, such as real-time video and audio processing. This reconfigurable image processing system provides high performance via high computational density, high memory bandwidth, and high I/O bandwidth. Generally, the reconfigurable image processing system and its control structure include a homogeneous array of 16 field programmable gate arrays (FPGA) and 16 static random access memories (SRAM) arranged in a partial torus configuration. The reconfigurable image processing system also includes a PCI bus interface chip, a clock control chip, and a datapath chip. It can be implemented in a single board. It receives data from its external environment, computes correspondence, and uses the results of the correspondence computations for various post-processing industrial applications. The reconfigurable image processing system determines correspondence by using non-parametric local transforms followed by correlation. These non-parametric local transforms include the census and rank transforms. Other embodiments involve a combination of correspondence, rectification, a left-right consistency check, and the application of an interest operator.05-28-2009
20090185720Weighted average image blending based on relative pixel position - A method of processing image data is provided. The method includes determining an overlap area based on image data from a first image sensor and image data from a second image sensor; computing a first weight and a second weight based on a relative position of the image data in the overlap area; and generating a final image by blending the image data from the first image sensor and the second image sensor based on the first weight and the second weight.07-23-2009
20090074251Method and System for Determining a Force and/or Torque Applied to an Orthodontic Bracket - A method and system are disclosed for analyzing images of fiducials that provide an indication of the forces and/or torques applied to an orthodontic bracket. The mass of each of the pixels forming the fiducial image are multiplied by the square of the distance of the pixel along the image's x axis, the product of the distances of the pixel along the image's x and y axes, and the square of the distance of the pixel along the image's y axis. The second order moments of inertia about the centroid of the fiducial are calculated and used to form a first matrix. The first matrix is diagonalized into a second matrix, from which the fiducial 's first and second principal axes are determined. These axes are then used to measure the size of the fiducial in the image.03-19-2009
20110142287ALGORITHMS FOR ESTIMATING PRECISE AND RELATIVE OBJECT DISTANCES IN A SCENE - A two picture matching curve information is able to be used to determine precise object distance or relative object distance in a scene. Acquiring two images with different blur information in addition to the curve information enables a device to determine distance information of objects in a scene. The distance information is able to be used in image processing including generating a depth map which is then able to be used in many imaging applications.06-16-2011
20110222735IMAGE PROCESSING PROGRAM, IMAGE PROCESSING APPARATUS, IMAGE PROCESSING METHOD AND IMAGE PROCESSING SYSTEM - As to each section making up of an original image, a CPU of an image processing apparatus calculates a distance from a predetermined position, for example, a center of the original image to a center of the section, sets a probability in accordance with the calculated distance, determines whether or not the section is a section where an object is to be drawn according to the set probability, and draws an object in each section which is determined as a section to be drawn.09-15-2011
20110222736ENHANCING STEREO DEPTH MEASUREMENTS WITH PROJECTED TEXTURE - A system for distance calculation is disclosed. The system includes an illuminator unit, one or more camera units, and a distance processor. The illuminator unit illuminates a scene in a target area using a textured pattern creator and wherein the textured pattern creator includes a diffractive optical element. The one or more camera units captures two or more images of the target area from two or more physical locations. A textured pattern illumination is visible in each of the two or more images of the target area. The images are used to calculate distances to one or more points in the scene in the target area.09-15-2011
20130121537IMAGE PROCESSING APPARATUS AND IMAGE PROCESSING METHOD - This image processing apparatus is an image processing apparatus for measuring a subject distance using a plurality of captured images acquired by capturing the same subject in a plurality of imaging states in succession of time, and includes a target motion amount estimation unit that estimates a target motion amount representing an amount of shift in subject position between first and second images among the captured images, the first image being captured in a first imaging state and a second image being captured in a second imaging state different from the first imaging state, a corrected image generation unit that generates a corrected image by performing motion compensation on the second image based on the target motion amount, and an image processing unit that performs image processing such as measuring a subject distance or generating an HDR image, using the first image and the corrected image.05-16-2013
20090252377MOBILE OBJECT RECOGNIZING DEVICE, MOBILE OBJECT RECOGNIZING METHOD, AND COMPUTER PROGRAM THEREOF - A mobile object recognizing device comprises a camera (10-08-2009
20090245584IMAGE PROCESSING APPARATUS, IMAGE PROCESSING METHOD, AND PROGRAM - In an image processing apparatus that generates a stereo image data set by calculating a distance to a subject according to a pair of image data sets each comprising grayscale images of a plurality of colors obtained by photographing the subject from two viewpoints, the pair of image data sets is inputted, and the grayscale image of one of the colors that is most appropriate for calculating the distance is selected among the grayscale images of the plurality of colors constituting at least one of the inputted image data sets. The distance is calculated based on the grayscale image of the selected color and the grayscale image of the color in the other image data set.10-01-2009
20120195471Moving Object Segmentation Using Depth Images - Moving object segmentation using depth images is described. In an example, a moving object is segmented from the background of a depth image of a scene received from a mobile depth camera. A previous depth image of the scene is retrieved, and compared to the current depth image using an iterative closest point algorithm. The iterative closest point algorithm includes a determination of a set of points that correspond between the current depth image and the previous depth image. During the determination of the set of points, one or more outlying points are detected that do not correspond between the two depth images, and the image elements at these outlying points are labeled as belonging to the moving object. In examples, the iterative closest point algorithm is executed as part of an algorithm for tracking the mobile depth camera, and hence the segmentation does not add substantial additional computational complexity.08-02-2012
20100150402METHOD FOR MEASURING A CURVED SURFACE OF AN OBJECT - A method for measuring a curved surface of an object is provided. The method aligns a point-cloud of an object and a triangulated curved surface of the object, obtain an original deviation value for each triangle on the triangulated curved surface by measuring a distance between each triangle and the nearest point in the point-cloud, and assign a color to each triangle according to a color assigned to a deviation range in which each original deviation value falls. The method further balances the assigned colors of all triangles on the triangulated curved surface, and generates a report according to data on the triangulated curved surface with the balanced colors.06-17-2010
20090116694DEVICE FOR MEASURING AN AERIAL IMAGE PRODUCED BY AN OPTICAL LITHOGRAPHY SYSTEM - An image measuring device that measures an aerial image, with relatively small or no dependence on the incident angle and polarization state of the beams projected onto the measuring device. The aerial image measuring device includes a substrate in which there are photo-luminescent nanoparticles that isotropically emit a photo-luminescent wavelength in response to an illuminated wavelength of the aerial image, a filter that blocks the illuminated wavelength and is transparent to the photo-luminescent wavelength, and a light detector that is sensitive to light of the photo-luminescent wavelength. The substrate is transparent to light of both the illuminated and the photo-luminescent wavelength, and the aerial image passes through the substrate and illuminates the nanoparticles. The photoluminescent light emitted by the nanoparticles passes through the filter and enters the light detector, which measures the aerial image. The aerial image is scanned by the aerial image measuring device05-07-2009
20100189307IMAGE PROCESSING APPARATUS, IMAGE PROCESSING METHOD AND COMPUTER READABLE MEDIUM - An image processing apparatus includes a measuring unit that measures lengths of segments in an image; a first evaluating unit that evaluates each segment based on the length of the segment measured by the measuring unit and a position of the segment; a second evaluating unit that evaluates each segment based on a similarity of the length of the segment measured by the measuring unit and the position of the segment; and a determining unit that determines whether each segment is a ruled line based on an evaluation result obtained by the first evaluating unit and an evaluation result obtained by the second evaluating unit.07-29-2010
20090116695SYSTEM AND METHOD FOR PROCESSING DIGITAL MEDIA - A method for processing digital media is described. In one example embodiment, the method may include detecting an unknown object in a video frame, receiving inputs representing probable identities of the unknown object in the video frame from various sources, and associating each input with the unknown object detected in the video frame. The received inputs may be processed, compared with reference data and, based on the comparison, probable identities of the object associated with the input derived. The method may further include retrieving a likelihood of the input to match the unknown object from historical data and producing weights corresponding to the inputs, fusing the inputs and the relative weight associated with each input, and identifying the unknown object based on a comparison of the weighted distances from the unknown identify to a reference identity. The relative weights are chosen from the historical data to maximize correct recognition rate based on the history of recognitions and manual verification results.05-07-2009
20090046896LENGTH MEASUREMENT SYSTEM - Disclosed herewith is a length measurement system, which obtains a value closer to its true one when figuring out the size and edge roughness of a pattern from a noise-included pattern image. Among plural band-like regions representing a portion around an edge in an image respectively, the system calculates the dependency of the edge point position on the image processing parameter at each of a narrow width band-like portion and a wide width band-like portion to calculate an image processing condition that calculates each measured value closer to its true value or estimates the true value itself.02-19-2009
20100189308Image Measuring Apparatus and Computer Program - An image of a measurement object is displayed, and specification of a feature image and a measurement position is received on the displayed image. The feature image, the specification has been received, and information on relative positions for the feature image, which represents the measurement position and a display position of a dimension line, are stored. A newly acquired image of the measurement object is compared with the feature image to identify information on the attitude and the position of the image of the measurement object. A measurement position is set for the image of the measurement object with the identified attitude and position, and then predetermined physical quantities are measured. Based on the stored information on the relative position for the feature image displaying the dimension line, a dimension line indicating a measurement position and a measurement result are displayed at predetermined positions.07-29-2010
20100215220MEASUREMENT DEVICE, MEASUREMENT METHOD, PROGRAM, AND COMPUTER READABLE MEDIUM - While a view angle is switched between wide and narrow view angles, images with the wide view angle and images with the narrow view angle are alternately taken. Based on images taken with the narrow view angle, movements of corresponding points in images in correspondence between the narrow-angle images are detected. Based on the images taken with the wide view angle, a translational vector and a rotation matrix that represent changes in the position/posture between the wide-angle images are calculated. By linearly interpolating the translational vector and the rotation matrix between the wide-angle images, a translational vector and a rotation matrix that represent changes in the position/posture between the narrow-angle images are estimated. Based on movements of corresponding points in the images and the translational vector and the rotational matrix between the narrow-angle images, three-dimensional coordinates of the corresponding points on the measurement object are highly accurately measured.08-26-2010
20100239126APPARATUS AND METHOD FOR MEASURING A DISTANCE TO AN EARDRUM - An apparatus for measuring a distance to an eardrum has an optical sensor and a signal-processing device. The optical sensor acquires an image of the eardrum and the signal-processing device determines an area or separation extending perpendicular to the spacing distance to be measured in the image and determines the spacing distance to the eardrum as a function of the determined area or separation. A corresponding method includes the steps of acquiring an image of the eardrum, determining an area or separation extending perpendicular to the distance to be measured in the image, and determining the distance as a function of the determined area or separation. A projection image may be projected onto or into a vicinity of the eardrum and then the distance can be determined from the separation between the projection image and the eardrum or the image center of the image.09-23-2010
20100226541SYSTEM AND METHOD FOR DETECTING POSITION OF UNDERWATER VEHICLE - An object of the invention is to provide a system for detecting the position of an underwater vehicle that enables an improvement in the accuracy of detecting the position of the underwater vehicle.09-09-2010
20100226540SYSTEM AND METHOD FOR MEASURING GAPS BETWEEN OBJECT PARTS - A detecting method for measuring gaps between two parts of an object is provided. The detecting method selects a reference image and a measured image, merges the reference image and the measured image to form an image of the object, grids the reference image and the measured image to obtain triangle mesh surfaces, obtains boundary points and outline points of the measured image, and obtains triangles on a joint portion of the reference image and measured points on a corresponding joint portion of the measured image. The method further compares each measured point with an obtained triangle to obtain gap values of the joint portion between the reference image and the measured image, and outputs an analysis report of the gap values on a display device.09-09-2010
20120140989ENDOSCOPE APPARATUS AND METHOD OF MEASURING OBJECT - An endoscope apparatus including: an image pickup portion that picks up an image of an object; and a measurement portion that measures the object based on the image of the object obtained by the image pickup portion, in which the measurement portion includes: a specification portion that specifies three base points on the image; a composing point calculation portion that calculates composing points forming an object region of the object, based on an image region that is based on a plurality of points, the points set on a line determined by the three base points that are specified by the specification portion; and a size calculation portion that calculates a size of the object based on the composing points.06-07-2012
20100119114DETERMINING RELATIVE DEPTH OF POINTS IN MULTIPLE VIDEOS - A relative depth of points captured by at least two recording sources are determined. A first sequence of image frames acquired from a first source and a second sequence of image frames acquired from a second source are received by a data processing system. The data processing system identifies a plurality of points-of-interest, each point-of-interest being present in both the first sequence and the second sequence. The points-of-interest are clustered into common depth planes at least by comparing motion across the sequences of different points-of-interest. Results of the clustering are stored in a processor-accessible memory system.05-13-2010
20100040257IMAGE DISPLAYING SYSTEM AND APPARATUS FOR DISPLAYING IMAGES BY CHANGING THE DISPLAYED IMAGES BASED ON DIRECTION OR DIRECTION CHANGES OF A DISPLAYING UNIT - An image displaying system for displaying an image operable to change a displayed image according to a direction of a image displaying apparatus, including: an image storing unit storing thereon the image; a displaying unit operable to display the image, the displaying unit being carried by a user; a photographing unit operable to photograph exterior as a plurality of photographed images, the photographing unit being carried with the displaying unit all together; a travel distance computing unit operable to compute a travel distance of the displaying unit by analyzing the plurality of photographed images which are photographed by the photographing unit at different times; and a display control unit operable to change the image, which is different from the photographed images, based on the travel distance computed by the travel distance computing unit and to causes the displaying unit to display the image.02-18-2010
20110026774CONTROLLING AN IMAGING APPARATUS OVER A DELAYED COMMUNICATION LINK - Method that includes: enabling the user to track a user-identified target on a currently presented image of periodically transmitted images from an imaging apparatus; calculating a distance between the estimated location of the user-identified target in view of the user's tracking and the estimated location of the pointing point of the imaging apparatus at said future time, wherein the estimation relate to a future time future time by which a command control currently transmitted by the user reaches the imaging apparatus; and calculating a command control required for s directing the pointing point of the imaging apparatus onto the user-identified target, based on said calculated distance, the estimated average velocity of the user-identified target and further based on all previous control commands that had been already transmitted by the user but have not yet affected the currently presented image due to the delay in the communication link.02-03-2011
20100220893Method and System of Mono-View Depth Estimation - A method and system of mono-view depth estimation are disclosed. A two-dimensional (2D) image is first segmented into a number of objects. A depth diffusion region (DDR), such as the ground or a floor, is then detected among the objects. The DDR generally includes a horizontal plane. The DDR is assigned the depth, and each object connected to the DDR is assigned depth according to the depth of the DDR at the connected site.09-02-2010
20110129123IMAGE SENSORS FOR SENSING OBJECT DISTANCE INFORMATION - An image sensor includes a clock signal generator configured to generate and output at least first and second clock signals, a plurality of pixels configured to generate associated distance signals based on corresponding clock signals from among the at least first and second clock signals and light reflected by an object, and a distance information deciding unit configured to determine distance information with respect to the object by using the associated distance signals. At least one first pixel from among the plurality of pixels is configured to generate the associated distance signal based on at least the first clock signal, and at least one second pixel from among the plurality of pixels, which is adjacent to the at least one first pixel, is configured to generate the associated distance signal based on at least the second clock signal.06-02-2011
20100296705METHOD OF AND ARRANGEMENT FOR MAPPING RANGE SENSOR DATA ON IMAGE SENSOR DATA - A method of and arrangement for mapping first range sensor data from a first range sensor to image data from a camera are disclosed. In at least one embodiment, the method includes: receiving time and position data from a position determination device on board a mobile system, as well as the first range sensor data from the first range sensor on board the mobile system and the image data from the camera on board the mobile system; identifying a first points cloud within the first range sensor data, relating to at least one object; producing a mask relating to the object based on the first points cloud; mapping the mask on object image data relating to the same object as present in the image data from the at least one camera; and performing a predetermined image processing technique on at least a portion of the object image data.11-25-2010
20100303299Three dimensional image sensor - A depth sensor includes a light source, a detector, and a signal processor. The light source transmits a source signal to the target according to a transmit control signal having reference time points. The detector receives a reflected signal from the source signal being reflected from the target. The signal processor generates a plurality of sensed values by measuring respective portions of the reflected signal during respective time periods with different time delays from the reference time points. The signal processor determines a respective delay time for a maximum/minimum of the sensed values for determining the distance of the target.12-02-2010
20100303300LOCALIZING A SURVEY INSTRUMENT IN RELATION TO A GROUND MARK - A method is disclosed for localizing, in relation to a mark located at a ground level, a surveying instrument having a housing including at least one camera. In at least one embodiment, the method includes aligning the vertical rotational axis of the surveying instrument with the mark using a pointing device; capturing an image of the ground below the housing with the camera arranged in a known camera position and orientation, wherein the camera position is eccentric to the rotation center of the surveying instrument; identifying an object point corresponding to the mark in the captured image; measuring image coordinates of the object point in the captured image; and determining the height of the rotation center of said instrument above the ground based on the image coordinates and camera calibration data. Furthermore, a surveying instrument for performing at least one embodiment of the method is disclosed.12-02-2010
20100322482THREE-DIMENSIONAL MEASUREMENT SYSTEM AND METHOD OF THE SAME, AND COLOR-CODED MARK - This invention provides a three-dimensional measurement system that improves the efficiency of and enables the automation of non-contact three-dimensional measurement over a wide range using a coded target. A measuring object 12-23-2010
20100322481IMAGE PROCESSING APPARATUS AND METHOD - Subject-distance information is acquired by a light-weight, small-size arrangement without a significant change in the structure of a conventional image processing apparatus. The apparatus acquires diffraction images obtained by sensing the image of a subject, using an image sensing unit, obtained via a diffraction grating and imaging optical system, detects a real image from luminance gradients of the diffraction images, calculates the distance between the detected real image and a virtual image corresponding to this real image in the diffraction images, and calculates the depth distance between the subject and the diffraction grating using the distance calculated by first calculating unit.12-23-2010
20090129633Method for repositioning a numerically conrolled device - The invention relates to a method for repositioning a numerically controlled device by using an image taken of an object as an aid. According to the method, the system is taught in such a way that a child image is defined for the camera's image, either its own set of co-ordinates is formed for the element or a set of coordinates is retrieved from elsewhere, and the image thus obtained, together with its co-ordinates, is stored in the data system, and, in the repositioning situation, the real-time image is compared with the child image stored in the data system, in order to determine the real-time position of the imaging device relative to the stored image. The set of co-ordinates used is a set of co-ordinates retrieved using satellite positioning, or the device's own internal set of co-ordinates. In the repositioning situation, the image stored in the teaching situation is sought as the co-ordinate point of the image stored in the memory is approached.05-21-2009
20110038509DETERMINING MAIN OBJECTS USING RANGE INFORMATION - A system and method for identifying a main object in a digital image using range information includes receiving the digital image representing a scene; identifying range information associated with the digital image and including distances of pixels in the scene from a known reference location; identifying the main object in the digital image based at least upon an analysis of the range information and the digital image; and storing an indication of the identified main object in a processor-accessible memory system.02-17-2011
20110038510IMAGE PROCESSING APPARATUS, IMAGE PROCESSING METHOD, AND PROGRAM - An image processing apparatus includes an outline distance estimating unit that acquires distance information on an outline portion of an area included in an image; an area dividing unit that divides an area included in an image on the basis of pixel attribute values; an area-plane estimating unit that estimates an area plane composing each area by using the distance information on the outline portion of each area resulting from the division to calculate an area-plane definitional equation; an abnormal data determining unit that compares the area plane with each coordinate position on a three-dimensional space indicated by the distance information to determine the distance information having a high degree of shift from the area plane to be abnormal data; and an area interpolating unit that estimates the distances inside the area by using the distance information resulting from removal of the abnormal data from the distance information.02-17-2011
20110007948SYSTEM AND METHOD FOR AUTOMATIC STEREO MEASUREMENT OF A POINT OF INTEREST IN A SCENE - A system for performing a three dimensional stereo measurement that uses a sensor for obtaining a sensor image of a scene, and a database for providing first and second reference images of the scene that are a stereo pair of images. At least one processing system is responsive to an output of the sensor and in communication with the database. The processing system registers the sensor image with the first reference image, and also selects a point of interest from one of the sensor image and the first reference image. The processing system performs a stereo point measurement from the selected point of interest and the first reference image to determine a point in the second reference image that represents a stereo mate of the selected point in the first reference image.01-13-2011
20110026773IMAGE PROCESSING APPARATUS, INFORMATION PROCESSING SYSTEM, AND IMAGE PROCESSING METHOD - A technique is provided which can improve the precision of a matching point search with a plurality of images taking the same object where distant and near views coexist. A plurality of first images obtained by time-sequentially imaging an object from a first viewpoint, and a plurality of second images obtained by time-sequentially imaging the object from a second viewpoint, are obtained. Reference regions including a reference point are set respectively in the first images with the same arrangement, and comparison regions corresponding to the form of the reference regions are set respectively in the second images with the same arrangement. One reference distribution of pixel values about two-or-more-dimensional space is generated from the distributions of pixel values about the plurality of reference regions, and one comparison distribution of pixel values about two-or-more-dimensional space is generated from the distributions of pixel values about the plurality of comparison regions. Then, a matching point in the plurality of second images that corresponds to the reference point is detected by using the reference distribution of pixel values and the comparison distribution of pixel values.02-03-2011
20100183197APPARATUS FOR INSPECTING AND MEASURING OBJECT TO BE MEASURED - An apparatus for inspecting and measuring an object to be measured includes: a distance measurement device having a light projector that projects a two-dimensional optical pattern onto a measurement target of the measurement object, imaging devices disposed in a stereoscopic arrangement that image the measurement object, and a driving device that rotates a posture of at least one of the imaging devices to control a parallax angle between the imaging devices; a working distance control device that controls the driving device and adjusts a position at which optical axes of the imaging devices intersect; and a distance calculation device having a correspondence position calculation device that determines a correspondence position at which the same region is imaged among images of the imaging devices, and a distance calculation devices that calculates a distance to the measurement target of the measurement object based on a calculation result of the calculation device.07-22-2010
20110019876Systems And Methods For Detecting Alignment Errors - A method of detecting an alignment error includes the steps of controlling a first portion of one or more imaging units to image on a substrate a first plurality of substantially parallel lines extending along a first direction and a second plurality of substantially parallel lines extending along a second direction and controlling a second portion of one or more imaging units to image a third plurality of substantially parallel lines extending along the first direction and a fourth plurality of substantially parallel lines extending along the second direction. One or more distances between adjacent lines of the second plurality of lines are varied and one or more distances between adjacent lines of the fourth plurality of lines are varied. Further, the lines imaged by the first and second portions form an alignment pattern. The method further includes the steps of collecting data relating to the alignment pattern and analyzing the collected data to determine an alignment error between the first and second portions of the one or more imaging units.01-27-2011
20100172546METHODS AND APPARATUS FOR PERFORMING ANGULAR MEASUREMENTS - A method of determining an azimuth and elevation of a point in an image is provided. The method comprises positioning an imaging device at a first position and acquiring a first image. The method also comprises rotating the imaging device and acquiring a second image at the first position. The first image includes the point, and a portion of the first image overlaps a portion of the second image. The method also includes determining correspondences between features in overlapping portions of the images, determining a first transformation between coordinates of the first image and coordinates of the second image based on the correspondences, and determining a second transformation between the coordinates of the second image and a local coordinate frame. The method also includes computing the azimuth and elevation of the point based on the first transformation and the second transformation.07-08-2010
20110091076METHOD AND APPARATUS FOR CAPTURING, GEOLOCATING AND MEASURING OBLIQUE IMAGES - A computerized system having a computer system storing a database of captured oblique images having corresponding geo-location data. The computerized system also has a data table storing ground plane data that approximates at least a portion of the terrain depicted within the captured oblique images. The computer system further has computer executable logic that when executed by a processor causes the computer system to receive a selection of a geographic point from a user, search the database to find images that contain the selected point, and make the images available to the user.04-21-2011
20090214082Image management apparatus - An image management apparatus capable of managing a plurality of images captured within a geographical range included in a designated map linked with that map. The image management apparatus has a period setting unit setting a predetermined period, an imaging unit acquiring an image, a position information acquisition unit acquiring capturing position information, a map information acquisition unit acquiring map information representing a predetermined geographical range, a judgment unit judging whether or not the imaging location corresponding to each of a plurality of images is included within the predetermined geographical range, a display position determination unit determining the display position of the imaging location judged to be included within the predetermined geographical range by the judgment unit among imaging locations, and a display unit displaying the imaging location judged to be included within the predetermined geographical range according to the display position.08-27-2009
20120243746IMAGE PROCESSOR, IMAGE PROCESSING METHOD, AND PROGRAM - An image processor includes a synthesis processing unit adapted to generate a panoramic image by clipping images from a plurality of captured images and connecting the clipped images together and a processing unit adapted to correct image distortions of a subject in the panoramic image due to varying distances to the subject by modifying the panoramic image on the basis of distance information obtained by measuring the distances to a plurality of positions of the subject.09-27-2012
20110129122MAP MATCHING FOR SECURITY APPLICATIONS - An apparatus for map matching between a measured position for an object and information on a digital map is specified. This involves a computation unit being used to perform a first selection of cartography elements on the map on the basis of the measured position and on the basis of a predefined error. In addition, the computation unit is designed to provide the selected cartography elements for a first and a second secondary computation unit. In addition, values from the secondary computation units are converted in the same unit of measurement.06-02-2011
20100220894METHOD AND APPARATUS FOR GEOMETRICAL MEASUREMENT USING AN OPTICAL DEVICE SUCH AS A BARCODE AND/OR RFID SCANNER - A system and method are described for determining the dimensions of items using an optical device, such as a barcode and/or RFID tag reader, for example. The dimensions of the field of view of the optical device are established, indexed by distance, and the dimensions of items in the field of view of the optical device are determined as a percentage of the full field of view of the optical device at the appropriate distance.09-02-2010
20110211731APPARATUS, METHOD, AND MEDIUM FOR DIVIDING REGIONS BY USING FEATURE POINTS AND MOBILE ROBOT USING THE SAME - An apparatus, method, and medium for dividing regions by using feature points and a mobile robot cleaner using the same are provided. A method includes forming a grid map by using a plurality of grid points that are obtained by detecting distances of a mobile robot from obstacles; extracting feature points from the grid map; extracting candidate pairs of feature points, which are in the range of a region division element, from the feature points; extracting a final pair of feature points, which satisfies the requirements of the region division element, from the candidate pair of feature points; forming a critical line by connecting the final pair of feature points; and forming a final region in accordance with the size relationship between regions formed of a closed curve which connects the critical line and the grid map.09-01-2011
20100008543SHAPE MEASURING DEVICE AND SHAPE MEASURING METHOD - A shape measuring device includes: a slit pattern projection unit (01-14-2010
20110081049DETECTOR AND METHOD FOR IDENTIFYING A ROAD LANE BOUNDARY - A detector for identifying a road lane boundary using a digitalized optical image of the region in front of the vehicle. The detector includes a correlator configured to select the edges which are to be used for road lane estimation by searching for extreme values of convolution response, and to weight each convolution response by a weighting factor. A histogram analysis unit configured to group the extracted edges into pairs, and to determine a frequency distribution of the distances between two paired edges as a histogram, and to use said frequency distribution to determine the distances between two grouped edges forming a frequency peak or a frequency plateau in the histogram as nominal edge widths. A weighting factor determination unit configured to determine, for an edge, the weighting factor of the convolution response in the correlator in such a manner that the weight determined by the weighting factor for an edge of the paired edges is higher the smaller the deviation of the distance between the grouped edges from the nominal edge width.04-07-2011
20110103652IMAGE PROCESSING APPARATUS AND IMAGE PROCESSING METHOD - A feature amount cluster holding unit extracts respective feature amounts from N images sensed under N respective types of image sensing conditions, and manages the feature amounts extracted from each sensed image as a feature amount cluster in association with corresponding one of the image sensing conditions. A feature space distance calculation unit specifies the second feature amount cluster containing a feature amount similar to the feature amount of interest in the first feature amount cluster. An image sensing condition setting unit specifies sets of feature amounts associated with the N respective types of image sensing conditions from the first and second feature amount clusters. The image sensing condition setting unit specifies a set having a largest distance between feature amounts in the set among the specified sets. An image sensing condition associated with the specified set is output as an image sensing condition for discriminating the feature amount clusters.05-05-2011
20110075886GRAPHICS-AIDED REMOTE POSITION MEASUREMENT WITH HANDHELD GEODESIC DEVICE - A graphics-aided geodesic device is provided. The device may include a display, camera, distance meter, GNSS (Global Navigation Satellite System, including GPS, GLONASS, and Galileo) receiver and antenna, and horizon sensors. Data from the camera and horizon sensors may be displayed to assist the user in positioning the device over a point of interest. In one example, the distance meter may be used to determine the position of the point of interest. In another example, images of the point of interest taken from multiple locations may be used to determine the position of the point of interest.03-31-2011
20110075885VOLUMETRIC IMAGE DATA PROCESSING - A method, apparatus, computer readable medium storing computer readable instructions are disclosed for processing volumetric image data. According to the method, 3-dimensional data points are collected. A plurality of 2-dimensional image maps is obtained from the 3-dimensional data points. At least one of the plurality of 2D image maps is extracted to form at least one image frame. A frame gallery is created from the at least one image frame.03-31-2011
20110069870SCREEN SPACE PLANE IDENTIFICATION - A method of finding and defining a plane includes screen-space scanning a plurality of rows of a depth image and interpolating a straight depth line through at least two depth values for each row. A pair of straight boundary lines are then fit to the endpoints of the straight depth lines, and a plane is defined to include these straight boundary lines.03-24-2011
20120201424ENVIRONMENTAL MODIFICATIONS TO MITIGATE ENVIRONMENTAL FACTORS - A method of depth imaging includes acquiring a depth image from a depth camera, identifying an environmental factor invalidating depth information in one or more portions of the depth image, and outputting an environmental modification to mitigate the environmental factor.08-09-2012
20120148108IMAGE PROCESSING APPARATUS AND METHOD THEREFOR - Image capturing data captured by using an imaging optical system including an iris with an aperture having no point symmetry is input. An imaging parameter for the imaging optical system when the image capturing data is captured is captured. A spectrum of the input image capturing data is calculated. Optical characteristic information corresponding to an imaging parameter and an object distance and a spectrum model are obtained. A predictive model is generated as a spectrum model corresponding to the input image capturing data by using the imaging parameter, optical characteristic information, and spectrum model. An evaluation function is generated by using the spectrum of the image capturing data and the predictive model. The actual distance of the object included in an image represented by the image capturing data is estimated by using the evaluation function and a statistical method.06-14-2012
20110150286ESTIMATION APPARATUS, CONTROL METHOD THEREOF, AND PROGRAM - An estimation apparatus for estimating a position and orientation of an object, includes: a capturing unit adapted to capture an object targeted for position and orientation estimation, and generate a range image representing distance information from the capturing unit to the target object; a general estimation unit adapted to analyze the range image and estimate a general position and orientation of the target object; a plurality of identification units each adapted to estimate a detailed position and orientation of an object within a predetermined position and orientation range; a determination unit adapted to determine a priority order of the plurality of identification units, based on the general position and orientation estimated by the general estimation unit; and a detailed estimation unit adapted to estimate a detailed position and orientation of the target object, using the plurality of identification units in the priority order determined by the determination unit.06-23-2011
20100260383 CALIPER FOR MEASURING OBJECTS IN AN IMAGE - The invention relates to a system (10-14-2010
20100260382Object Detection And Ranging Method - The present invention relates to a method for object detection and ranging within a vehicles rearward field of interest which includes an algorithm to translate images provided by an imaging device. An image device provides images to a processor which divides the images into groups of rows of pixels. The rows are processed by the algorithm which includes assigning each pixel in the rows to an object. The translation of the image from outside of the vehicle is provided to the vehicle operator and includes the tracking of location and dimensions of multiple objects within the viewing range.10-14-2010
20090022369POSITION/ORIENTATION MEASUREMENT METHOD AND APPARATUS - This invention relates to a position/orientation measurement apparatus which can measure a position and orientation while achieving both high stability and precision. An image including indices laid out on a space is captured, and the indices are detected from the captured image. When a plurality of indices are detected, their distribution range is calculated, and an algorithm to be applied in position/orientation calculations is selected according to the size of the range (S01-22-2009
20120201425Method and Apparatus for Visualizing Multi-Dimensional Well Logging Data with Shapelets - A method for visualizing parametric logging data includes interpreting logging data sets, each logging data set corresponding to a distinct value of a progression parameter, calculating a geometric image including a representation of data from each of the logging data sets corresponding to a wellbore measured depth, and displaying the geometric image(s) at a position along a well trajectory corresponding to the wellbore measured depth. The progression parameter includes time, a resistivity measurement depth, differing tool modes that are sampling different volumes of investigation, and/or sampling different physical properties. The geometric images include a number of parallel lines having lengths determined according to the logging data and/or an azimuthal projection of the logging data, a number of concentric axial projections, and/or shapelets determined from parallel lines and/or concentric axial projections. The method includes dynamically determining a selected measured depth, measured depth interval, and/or azimuthal projection angle.08-09-2012
20090028390Image Processing for Estimating Subject Distance - An image processing method includes acquiring first information that indicates a size of an image of a specific subject in a target image relative to a size of the target image, acquiring second information that indicates a size of the specific subject, acquiring third information with which an angle of view of the target image can be determined, and estimating a subject distance, which is a distance from an image pickup apparatus that has generated the target image to the specific subject on the basis of the first information, the second information and the third information.01-29-2009
20080279422IMAGE PROCESSING APPARATUS - An image processing apparatus includes a distance measurement section which measures a distance, on the basis of a plurality of images photographed by an imaging device at different visual point positions, from the imaging device to a subject for each pixel. A threshold setting section sets information indicative of a distance between an obstacle present between the imaging device and a main subject, and the imaging device. An image formation section compares the distance measured by the distance measurement section with the threshold information thereby to form an image from which an image of the obstacle is removed.11-13-2008
20080267454MEASUREMENT APPARATUS AND CONTROL METHOD - A measurement apparatus (10-30-2008
20080253617Method and Apparatus for Determining the Shot Type of an Image - A method and apparatus for determining the type of shot of an image is disclosed. The method comprising the steps of: assigning (10-16-2008
20080205709RANGING APPARATUS AND RANGING METHOD - A first ranging apparatus includes a light-emitting unit for emitting a series of first through fourth modulated lights which have respective different start phases at which the first through fourth modulated lights start being emitted, a light-detecting unit for detecting reflected lights from an object that is irradiated with the first through fourth modulated lights, and a calculating unit for calculating the distance up to the object based on the phase difference between the first through fourth modulated lights and the reflected lights. The light-emitting unit comprises a start phase controller for controlling the start phases. The light-detecting units samples the amounts of the reflected lights in exposure periods established at a constant cycle length from the time when the modulated lights start being emitted.08-28-2008
20080205707Method And Geodetic Device For Surveying At Least One Target - The invention relates to a method for surveying at least one target using a geodetic device. According to said method, a camera of the device captures a visual image and surveys an angle and/or a distance to the target with geodetic precision, the angle and/or distance surveillance being supported or controlled by the visual image. At the same time of capture of the visual image at least two distance points of a distance image are captured as the spatial distribution of discrete distance points in the area of detection. When the visual image and the distance image are correlated with each other, the target is recognized or the measuring process is controlled.08-28-2008
20100246898IMAGE RECOGNITION DEVICE AND IMAGE RECOGNITION METHOD - The present invention is aimed at detecting an operator's height and position of each body part with a certain range of accuracy, which is used for calculation of a BMI value or operation of a game. An exercise assist device (09-30-2010
20100246897Method for Producing a Known Fixed Spatial Relationship Between a Laser Scanner and a Digital Camera for Traffic Monitoring - Method for producing a known fixed spatial relationship between a laser scanner and a digital camera for monitoring traffic, wherein the laser scanner axis and the optical axis of the digital camera are aligned relative to one another only roughly, and the spatial relationship between a scanner coordinate system (09-30-2010
20100246896Image processing device - An image processing device for dividing an image imaged by imaging means into multiple regions includes: processing means for grouping, when difference of the pieces of image data between a single pixel within the image and a pixel adjacent thereto is less than a predetermined threshold, the single pixel and the adjacent pixel, and dividing the image into multiple regions with finally obtained each group as each region of the image; and average-value calculating means for calculating the average value of the image data within the group including the single pixel; with the processing means comparing the image data of the single pixel, and the average value calculated at the average-value calculating means regarding the group to which the adjacent pixel belongs; and when the difference thereof is equal to or greater than a predetermined second threshold, doing not group the single pixel and the adjacent pixel.09-30-2010
20100246895POSITION/ATTITUDE RECOGNIZING METHOD, PART HOLDING METHOD, PART ARRANGING METHOD, PART ASSEMBLING METHOD, POSITION/ATTITUDE RECOGNIZING APPARATUS, PART HOLDING APPARATUS, PART ARRANGING APPARATUS AND PART ASSEMBLING APPARATUS - A group of light spots dispersed and disposed three-dimensionally to be disposed not in one plane on an object-to-be-measured are shot by a camera. A position and an attitude of the object-to-be-measured are recognized based on an optical image representing each of the light spots included on a shot image by the camera.09-30-2010
20100246894COMPONENT ASSEMBLY INSPECTION METHOD AND COMPONENT ASSEMBLY INSPECTION APPARATUS - A component assembly inspection method includes taking an image of a first light spot group possessed by a first component with a camera after the first component is assembled in a second component, the first light spot group including plural light spots. The method further includes recognizing a position and an attitude of the first component based on a light image that is on the image taken with the camera and represents each of the light spots of the first light spot group; and determining quality of an assembly state of the first component in the second component based on the position and attitude of the first component.09-30-2010
20100246893Method and Apparatus for Nonlinear Dynamic Estimation of Feature Depth Using Calibrated Moving Cameras - A method apparatus estimates depths of features observed in a sequence of images acquired of a scene by a moving camera by first locating features, estimating coordinates of the features and generating a sequence of perspective feature image. A set of differential equations are applied to the sequence of perspective feature images to form a nonlinear dynamic state estimator for the depths using only a vector of linear and angular velocities of the camera and the focal length of the camera. The camera can be mounted on a robot manipulator end effector. The velocity of the camera is determined by robot joint encoder measurements and known robot kinematics. An acceleration of the camera is obtained by differentiating the velocity and the acceleration is combined with other signals.09-30-2010
20100246892COMPOUND EYE TYPE IMAGING APPARATUS WITH DISTANCE MEASURING CAPABILITY - Multiple imaging areas (09-30-2010
20100246891Method for Generating a Distance Field of an Object Represented By Outlines - A method generates a distance field of an object, where the distance field includes a set of cells and the object includes a set of outlines. A processor is included for performing steps of the method. A first cell of the set of cells enclosing the object is determined. An outside reconstruction method is associated with the first cell. A set of boundary cells of the set of cells is determined, where each boundary cell encloses a portion of a particular outline in the set of outlines. A boundary reconstruction method is associated with each boundary cell. A final cell of the set of cells is determined enclosing the object. An inside reconstruction method is associated with the final cell. The outside and boundary reconstruction methods are used to determine combined distances, which are further processed by the inside reconstruction method to generate the distance field of the object.09-30-2010
20110135157APPARATUS AND METHOD FOR ESTIMATING DISTANCE AND POSITION OF OBJECT BASED ON IMAGE OF SINGLE CAMERA - Disclosed are an apparatus and a method for estimating a distance and a position between a photographing unit and a predetermined object based on an image of the photographing unit using a single camera. An apparatus for estimating a distance of an object based on a single camera image includes: a region detector detecting an object region box including the predetermined object in an image photographed by a photographing unit; a distance estimator measuring the size of the detected object region box and estimating a distance between the predetermined object and the photographing unit on the basis of an interpolating function interpolating the relationship of the size of the object region box and distances up to the photographing unit and the predetermined object; and a position estimator generating object position information estimating the position of the predetermined object.06-09-2011
20120148109DISTANCE ESTIMATION DEVICE, DISTANCE ESTIMATION METHOD, INTEGRATED CIRCUIT, AND COMPUTER PROGRAM - A distance estimation device (06-14-2012
20120148107IMAGE PROCESSING APPARATUS AND IMAGE PROCESSING METHOD - Disclosed is an image processing apparatus including: a representative image creating section which creates a representative image on the basis of images of respective screens which form a predetermined scene; and a symbol drawing section which draws, on the basis of a movement of a target portion of each screen in the predetermined scene, a symbol indicating the movement of the target portion in a peripheral section of the representative image created by the representative image creating section.06-14-2012
20120148106TERMINAL AND METHOD FOR PROVIDING AUGMENTED REALITY - A method for providing augmented reality includes acquiring a real-world image including an object; transmitting terminal information, in which the terminal information includes a location information of a terminal and an original retrieval distance; receiving object information corresponding to the object, in which the object information is based on the transmitted terminal information; and overlapping the received object information over the corresponding object in the real-world image. A terminal to perform the methods described herein includes a location information providing unit, an information transmitting/receiving unit, an image processing unit, and a user view analyzing unit.06-14-2012
20110255749METHOD AND APPARATUS FOR DISPLAYING A CALCULATED GEOMETRIC ENTITY WITHIN ONE OR MORE 3D RANGEFINDER DATA SETS - A method, computer program product, and apparatus for displaying a calculated geometric entity within at least one 3D range data set obtained using a 3D rangefinder device. At least a first 3D range data set is provided. Each 3D range data set is displayed as at least one displayed image. A calculated geometric entity that represents a non-physical entity is specified. The calculated geometric entity is displayed merged within at least one displayed image, where the calculated geometric entity represents something other than the physical objects represented by the first 3D range data set.10-20-2011
20100034426MEASUREMENT APPARATUS, MEASUREMENT METHOD, AND FEATURE IDENTIFICATION APPARATUS - It is an object to measure a position of a feature around a road. An image memory unit stores images in which neighborhood of the road is captured. Further, a three-dimensional point cloud model memory unit 02-11-2010
20110096958ULTRASOUND DIAGNOSTIC APPARATUS - The ultrasonic diagnostic apparatus of this invention includes: a transmitting section 04-28-2011
20110096957POSITION MEASUREMENT METHOD, POSITION MEASUREMENT DEVICE, AND PROGRAM - A position measurement method includes an exterior orientation parameter correcting step S04-28-2011
20100215219METHOD AND APPARATUS FOR EXTRACTING SCENERY DEPTH INFORMATION - A method and optical apparatus are utilized not only to increase the degree of the similarity of the point spread functions of different fields of view, but also to maintain the degree of the difference of point spread functions which are along the optical axis of the optical apparatus. Using the degree of the difference of on-axis point spread functions, the depth information of scenery can be extracted.08-26-2010
20110052009UNCONSTRAINED SPATIALLY ALIGNED HEAD-UP DISPLAY - A system for providing a spatially aligned head-up display to a user viewing a scene through a display the system includes: an image sensing system including at least one image sensor deployed to sample images of the user's face, the image sensing system generating data derived at least in part from the images; a display for displaying visible indications to the user superimposed on the scene; and a control system associated with the image sensing system and the display, the control system being configured to: process data from the image sensing system to determine a position and attitude of the user's face, determine a viewing direction from at least one eye of the user to a point of interest within the scene, and actuate the display to display a visible indication aligned with the viewing direction that provides a spatially aligned head-up display to the user.03-03-2011
20100166263ELECTRONIC DEVICE AND MEASURING METHOD USING THE SAME - An electronic device includes an image capturing device, a touch sensitive display, a distance measuring device, and a processor. The image capturing device picks up an image of the object and determining a focal length of the image capturing device at which the image capturing device captures the image. The touch sensitive display displays the image and receives input touches on the image to determine a touching region. The distance measuring device determines a distance between the electronic device and the object. The processor extracts the dimension of the touching region to determine the dimension of the image of the object, and determines the dimension of the object based on the dimension of the image of the object, the distance, and a pre-setting proportional relationship about the dimension of the image of the object, the distance between the electronic device and the object, and the focal length.07-01-2010
20100158317METHOD AND SYSTEM FOR DETERMINING A DEFECT DURING SAMPLE INSPECTION INVOLVING CHARGED PARTICLE BEAM IMAGING - A method for determining a defect during sample inspection involving charged particle beam imaging transforms a target charged particle microscopic image and its corresponding reference charged particle microscopic images each into a plurality of feature images, and then compares the feature images against each other. Each feature image captures and stresses a specific feature which is common to both the target and reference images. The feature images produced by the same operator are corresponding to each other. A distance between corresponding feature images is evaluated. Comparison between the target and reference images is made based on the evaluated distances to determine the presence of a defect within the target charged particle microscopic image.06-24-2010
20100158318FOCAL SPOT SIZE MEASUREMENT WITH A MOVABLE EDGE LOCATED IN A BEAM-SHAPING DEVICE - It is described a method for measuring the sharpness in an X-ray system (06-24-2010
20100195872SYSTEM AND METHOD FOR IDENTIFYING OBJECTS IN AN IMAGE USING POSITIONAL INFORMATION - A computer-implemented method is provided for identifying objects in an image. The method includes: capturing a series of images of a scene using a camera; receiving a topographical map for the scene that defines distances between objects in the scene; determining distances between objects in the scene from a given image; approximating identities of objects in the given image by comparing the distances between objects as determined from the given image in relation to the distances between objects from the map. The identities of objects can be re-estimated using features of the objects extracted from the other images.08-05-2010
20100195873QUANTITATIVE DIFFERENTIAL INTERFERENCE CONTRAST (DIC) DEVICES FOR COMPUTED DEPTH SECTIONING - Embodiments of the present invention relate to a method for computing depth sectioning of an object using a quantitative differential interference contrast device having a wavefront sensor with one or more structured apertures, a light detector and a transparent layer between the structured apertures and the light detector. The method comprises receiving light, by the light detector, through the one or more structured apertures. The method also measures the amplitude of an image wavefront, and measures the phase gradient in two orthogonal directions of the image wavefront based on the light. The method can then reconstruct the image wavefront using the amplitude and phase gradient. The method can then propagate the reconstructed wavefront to a first plane intersecting an object at a first depth. In one embodiment, the method propagates the reconstructed wavefront to additional planes and generates a three-dimensional image based on the propagated wavefronts.08-05-2010
20100067745SYSTEM AND METHOD FOR OBJECT CLUSTERING AND IDENTIFICATION IN VIDEO - Embodiments of computer implemented methods and systems for object clustering and identification are described. One example embodiment includes receiving an unclustered video object, determining a first distance between the unclustered video object and an arbitrary representative video object, the arbitrary representative video object being selected from representative video objects, estimating distances between the unclustered video object and the representative video objects based on the first distance and precalculated distances between the arbitrary representative video object and the representative video objects, and, based on the estimated distances, selectively associating the unclustered video object with a video cluster, thereby producing a clustered video object.03-18-2010
20110188708THREE-DIMENSIONAL EDGE EXTRACTION METHOD, APPARATUS AND COMPUTER-READABLE MEDIUM USING TIME OF FLIGHT CAMERA - A method of extracting a three-dimensional (3D) edge is based on a two-dimensional (2D) intensity image and a depth image acquired using a time of flight (TOF) camera. The 3D edge extraction method includes acquiring a 2D intensity image and a depth image using a TOF camera, acquiring a 2D edge image from the 2D intensity image, and extracting a 3D edge using a matched image obtained by matching the 2D intensity image and the depth image.08-04-2011
20090175503Surface Position Measuring Method and Apparatus - A measuring apparatus for measuring a position of a surface of an object while the object is scanned in a scanning direction in an X-Y plane. A detecting unit detects the position of the surface of the object in a Z direction perpendicular to the X-Y plane, a stage scans the object relative to the detecting unit in the scanning direction, and a controller causes the stage to pre-scan the object relative to the detecting unit in two scanning directions, in the X-Y plane, opposite to each other, to detect, using the detecting unit, with respect to each of the two scanning directions, a position of the surface in the Z-direction for each of the same detection points on the surface, to determine, with respect to each of the two scanning directions, a reference surface based on the detected positions of the surface, to calculate an offset value, which is a difference between the detected position and a position of the reference surface in the Z-direction for each of the same detection points with respect to each of the two scanning directions, to calculate a correction value for correcting the calculated offset value in accordance with a corresponding one of the two scanning directions based on a difference, in the Z-direction, between positions of the determined reference surfaces obtained with respect to the two scanning directions.07-09-2009
20110216946INFORMATION PROCESSING DEVICE, INFORMATION PROCESSING METHOD, PROGRAM, AND INFORMATION STORAGE MEDIUM - To provide an image processing device capable of reducing a user operation in production of motion data by simulating a 3D motion of a moving object. A base data obtaining unit obtains a plurality of base data each containing an image and distance data indicating a distance between an object shown in the image and the image capturing unit, the distance data being based on a measured result. An in-image position specifying unit, based on the image obtained, specifies a position in the image, where the moving object is shown. A distance specifying unit specifies a distance between the moving object and the image capturing unit, based on the specified in-image position and the distance data. A position coordinate calculating unit calculates the 3D position coordinates of the moving object, based on the specified in-image position and the specified distance. The motion data producing unit produces motion data describing a motion of the moving object in the 3D space, based on the calculated 3D position coordinates.09-08-2011
20110305370Apparatus and method for depth unfolding based on multiple depth images - Provided is a depth image unfolding apparatus and method that may remove a depth fold from a depth image to restore a three-dimensional (3D) image. The depth image unfolding apparatus may include an input unit to receive inputted multiple depth images with respect to the same scene, the multiple depth images being photographed based on different modulation frequencies of a fixed photographing device, a depth fold estimator to estimate a number of depth folds based on a distance between multiple three-dimensional (3D) points of multiple pixels indicating the same location of the scene in the multiple depth images, and an output unit to output the multiple depth images from which depth folds are removed based on the estimated number of depth folds.12-15-2011
20080273758APPARATUS AND METHOD FOR MONITORING A SPATIAL AREA, IN PARTICULAR FOR SAFEGUARDING A HAZARDOUS AREA OF AN AUTOMATICALLY OPERATED INSTALLATION - An apparatus for monitoring a spatial area, in particular for safeguarding a hazardous area of an automatically operated installation, comprises an illumination device which at least temporarily emits light signals into the spatial area. A first image recording unit records a first image of the spatial area. The first image recording unit comprises an image sensor having a plurality of pixels. An evaluation unit determines a distance value for at least one spatial area point, which is located in the spatial area and is imaged on at least one pixel, by means of a propagation type measurement. The propagation type measurement suffers from a limited unambiguity range. Therefore, a test device is designed to check the distance value by means of a reference distance value determined from a second image of said spatial area.11-06-2008
20110075887Lens shift measuring apparatus, lens shift measuring method, and optical module manufacturing method - The lens shift measuring apparatus calculates a position of a predetermined portion of a frame body, on the basis of an imaging result obtained by imaging reflected light from the frame body, in a state in which light is irradiated onto a lens-attached member having a lens and the frame body to hold the lens, such that the reflected light from the frame body is generated, and calculates a position of a focusing spot, on the basis of an imaging result obtained by imaging light transmitted through the lens, in a state in which the light is irradiated onto the lens-attached member, such that the focusing spot formed by focusing the light transmitted through the lens by the lens is generated, and calculates a shift amount of the position of the focusing spot with respect to the predetermined portion as the shift of the lens.03-31-2011
20110064275VEHICLE PERIPHERY MONITORING APPARATUS - Provided is an apparatus capable of, even when an object is moving, measuring the position of the object at a high accuracy. A vehicle periphery monitoring apparatus (03-17-2011
20110091077DEVICE AND METHOD FOR PRODUCTION OF A LOCATION SIGNAL - A method generates a locating signal which indicates the location of a vehicle, in particular that of a track-bound vehicle. Accordingly, there is provision that a previously stored reference object in the surroundings of the vehicle is identified and the reference object is subjected to an intersection image or mixed image-distance measurement and the locating signal is generated by evaluating the intersection image or mixed image-distance measurement.04-21-2011
20110091075METHOD AND APPARATUS FOR CAPTURING, GEOLOCATING AND MEASURING OBLIQUE IMAGES - A computerized system for displaying and making measurements based upon captured oblique images. The system includes a computer system executing image display and analysis software. The software reads a plurality of captured oblique images having corresponding geo-location data and a data table storing ground plane data that approximates at least a portion of the terrain depicted within the captured oblique images. The executed software causes the computer system to receive a starting point selected by a user, receive an end point selected by the user and calculate a desired measurement between the starting and end points dependent upon the geo-location data and ground plane data. The desired measurement is selected from a group consisting of a distance measuring mode, a height measuring mode, and a relative elevation measuring mode.04-21-2011
20110317879Measurement of Positional Information for a Robot Arm - Positional measurements for a robot arm are made using a light ray projector (12-29-2011
20110317880METHOD AND MEASUREMENT SYSTEM FOR CONTACTLESS COORDINATE MEASUREMENT ON AN OBJECT SURFACE - The invention relates to a method and a surveying system for noncontact coordinate measurement on the object surface of an object to be surveyed in an object coordinate system. With a 3D image recording unit, a first three-dimensional image of a first area section of the object surface is electronically recorded in a first position and first orientation, the first three-dimensional image being composed of a multiplicity of first pixels, with which in each case a piece of depth information is coordinated. The first position and first orientation of the 3D image recording unit in the object coordinate system are determined by a measuring apparatus coupled to the object coordinate system. First 3D object coordinates in the object coordinate system are coordinated with the first pixels from the knowledge of the first 3D image coordinates and of the first position and first orientation of the 3D image recording unit.12-29-2011
20110317878Apparatus and method for generating depth image - Provided is a depth image generating apparatus and method. The light-receiving unit may include a first gate and a second gate and a depth calculator to calculate a depth based on first light information through fourth light information. The first gate may obtain the first light information from a reflected light of a light emitted based on a first pulse and the second gate may obtain the second light information from a reflected light of a light emitted based on a second pulse, and then the first gate may obtain the third light information from a reflected light of a light emitted based on a third pulse and the second gate may obtain the fourth light information from a reflected light of a light emitted based on a fourth pulse.12-29-2011
20120045101METHOD OF BIOIMAGE DATA PROCESSING FOR REVEALING MORE MEANINGFUL ANATOMIC FEATURES OF DISEASED TISSUES - The present invention discloses a method for generating elevation maps or images of a tissue layer/boundary with respect to a fitted reference surface, comprising the steps of finding and segmenting a desired tissue layer/boundary; fitting a smooth reference surface to the segmented tissue layer/boundary; calculating elevations of the same or other tissue layer/boundary relative to the fitted reference surface; and generating maps of elevation relative to the fitted surface. The elevation can be displayed in various ways including three-dimensional surface renderings, topographical contour maps, contour maps, en-face color maps, and en-face grayscale maps. The elevation can also be combined and simultaneously displayed with another tissue layer/boundary dependent set of image data to provide additional information for diagnostics.02-23-2012
20120045099INFORMATION PROCESSING DEVICE, INFORMATION PROCESSING METHOD, PROGRAM AND ELECTRONIC APPARATUS - An information processing device, method, program, and electronic apparatus are provided for operating an information processing device without a sense of discomfort regardless of a distance between the information processing device and a user. The information processing device, method, program, and electronic apparatus detect a subject from an image and extract a feature point from the subject. The feature point is converted into a coordinate point based on a movement distance of the subject and an image distance. The image distance being a distance between the subject and where the image is captured.02-23-2012
20120002844Computer product, information display apparatus, and information display method - A 3-dimesional model of an antigen is regarded as a display subject and a 3-dimesional model of an antibody is regarded as a comparison subject. A portion of the molecular surface of the display subject at a distance enabling binding with the comparison subject is cut out as a display surface. The 3-dimensional model of the antigen, which is the display subject, is displayed in a rotated state, where the normal of the display surface is rotated to point in a counter viewing direction, whereby the 3-dimensional model is rotated in a viewing coordinate system. The display surface alone is displayed in color, whereas other portions of the molecular surface are not, thereby enabling the display surface of the antigen that is at a distance enabling binding with the antibody to be displayed at a position easily viewed by the user.01-05-2012
20100080421Apparatus and method for eye margin calculating, and computer-readable recording medium recording program therefof - A center location of an eye pattern generated by superimposing waveform signal pieces cut out from a waveform signal generated by a simulator is calculated, and an arrangement of a mask as a quality evaluation criterion of the eye pattern on the center location is envisaged to calculate time coordinate values and voltage coordinate values of feature points included in the mask. First feature points not on a time axis is set as processing objects, and a margin in the voltage axis direction is calculated based on the voltage coordinate values of the first feature points and the voltage coordinate values of waveform signal piece parts associated with the first feature points. Second feature points on the time axis is set as processing objects, and a margin in the time axis direction is calculated based on the time coordinate values of the second feature points and the time coordinate values of waveform signal piece parts associated with the second feature points.04-01-2010
20120008833SYSTEM AND METHOD FOR CENTER CURVE DISPLACEMENT MAPPING - A system and method for center point trajectory mapping includes a computer readable storage medium having stored thereon a computer program comprises instructions, which when executed by a computer, cause the computer to obtain a first plurality of images, each image comprising an unmasked portion and a masked portion, and to calculate a center curve in the unmasked portion in each of the first plurality of images. The instructions further cause the computer to calculate a displacement of the center curve in each of the first plurality of images from a reference center curve, plot a map based on the calculated displacements, and display the map on a display.01-12-2012
20120045100POSITION MEASURING APPARATUS, POSITION MEASURING METHOD, IMAGE PROCESSING APPARATUS AND IMAGE PROCESSING METHOD - A position measuring apparatus which measures a position of a position measurement target easily and accurately without using any parameter inside a camera in the measurement, based on images captured from mutually different viewpoints. The position measuring apparatus which measures the three-dimensional position of the position measurement target, based on the input images captured from the mutually different viewpoints, and the position measuring apparatus includes: a ray information storage unit configured to store ray information in which each of ray vectors is associated with a corresponding one of pixels in one of the input images, each of the ray vectors indicating a forward direction of a light incident onto an optical system for a corresponding one of the input images; and a position measuring unit configured to measure the three-dimensional position of the position measurement target, using the ray information stored in the ray information storage unit.02-23-2012
20120014564GEODETIC MEASURING DEVICE - Geodetic measuring device that has an angle and distance measuring functionality for determining a position of a target object. For this purpose, the measuring device comprises a sighting device having a lens that magnifies multiplicatively, a camera sensor comprising a plurality of image recording points for recording a camera image of a field of view, a focusing optical system arranged in front of the camera sensor—wherein a first optical path is defined between the lens and the camera sensor—and an ocular. The camera sensor is connected to an electronic graphics processor for generating a display image from the camera image. The sighting device comprises an electronic graphical display component arranged in front of the ocular for visually presenting the generated display image, wherein a second optical path separated from the first optical path by the display image is defined between the display component and the ocular.01-19-2012
20120014563METHOD OF STRUCTURED LIGHT-BASED MEASUREMENT - A method of determining the distance to an object can use a video inspection device comprising a first light emitter and a second light emitter, wherein the first light emitter can emit light through an opening with at least one shadow-forming element. The method can comprise capturing at least one first emitter image with the first light emitter activated and the second light emitter deactivated, capturing at least one second emitter image with the second light emitter activated and the first light emitter deactivated, determining a first plurality of luminance values of the pixels in the at least one first emitter image, determining a second plurality of luminance values of the pixels in the at least one second emitter image, determining the brightness ratios of the second plurality of luminance values to the first plurality of luminance values, and determining an object distance using the brightness ratios.01-19-2012
20130011018INFORMATION PROCESSING APPARATUS AND INFORMATION PROCESSING METHOD - An information processing apparatus which estimates a three-dimensional position-and-orientation of a measuring object using an imaging apparatus capable of capturing a two-dimensional image and a range image, includes a data storing unit configured to store verification data for estimating a position-and-orientation of a measuring object, a two-dimensional image input unit configured to input a two-dimensional image captured by the imaging apparatus in a first position-and-orientation, a range image input unit configured to input a range image captured by the imaging apparatus in a second position-and-orientation, a position-and-orientation information input unit configured to acquire position-and-orientation difference information which is relative position-and-orientation information in the first position-and-orientation and the second position-and-orientation, and a calculation unit configured to calculate, based on the position-and-orientation difference information, a position-and-orientation of the measuring object so that the verification data for estimating the position-and-orientation matches the two-dimensional image and the range image.01-10-2013
20120057758DISTANCE MEASUREMENT METHOD AND SYSTEM, AND STORAGE MEDIA - A distance measurement system provided by the present invention comprises a light source module, for projecting a light beam having a speckle pattern to a plurality of reference flat surfaces and an object which are located at different position points, so as to show an image of the speckle pattern on each of the reference flat surface and the object. The speckle pattern contains a plurality of speckles. The invention generates a plurality of reference image information through capturing the image of the speckle pattern on each of the plurality of reference flat surfaces and generates an object image information through capturing the image of the speckle pattern on the object. The Invention further generates a plurality of comparison results through comparing the plurality of reference image information, and computes the position of the object through performing an interpolation operation to generate the plurality of comparison results.03-08-2012
20120020528IMAGE PROCESSING APPARATUS, IMAGE PROCESSING METHOD, AND PROGRAM - An image processing apparatus includes an obtaining unit obtaining an image including a closed curve input which encloses an object in an input image, a generation unit generating a distance image having pixel values of individual pixels corresponding to distances from the input closed curve in accordance with a shape of the curve, a calculation unit calculating an input-image energy of the input image including a distance energy changed based on the distances of the pixels or a likelihood energy changed based on likelihoods of the pixels based on color distribution models of an object region and a non-object region in the distance image and a color energy changed in accordance with color differences between adjacent pixels in the distance image, and a generation unit generating a mask image by minimizing the input-image energy and assigning an attribute representing the object region or an attribute representing the non-object region.01-26-2012
20120063645CONTROLLING AN IMAGE ELEMENT IN A REFLECTED ENERGY MEASUREMENT SYSTEM - Controlling an image element associated with a distance from an energy measuring device, in a reflected energy measurement system involves producing at least one signal for controlling a common visible characteristic of the image element, in response to a plurality of signals representing respective intensities of reflected energy at respective different frequencies, measured at the energy measuring device at a time corresponding to the distance.03-15-2012
20120155713METHOD AND SYSTEM FOR POSITION DETERMINATION USING IMAGE DEFORMATION - A method and system of position determination using image deformation is provided. One implementation involves storing an actual tag in a reference data module, receiving an image of a visual tag, the image captured by an image capturing device, comparing properties of the visual tag with properties of the actual tag; and based the comparison, determining a position of the image capturing device relative to the visual tag.06-21-2012
20100172545ABSOLUTE TRACKING IN A SUB-PIXEL RANGE - An optical navigation device for absolute tracking in a sub-pixel range. The optical navigation device includes an image sensor and a tracking engine. The image sensor includes a pixel array to generate a plurality of tracking images. The tracking images correspond to incident light at the pixel array. The tracking engine determines a sub-pixel displacement value of a tracking surface based on a comparison of at least two of the tracking images. The tracking engine includes a sub-pixel approximation engine and a linear approximation engine. The sub-pixel approximation engine generates an intermediate sub-pixel displacement value based on a sub-pixel approximation according to a non-linear sub-pixel distribution. The linear approximation engine generates a final sub-pixel displacement value from the intermediate sub-pixel displacement value.07-08-2010
20100172544METHOD OF MEASURING, ON THE FLY, THE HEIGHT OF AN ELECTROLYSIS ANODE - An on-the-fly method for measuring the length, along an axis (z′z) of an anode (07-08-2010
20110103651COMPUTER ARRANGEMENT AND METHOD FOR DISPLAYING NAVIGATION DATA IN 3D - A computer arrangement includes a processor and memory accessible for the processor. In at least one embodiment, the memory includes a computer program including data and instructions arranged to allow the processor to: a) obtain navigation information, b) obtain an image corresponding to the navigation information, c) display the image and at least part of the navigation information, whereby the at least part of the navigation information is superimposed upon the image. The processor is further allowed to b1) obtain depth information corresponding to the image and use the depth information to perform action c).05-05-2011
20080310681METHOD AND SYSTEM FOR THREE-DIMENSIONAL FEATURE ATTRIBUTION THROUGH SYNERGY OF RATIONAL POLYNOMIAL COEFFICIENTS AND PROJECTIVE GEOMETRY - A method is provided for generating height information for an image point on a rectified image from first and second aerial images having respective first and second sets of rational polynomial coefficients (RPCs), the first and second aerial images and the rectified image including overlapping image locations. The method comprises steps for generating epipolar lines and RPC lines and intersection points of epipolar lines and RPC lines. The method also comprises steps for equating line and sample coordinates to cubic polynomial equations and simultaneously solving the cubic polynomial equations to generate a height of the image point.12-18-2008
201002906743D image processing apparatus improving depth accuracy of region of interest and method - Example embodiments relate to a three-dimensional (3D) image processing apparatus and method that may determine a Region Of Interest (ROI) in an entire sensing region of the 3D image processing apparatus, and may obtain a depth image of the ROI. Also, example embodiments may be applied to a surveillance camera system, a motion detection system, a virtual reality simulation system, a distance recognition system for vehicles, a robotic system, a background separation system based on depth information, etc.11-18-2010
20100092040OPTICAL COMPUTERIZED METHOD FOR THE 3D MEASUREMENT OF AN OBJECT BY FRINGE PROJECTION AND USE OF A PHASE-SHIFT METHOD, CORRESPONDING SYSTEM - An optical computerized method and system for the 3D measurement of the external surface of an object in relief by projection of fringes onto the object and use of a phase-shifting method, wherein four projection axes of the fringes onto the object are implemented, the origin of each projection axis being considered as an illumination point located substantially at each of the four vertices of a virtual tetrahedron, the object being placed substantially at the centre of the tetrahedron, and the shootings are taken from four shooting points located substantially along four shooting axes, each of the shooting axes being the median of one of the four trihedrons formed by the four triplets of projection axes, the four shooting points being located at such a distance from the object that, at each shooting point, each image includes at least one portion of each of the three surfaces of the object that can be lighted by the three illumination points of the triplet of projection axes, the median of which defines the shooting axis of the shooting point, and a set of images of each of the six surfaces that can be illuminated and defined by six couples of illumination points is acquired into the computer equipment.04-15-2010
20100092041METHOD OF MEASURING A THREE-DIMENSIONAL SHAPE - In order to measure a three-dimensional shape, feature information is read from a database. A board is transferred to a measurement position. A measurement head is transferred to an inspection area of the board. Light of a first lighting device for three-dimensional measurement and light of a second lighting device for two-dimensional measurement is illuminated onto the inspection area to photograph a first reflection image and a second reflection image that are reflected from the inspection area. The inspection area is realigned by comparing the feature information with at least one of the photographed first and second reflection images to inspect distortion of the inspection area. The realigned inspection area is inspected. Thus, the three-dimensional shape may be precisely measured.04-15-2010
20120121137IMAGE PROCESSING APPARATUS - An image processing program causes a computer to execute processing of obtaining an image, photographed with a camera, of markers disposed in a real space, creating vectors from the camera and to the markers, selecting a reference marker from the markers, calculating a inner product of the vectors, canceling use of a negative sign included in an equation that obtains a distance between the camera and a remaining marker, creating sign patterns based on to the cancelled remaining markers, setting a first distance between the reference marker and the camera, calculating candidates of a distance between the camera and the remaining markers, calculating error between an inter-marker distance in a real space and the sign patterns, calculating other error when a second distance is set, determining the distance according to the error and the other error, and calculating a position and pose of the camera according to the determined distance.05-17-2012
20120163673METHODS, APPARATUSES, AND SYSTEMS FOR IMAGE-BASED MEASUREMENT AND INSPECTION OF PRE-ENGINEERED STRUCTURAL COMPONENTS - Methods, apparatuses, and systems for image-based measurement and inspection of pre-engineered structural components, such as building trusses and wall panels. A system can include: a light source; a camera; a first memory storage; a second memory storage; and a processing unit configured to (i) detect a characteristic of the structural component, (ii) compare the characteristic to a corresponding characteristic of at least one reference data, and (iii) indicate a result of the comparison. A method can include: causing a light source to illuminate a portion of the structural component, receiving a reflection of the light source from the illuminated portion of the structural component, and storing data corresponding to the intensity of the reflection; comparing the stored data to at least one reference data; and indicating a result of the comparison.06-28-2012
20100208943PERSPECTIVE IMPROVEMENT FOR IMAGE AND VIDEO APPLICATIONS - A system and method for reducing distance-based distortion in a camera image of an object, where the distanced-based distortion is due to differences in distance from the camera to different parts of the object. In one approach, the distortion is reduced by estimating distances to different parts of the object and then generating a reprojected image of the object dependent upon the estimated distances and upon a virtual viewpoint that is more distant than the camera from the object. In a further approach, the image is warped such that points in the image match corresponding points in one or more stored templates. In a still further approach, if excessive distortion is present in the image, the camera zoom is increased and a magnified image is displayed, prompting a person to move farther from the camera thereby reducing the distortion.08-19-2010
20100208942IMAGE PROCESSING DEVICE AND METHOD - An image processing device comprises receiving means operable to receive, from a camera, a captured image corresponding to an image of a scene captured by the camera. The scene contains at least one object. The device comprises determining means operable to determine a distance between the object within the scene and a reference position defined with respect to the camera, and generating means operable to detect a position of the object within the captured image, and to generate a modified image from the captured image based on image features within the captured image which correspond to the object in the scene. The generating means is operable to generate the modified image by displacing the position of the captured object within the modified image with respect to the determined position of the object within the captured image by an object offset amount which is dependent on the distance between the reference position and the object in the scene so that, when the modified image and the captured image are viewed together as a pair of images on a display, the captured object appears to be positioned at a predetermined distance from the display.08-19-2010
20110182476Apparatus and method with composite sensor calibration - An apparatus and method capable of calculating a coordinate transformation parameter without having to utilize a rig are provided. The apparatus and method extract a first feature point based on a plane of first data, project the first feature point onto second data and then extract a second feature point from a part of the second data onto which the first feature point is projected. Then, calibration is performed based on the extracted feature points. Therefore, it is possible to perform calibration immediately as necessary without having to utilize a separate device such as a rig.07-28-2011
20120314908IMAGE PROCESSING DEVICE, METHOD OF CONTROLLING IMAGE PROCESSING DEVICE, AND PROGRAM FOR CAUSING COMPUTER TO EXECUTE THE SAME METHOD - There is provided an image processing device including a depth acquisition unit and a smoothing processing unit. The depth acquisition unit acquires a depth to a subject in correlation with a pixel in a captured image of the subject, and the smoothing processing unit designates a pixel in a region excluding a predetermined region in the image as a target pixel, and performs a smoothing process of the degree according to the depth corresponding to the target pixel for a pixel value of the target pixel in a predetermined direction. This causes a pixel in a region excluding the predetermined region in the image to be blurred, thereby generating a panning image.12-13-2012
20120134542Photo-Mask Acceptance Technique - A technique for calculating a second aerial image associated with a photo-mask that can be used to determine whether or not the photo-mask (which may include defects) is acceptable for use in a photolithographic process is described. In particular, using a first aerial image produced by the photo-mask when illuminated using a source pattern and an inspection image of the photo-mask, a mask pattern corresponding to the photo-mask is determined. For example, the first aerial image may be obtained using an aerial image measurement system, and the inspection image may be a critical-dimension scanning-electron-microscope image of the photo-mask. This image, which has a higher resolution than the first aerial image, may indicate spatial-variations of a magnitude of the transmittance of the photo-mask. Then, the second aerial image may be calculated based on the determined mask pattern using a different source pattern than the source pattern.05-31-2012
20120076363COMPONENT CONCENTRICITY - System and methods for determining a concentricity of a component mounted within an aperture of an electronic device housing. In particular, a method for determining concentricity of a camera includes obtaining an image of a camera mounted within the aperture of the housing and identifying a plurality of circular shapes within the image using a processor. One of the circular shapes represents the aperture in the housing and the other circular shape represents the camera. An offset between two of the plurality of circular shapes is calculated.03-29-2012
20120076362CODED APERTURE CAMERA WITH ADAPTIVE IMAGE PROCESSING - An image capture device is used to identify object range information, and includes: providing an image capture device, an image sensor, a coded aperture, and a lens; and using the image capture device to capture a digital image of the scene from light passing through the lens and the coded aperture, the scene having a plurality of objects. The method further includes: dividing the digital image into a set of blocks; assigning a point spread function (psf) value to each of the blocks; combining contiguous blocks in accordance with their psf values; producing a set of blur parameters based upon the psf values of the combined blocks and the psf values of the remaining blocks; producing a set of deblurred images based upon the captured image and each of the blur parameters; and using the set of deblurred images to determine the range information for the objects in the scene.03-29-2012
20120177252DISTANCE MEASURING APPARATUS, DISTANCE MEASURING METHOD, AND PROGRAM - Disclosed are a distance measuring apparatus which can measure the distance to an object with high accuracy, a distance measuring method, and a program. The distance measuring apparatus is provided with: a light source which emits light; an image pickup unit, which picks up an image of light which has been emitted from the light source and reflected by means of the object; an image distance data converting unit, which converts image data obtained by the image pickup performed by the image pickup unit into image distance data which indicates the distance to the object; an image pickup condition setting unit, which sets first image pickup conditions on the basis of the image distance data; and an image pickup control unit, which controls the light source and the image pickup unit and have an image of the object picked up under the first image pickup conditions.07-12-2012
20100272321MEASURING APPARATUS, METHOD, AND PROGRAM - The present invention relates to a measuring apparatus, method and program with which a distance between two points that are not within one photographing range can be easily measured. In an image in which a start point is set, a specified position address calculation circuit 10-28-2010
20100272320Laser Pointing System - A system for pointing a laser beam is provided. The system comprises at least one processing laser source for emitting a processing laser beam toward a target, said processing beam being transmitted through a non-reflective zone of a first mirror, said mirror allowing return to an imaging system receiving an illumination beam reflected by the target, said low reflection coefficient zone of the first mirror inducing a shadow zone toward the imaging system; a second mirror receiving said processing beam and intended to orient it and reflect it toward the target; an illumination source for illuminating said target with the aid of the illumination beam, a first control circuit for controlling the orientation of said pointing system toward the target, a second control circuit for angularly displacing the processing beam by a determined angle, measuring the distance separating the position of a zone of the target from the position of the spot of the processing beam on the basis of an image obtained by the imaging system, then displacing the illumination beam in the opposite sense by an angle corresponding to said measured distance, the angular displacement of the processing beam having an amplitude such that the measurement of the position of the target is not perturbed by the shadow zone.10-28-2010
20100272319TOMOGRAPHIC APPARATUS - The tomographic apparatus of this invention has an attenuation quantity calculating and adjusting unit, a filter length calculating unit, an attenuation correction function calculating unit and a transmission length calculating unit in an arithmetic processing unit. Thus, sectional images with a beam hardening correction can be acquired. The beam hardening correction can be made without using an iterative method. The attenuation correction function calculating unit calculates a correction function by approximating and expressing the correction function by a linear function linking, between two transmission lengths, ratios between measured attenuation quantities for two transmission lengths measured by an X-ray detector, and calculated attenuation physical quantities calculated and adjusted for these transmission lengths.10-28-2010
20090060281Object Distance Deriving Device - An object distance deriving device comprises a compound-eye imaging unit for capturing n unit images and a microprocessor for calculating an object distance of an object from the imaging unit based on the unit images. The microprocessor sets a first temporary distance D03-05-2009
20090060280STEREO VISION SYSTEM AND STEREO VISION PROCESSING METHOD - A stereo vision system includes an image pre-processing unit for pre-processing the right and left images, and a stereo matching unit for carrying out stereo matching of the right and left images to acquire low-resolution distance information of the right and left images and high-resolution distance information of the right and left images upon detection of an object within a distance range through the low-resolution distance information.03-05-2009
20120257795MOBILE TERMINAL AND IMAGE DEPTH CONTROL METHOD THEREOF - An image depth control method is provided that include displaying a perceived 3-dimensional (3D) content on a display screen, recognizing a shape (or object) that faces the displayed 3D content, obtaining information about the recognized shape, determining a distance from the mobile terminal to the shape, and automatically changing a depth of the displayed 3D content based on the determined distance of the shape and the obtained information of the recognized shape.10-11-2012
20120263353IMAGE PROCESSING APPARATUS, IMAGE PROCESSING METHOD, COMPUTER PROGRAM, AND MOVABLE BODY - An image processing apparatus includes an edge extracting section which extracts an edge in a range image and a removing section which removes a distance value of a pixel of an edge part extracted by the edge extracting section in the range image.10-18-2012
20120082346TIME-OF-FLIGHT DEPTH IMAGING - Techniques are provided for determining depth to objects. A depth image may be determined based on two light intensity images. This technique may compensate for differences in reflectivity of objects in the field of view. However, there may be some misalignment between pixels in the two light intensity images. An iterative process may be used to relax a requirement for an exact match between the light intensity images. The iterative process may involve modifying one of the light intensity images based on a smoothed version of a depth image that is generated from the two light intensity images. Then, new values may be determined for the depth image based on the modified image and the other light intensity image. Thus, pixel misalignment between the two light intensity images may be compensated.04-05-2012
20120082345ATTITUDE ESTIMATION IN COMPRESSED DOMAIN - In general, in one embodiment, a starfield image as seen by an object is analyzed. Compressive samples are taken of the starfield image and, in the compressed domain, processed to remove noise. Stars in the starfield image are identified and used to determine an attitude of the object.04-05-2012
20090016572System and technique for retrieving depth information about a surface by projecting a composite image of modulated light patterns - A technique, associated system and program code, for retrieving depth information about at least one surface of an object, such as an anatomical feature. Core features include: projecting a composite image comprising a plurality of modulated structured light patterns, at the anatomical feature; capturing an image reflected from the surface; and recovering pattern information from the reflected image, for each of the modulated structured light patterns. Pattern information is preferably recovered for each modulated structured light pattern used to create the composite, by performing a demodulation of the reflected image. Reconstruction of the surface can be accomplished by using depth information from the recovered patterns to produce a depth map/mapping thereof. Each signal waveform used for the modulation of a respective structured light pattern, is distinct from each of the other signal waveforms used for the modulation of other structured light patterns of a composite image; these signal waveforms may be selected from suitable types in any combination of distinct signal waveforms, provided the waveforms used are uncorrelated with respect to each other. The depth map/mapping to be utilized in a host of applications, for example: displaying a 3-D view of the object; virtual reality user-interaction interface with a computerized device; face—or other animal feature or inanimate object—recognition and comparison techniques for security or identification purposes; and 3-D video teleconferencing/telecollaboration.01-15-2009
20110123070METHOD FOR X-RAY MARKER LOCALIZATION IN 3D SPACE IN THE PRESENCE OF MOTION - A method and system of determining a radial distance (R), an angular position (φ), and a axial position (Z) of a marker identified in a sequence of projection images. Embodiments of the invention allow a marker three-dimensional localization module executable by the electronic processing unit to obtain a sequence of images based on image data generated by a scanner. Each image in the sequence of images represents an angle of rotation by the scanner and includes a marker point position. The behavior of first values and second values of the marker point positions are analyzed through the sequence of images to determine the radial distance of the marker, the angular position of the marker, and the axial position of the marker. Thus, embodiments of the invention allow for rapidly detecting and localizing external markers placed on a patient in projection images.05-26-2011
20110123069Mapping Property Values Onto Target Pixels Of An Image - A computer implemented method of mapping values of source pixels (c(x)05-26-2011
20120230549IMAGE PROCESSING DEVICE, IMAGE PROCESSING METHOD AND RECORDING MEDIUM - To enhance the accuracy of the search of a corresponding point in a plurality of sheets of images photographed by a camera array and to acquire more pieces of color information on a subject. First color information is calculated from pixel values of the plurality of pieces of photographed image data photographed by the camera array, and the first color information is used to calculate a corresponding point between images indicated by the plurality of pieces of photographed image data. The calculated corresponding point is used to calculate information on the depth of the subject in the image indicated by the photographed image data. Second color information that is used for reproducing the color of the subject faithfully is calculated from the pixel values of the plurality of pieces of photographed image data. The calculated depth information and second color information are used to combine the plurality of photographed images.09-13-2012
20080298638MAP CHANGE DETECTION DEVICE, MAP CHANGE DETECTION METHOD, AND PROGRAM - Changes in houses and buildings on a two-dimensional map are detected using three-dimensional data obtained from stereo images. A change detection device that detects changes in features that are targets described on a map has a stereo processor, a feature height calculator, and a demolition and/or new building detector. The stereo processor is inputted with a plurality of images taken of predetermined regions from a plurality of different positions, and extracts digital surface model data representing surfaces of the regions in three-dimensional coordinates. The feature height calculator extracts feature heights where an elevation of ground level is subtracted from the digital surface model data extracted by the stereo processor. The demolition and/or new building detector detect changes in the feature that are the targets described on a map by comparing feature height data and map data. An elevation region extractor extracts an elevation region that is a set of points having a height greater than or equal to the predetermined value, compares the elevation region and the map data, and detects changes in the feature constituting the targets.12-04-2008
20120093372DISTANCE MEASURING DEVICE AND DISTANCE MEASURING METHOD - Provided are a distance measuring device and a distance measuring method which sufficiently suppress distance detection accuracy degradation caused by detection errors pertaining to a measurement subject, thereby making high-accuracy measurement of the distance to the measurement subject that is imaged. First to third region detection units (04-19-2012
20120093371GENERATING SEARCH REQUESTS FROM MULTIMODAL QUERIES - A method and system for generating a search request from a multimodal query that includes a query image and query text is provided. The multimodal query system identifies images of a collection that are textually related to the query image based on similarity between words associated with each image and the query text. The multimodal query system then selects those images of the identified images that are visually related to the query image. The multimodal query system may formulate a search request based on keywords of web pages that contain the selected images and submit that search request to a search engine service.04-19-2012
20120093370EVENT DETERMINATION BY ALIGNMENT OF VISUAL AND TRANSACTION DATA - Determination of human behavior from an alignment of data streams includes acquiring visual image primitives from a video input comprising visual information relevant to a human activity. The primitives are temporally aligned to an optimally hypothesized sequence of primitives transformed from a sequence of transactions as a function of a distance metric between the observed primitive sequence and the transformed primitive sequence. More particularly, transforming includes comparing the distance metric costs and choosing and performing the lowest cost of temporally matching the observed primitives to one or more transactions, deleting a primitive, or associating a primitive with a pseudo transaction marker. Accordingly, alerts are issued based on analysis of the transformation of primitives.04-19-2012
20110235865ADJUSTABLE RANGE FINDER AND THE METHOD THEREOF - An adjustable range finder and the method thereof are disclosed, in which the method comprising: projecting a first beam containing information of an object on a refractive optical element, being comprised a liquid-crystal layer, electrically connected to a voltage device, and a transmission blazed grating, so as to generate a second beam; enabling the voltage device to provide a first voltage to the liquid-crystal layer for forming an energy-concentrated M09-29-2011
20100232650MEASUREMENT APPARATUS - In a measurement apparatus, higher-quality measurement is realized in measurement of measurement object displacement or imaging of a two-dimensional image. In a controller, a light receiving signal of a photodiode is supplied to a displacement measuring unit of a sensor head in order to measure a height of a measurement object, and the height of a surface of the measurement object is measured based on the light receiving signal. Then, in the controller, image obtaining timing is determined based on the height of the measurement object. Specifically, a focus adjustment value corresponding to the computed height of the measurement object is obtained from the table, and an image obtaining signal is transmitted to an imaging device at the timing the focus adjustment value is realized. Therefore, a length between two points on the measurement object is computed from the thus obtained image based on the height of the measurement object.09-16-2010
20120328159SYSTEM AND METHOD FOR DETERMINING CUMULATIVE TOW GAP WIDTH - A system for determining cumulative tow gap width includes an in-process vision system having at least one camera adapted to record images of a composite material and a data analysis computer communicating with and adapted to receive image data from the in-process vision system. The data analysis computer may be adapted to calculate a cumulative gap width of tow gaps in the composite material. A user interface may communicate with and be adapted to receive data analysis results from the data analysis computer. A method for determining cumulative tow width gap of tow gaps in a composite structure is also disclosed.12-27-2012
20120288157IMAGE PROCESSING APPARATUS, IMAGE PROCESSING METHOD, AND IMAGE PROCESSING PROGRAM - An image processing apparatus includes: a depth calculation portion configured to calculate, based on a first blurred image of an observation area of at least a part of a sample, that is obtained by relatively moving an objective lens of a microscope and the sample along an optical-axis direction of the objective lens and a second blurred image of the observation area that is obtained by relatively moving the objective lens and the sample along a direction that includes a component of the optical-axis direction but is different from the optical-axis direction, a depth of a bright spot in the sample in the optical-axis direction, the bright spot being obtained by coloring a target included in the observation area; and a correction portion configured to correct the first blurred image based on information on the calculated depth of the bright spot.11-15-2012
20110158481DISTANCE MEASURING SYSTEM - A distance measuring system includes a light source module, an image capturing device, a signal processing unit, a selecting unit, and a distance calculating unit. The light source module is configured for emitting a first and a second infrared light beams in an asynchronous manner. The first infrared light beam has a same intensity with the second infrared light beam. The image capturing device is configured for capturing images of the first and second infrared light beams reflected by a measurement object. The signal processing unit is configured for analyzing the images of the first and second infrared light beams reflected by the measurement object to determine intensities thereof. The selecting unit is configured for selecting one of the first and second reflected infrared light beams whichever has a greater intensity. The distance calculating unit is configured for calculating a distance to the object from to the distance measuring system.06-30-2011
20080247602System and Method for Providing Mobile Range Sensing - The present invention provides an improved system and method for estimating range of the objects in the images from various distances. The method comprises receiving a set of images of the scene having multiple objects from at least one camera in motion. Due to the motion of the camera, each of the images are obtained at different camera locations Then an object visible in multiple images is selected. Data related to approximate camera positions and orientations and the images of the visible object are used to estimate the location of the object relative to a reference coordinate system. Based on the computed data, a projected location of the visible object is computed and the orientation angle of the camera for each image is refined. Additionally, pairs of cameras with various locations can then be chosen to obtain dense stereo for regions of the image at various ranges. The process is further structured so that as new images arrive, they are incorporated into the pose adjustment so that the dense stereo results can. be updated.10-09-2008
20130170712METHOD AND SYSTEM FOR MEASURING BUMPS BASED ON PHASE AND AMPLITUDE INFORMATION - A device for measuring a height of a microscopic structure, the device may include: a storage circuit arranged to store information that comprises amplitude information and phase information, wherein the information is indicative of a shape and a size of the microscopic structure; a mask generation circuit arranged to threshold pixels of the amplitude information to provide a mask that comprises masked amplitude pixels; a phase information circuit arranged to apply the mask on the phase information to provide masked phase pixels; select, out of the masked phase pixels, selected phase pixels that correspond to a phase criterion, the selected phase pixels have selected phase pixels attribute values; find, out of the phase information, elected phase pixels that have the selected phase pixel attribute values; and a height calculation circuit arranged to generate a height measurement result based the elected phase pixels.07-04-2013
20080226130Automated Location Estimation Using Image Analysis - An implementation of automated location estimation using image analysis is described. In this implementation, an image of a place is obtained and matched with previously stored images. The matching may be achieved by employing methods based on key feature extraction algorithm, color histogram analysis, pattern matching or other image comparison techniques. Upon determining a match, the location information associated with the matched previously stored images provides the location. The location information may be in the form of location tags or location keywords and the location information may be used by the user or other applications for the purposes of location determination. The above technique also allows for the user to enter location information to improve accuracy. The location information may also be assigned to the previously stored images residing in local and remote databases for users and applications to automatically assign information or keywords to images.09-18-2008
20130094714SHAPE MEASURING DEVICE, SHAPE MEASURING METHOD, AND METHOD FOR MANUFACTURING GLASS PLATE - The present invention provides a technology capable of measuring three-dimensional shapes by applying a stereo method even in the case that an object has a specular surface. A shape measuring apparatus 04-18-2013
20130094713STEREO IMAGE PROCESSING APPARATUS AND METHOD OF PROCESSING STEREO IMAGE - Provided is a stereo image processing apparatus and a method of processing a stereo image, wherein calculation precision of disparity is improved, while maintaining a processing amount equal to the SAD method. In the stereo image processing apparatus (04-18-2013
20130114861DEVICE AND METHOD FOR RECOGNIZING THREE-DIMENSIONAL POSITION AND ORIENTATION OF ARTICLE - A recognition device and method capable of recognizing 3D position and orientation of an article at low calculation cost. A 2D image of a region, where articles are randomly located, is obtained by a camera, and 3D information of generally the same region is obtained by a range sensor. A space, where an article to be taken out is considered to exist, is roughly limited. Based on the limited space, a search condition for searching the article by 2D image processing is set, and 2D positional information on the image of the article is obtained. Then, 3D point data used to recognize the 3D position and orientation of the article is selected, and a view line in the 3D space, extending from the camera to the article, is calculated, whereby the 3D position and orientation of the article is calculated.05-09-2013
20130101175Reimaging Based on Depthmap Information - One or more systems, devices, and/or methods for emphasizing objects in an image, such as a panoramic image, are disclosed. For example, a method includes receiving a depthmap generated from an optical distancing system, wherein the depthmap includes position data and depth data for each of a plurality of points. The optical distancing system measures physical data. The depthmap is overlaid on the panoramic image according to the position data. Data is received that indicates a location on the panoramic image and, accordingly, a first point of the plurality of points that is associated with the location. The depth data of the first point is compared to depth data of surrounding points to identify an area on the panoramic image corresponding to a subset of the surrounding points. The panoramic image is altered with a graphical effect that indicates the location.04-25-2013
201301011763D IMAGE ACQUISITION APPARATUS AND METHOD OF CALCULATING DEPTH INFORMATION IN THE 3D IMAGE ACQUISITION APPARATUS - A 3-dimensional (3D) image acquisition apparatus and a method of calculating depth information in the 3D image acquisition apparatus, the 3D image acquisition apparatus including: an optical modulator for modulating light reflected from a subject by sequentially projected N (N is 3 or a larger natural number) light beams; an image sensor for generating N sub-images by capturing the light modulated by the optical modulator; and a signal processor for calculating depth information regarding a distance to the subject by using the N sub-images.04-25-2013
20110268322ESTABLISHING COORDINATE SYSTEMS FOR MEASUREMENT - In one aspect, in general, a measurement system includes a projector for illuminating a pattern on a surface of the object, at least two imaging devices for obtaining images of a portion of an object, wherein at least some of the images include representations of one or more illuminated reference markers, an instrument for identifying a predetermined feature of the object, and a computing device for determining first position information associated with the illuminated reference markers represented in the images, determining second position information associated with the instrument, and based on the first position information and the second position information, assigning a predetermined coordinate system of the object to the object.11-03-2011
20130129153IMAGE PROCESSING DEVICE, INFORMATION STORAGE DEVICE, AND IMAGE PROCESSING METHOD - An image processing device includes an information acquisition section that acquires a photographing position of a photographed image or a position of an imaging device as coordinate information, a distribution state acquisition section that acquires a distribution state of a plurality of pieces of coordinate information acquired as the coordinate information, and a keyword assignment section that assigns a keyword that corresponds to the acquired distribution state to the photographed image.05-23-2013
20130136311THREE DIMENSIONAL POSITION OBSERVATION METHOD AND APPARATUS - A three-dimensional position observation apparatus provided with a lens system having focusing and diaphragm mechanisms, for forming an image on an imaging plane by light from an observation object includes a beam steering member disposed in a light path extending from the observation object to the imaging plane, for changing a traveling direction of observation light into a plurality of different directions, and an image analyzing unit for analyzing a position of the observation object based on a positional relation between a plurality of images on the imaging plane formed by light passing through the beam steering member.05-30-2013
20130142394System And Method For Performing Depth Estimation Utilizing Defocused Pillbox Images - A system and method for performing a depth estimation procedure utilizing defocused pillbox images includes a camera device with a sensor device for capturing pillbox blur images of a photographic target. The camera utilizes a depth estimator for performing a Gaussianization procedure that transforms the pillbox blur images into corresponding Gaussian blur images. The Gaussianization procedure is performed by convolving the pillbox blur images with a Gaussianization kernel to generate the corresponding Gaussian blur images. The depth estimator then utilizes the Gaussian blur images for effectively performing the depth estimation procedure.06-06-2013
20130142395DISTANCE MEASUREMENT APPARATUS AND METHOD - A distance measurement apparatus and a distance measurement method are provided. The apparatus includes a line-shaped laser transmitter, an image sensing device and a processing unit. The line-shaped laser transmitter transmits a line-shaped laser, and the image sensing device senses the line-shaped laser to output a line-shaped laser image. The processing unit receives the line-shaped laser image, and segments the line-shaped laser image into several sub-line-shaped laser images. The processing unit further calculates a vertical position for a laser line in each sub-line-shaped laser image, and outputs each distance information according to the corresponding sub-line-shaped laser image and a transformation relation.06-06-2013
20130142396ESTIMATION OF SHIFT AND SMALL IMAGE DISTORTION - Disclosed is method of measuring the displacement between a reference region of a reference image and a sample region of a sample image. The method spatially varies the reference region using a one-dimensional filter having complex kernel values, wherein a length (radius) and direction (angle or tangent segment) of the filter is a function of position in the reference region. The method then measures a displacement between the reference region and the sample region by comparing the spatially varied reference region and the sample region.06-06-2013
20110211730IMAGE MEASURING DEVICE FOR CALIBRATION TEST AND METHOD THEREOF - An image measuring device comprises a storage, a processor, an acquiring module, a positioning module and a determining module. The acquiring module acquires an image of a production object by scanning the production object. The positioning module positions the image of the production object in a coordinate plane according to predefined parameters and acquiring the edge of the image of the production object. The determining module determines whether the difference between the positioned image and the predefined parameters is over a tolerance, wherein the acquiring module, the positioning module and the determining module are stored in the storage and controlled by the processor.09-01-2011
20110222734METHODS FOR EVALUATING DISTANCES IN A SCENE AND APPARATUS AND MACHINE READABLE MEDIUM USING THE SAME - A distance evaluation method for evaluating distances from an observation point to objects within an arbitrary detectable range in a scene is disclosed. The method includes the following steps. First, a focus distance is set to correspond to a lower or higher limit of a chosen detection range. Next, an image is then captured with an image acquisition system, wherein transfer function of the image acquisition system depends on the focus distance. The captured image of the scene is segmented. A blur metric is computed for each image segment of the captured image. The blur metric is associated with the distance of the objects from the observation point in each image segment.09-15-2011
20080199047Indication position calculation system, indicator for indication position calculation system, game system, and indication position calculation method - When the user of a system directs an indicator toward a light-emitting section, an imaging section provided in the indicator acquires an image PA in a given area including the light-emitting section. A determination section determines whether or not each pixel individually satisfies a first condition based on light-reception information relating to each pixel of the image PA, and a calculation section calculates an indication position of the indicator based on the positions of effective pixels in the image using the effective pixels which satisfy the first condition as pixels corresponding to the light-emitting section. The calculation section calculates a representative value of the effective pixels based on identification information relating to the effective pixels while changing weighting on the identification information relating to the effective pixels depending on whether or not the effective pixels satisfy a second condition.08-21-2008
20080199046METHOD OF AND APPARATUS FOR TAKING SOLID IMAGE AND COMPUTER PROGRAM FOR CAUSING COMPUTER TO EXECUTE THE METHOD - A solid image taking apparatus includes a plurality of image taking portions which obtain a plurality of images at each sight point by taking a plurality of images of objects from different sight points, a distance measuring portion which measures object distances which are distances to the objects from the plurality of image taking portions, and a classifying portion which classifies the objects included in the images into a plurality of groups according to the object distances and outputs the result of classification.08-21-2008
20130148859IMAGE PROCESSING APPARATUS, IMAGE PROCESSING METHOD, DISPLAY APPARATUS, DISPLAY METHOD, AND COMPUTER READABLE RECORDING MEDIUM - The perspective in the captured image is enhanced. Provided is an image processing apparatus including: a detecting section that detects a subject distance in each region of a captured image; and an image processing section that changes a distance perspective of an image of the region depending on the subject distance detected for each region of the captured image. In an example, the image processing section may increase the blue component ratio of a region whose detected subject distance is larger than a reference value. In an example, the image processing section may blur an image of a region whose detected subject distance is larger than a reference value.06-13-2013
20120275654POSITION AND ORIENTATION MEASUREMENT APPARATUS, POSITION AND ORIENTATION MEASUREMENT METHOD, AND PROGRAM - A position and orientation measurement apparatus for measuring the position and orientation of a target object includes a first search unit which searches a geometric model for a lost model region corresponding to a lost image region in a range image, a determination unit which determines whether or not a point on a geometric model corresponding to a pixel on the range image of the target object falls within the lost model region, a correction unit which corrects combinations of pixels on the range image and corresponding points which are determined to fall within the lost model region, and a calculation unit which calculates the position and orientation of the target object based on the corrected combinations of the pixels on the range image and points on the geometric model.11-01-2012
20100310130FOURIER TRANSFORM DEFLECTOMETRY SYSTEM AND METHOD - The present invention relates to a Fourier transform deflectometry system (12-09-2010
20130156268IMAGE INFORMATION PROCESSING APPARATUS AND METHOD FOR CONTROLLING THE SAME - An image information processing apparatus performs three-dimensional measurement of an object using a captured image obtained by projecting onto the object a projection pattern containing a two-dimensional symbol sequence that is obtained by assigning a predetermined symbol to each code in a projection code string in which a plurality of types of codes are arranged two-dimensionally and capturing an image of the object. The apparatus obtains an imaging pattern by extracting a symbol sequence from the captured image, and converts symbol dots in the imaging pattern into corresponding codes, thereby obtaining an imaging code string. The apparatus obtains a predetermined number of codes according to one sampling feature selected from a plurality of types of sampling features, generates an information code string by arranging the obtained codes, and determining the correspondence between the information code string and a part of the projection code string, thereby performing three-dimensional measurement.06-20-2013
20110311108METHOD FOR DETECTING OBJECTS - A method for detecting objects, wherein two images of a surrounding (12-22-2011
20110311107ACQUISITION OF 3D TOPOGRAPHIC IMAGES OF TOOL MARKS USING NON-LINEAR PHOTOMETRIC STEREO METHOD - There is described a method and 3D image acquisition system for addressing the specular nature of metallic surfaces in general, and ballistic pieces of evidence in particular, using photometric stereo by identifying and solving a plurality of sets of non-linear equations comprising a diffusive term and a specular term to determine a surface normal vector field N(x, y), and using N(x, y) to determine a 3D topography Z(x, y).12-22-2011
20110311106Method for The Identification of Objects - A method for the identification of objects in a predetermined target area involves recording a first and a second height profile of the target area, wherein the two height profiles are recorded at a predeterminable time interval. A height difference profile is determined from the first and the second height profile. The height difference profile is subdivided in equidistant horizontal height sections. The positions of the centroids of the surface areas enclosed by the respective contour lines of the horizontal height sections are calculated and the determined height difference profile and the calculated centroids of the surface areas are supplied to a system for classifying objects.12-22-2011
20110311105METHOD FOR 3D, MEASUREMENT OF THE SURFACE OF AN OBJECT, IN PARTICULAR FOR DENTAL PURPOSES - For the purpose of 3D scanning the surface of an object by optical double triangulation using the phase-shifting method, more particularly for dental purposes, at least two 3D scans of the same object (12-22-2011
20120020527METHODS FOR MAPPING DEPTH AND SURFACE CURRENT - Systems and methods for acquiring accurate maps of near-shore depth and surface currents are disclosed. An imaging platform is provided which is able to obtain a time series of overhead images of an area of a body of water having pixel intensity correlated with wave height. The platform may be on a tower, or may be airborne, space-borne, or ship-borne. The imaging modality may be optical, radar, or LIDAR. Image processing corrects the images, as and if needed, such they are mapped onto a grid of fixed coordinates, and the pixel intensities have a near linear relationship to wave height. A two-dimensional Fourier transform of each image is obtained, then the extremum of an objective function is found, wherein the objective function is a function of the depth and surface current (velocity) vector at each pixel location, and the extremum is sharply peaked at a particular set of depth and a particular set of surface current vector values. A pixel-by-pixel map of depths and or currents can be produced.01-26-2012
20130202157SYSTEMS AND METHODS FOR ESTIMATION OF BUILDING WALL AREA - A wall area estimation system generates an estimated wall area measurement of a building based on the received roof measurements (e.g., those generated by, received from or found in a three dimensional model of the roof) and a reference distance. The reference distance is a measurement indicative of a distance between a reference point on the roof and a ground surface. This reference distance may be used to determine how for down to extend the walls of the building (e.g., to a ground level) when building a three dimensional digital model of the building to aid in generating wall area measurements. The resulting wall measurements, roof measurements, measurements of areas missing from the wall used to generate a wall estimate report, or a combined roof and wall estimate report including various different identifiers indicating the different features and measurements based on the three dimensional model.08-08-2013

Patent applications in class Range or distance measuring