Inventors list

Assignees list

Classification tree browser

Top 100 Inventors

Top 100 Assignees


OBJECT TRACKING

Subclass of:

348 - Television

Patent class list (only not empty are listed)

Deeper subclasses:

Class / Patent application numberDescriptionNumber of patent applications / Date published
348172000 Centroidal tracking 2
20090015677Beyond Field-of-View Tracked Object Positional Indicators for Television Event Directors and Camera Operators - A system and method for implementing beyond field-of-view tracked object positional indicators for television event directors and camera operators. The present invention includes a camera having a field-of-view. The camera tracks an off-screen object. A coordinate manager blends an on-screen indication of distance that the object is away from said field-of-view. The camera is positioned to avoid the object in the field-of-view.01-15-2009
20110007167High-Update Rate Estimation of Attitude and Angular Rates of a Spacecraft - System and method are provided for estimating attitude and angular rate of a spacecraft with greater accuracy by obtaining star field image data at smaller exposure times and processing the data using algorithms that allow attitude and angular rate to be calculated during the short exposure times.01-13-2011
348170000 Using tracking gate 1
20100321505TARGET TRACKING APPARATUS, IMAGE TRACKING APPARATUS, METHODS OF CONTROLLING OPERATION OF SAME, AND DIGITAL CAMERA - A detection area is decided in a case where a target has gone out-of-frame. If a target is being imaged by a camera continuously, it is determined whether the target has gone out-of-frame. If the target has gone out-of-frame, then the magnitude and direction of motion of the camera are detected. If camera motion is large, it can be concluded that the camera user is imaging the target while tracking it. Accordingly, it can be concluded that the target will again be imaged at the center of the imaging zone. An area defined as a region in which the target will be detected is set at the center of the imaging zone. If camera motion is small in a case where the target goes out-of-frame, it can be concluded that the user is waiting for the target to re-enter the imaging zone and therefore the edge of the imaging zone is set as the detection area.12-23-2010
Entries
DocumentTitleDate
20100165114DIGITAL PHOTOGRAPHING APPARATUS AND METHOD OF CONTROLLING THE SAME - Provided is a method of controlling a digital photographing apparatus. The method includes: determining a target subject; determining whether the target subject exists in a moving image photographing domain; determining whether a condition of temporarily stopping recording of a moving image based on whether the target subject exists is satisfied; and temporarily stopping the recording of a moving image when the condition is satisfied.07-01-2010
20100165113SUBJECT TRACKING COMPUTER PROGRAM PRODUCT, SUBJECT TRACKING DEVICE AND CAMERA - A computer performs a template-matching step of template-matching an input image with a plurality of template images with different magnifications by a program stored in an object racking computer program product, selecting the template image with the highest similarity to an image in a predetermined region of the input image among the plurality of template images and extracting the predetermined region of the input image as a matched region, a determination step of determining whether or not the matching results of the template-matching step satisfy update conditions for updating the plurality of template images, and an updating step of updating at least one of the plurality of template images when the update conditions are determined to be satisfied by the determination step.07-01-2010
20100165112AUTOMATIC EXTRACTION OF SECONDARY VIDEO STREAMS - The automatic generation (07-01-2010
20090195656INTERACTIVE TRANSCRIPTION SYSTEM AND METHOD - A method and system which seamlessly combines natural way of handwriting (real world) with interactive digital media and technologies (virtual world) for providing a mixed or augmented reality perception to the user is disclosed.08-06-2009
20100053333METHOD FOR DETECTING A MOVING OBJECT IN A SEQUENCE OF IMAGES CAPTURED BY A MOVING CAMERA, COMPUTER SYSTEM AND COMPUTER PROGRAM PRODUCT - The invention relates to a method for detecting a moving object in a sequence of images captured by a moving camera. The method comprises the step of constructing a multiple number of difference images by subtracting image values in corresponding pixels of multiple pairs of images being based on captured images. Further, the method comprises the step of retrieving a moving object by extracting spatial information of pixels in the multiple number of constructed difference images having relatively large image values. In addition, from a pair of images in the construction step an image is a representation of a high resolution image having a higher spatial resolution than original captured images on which the high resolution image is based.03-04-2010
20110193969OBJECT-DETECTING SYSTEM AND METHOD BY USE OF NON-COINCIDENT FIELDS OF LIGHT - The invention provides an object-detecting system and method for detecting information of an object located in an indicating space. In particular, the invention is to capture images relative to the indicating space by use of non-coincident fields of light, and further to determine the information of the object located in the indicating space. The invention also preferably sets the operation times of image-capturing units and the exposure times of light-emitting units in the object-detecting system to improve the quality of the captured images.08-11-2011
20130076913SYSTEM AND METHOD FOR OBJECT IDENTIFICATION AND TRACKING - What is disclosed is a system and method for identifying materials comprising an object captured in a video and for using the identified materials to track that object as it moves across the captured video scene. In one embodiment, a multi-spectral or hyper-spectral sensor is used to capture a spectral image of an object in an area of interest. Pixels in the spectral planes of the spectral images are analyzed to identify a material comprising objects in that area of interest. A location of each of the identified objects is provided to an imaging sensor which then proceeds to track the objects as they move through a scene. Various embodiments are disclosed.03-28-2013
20090122146METHOD AND APPARATUS FOR TRACKING THREE-DIMENSIONAL MOVEMENTS OF AN OBJECT USING A DEPTH SENSING CAMERA - A controller (05-14-2009
20130208126COOPERATIVE OPTICAL-IMAGING SENSOR ARRAY - An apparatus and method for providing image primitives, such as edge polarity, edge magnitude, edge orientation, and edge displacement, and derivatives thereof, for an object are described. The data are obtained substantially simultaneously and processed in parallel such that multiple objects can be distinguished from one another in real time.08-15-2013
20100073484Display control apparatus, display control method, and program - A display control apparatus includes a receiving unit that receives a television broadcast signal containing at least remote broadcast image information, a display unit that displays image information contained in the television broadcast signal, a player information acquiring unit that acquires, from the remote broadcast image information, information regarding players in a sports game included in a broadcast image signal, a field information acquiring unit that acquires field information from the remote broadcast image information, a player position information acquiring unit that acquires player position information from the image signal using the player information and the field information, a player information providing unit that provides the acquired player information by displaying the player information on the display unit, and a cursor control function unit that sets, using the player position information, a cursor on one of the players selected using the provided player information and displayed on the display unit.03-25-2010
20130083202Method, System and Computer Program Product for Reducing a Delay From Panning a Camera System - For reducing a delay from panning a camera system, an estimate is received of a physical movement of the camera system. In response to the estimate, a determination is made of whether the camera system is being panned. In response to determining that the camera system is not being panned, most effects of the physical movement are counteracted in a video sequence from the camera system. In response to determining that the camera system is being panned, most effects of the panning are preserved in the video sequence, while concurrently the video sequence is shifted toward a position that balances flexibility in counteracting effects of a subsequent physical movement of the camera system.04-04-2013
20130083201METHODS AND APPARATUS FOR DETERMINING MISALIGNMENT OF FIRST AND SECOND SENSORS - Method and apparatus of the invention determine the spatial and roll mis-alignment of first and second sensors using a common image scene from sensor overlapping Fields of View (FOV). The sensors are dissimilar in one or more of the following respects: Field-Of-View (FOV) size, detector array size, spectral response, gain and level, pixel resolution, dynamic range, and thermal sensitivity.04-04-2013
20100045800Method and Device for Controlling Auto Focusing of a Video Camera by Tracking a Region-of-Interest - The invention concerns an electronic device equipped with a video imaging process capability, which device includes a camera unit arranged to produce image frames from an imaging view which includes a region-of-interest ROI, an adjustable optics arranged in connection with the camera unit in order to focus the ROI on the camera unit, an identifier unit in order to identify a ROI from the image frame, a tracking unit in order to track the ROI from the image frames during the video imaging process and an auto-focus unit arranged to analyze the ROI on the basis of the tracking results provided by the tracking unit in order to adjust the optics. The device is arranged to determine the spatial position of the ROI in the produced image frame without any estimation measures.02-25-2010
20100045799Classifying an Object in a Video Frame - In a digital video surveillance system, a number of processing stages are employed to identify foreground regions representing moving objects in a video sequence. An object tracking stage (02-25-2010
20100321504INFORMATION PROCESSING SYSTEM, INFORMATION PROCESSING APPARATUS AND INFORMATION PROCESSING METHOD, PROGRAM, AND RECORDING MEDIUM - An information processing system, apparatus and method is disclosed wherein an image of a predetermined region and an image of moving bodies in the region can be picked up and any of images obtained by such image pickup which is desired by a user can be reproduced readily. Sensor images of the predetermined region are stored, and an image of the moving bodies in the region is picked up separately and stored together with reproduction information relating to reproduction of the sensor image from which the moving bodies are detected. When an instruction to reproduce the sensor image is issued, the reproduction information corresponding to the moving body is read out, and the sensor image is reproduced based on the read out reproduction information. The invention can be applied, for example, to a monitoring system.12-23-2010
20130033607METHOD OF AUTOMATICALLY TRACKING AND PHOTOGRAPHING CELESTIAL OBJECTS, AND CAMERA EMPLOYING THIS METHOD - A method of automatically tracking and photographing celestial objects which captures a still image of a celestial object(s) where each celestial object appears stationary simply by making an exposure with a camera directed toward an arbitrary-selected celestial object and fixed with respect to the ground and without using an equatorial, and also a camera that employs this method. The method includes inputting latitude information at a photographic site, photographing azimuth angle information, photographing elevation angle information, attitude information of a photographic apparatus and focal length information of a photographing optical system; calculating movement amounts of the celestial object image relative to the photographic apparatus, for fixing the celestial object image with respect to the predetermined imaging area of an image pickup device, using all of the input information; and obtaining a photographic image by moving at least one of the predetermined imaging area and the celestial object image.02-07-2013
20130070105TRACKING DEVICE, TRACKING METHOD, AND COMPUTER PROGRAM PRODUCT - According to an embodiment, a tracking device includes an acquiring unit, a first calculator, a second calculator, and a setting unit. The acquiring unit images a tracking target object to time-sequentially acquire an image. The first calculator calculates a first likelihood representing a degree of coincidence between a pixel value of each pixel included in a search region within the image and a reference value. The second calculator calculates a difference value between the pixel value of each pixel in the search region and the pixel value of a corresponding pixel in an image in a past frame. The setting unit sets weights of the first likelihood and the difference value so that as a distance between each pixel in the search region and a position of the tracking target object in the past increases, the weight of the first likelihood decreases and the weight of the difference value increases.03-21-2013
20130070103SUPER RESOLUTION BINARY IMAGING AND TRACKING SYSTEM - In one aspect, the invention provides an imaging system including an optical system adapted to receive light from a field of view and direct the received light to two image planes. A fixed image detector is optically coupled to one of the image planes to detect at least a portion of the received light and generate image data corresponding to at least a portion of the field of view. A movable (e.g., rotatable) image detector is optically coupled to the other image plane to sample the received light at different locations thereof to generate another set of image data at a higher resolution than the image data obtained by the fixed detector. The system can include a processor for receiving the two sets of image data to generate two images of the field of view.03-21-2013
20130070104SOUND SOURCE MONITORING SYSTEM AND METHOD THEREOF - A sound source monitoring system includes a sound receiving module, a sound detection module, a sound source localization module, and a camera module. The sound receiving module is configured to receive a plurality of sound signals. The sound detection module is for dividing an integrated signal formed by adding the sound signals received by the sound receiving module and normalizing the sum of the sound signals or dividing each of the sound signals into a plurality of sub-bands, calculating a signal-to-noise ratio (SNR) of each sub-band and a background noise, and accordingly determining whether to output the sound signals received by the sound receiving module to the sound source localization module. The sound source localization module is for outputting a sound source location by using the sound signals received by the sound receiving module. The camera module is for shooting an image corresponding to the sound source location.03-21-2013
20130050499INDIRECT TRACKING - A mobile platform in a multi-user system tracks its own position with respect to an object and tracks remote mobile platforms despite being an unknown moving objects. The mobile platform captures images of the object and tracks its position with respect to the object as the position changes over time using the multiple images. The mobile platform receives the position of a remote mobile platform with respect to the object as the second position changes over time. The mobile platform tracks the position of the mobile platform with respect to the remote mobile platform using the position of the mobile platform and the received position of the remote mobile platform. The mobile platform may render virtual content with respect to the object and the remote mobile platform based on the tracked positions or may detect or control interactions or trigger events based on the tracked positions.02-28-2013
20130050500INFORMATION PROCESSING PROGRAM, INFORMATION PROCESSING SYSTEM, INFORMATION PROCESSING APPARATUS, AND INFORMATION PROCESSING METHOD, UTILIZING AUGMENTED REALITY TECHNIQUE - An exemplary embodiment provides an information processing program. The information processing program includes image obtaining instructions, search target detection instructions, shape recognition instructions, event occurrence instructions, virtual image generation instructions, and display control instructions. The search target detection instructions cause a computer to detect a search target from an image of a subject obtained in accordance with the image obtaining instructions. The shape recognition instructions cause the computer to recognize a shape of the search target. The event occurrence instructions cause the computer to cause an event to occur in a virtual space in accordance with the shape of the search target. The virtual image generation instructions cause the computer to generate a virtual image by shooting the event that occurs in accordance with the event occurrence instructions with a virtual camera arranged in the virtual space.02-28-2013
20130050501Metrology Method and Apparatus, and Device Manufacturing Method - A target structure including a periodic structure is formed on a substrate. An image of the target structure is detected while illuminating the target structure with a beam of radiation, the image being formed using a first part of non-zero order diffracted radiation while excluding zero order diffracted radiation. Intensity values extracted from a region of interest within the image are used to determine a property of the periodic structure. A processing unit recognizes locations of a plurality of boundary features in the image of the target structure to identify regions of interest. The number of boundary features in each direction is at least twice a number of boundaries of periodic structures within the target structure. The accuracy of locating the region is greater than by recognizing only the boundaries of the periodic structure(s).02-28-2013
20130050502MOVING OBJECT TRACKING SYSTEM AND MOVING OBJECT TRACKING METHOD - A moving object tracking system includes an input unit, a detection unit, a creating unit, a weight calculating unit, a calculating unit, and an output unit. The detection unit detects all tracking target moving objects from each of input images input. The creating unit creates a combination of a path that links each moving object detected in a first image to each moving object detected in a second image, a path that links each moving object detected in the first image to an unsuccessful detection in the second image, and a path that links an unsuccessful detection in the first image to each moving object detected in the second image. The calculating unit calculates a value for the combination of the paths to which weights are allocated. The output unit outputs a tracking result.02-28-2013
20100134631APPARATUS AND METHOD FOR REAL TIME IMAGE COMPRESSION FOR PARTICLE TRACKING - A real-time image compression method includes identifying pixels in a set of image data that have a brightness value past a predetermined threshold; determining a position of each identified pixel in the image data; and for each of the identified pixels, defining a vector that includes the brightness value and the position of the identified pixel in the image data.06-03-2010
20130208127AUTO BURST IMAGE CAPTURE METHOD APPLIED TO A MOBILE DEVICE, METHOD FOR TRACKING AN OBJECT APPLIED TO A MOBILE DEVICE, AND RELATED MOBILE DEVICE - The present invention discloses a mobile device, where the mobile device includes an image sensing unit, a touch screen, and a processor. The image sensing unit is configured to receive at least an image of a scene comprising at least an object. The touch screen is configured to display at least an image of a scene and received at least one user input. The processor is configured to identify the object in response to a first user input corresponding to the object is received, determine characteristics of the object, track the object in the scene according to the characteristics of the object, and capture a number of images of the scene according to a motion state of the object. The motion state is determined according to variance of the characteristics of the object in consecutive images received by the image sensing unit.08-15-2013
20130057701DEVICE AND METHOD FOR IMAGE PROCESSING - An image processing device includes: an extractor configured to extract a region of interest which includes a point of interest and satisfies a specified condition in a first image frame; a divider configured to divide the region of interest into a first subregion including the point of interest and a second subregion not including the point of interest at a narrow portion of the region of interest; and a specifying unit configured to specify a specified pixel in the first subregion as a point of interest of a second image frame.03-07-2013
20130057702OBJECT RECOGNITION AND TRACKING BASED APPARATUS AND METHOD - Object recognition and tracking methods, devices and systems are disclosed. One embodiment of the present invention pertains to a method for associating an object with an event and a condition triggering the event in response to a receipt of a representation of the object to be recognized and tracked. The method also comprises tracking a movement of the object and storing information associated with the movement. The method further comprises, generating data associated with the object based on the information associated with the movement of the object in response to occurrence of the condition triggering the event.03-07-2013
20130057700LINE TRACKING WITH AUTOMATIC MODEL INITIALIZATION BY GRAPH MATCHING AND CYCLE DETECTION - A vision based tracking system in a mobile platform tracks objects using groups of detected lines. The tracking system detects lines in a captured image of the object to be tracked. Groups of lines are formed from the detected lines. The groups of lines may be formed by computing intersection points of the detected lines and using intersection points to identified connected lines, where the groups of lines are formed using connected lines. A graph of the detected lines may be constructed and intersection points identified. Interesting subgraphs are generated using the connections and the group of lines is formed with the interesting subgraphs. Once the groups of lines are formed, the groups of lines are used to track the object, e.g., by comparing the groups of lines in a current image of the object to groups of lines in a previous image of the object.03-07-2013
20100097476Method and Apparatus for Optimizing Capture Device Settings Through Depth Information - A method for adjusting image capture settings for an image capture device is provided. The method initiates with capturing depth information of a scene at the image capture device. Depth regions are identified based on the captured depth information. Then, an image capture setting is adjusted independently for each of the depth regions. An image of the scene is captured with the image capture device, wherein the image capture setting is applied to each of the depth regions when the image of the scene is captured.04-22-2010
20090268033Method for estimating connection relation among wide-area distributed camera and program for estimating connection relation - An estimating method and program for estimating the connection relation among distributed cameras to use the estimated relation for monitoring and tracking many object in a wide area. The feature of the invention is not to need any object association among cameras by recognition of a camera image. Each camera independently detects and tracks objects entering/exiting an observation image, and the image coordinates and times of the moments when each object is detected first and last in the image are acquired. All the acquired data observed by each camera is tentatively associated with all the acquired data observed by all the cameras before the detection of all the acquired data observed by the each camera, and the associated items of the data associated for each elapsed time are counted. By using the fact that the elapsed time of correctly associated data with the movement of the same object has a significant peak in the histogram showing the relation between the elapsed time and the number of observations, the connection relation among the fields of view of cameras (presence/absence of overlap between fields of view, image coordinates at which entrance/exit occurs, elapsed time, and pass probability) is acquired according to be peak detection result.10-29-2009
20130063605IMAGING APPARATUS, IMAGE PROCESSING METHOD, AND RECORDING MEDIUM FOR RECORDING PROGRAM THEREON - An imaging apparatus includes: a setting processor which sets an image area including a tracking shooting object image in a pan-blur shooting as a tracking object image as a tracking image area, with respect to a specific frame image; a searching processor, by respectively setting a scanning area for a plurality of frame images following the frame image used to set the tracking image area, and respectively moving the scanning area on each corresponding frame image of the frame images, and makes a comparison of a characteristic amount of an image between the tracking image area and the scanning area, which respectively obtains a scanning area where the characteristic amount of the image is similar to the image in the tracking image area as a tracking object existing area including the tracking object image, with respect to the frame images; a measuring processor, by dividing a difference between a coordinate of the tracking object existing area obtained with respect to one frame image of the frame images and that obtained with respect to a next one by a certain time interval, which measures a moving speed of the tracking object image on a monitor screen; and a displaying processor which displays a speed display mark corresponding to the moving speed of the tracking object image on the monitor screen.03-14-2013
20130162839TRACKING DEVICE AND TRACKING METHOD FOR PROHIBITING A TRACKING OPERATION WHEN A TRACKED SUBJECT IS OBSTRUCTED - A tracking device includes an imaging part for repeatedly acquiring image data on a subject image, a tracking processing part for setting a tracking position based on first image data to perform tracking processing on a subject in the tracking position based on second image data, and a relative distance information calculating part for calculating relative distance information using (1) a determined information about a distance to the tracking position and (2) a determined information about a distance to a surrounding area around the tracking position. When the tracking processing part determines that another subject in the surrounding area is located at the closer range than the subject in the tracking position, the tracking processing part prohibits tracking processing.06-27-2013
20130162837Image Capture - An apparatus including a processor configured to change automatically which pixels are used to define a target captured image in response to relative movement of a sensor frame of reference defined by a camera sensor and an image frame of reference defined by the image.06-27-2013
20130162838Transformation between Image and Map Coordinates - Systems and methods for transformations between image and map coordinates, such as those associated with a video surveillance system, are described herein. An example of a method described herein includes selecting a reference point within the image with known image coordinates and map coordinates, computing at least one transformation parameter with respect to a location and a height of the camera and the reference point, detecting a target location to be tracked within the image, determining image coordinates of the target location, and computing map coordinates of the target location based on the image coordinates of the target location and the at least one transformation parameter.06-27-2013
20090278937Video data processing - An embodiment of the present invention relates to systems and methods for dynamically detecting and visualizing actions and/or events in video data streams. In one embodiment, a method involves dynamically detecting and extracting objects and attributes relating to the objects from a video data stream by using action recognition filtering for attribute detection and time series analysis for relation detection among the extracted objects. In addition, the method may involve dynamically generating a multi-field video visualization along a time axis by depicting the video data stream as a series of frames at a relatively sparse or dense interval, and by continuously rendering the attributes relating to the objects with substantially continuous abstract illustrations. Finally, a method may also involve dynamically combining detection, and extraction of objects and combining with multi-field visualization in a video perpetuo gram (VPG), which may show a video stream in parallel, and which allows for real-time display and interaction.11-12-2009
20090009606Tracking device, focus adjustment device, image-capturing device, and tracking method - A tracking device includes: a first tracking control unit that tracks an object based upon focus adjustment states that are detected by a focus detection unit in a plurality of focus detection positions; a second tracking control unit that tracks the object based upon image information that is outputted by an image-capturing unit and reference image information that has been set as a reference; a setting unit that sets a degree to which focus adjustment based upon a focus adjustment state detected by the focus detection unit is temporarily prohibited; and a control unit that selects one of the first tracking control unit and the second tracking control unit to be used for tracking the object, based upon the degree that has been set by the setting unit.01-08-2009
20120113267IMAGE CAPTURING APPARATUS, METHOD, AND RECORDING MEDIUM CAPABLE OF CONTINUOUSLY CAPTURING OBJECT - An image capturing apparatus includes an image capturing unit, a designating unit configured to designate a main-area or a position to which a moving object is to reach, in each captured image captured by the image capturing unit, an image capturing control unit configured to control the image capturing unit to continuously capture the moving object at a predetermined frame rate, a position specifying unit configured to specify a position of the moving object in the image captured by the image capturing unit, and a frame rate control unit configured to control the predetermined frame rate, based on the specified position of the moving object, and either the main-area or position.05-10-2012
20120113268IMAGE GENERATION APPARATUS, IMAGE GENERATION METHOD AND STORAGE MEDIUM - An image generation apparatus comprises: a first-image obtaining unit adapted to obtain an image obtained by causing an image capturing unit to capture the grip unit controlled so as to place the target object in one predetermined orientation of a plurality of predetermined orientations with respect to the image capturing unit and the target object in the one predetermined orientation as a grip-state image; a second-image obtaining unit adapted to obtain, as a non-grip-state image corresponding to the one predetermined orientation, an image of the grip unit that does not grip the target object and is placed in a predetermined orientation coincident with the orientation controlled to place the target object in the one predetermined orientation; and an image generation unit adapted to generate a target object image including only the target object for the one predetermined orientation based on a difference between the grip-state image and the non-grip-state image.05-10-2012
20110001831Video Camera - A video camera includes an imager. An imager repeatedly outputs an object scene image captured on an imaging surface. A determiner repeatedly determines whether or not one or at least two dynamic objects exist in the object scene by referring to the object scene image outputted from the imager. A first searcher searches a specific dynamic object that satisfies a predetermined condition from the one or at least two dynamic objects when a determination result of the determiner is updated from a negative result to an affirmative result. An adjuster adjusts an imaging condition by tracking the specific dynamic object discovered by the first searcher.01-06-2011
20110279683Automatic Motion Triggered Camera with Improved Triggering - An automatic motion triggered camera with improved triggering to produce images with the target animal substantially centered in the field of view. A single motion detector, a camera, and image memory are controlled by a processor that detects changes in the motion detector output and generates a trigger signal for the camera. The trigger follows a first stage of waiting until a predetermined minimum threshold of movement is detected and a second stage of waiting until a further change in the output signal indicative of an animal being substantially centered in the field of view of the camera is detected. Compensation for camera capture delay time can be included by sampling a plurality of motion detector signals, computing an estimated time at which the animal will be centered, and triggering the camera prior to that time.11-17-2011
20110279682Methods for Target Tracking, Classification and Identification by Using Foveal Sensors - A method of operating a sensor system may include the steps of sensing a predetermined area including a first object to obtain first sensor data at a first predetermined time, sensing the substantially same predetermined area including the first object to obtain second sensor data at a second predetermined time, determining a difference between the first sensor data and the second sensor data, identifying a target based upon the difference between the first sensor data and the second sensor data, identifying a material of the target and determining a target of interest to track based upon the material of the target.11-17-2011
20110285855METHOD OF AUTOMATICALLY TRACKING AND PHOTOGRAPHING CELESTIAL OBJECTS AND CAMERA EMPLOYING THIS METHOD - A method of automatically tracking and photographing a celestial object so that the celestial object image, which is formed on an imaging surface of an image sensor via a photographing optical system, becomes stationary relative to a predetermined imaging area of the imaging surface of the image sensor during a tracking and photographing operation. The method includes performing a preliminary photographing operation at a predetermined preliminary-photographing exposure time with the photographic apparatus directed toward the celestial object and with a celestial-body auto tracking action suspended to obtain a preliminary image before automatically tracking and photographing the celestial object, calculating a moving direction and a moving speed of the celestial object image from the preliminary image that is obtained by the preliminary photographing operation, and automatically tracking and photographing the celestial object based on the moving direction and the moving speed of the celestial object image.11-24-2011
20110285854System and method for theatrical followspot control interface - There is provided a system and method for controlling a tracking device to follow a location of a performer on a stage. The tracking device may comprise a lighting fixture such as a high output video projector or an automated mechanical moving light such as a DMX lighting fixture to provide a followspot, or a microphone to record audio from a particular performer. The method comprises capturing a video feed of the stage using a camera, presenting the video feed on a display of a control device, receiving input data indicating a position of the performer in the video feed, translating the input data into the location of the performer on the stage, and adjusting the tracking device to follow the location of the performer on the stage. The control device thereby provides lighting operators with an intuitive interface readily implemented using cost effective commodity hardware.11-24-2011
20110292217METHOD OF AUTOMATICALLY TRACKING AND PHOTOGRAPHING CELESTIAL OBJECTS AND PHOTOGRAPHIC APPARATUS EMPLOYING THIS METHOD - A method of automatically tracking and photographing a celestial object, is provided, which moves relative to a photographic apparatus due to diurnal motion so that the celestial object image formed on an image sensor becomes stationary during a celestial-object auto-tracking photographing operation. The method includes inputting photographing azimuth angle and elevation angle information of the photographic apparatus; calculating preliminary-tracking drive control data based on the photographing azimuth angle and elevation angle information; obtaining first and second preliminary images corresponding to commencement and termination points of the preliminary tracking operation; calculating a deviation amount between a celestial object image in the first preliminary image and a corresponding celestial object image in the second preliminary image; calculating, from the deviation amount, actual-tracking drive control data with the deviation amount cancelled; and performing the celestial-object auto-tracking photographing operation based on the actual-tracking drive control data.12-01-2011
20100097474SMOKE DETECTING METHOD AND SYSTEM - A smoke detecting method and system are provided. The smoke detecting method and system capture a plurality of images; determine whether a moving object exists in the plurality of images; select the images having the moving object to be analyzed; analyze whether the moving object is moving toward a specific direction and a displacement of a base point of the moving object; and determine the moving object as a smoke when the moving object is moving toward the specific direction and the displacement is less than a threshold value.04-22-2010
20100091110SINGLE CAMERA TRACKER - A camera tracker, in which an image captured by a camera oriented to capture images across a surface is accessed. A region in which an object detected within the accessed image is positioned is determined from among multiple defined regions within a field of view of the camera. User input is determined based on the determined region and an application is controlled based on the determined user input.04-15-2010
20100007740Statistical modeling and performance characterization of a real-time dual camera surveillance system - The present invention relates to a method for visually detecting and tracking an object through a space. The method chooses modules for a restricting a search function within the space to regions with a high probability of significant change, the search function operating on images supplied by a camera. The method also derives statistical models for errors, including quantifying an indexing step performed by an indexing module, and tuning system parameters. Further the method applies a likelihood model for candidate hypothesis evaluation and object parameters estimation for locating the object.01-14-2010
20100033579Image Shooting Device And Image Playback Device - An image shooting device includes: an image sensing device that, by sequential shooting, outputs a signal representing a series of shot images; a tracking processing portion that, based on the output signal of the image sensing device, sequentially detects the position of a tracking target on the series of shot images and thereby tracks the tracking target on the series of shot images; a clipping processing portion that, for each shot image, based on the detected position, sets a clipping region in the shot image and extracts the image within the clipping region as a clipped image or outputs clipping information indicating the position and extent of the clipping region; and a tracking evaluation portion that, based on the output signal of the image sensing device, evaluates the degree of reliability or ease of tracking by the tracking processing portion. The clipping processing portion varies the extent of the clipping region in accordance with the evaluated degree.02-11-2010
20090096871TRACKING DEVICE, TRACKING METHOD, TRACKING DEVICE CONTROL PROGRAM, AND COMPUTER-READABLE RECORDING MEDIUM - A tracking device includes feature information detection means for detecting feature information from a photographic image and tracking object matching means for comparing the feature information with tracking object information in which feature information of a plurality of figures are registered so that the feature information corresponds to a priority indicating tracking order of the feature information and for determining whether or not the feature information is information of the tracking object. The tracking device also includes priority acquisition means for acquiring the priority of the feature information detected from the tracking object information where it is determined that the feature information detected is the information of the tracking object and control means for controlling the photographing section, based on the priority acquired, so as to continuously include, in the photographic image from which the feature information is detected, feature information that has a highest priority in the photographic image.04-16-2009
20090153666TRACKING DEVICE, AUTOMATIC FOCUSING DEVICE, AND CAMERA - A tracking device includes: a light measurement device for light measuring an object; a focus detection device for performing focus detection of the object by an optical system; and a tracking control device for tracking the object based on light measurement information from the light measurement device and focus detection information from the focus detection device corresponding to the light measurement information.06-18-2009
20100302377IMAGE SENSOR, APPARATUS AND METHOD FOR DETECTING MOVEMENT DIRECTION OF OBJECT - An image sensor includes a base, at least a plurality of first sensing elements and second sensing elements formed on the base. The first sensing elements and the second sensing elements are arranged in an alternate fashion. The first sensing elements cooperatively form a first noncontinuous planar sensing surface facing toward an object, and the second sensing elements cooperatively form a second noncontinuous planar sensing surface facing toward the object. The first noncontinuous planar sensing surface is lower than the second noncontinuous planar sensing surface.12-02-2010
20090278936METHOD FOR AUTOSTEREOSCOPICALLY PRODUCING THREE-DIMENSIONAL IMAGE INFORMATION FROM SCANNED SUB-PIXEL EXTRACTS AND DEVICE FOR CARRYING OUT SAID METHOD - A method for autostereoscopically producing three-dimensional image information from scanned subpixel extracts uses a multiplex track method (MTV) having a separating raster (TR) obliquely extended with respect to a matrix screen (MB) and an electronic tracking (TS) of viewing areas ibased on two separated image views (L, R), that adjacently disposes two or three subpixels (SP) of each pixel (P) of the two image views (L, R) in the actual subpixel extraction (SPA), continuously and alternatingly preserving each subpixel address and disposes said subpixels (SP) in an overlapping manner on each other with an offset, whereby the resolution loss effects the subpixels (SP) only. The crosstalk resulting from the inclination of the separating raster (TR) is reduced by a special structure of the subpixel extraction (SPA), wherein the resolution homogenisation in two directions of the screen is simultaneously preserved. The formation of the actual subpixel extraction (SPA) is carried out according to multiplex schemes (MUX11-12-2009
20090295926IMAGE PICKUP APPARATUS - An image pickup apparatus includes a first detection unit configured to detect movement information of an object which is a tracking target from a movie generated at an image pickup element; a second detection unit configured to detect movement information of an apparatus main body; and a determination unit configured to determine that tracking cannot be continued when the difference between a motion vector of the apparatus main body and a motion vector of the object is larger than a certain threshold.12-03-2009
20090278938Cognitive Change Detection System - A method of detecting a changed condition within a geographical space from a moving vehicle. Images of that geographic space are memorialized in conjunction with GPS coordinates together with its GPS coordinates. The same geographic space is traversed from the moving vehicle while accessing the route's GPS coordinates. The memorialized images are played back by coordinating the GPS data on a memorialized images with that of the traversed geographic space such that the memorialized images are viewed simultaneously with the geographic space being traversed. An observer traveling within the moving vehicle can compare the memorialized images with those being traversed in order to identify changed conditions.11-12-2009
20090262197MOVING OBJECT IMAGE TRACKING APPARATUS AND METHOD - An apparatus includes a computation unit computing a moving velocity of a moving object (MO) by differentiation on a first angle of a first-rotation unit and a second angle of a second-rotation unit, a setting unit setting a first-angular velocity of the first-rotation unit and a second-angular velocity of the second-rotation unit as angular-velocity-instruction values when the MO falls outside a correction range, and setting the second-angular velocity and a third-angular velocity as the angular-velocity-instruction values when the MO falls within the correction range, a detection unit detecting a fourth-angular velocity and a fifth-angular velocity of the first-rotation unit and the second-rotation unit, and a control unit controlling a driving unit to eliminate a difference between the fourth-angular velocity and an angular velocity corresponding to the first-rotation unit, and controlling the driving unit to eliminate a difference between the fifth-angular velocity and an angular velocity corresponding to the second-rotation unit.10-22-2009
20100220194Image processing device, image processing system, camera device, image processing method, and program - An image processing device includes: a detecting section configured to detect a plurality of objects by type from one input image; a generating section configured to generate image data on each of the objects detected by the detecting section as images of respective different picture frames by types of the objects; and a processing section configured to subject the images of the different picture frames, the images of the different picture frames being generated by the generating section, to processing according to one of a setting and a request.09-02-2010
20100277596AUTOMATIC TRACKING APPARATUS AND AUTOMATIC TRACKING METHOD - An automatic tracking apparatus controls a direction of an imager and adjusts an imaging region so as to include a tracked object in imaged video by a pan/tilt/zoom controller and a pan/tilt driver. Here, a tracking speed is acquired from control information for controlling the imager by a tracking speed decision portion. Then, when a tracking speed is a predetermined value or less, a zoom scaling factor is calculated according to the tracking speed by a zoom scaling factor calculator and a zoom scaling factor of the imager is changed by the pan/tilt/zoom controller and a zoom lens driver.11-04-2010
20120293665IMAGING DEVICE INCLUDING TARGET TRACKING FUNCTION - An imaging device includes an imaging unit capturing an image of a subject, and tracks, through images captured in time series, an area in which a specific target appears. The device includes a parameter acquiring unit acquiring a photographic parameter from the imaging unit, a target area determining unit determining an area of a captured image including the specific target as a target area, a track area adjusting unit setting a track frame for a track area to track the target area including the specific target and adjusting a size of the track frame based on the photographic parameter, and a track area searching unit searching the captured image for the track area, while moving the size-adjusted track frame, based on a similarity between a characteristic amount of the track area of a current captured image and that of the target area of a previous captured image.11-22-2012
20120293664CHARACTER RECOGNITION SYSTEM AND METHOD FOR RAIL CONTAINERS - A system and method, which enables precise identification of characters contained in vehicle license plates, container I.D, chassis I.D, aircraft serial number and other such identification markings. The system can process these identified characters and operate devices, such as access control operations, traffic systems and vehicle and container tracking and management systems, and provide records of all markings together with their images.11-22-2012
20120293663DEVICE FOR DETERMINING DISAPPEARING DIRECTION AND METHOD THEREOF, APPARATUS FOR VIDEO CAMERA CALIBRATION AND METHOD THEREOF - A disappearing direction determination device and method, a video camera calibration apparatus and method, a video camera and a computer program product are provided. The device comprises: a moving target detecting unit for detecting in the video image a moving target area where a moving object locates; a feature point extracting unit for extracting at least one feature point on the moving object in the detected moving target area; a moving trajectory obtaining unit for tracking a movement of the feature point in a predetermined number of video image frames to obtain a movement trajectory of the feature point; and a disappearing direction determining unit for determining, according to the movement trajectories of one or more moving objects in the video image, a disappearing direction pointed by a major moving direction of the moving objects. Thus, a disappearing direction and video camera gesture parameters can be determined accurately.11-22-2012
20100103269DETERMINING ORIENTATION IN AN EXTERNAL REFERENCE FRAME - Orientation in an external reference is determined. An external-frame acceleration for a device is determined, the external-frame acceleration being in an external reference frame relative to the device. An internal-frame acceleration for the device is determined, the internal-frame acceleration being in an internal reference frame relative to the device. An orientation of the device is determined based on a comparison between a direction of the external-frame acceleration and a direction of the internal-frame acceleration.04-29-2010
20100141773DEVICE FOR RECOGNIZING MOTION AND METHOD OF RECOGNIZING MOTION USING THE SAME - The present invention provides a device for recognizing a motion. The device for recognizing a motion includes: an input device that includes a light source and an inertial sensor; and a motion recognition mechanism that extracts the trajectory of a user's motion by detecting position change of the light source for a user's motion section that is determined in response to a sensing signal of the inertial sensor.06-10-2010
20100141772IMAGE PROCESSING DEVICE AND METHOD, IMAGE PROCESSING SYSTEM, AND IMAGE PROCESSING PROGRAM - An image processing device includes: an entire image display control portion that performs control to display an entire image of a predetermined region in an entire image display window; and a cutout image display control portion that performs control to enlarge a plurality of tracking subjects included in the entire image and display the tracking subjects in a cutout image display window. The cutout image display control portion performs the control in such a manner that one cutout image including the tracking subjects is displayed in the cutout image display window in a case where relative distances among the tracking subjects are equal to or smaller than a predetermined value, and that two cutout images including the respective tracking subjects are displayed in the cutout image display window in a case where the relative distances among the tracking subjects are larger than the predetermined value.06-10-2010
20080239081SYSTEM AND METHOD FOR TRACKING AN INPUT DEVICE USING A DISPLAY SCREEN IN CAPTURED FRAMES OF IMAGE DATA - A system and method for tracking an input device uses positional information of a display screen in frames of image data to determine the relative position of the input device with respect to the display screen.10-02-2008
20090315997IMAGE PROCESSING METHOD, IMAGING APPARATUS, AND STORAGE MEDIUM STORING CONTROL PROGRAM OF IMAGE PROCESSING METHOD EXECUTABLE BY COMPUTER - An image processing method is provided for detecting the position of a specific subject from a movie and combining a display of detection result indicating the detected position with the movie. The image processing method includes a step of determining, depending on a display time of the detection result, whether the detection result should be continuously displayed, when the subject cannot be detected during the display of the detection result combined with the movie.12-24-2009
20100271484Object tracking using momentum and acceleration vectors in a motion estimation system - There is provided a method and apparatus for motion estimation in a sequence of video images. The method comprises a) subdividing each field or frame of a sequence of video images into a plurality of blocks, b) assigning to each block in each video field or frame a respective set of candidate motion vectors, c) determining for each block in a current video field or frame, which of its respective candidate motion vectors produces a best match to a block in a previous video field or frame, d) forming a motion vector field for the current video field or frame using the thus determined best match vectors for each block, and e) forming a further motion vector field by storing a candidate motion vector derived from the best match vector at a block location offset by a distance derived from the candidate motion vector. Finally, steps a) to e) are repeated for a video field or frame following the current video field or frame. The set of candidate motion vectors assigned at step b) to a block in the following video field or frame includes the candidates stored at that block location at step e) during the current video field or frame The method enables a block or tile based motion estimator to improve its accuracy by introducing true motion vector candidates derived from the physical behaviour of real world objects.10-28-2010
20090079833TECHNIQUE FOR ALLOWING THE MODIFICATION OF THE AUDIO CHARACTERISTICS OF ITEMS APPEARING IN AN INTERACTIVE VIDEO USING RFID TAGS - The present solution can include a method for allowing the selective modification of audio characteristics of items appearing in a video. In this method, a RFID tag can be loaded with audio characteristics specific to a sound-producing element. The RFID tag can then be attached to an item that corresponds to the sound-producing element. The video and audio of the area including the item can be recorded. The audio characteristics can be automatically obtained by scanning the RFID tag. The audio characteristics can then be embedded within the video so that the audio characteristics are available when the item appears in the video.03-26-2009
20090160942Tagging And Path Reconstruction Method Utilizing Unique Identification And The System Thereof - Disclosed is a tagging and path reconstruction method utilizing unique identification characteristics and the system thereof. The tagging and path reconstruction system comprises a plurality of readers for reading identification information having unique IDs, a plurality of cameras for taking object's image data, and a server. The server includes an identifying and tagging module, an interlaced fusion and identification module, and a path reconstruction module. The identifying and tagging module identifies and tags the object image data with unique IDs. The interlaced fusion and identification module filters, checks and updates the tagged object image data. The path reconstruction module recovers the tagged object image data, lets them regress to their original identity data, and reconstructs the motion path of each object.06-25-2009
20130120585AUTOMATIC TRACKING CAMERA SYSTEM - An automatic tracking camera system includes: an image pickup unit; a driving unit for rotating the image pickup unit in panning or tilting direction; a signal receiver for receiving an object information signal; an image signal processor for recognizing an object in an image and detecting motion of the object in the image; a controller for controlling the image pickup unit, the driving unit, and the image signal processor; and a memory for storing, for each passageway, standby positions at which the image signal processor detects the object. The controller calculates an approaching passageway and angle of the object; selects a corresponding standby position from the standby positions stored in the memory; drives the driving unit and a lens apparatus to the standby position; and controls, when the image signal processor recognizes the object, the driving unit and the lens apparatus to automatically track the object based on detected information.05-16-2013
20130120586AUTOMATIC TRACKING CAMERA SYSTEM - An automatic tracking camera system includes: a rotating unit for panning and tilting an image pickup unit including a lens apparatus and an image pickup apparatus; a tracking object detector; a motion vector detector for detecting a motion vector of the object to be tracked; a capture position setting unit for setting a capture position of the object to be tracked in the picked up image; and a controller for controlling drive of the rotating unit. The controller controls the rotating unit in a capture mode to capture the object to be tracked at the capture position based on the motion vector detected by the motion vector detector after the tracking object detector has detected the object to be tracked in the picked up image, and a maintenance mode to continuously capture the object to be tracked at the capture position after the capture mode.05-16-2013
20090128632CAMERA AND IMAGE PROCESSOR - A video signal produced through a shooting operation of a camera is sent to a motion detecting unit which detects a motion included in the video signal to set an area of the motion as a space motion area and inputs information of the area to a distance determination circuit which calculates distance between the motion area and the camera by use of a parallax signal produced from stereo cameras disposed in the camera to send information of the space motion area and the distance to a mask determination circuit. The mask determination circuit conducts a comparison between the space motion area information and the space mask area information and between the distance information of the motion area and that of the space mask area, and resultantly compares three-dimensional positions between the detected motion and the mask to determine a position relationship therebetween.05-21-2009
20090015676Recognition and Tracking Using Invisible Junctions - The present invention uses invisible junctions which are a set of local features unique to every page of the electronic document to match the captured image to a part of an electronic document. The present invention includes: an image capture device, a feature extraction and recognition system and database. When an electronic document is printed, the feature extraction and recognition system captures an image of the document page. The features in the captured image are then extracted, indexed and stored in the database. Given a query image, usually a small patch of some document page captured by a low resolution image capture device, the features in the query image are extracted and compared against those stored in the database to identify the query image. The present invention also includes methods for recognizing and tracking the viewing region and look at point corresponding to the input query image. This information is combined with a rendering of the original input document to generate a new graphical user interface to the user. This user interface can be displayed on a conventional browser or even on the display of an image capture device.01-15-2009
20090079834CAMERA - A camera of the present invention is provided with a gapless prism which obtains wavelength spectrum of an optical image of a targeting object, an IR-cut filter which is inserted to and retreated from an optical path, and a Bch image sensor, a Gch image sensor, and an Rch+IR image sensor which are arranged for respective spectrum separated by the gapless prism. An infrared ray separator which conducts separation into a specific spectrum signal and an infrared ray signal from infrared ray mixture image data output from the Rch+IR image sensor when the IR-cut filter is retreated from the optical path.03-26-2009
20090086027Method And System For Providing Images And Graphics - A system detects a subject(s) or recipient(s), or an object(s) associated therewith, and sends a communication, for example, a message, to the subject, or object associated therewith, collectively, the “subject.” The communication is projected as an image proximate to the subject, and is projected to the subject when stationary or in motion, and in some cases, the communication is selected based on the direction of travel of the subject.04-02-2009
20110141288Object Tracking Method And Apparatus For A Non-Overlapping-Sensor Network - An object tracking method for a non-overlapping-sensor network works in a sensor network. The method may comprise a training phase and a detection phase. In the training phase, a plurality of sensor information measured by the sensors in the sensor network is used as training samples. At least an entrance/exit is marked out within the measurement range of each sensor. At least three characteristic functions including sensor spatial relation among the sensors in the sensor network, difference of movement time and similarity in appearance, are estimated by an automatically learning method. The at least three characteristic functions are used as the principles for object tracking and relationship linking in the detection phase.06-16-2011
20110228100OBJECT TRACKING DEVICE AND METHOD OF CONTROLLING OPERATION OF THE SAME - Disclosed is a technique for accurately tracking a tracking target object. A disparity map image in which the disparity of each pixel of an object is shown is generated from a three-dimensional object image. A detection range is determined such that an object disposed on the front side of a pedestrian, which is a tracking target object, in the depth direction is excluded from the generated disparity map image. The pedestrian, which is a tracking target object, is detected in the determined detection range. In this way, it is possible to prevent a bike driver disposed on the front side of the pedestrian in the depth direction from being tracked.09-22-2011
20110228099SYSTEM AND METHOD FOR TRACKING COOPERATIVE, NON-INCANDESCENT SOURCES - A system and method for tracking a cooperative, non-incandescent source may include collecting scene images of a scene that includes the cooperative, non-incandescent source and background clutter. First and second scene images of the scene may be generated over distinct spectral bands. The first and second scene images may be imaged onto respective first and second focal plane arrays. In one embodiment, the imaging may be substantially simultaneous. The first and second scene image frame data respectively generated by the first and second focal plane arrays may be processed to produce resultant scene image frame data. The scene image frame data may result in reducing magnitude of scene image frame data representative of the background clutter more than magnitude of scene image frame data representative of the cooperative, non-incandescent source.09-22-2011
20090213222SYSTEM FOR TRACKING A MOVING OBJECT, BY USING PARTICLE FILTERING - In a tracking system for tracking a moving object by using a particle filter, the particle filter is configured to arrange particles initially, in a standby state, in a given background region provided in the screen of a camera and to rearrange the particles with respect to the moving object in accordance with a change in likelihood that the object has with respect to the particles.08-27-2009
20090231436Method and apparatus for tracking with identification - Combining and fusing the tracking of people and objects with image processing and the identification of the people and objects being tracked. Also, conditions of a person, object, area or facility can be detected, evaluated and monitored.09-17-2009
20100013935MULTIPLE TARGET TRACKING SYSTEM INCORPORATING MERGE, SPLIT AND REACQUISITION HYPOTHESES - A tracking system having a video detector for associating observations of blobs and objects and deriving objects' or blobs' paths. Hypotheses may be computed by the system for merging, splitting and reacquisition of the observations. There may be objects tracked among the observations, and best paths selected as trajectories of corresponding objects. The observations may be placed in a sliding window containing a series of observations inferred from a collection of frames for improving the accuracy of the tracking (or data association). The processed observations and data may be represented graphically.01-21-2010
20120105647CONTROL DEVICE, CONTROL METHOD, PROGRAM, AND CONTROL SYSTEM - A control device, a control method, a program, and a control system, whereby more intelligent imaging operation more useful for a user can be realized particularly in event of performing automatic imaging operation using a subject detection result within an imaged image, as an imaging operation. Control relating to imaging operation is performed according to a positional relation between an edge region set as an edge portion region of an image frame, and a subject detected within the image frame. Control relating to automatic imaging operation can be performed based on a determination reference that is a positional relation between the edge region and a detected subject, and more intelligent automatic imaging operation more useful for a user can be realized.05-03-2012
20090315996Video tracking systems and methods employing cognitive vision - Video tracking systems and methods include a peripheral master tracking process integrated with one or more tunnel tracking processes. The video tracking systems and methods utilize video data to detect and/or track separately several stationary or moving objects in a manner of tunnel vision. The video tracking system includes a master peripheral tracker for monitoring a scene and detecting an object, and a first tunnel tracker initiated by the master peripheral tracker, wherein the first tunnel tracker is dedicated to track one detected object.12-24-2009
20100149342IMAGING APPARATUS - An imaging apparatus has an imaging section that creates video data from an optical image of a subject field; a feature acquiring section that acquires features of a main subject in the subject field; a feature holding section that holds the acquired features; a tracking processing section that performs a predetermined process for tracking the main subject using the created video data and the held features; and a controlling section that validates or invalidates an operation of the feature acquiring section, and the controlling section invalidates the operation of the feature acquiring section when the imaging apparatus satisfies a predetermined condition.06-17-2010
20100149341CORRECTING ANGLE ERROR IN A TRACKING SYSTEM - To correct an angle error, acceleration data is received corresponding to a tracked object in a reference frame of the tracked object. Positional data of the tracked object is received from a positional sensor, and positional sensor acceleration data is computed from the received positional data. The acceleration data is transformed into a positional sensor reference frame using a rotation estimate. An amount of error between the transformed acceleration data and the positional sensor acceleration data is determined. The rotation estimate is updated responsive to the determined amount of error.06-17-2010
20100149343PHOTOGRAPHING METHOD AND APPARATUS USING FACE POSE ESTIMATION OF FACE - Provided are a photographing method and apparatus using face pose estimation. The photographing method includes: detecting a face area from an input image; estimating pose information in the detected face area; and determining a face direction based on the estimated pose information and recording the input image according to the face direction. Accordingly, when a face of a subject looks at something other than a camera, a picture is not taken, and thus a failed photograph is prevented.06-17-2010
20100149340COMPENSATING FOR BLOOMING OF A SHAPE IN AN IMAGE - A number of brightness samples are taken outside a shape to compensate for blooming of the shape in an image generated by a digital camera. The brightness of each of the samples is determined and averaged, and the size of the shape is adjusted based on the difference between the brightness of the shape and the average of the brightness samples.06-17-2010
20100157065AUTOFOCUS SYSTEM - An autofocus system includes: an image pickup unit; an autofocus unit performing focus adjustment such that a subject in a AF area is to be in focus in the photographing image; a tracking unit moving the AF area to follow the movement of the subject in the range of the photographing image; a reference pattern registering unit registering the subject image in focus as a reference pattern; and a matched image detecting unit detecting a subject which is most closely matched with the reference pattern in the photographing image. When the amount of movement of the detected subject in a screen is less than a given value, the reference pattern is updated with the image of the subject, and an AF frame is updated to follow the moved subject. When the amount of movement is equal to or more than the given value, a tracking operation stops.06-24-2010
20100157063SYSTEM AND METHOD FOR CREATING AND MANIPULATING SYNTHETIC ENVIRONMENTS - Disclosed herein are systems, computer-implemented methods, and tangible computer-readable media for synthesizing a virtual window. The method includes receiving an environment feed, selecting video elements of the environment feed, displaying the selected video elements on a virtual window in a window casing, selecting non-video elements of the environment feed, and outputting the selected non-video elements coordinated with the displayed video elements. Environment feeds can include synthetic and natural elements. The method can further toggle the virtual window between displaying the selected elements and being transparent. The method can track user motion and adapt the displayed selected elements on the virtual window based on the tracked user motion. The method can further detect a user in close proximity to the virtual window, receive an interaction from the detected user, and adapt the displayed selected elements on the virtual window based on the received interaction.06-24-2010
20100157064OBJECT TRACKING SYSTEM, METHOD AND SMART NODE USING ACTIVE CAMERA HANDOFF - If an active smart node detects that an object leaves a center region of a FOV for a boundary region, the active smart node predicts a possible path of the object. When the object gets out of the FOV, the active smart node predicts the object appears in a FOV of another smart node according to the possible path and a spatial relation between cameras. The active smart node notifies another smart node to become a semi-active smart node which determines an image characteristic similarity between the object and a new object and returns to the active smart node if a condition is satisfied. The active smart node compares the returned characteristic similarity, an object discovery time at the semi-active smart node, and a distance between the active smart node and the semi-active smart node to calculate possibility.06-24-2010
20100171836IMAGE CAPTURING APPARATUS, CONTROL METHOD THEREOF, AND PROGRAM - An image capturing apparatus comprising an object detection unit which detects a specific object from an image signal, and a control unit which performs first control corresponding to the specific object when the object detection unit detects the specific object, and performs second control different from the first control when the object detection unit does not detect the specific object, wherein when a state in which the specific object is detected by the object detection unit transits to a state in which the specific object becomes undetectable, the control unit changes, based on information before the specific object becomes undetectable, at least either of a time for which the first control is held and a transition speed when transiting from the first control to the second control.07-08-2010
20100002083MOVING OBJECT AUTOMATIC TRACKING APPARATUS - Even when a moving object intrudes into an area other than an initially registered preset position, movement of the moving object can be detected and tracked.01-07-2010
20100238296MOBILE OBJECT IMAGE TRACKING APPARATUS - A mobile object image tracking apparatus includes: a base; a first gimbal; a second gimbal; an image guiding passage configured to guide an image received through an input opening portion of the second gimbal to the base; an image capturing device; an angle sensor; a tracking error detector configured to detect a first tracking error of an image data; a delay circuit; a tracking error calculator configured to calculate a second tracking error based on the first tracking error, a delayed first rotation angle, and a delayed second rotation angle; an angular velocity processor configured to generate a first target angular velocity and a second target angular velocity based on the first rotation angle, the second rotation angle, and the second tracking error; and an actuator controller configured to control the first gimbal the second gimbal based on the first and second target angular velocities.09-23-2010
20130128054System and Method for Controlling Fixtures Based on Tracking Data - Systems and methods are provided for using tracking data to control the functions of an automated fixture. Examples of automated fixtures include light fixtures and camera fixtures. A method includes obtaining a first position of a tracking unit. The tracking unit includes an inertial measurement unit and a visual indicator configured to be tracked by a camera. A first distance is computed between the automated fixture and the first position and it is used to set a function of the automated fixture to a first setting. A second position of the tracking unit is obtained. A second distance between the automated fixture and the second position is computed, and the second distance is used to set the function of the automated fixture to a second setting.05-23-2013
20090115850MOBILE OBJECT IMAGE TRACKING APPARATUS AND METHOD - A mobile object image tracking apparatus includes at least one unit rotating about at least one axis, a camera sensor photographing a mobile object to acquire image data, a unit detecting a tracking error as a tracking error detection value, a unit detecting an angle of the rotary unit, a unit estimating the tracking error as a tracking error estimation value, a unit selecting the tracking error detection value when the mobile object falls within the field of view, and selecting the tracking error estimation value when the mobile object falls outside the field of view, a unit computing an angular velocity instruction value used to drive the rotating unit to track the mobile object, a unit detecting an angular velocity of the rotary unit, and a unit controlling the rotating unit to make zero a difference between the angular velocity instruction value and the angular velocity.05-07-2009
20100302378TRACKING SYSTEM CALIBRATION USING OBJECT POSITION AND ORIENTATION - To calibrate a tracking system, a computing device receives positional data of a tracked object from an optical sensor as the object is pointed approximately toward the optical sensor. The computing device computes a first angle of the object with respect to an optical axis of the optical sensor using the received positional data. The computing device receives inertial data corresponding to the object, wherein a second angle of the object with respect to a plane normal to gravity can be computed from the inertial data. The computing device determines a pitch of the optical sensor using the first angle and the second angle.12-02-2010
20110128387SYSTEMS AND METHODS FOR MAINTAINING MULTIPLE OBJECTS WITHIN A CAMERA FIELD-OFVIEW - In one embodiment, a system and method for maintaining objects within a camera field of view include identifying constraints to be enforced, each constraint relating to an attribute of the viewed objects, identifying a priority rank for the constraints such that more important constraints have a higher priority that less important constraints, and determining the set of solutions that satisfy the constraints relative to the order of their priority rank such that solutions that satisfy lower ranking constraints are only considered viable if they also satisfy any higher ranking constraints, each solution providing an indication as to how to control the camera to maintain the objects within the camera field of view.06-02-2011
20110043639Image Sensing Apparatus And Image Processing Apparatus - An image sensing apparatus includes an imaging unit which outputs image data of images obtained by photography, and a photography control unit which controls the imaging unit to perform sequential photography of a plurality of target images including a specific object as a subject. The photography control unit sets a photography interval of the plurality of target images in accordance with a moving speed of the specific object.02-24-2011
20110025854CONTROL DEVICE, OPERATION SETTING METHOD, AND PROGRAM - A control device includes an operation decision unit which inputs the information on image data and a subject detected in an image of the image data and decides the operations to be executed based on the position of the subject in the image in the case of a predetermined limitation position state.02-03-2011
20110115920MULTI-STATE TARGET TRACKING MEHTOD AND SYSTEM - A multi-state target tracking method and a multi-state target tracking system are provided. The method detects a crowd density of a plurality of images in a video stream and compares the detected crowd density with a threshold when receiving the video stream, so as to determine a tracking mode used for detecting the targets in the images. When the detected crowd density is less than the threshold, a background model is used to track the targets in the images. When the detected crowd density is greater than or equal to the threshold, a none-background model is used to track the targets in the images.05-19-2011
20090322885IMAGE PROCESSING METHOD, IMAGING APPARATUS, AND STORAGE MEDIUM STORING CONTROL PROGRAM OF IMAGE PROCESSING METHOD EXECUTABLE BY COMPUTER - An image processing method is provided for detecting the position of a specific subject from a movie and combining a display of detection result indicating the detected position with the movie. The image processing method includes a step of determining, depending on a display time of the detection result, whether the detection result should be continuously displayed, when the subject cannot be detected during the display of the detection result combined with the movie.12-31-2009
20110141287VIDEO PROCESSING SYSTEM PROVIDING ENHANCED TRACKING FEATURES FOR MOVING OBJECTS OUTSIDE OF A VIEWABLE WINDOW AND RELATED METHODS - A video processing system may include a display and a video processor coupled to the display. The video processor may be configured to display a georeferenced video feed on the display defining a viewable area, determine actual geospatial location data for a selected moving object within the viewable area, and generate estimated geospatial location data along a predicted path for the moving object when the moving object is no longer within the viewable area and based upon the actual geospatial location data. The video processor may be further configured to define a successively expanding search area for the moving object when the moving object is no longer within the viewable window and based upon the estimated geospatial location data, and search within the successively expanding search area for the moving object when the successively expanding search area is within the viewable area.06-16-2011
20100188511IMAGING APPARATUS, SUBJECT TRACKING METHOD AND STORAGE MEDIUM - If there is no movement of the imaging apparatus and no subject presence estimation region, normal tracking setting is accomplished (step S07-29-2010
20100134632APPARATUS FOR TRACKING AN OBJECT USING A MOVING CAMERA AND METHOD THEREOF - Provided is an apparatus for tracking an object, including: a dynamic area extracting unit that extracts an object to be tracked from an image frame collected through an image collecting apparatus; an object modeling unit that models the object to be tracked extracted through the dynamic area extracting unit to calculate the color distribution of the object to be tracked; and an object tracking unit that calculates the color distribution of a next image frame collected through the image collecting apparatus, after calculating the color distribution of the object to be tracked, and calculates a posterior probability that each pixel belongs to the object to be tracked in the next image frame based on the calculated color distribution of the next image frame and the color distribution of the object to be tracked to track the object to be tracked.06-03-2010
20120120249CONTROL APPARATUS, IMAGING SYSTEM, CONTROL METHOD, AND PROGRAM - A control apparatus, an imaging system, a control method, and a program in which, when performing automatic image-recording, subjects which seem to be present around an imaging apparatus can be recorded as evenly as possible. An automatic recording operation for recording, upon detection of a subject from an image obtained by imaging, data representing an image containing the subject is performed. On that basis, if it is determined, on the basis of image-recording history information, that the transition to a subject configuration different from that used in the last image-recording is to be performed, a movable mechanism unit is moved to change an imaging field-of-view range, thereby obtaining a different subject configuration.05-17-2012
20120120247Image sensing module and optical signal processing device - An image sensing module includes an image sensor and a sensing controller. The image sensor captures an image with a first frequency. The sensing controller determines whether the image sensor has detected an object according to the image. When the sensing controller determines that the image sensor has detected an object, the sensing controller switches the image sensor to capture the image with a second frequency for the image sensor to continuously detect the location of the object. By setting the first frequency higher than the second frequency, the image-sensing module detects the object in real time and the power consumption of the image-sensing module continuously detecting the location of the object is saved as well.05-17-2012
20100118149SYSTEM AND METHOD FOR TRACKING AND MONITORING PERSONNEL AND EQUIPMENT - A system and method are described for using RFID tags to track and monitor personnel and equipment in large environments and environments that are prone to multipath fading. The system scans the environment by selecting local interrogation zones where RFID tags may be located. Multiple antennae are used, each transmitting a portion of an activation signal, such that the activation signal will be formed in the selected local interrogation zone. Different subsets of the antennae are successively selected, each targeting the selected local interrogation zone, to repeat the activation signal for each subset of antenna. RFID tags in the local interrogation zone will receive the portions of the activation signals and process them to determine whether the full activation signal was destined for that local interrogation zone for each of the subsets of antennae. An activated RFID tag will transmit its tag information, including any data collected from sensors connected to the tag, back to the system. The systems and method will use the location information of the various RFID tags in the global environment and combine that with data received through cameras and other sensors to provide a display with the RFID tag location information superimposed. The data collected about various regions of the environment may be transmitted back to the RFID tags to provide the personnel with information about their surroundings.05-13-2010
20100271485IMAGE PHOTOGRAPHING APPARATUS AND METHOD OF CONTROLLING THE SAME - A method of controlling an image photographing apparatus to track a subject using a non-viewable pixel region of an image sensor includes detecting a motion vector of a subject when a moving image is photographed, determining whether a non-viewable pixel region of an image sensor is present in the direction of the detected motion vector of the subject, and moving a photographing region to the non-viewable pixel region so as to track the motion of the subject. Accordingly, the moving subject may be tracked within a range of a predetermined Field of View (FOV) of the image photographing apparatus without an additional hardware system.10-28-2010
20100097475INTER-CAMERA LINK RELATION INFORMATION GENERATING APPARATUS - The apparatus comprises a feature quantity extraction part for extracting a feature quantity of a subject from video captured by plural cameras, an In/Out point extraction part for extracting In/Out points indicating points in which a subject appears and disappears in each video captured, an In/Out region formation part for forming In/Out regions based on the In/Out points extracted, a correlation value calculation part for calculating a correlation value by obtaining the total sum of similarities every feature quantity of the subject in each of the plural combinations of In/Out points included in the In/Out regions, a frequency histogram creation part for creating a frequency histogram based on the correlation value, and a link relation information generation part for extracting a peak of the frequency histogram and estimating the presence or absence of a link relation between the plural cameras and generating link relation information.04-22-2010
20120200715IMAGING APPARATUS, CONTROL METHOD THEREOF, AND STORAGE MEDIUM - An imaging apparatus includes an imaging unit configured to acquire image data, a positioning unit configured to perform positioning processing for acquiring positional information, a first control unit configured to control the positioning unit to perform the positioning processing at a first time interval, and to control an association unit to associate the positional information with the image data, and a second control unit configured to control the positioning unit to perform the positioning processing at a second time interval, and to control a generation unit to generate the log data based on the positional information, wherein the second control unit changes a time interval based on the acquisition status of the positional information.08-09-2012
20080278584Moving Object Detection Apparatus And Method By Using Optical Flow Analysis - Disclosed is a moving object detection apparatus and method by using optical flow analysis. The apparatus includes four modules of image capturing, image aligning, pixel matching, and moving object detection. Plural images are successively inputted under a camera. Based on neighboring images, frame relationship on the neighboring images is estimated. With the frame relationship, a set of warping parameter is further estimated. Based on the wrapping parameter, the background areas of the neighboring images are aligned to obtain an aligned previous image. After the alignment, a corresponding motion vector for each pixel on the neighboring images is traced. The location in the scene of the moving object can be correctly determined by analyzing all the information generated from the optical flow.11-13-2008
20100321503IMAGE CAPTURING APPARATUS AND IMAGE CAPTURING METHOD - There is provided an image capturing apparatus and method capable of preventing frame-out of a moving object even when a fast-moving object is photographed. The solution comprises: an image capturing unit that obtains image data by capturing an object; a moving object detecting unit that detects a moving object to be tracked based on the image data obtained with the image capturing unit; a tracking frame setting unit that sets an area portion which includes the moving object in the image data as a tracking frame when the moving object is detected by the moving object detecting unit; a display frame setting unit that sets an area which is shifted from the tracking frame in the image data in the direction opposite to the moving direction of the moving object as a display frame; and a display processing unit that displays the image data which is included in the display frame on a display unit.12-23-2010
20130169820CAMERA DEVICE TO CAPTURE AND GENERATE TARGET LEAD AND SHOOTING TECHNIQUE DATA AND IMAGES - This invention relates to processes for the capturing of the images of a target, and/or the shooter, at the time around the discharge of a gun, bow, or shooting device and the display of the images prior to discharge, around point of discharge, and post discharge in a manner that allows the shooter to analyze the images and data. More particularly, the present invention relates to the process in shooting where a moving target must be led in order that the projectile (or projectiles) arrives on target after the point in time where the shoot decision is made and the projectile reaches the target area. This invention will aid the shooter by letting them see images and sight pictures of successful and unsuccessful shots and how much lead, if any, they had given the targets at the point in time they decided to shoot. It also allows for the shooters technique to be recorded and analyzed.07-04-2013
20100123782AUTOFOCUS SYSTEM - By performing a tap operation on any one of the buttons on a screen of a liquid crystal display equipped with a touch panel for performing an operation input on the AF frame auto-tracking, it is possible to select a desired mode among a fixation mode, an object tracking mode, a face detection tracking mode, and a face recognition tracking mode. The fixation mode is suitable to set a position of the AF frame by means of manual operation. The object tracking mode is suitable to allow the AF frame to automatically track a desired subject other than a face. The face detection tracking mode is suitable to allow the AF frame to track a face of an arbitrary person previously not registered. The face recognition tracking mode is suitable to allow the AF frame to track a face of a specific person previously registered.05-20-2010
20110187869TRACKING-FRAME INITIAL-POSITION SETTING APPARATUS AND METHOD OF CONTROLLING OPERATION OF SAME - Automatic tracking of a target image is made comparatively easy. Specifically, an image obtained by imaging a subject is displayed on a display screen and a tracking frame is displayed at a reference position located at the central portion of the display screen. A target area for setting the initial position of the tracking frame is set surrounding the tracking frame and a high-frequency-component image representing high-frequency components of the image within this area is generated. The amount of high-frequency component is calculated while a moving frame is moved within the high-frequency-component image. The position of the moving frame where the calculated amount of high-frequency component is largest is decided upon as the initial position of the tracking frame.08-04-2011
20090027501DETECTING AN OBJECT IN AN IMAGE USING CAMERA REGISTRATION DATA INDEXED TO LOCATION OR CAMERA SENSORS - An object is detected in images of a live event by storing and indexing camera registration-related data from previous images. For example, the object may be a vehicle which repeatedly traverses a course. A first set of images of the live event is captured when the object is at different locations in the live event. The camera registration-related data for each image is obtained and stored. When the object again traverses the course, for each location, the stored camera registration-related data which is indexed to the location can be retrieved for use in estimating a position of a representation of the object in a current image, such as by defining a search area in the image. An actual position of the object in the image is determined, in response to which the camera registration-related data may be updated, such as for use in a subsequent traversal of the course.01-29-2009
20090167867CAMERA CONTROL SYSTEM CAPABLE OF POSITIONING AND TRACKING OBJECT IN SPACE AND METHOD THEREOF - A space position device capable of generating position signals according to its position in space is used in the camera control system for tracking an object. The space position device generates and transmits its position signals to a control unit every predetermined time interval. The control unit then generates control command for controlling a camera to rotate upward/downward, leftward/rightward, zoom in or zoom out according to the generated position signals such that the camera adjusts its focus on the space position device for tracking an object automatically.07-02-2009
20110304737GIMBAL POSITIONING WITH TARGET VELOCITY COMPENSATION - A gimbal system, including components and methods of use, configured to track moving targets. More specifically, the disclosed system may be configured to orient and maintain the line of sight (“los”) of the gimbal system toward a target, as the target and, in some cases, the platform supporting the gimbal system are moving. The system also may be configured to calculate an estimated target velocity based on user input, and to compute subsequent target positions from previous positions by integrating the estimated target velocity over time. A variety of filters and other mechanisms may be used to enable a user to input information regarding a target velocity into a gimbal controller.12-15-2011
20110304736GIMBAL POSITIONING WITH TARGET VELOCITY COMPENSATION - A gimbal system, including components and methods of use, configured to track moving targets. More specifically, the disclosed system may be configured to orient and maintain the line of sight (“los”) of the gimbal system toward a target, as the target and, in some cases, the platform supporting the gimbal system are moving. The system also may be configured to calculate an estimated target velocity based on user input, and to compute subsequent target positions from previous positions by integrating the estimated target velocity over time. A variety of filters and other mechanisms may be used to enable a user to input information regarding a target velocity into a gimbal controller.12-15-2011
20080231709SYSTEM AND METHOD FOR MANAGING THE INTERACTION OF OBJECT DETECTION AND TRACKING SYSTEMS IN VIDEO SURVEILLANCE - A system, method and program product for providing a video surveillance system that enhances object detection by utilizing feedback from a tracking system to an object detection system. A system is provided that includes: a moving object detection system for detecting moving objects in a video input; an object tracking system for tracking a detected moving object in successive time instants; and a tracker feedback system for feeding tracking information from the object tracking system to the moving object detection system to enhance object detection.09-25-2008
20120002055Correlation of position data that are acquired by means of a video tracking system with a second localization system - The invention relates to a video tracking system which determines the position of objects, e.g. players in a football match, by evaluating video image information. According to the present invention, the video tracking system is additionally provided with an independent positioning system which is preferably used in instances when the video-based position detection is faulty or does not work at all. According to one embodiment, the independent positioning system is GPS based, each football player carrying a portable GPS module which transmits the position data to the system by radio. Additional data, such as heart rate or acceleration, can also be transmitted.01-05-2012
20120002056APPARATUS AND METHOD FOR ACTIVELY TRACKING MULTIPLE MOVING OBJECTS USING A MONITORING CAMERA - An apparatus for actively tracking an object is provided. The apparatus includes a camera unit; a motor drive for changing a shooting direction of the camera unit; and a controller for acquiring a first comparative image and a second comparative image in sequence using the camera unit, comparing the first comparative image with the second comparative image, detecting a moving direction and a speed of an identical object existing in the first and second comparative images, determining an estimated location of the object after receipt of the second comparative image based on the detected moving direction and speed, and enlarging and capturing the object in the estimated location of the object.01-05-2012
20120057029CAPTURE OF VIDEO WITH MOTION-SPEED DETERMINATION AND VARIABLE CAPTURE RATE - A method of capturing a video of a scene depending on the speed of motion in the scene, includes capturing a video of the scene; determining the relative speed of motion within a first region of the video of the scene with respect to the speed of motion within a second region of the video of the scene; and causing a capture rate of the first region of the video of the scene to be greater than a capture rate of the second region of the video of the scene, or causing an exposure time of the first region to be less than exposure time of the second region.03-08-2012
20090135254IMAGE CAPTURING DEVICE AND METHOD FOR TRACKING OBJECT OF SAME - An image capturing device includes a main body with an image capturing unit received therein, a base, a pan and tilt mechanism for rotating the main body relative to the base, a viewfinder display window, and an object tracking system. The object tracking system includes an object obtaining unit, a storing unit, a detecting unit, a calculating unit, and a driving unit. The object obtaining unit is used for selecting and obtaining an object for tracking in the viewfinder display window. The storing unit is used for storing the information of the object. The detecting unit is used for detecting the position of the object image displayed in the viewfinder display window. The calculating unit is used for calculating an amount and direction of rotation for the main body. The driving unit is used for driving the pan and tilt mechanism to rotate the main body.05-28-2009
20120154599ZOOMING FACTOR COMPUTATION - Systems, methods, and devices are disclosed for determining a zooming factor for a camera in a pan, tilt, and zoom (PTZ) camera tracking system to enable a camera to keep an object at a constant size within the camera's viewing area, despite changes in the object's distance from the camera. This provides a complement to a camera's pan and tilt tracking of the moving object. For example, a PTZ camera tracking system that determines an object to track, utilizes information regarding images of the object of interest are used to determine a zooming factor (or other zooming value) for a camera in the PTZ camera tracking system. This information includes variables such as tilt angles of one or more cameras and a reference zooming factor.06-21-2012
20120026340SYSTEMS AND METHODS FOR PRESENTING VIDEO DATA - Described herein are systems and methods for presenting video data to a user. In overview, video data originates from a source, such as a capture device in the form of a camera. This video data is defined by a plurality of sequential frames, having a common geometric size. This size is referred to as the “geometric bounds of captured video data”. Analytics software is used to track objects in the captured video data, and provide position data indicative of the location of a tracked object relative to the geometric bounds of captured video data. Video data is presented to a user via a “view port”. By default, this view port is configured to display video data corresponding to geometric bounds of captured video data. That is, the view port displays the full scope of video data, as captured. Embodiments of the present invention use the position data to selectively adjust the view port to display a geometrically reduced portion of the geometric bounds of captured video data, thereby to assist the user in following a tracked object.02-02-2012
20090153667Surveying instrument - The present invention provides a surveying instrument provided with a tracking function, said surveying instrument comprising a first image pickup means 06-18-2009
20120154600IMAGE PICKUP DEVICE - An image pickup device may include an image pickup unit that captures a subject, an acquiring unit that acquires an absolute position of the image pickup device, an estimating unit that estimates an image capturing direction of the image pickup device, a receiving unit that receives first information or second information from an external terminal, the first information representing an absolute position of the subject, the second information representing an absolute position of the external terminal, an image capturing direction of the external terminal, and a distance from the external terminal to the subject, and a control unit that controls the image capturing direction or notifies an operator of the image pickup device of information instructing that the image capturing direction be changed, based on the absolute position, the image capturing direction, and at least one of the first information and the second information.06-21-2012
20110090345DIGITAL CAMERA, IMAGE PROCESSING APPARATUS, AND IMAGE PROCESSING METHOD - An image processing apparatus is provided which is capable of stably tracking a target even when an amount of characteristics of an image of the target varies on input pictures due to an abrupt movement of the target or other causes. An image processing apparatus (04-21-2011
20110090344Object Trail-Based Analysis and Control of Video - Systems and methods for analyzing scenes from cameras imaging an event, such as a sporting event broadcast, are provided. Systems and methods include detecting and tracking patterns and trails. This may be performed with intra-frame processing and without knowledge of camera parameters. A system for analyzing a scene may include an object characterizer, a foreground detector, an object tracker, a trail updater, and a video annotator. Systems and methods may provide information regarding centers and spans of activity based on object locations and trails, which may be used to control camera field of views such as a camera pose and zoom level. A magnification may be determined for images in a video sequence based on the size of an object in the images. Measurements may be determined from object trails in a video sequence based on an effective magnification of images in the video sequence.04-21-2011
20120120248IMAGE PHOTOGRAPHING DEVICE AND SECURITY MANAGEMENT DEVICE OF OBJECT TRACKING SYSTEM AND OBJECT TRACKING METHOD - An image photographing device of an object tracking system includes: an image recognizing module for collecting image information within a field of view (FOV) region in real time, recognizing occurrence of an event from the collected image information to extract an object contributing to the occurrence of the event, and sensing whether the extracted object is out of the FOV region or not. The device further includes an object tracking module for extracting property of the object from the extracted object to generate metadata, storing the metadata in a database, and providing the metadata stored in the database to ambient image photographing devices based on the sensing result of the image recognizing module.05-17-2012
20120249802DISTRIBUTED TARGET TRACKING USING SELF LOCALIZING SMART CAMERA NETWORKS - A plurality of camera devices are configured to localize one another based on visibility of each neighboring camera in an image plane. Each camera device captures images and identifies sightings of candidate targets. The camera device share information about sightings and triangulate positions of targets. Targets are matched to known tracks based on prior images, allowing targets to be tracked in a 3D environment.10-04-2012
20100245589CAMERA CONTROL SYSTEM TO FOLLOW MOVING OBJECTS - The present invention is directed to an image tracking system that tracks the motion of an object. The image processing system tracks the motion of an object with an image recording device that records a first image of an object to be tracked and shortly thereafter records a second image of the object to be tracked. The system analyzes data from the first and the second images to provide a difference image of the object, defined by a bit map of pixels. The system processes the difference image to determine a threshold and calculates a centroid of the pixels in the difference image above the threshold. The system then determines the center of the difference image and determines a motion vector defined by the displacement from the center to the centroid and determines a pan tilt vector based on the motion vector and outputs the pan tilt vector to the image recording device to automatically track the object.09-30-2010
20100245587Automatic tracking method and surveying device - The present invention provides an automatic tracking method, comprising a light spot detecting step of detecting a light spot 09-30-2010
20100245588TAG TRACKING SYSTEM - A new system is described that integrates real time item location data from electronically tagged items with a video system to allow cameras in proximity to a tagged item to automatically be enabled or recorded and to move and follow the movement of the tagged item. The methodology to implement such a system is described and involves computerized coordinate system scaling and conversion to automatically select and command movement if available of the most appropriate cameras. The system can follow a moving tagged item and hand off the item from one camera to another, and also command other facility assets, such as lights and door locks.09-30-2010
20120300083IMAGE PICKUP APPARATUS AND METHOD FOR CONTROLLING THE SAME - An image pickup apparatus is provided which realizes an improvement in the accuracy of a subject tracking function of the image pickup apparatus during continuous shooting. This image pickup apparatus includes an image pickup unit configured to capture a plurality of auxiliary images during an interval between capturing of a main image and capturing of a next main image; a first subject tracking processing unit configured to detect a region where a subject that is the same as a main subject exists, from a first region that is a part of a first auxiliary image among the plurality of auxiliary images; and a second subject tracking processing unit configured to detect a region where a subject that is the same as the main subject exists, from a second region of a second auxiliary image among the plurality of auxiliary images, the second region being larger than the first region.11-29-2012
20120300082Automatic Device Alignment Mechanism - An alignment suite includes first and second targeting devices and an optical coupler. The first targeting device is configured to perform a positional determination regarding a downrange target. The first targeting device includes an image processor. The second targeting device is configured to perform a targeting function relative to the downrange target and is affixable to the first targeting device. The optical coupler enables the image processor to capture an image of a reference object at the second targeting device responsive to the first and second targeting devices being affixed together. The image processor employs processing circuitry that determines pose information indicative of an alignment relationship between the first and second targeting devices relative to the downrange target based on the image captured.11-29-2012
20120127319METHODS AND APPARATUS FOR CONTROLLING A NETWORKED CAMERA - An apparatus for controlling a remote camera is described. The apparatus includes a housing and a processor positioned within the housing. A transceiver coupled to the processor communicates with a remote server. The remote server is coupled to the remote camera. A motion tracking component is mechanically coupled to the housing and electrically coupled to the processor. The motion tracking component generates a motion signal. The remote server controls a parameter of the remote camera in response to the motion signal. A display is coupled to the processor for displaying the output signal from the remote camera. The output signal is associated with the parameter of the remote camera.05-24-2012
20100208078HORIZONTAL GAZE ESTIMATION FOR VIDEO CONFERENCING - Techniques are provided to determine the horizontal gaze of a person from a video signal generated from viewing the person with at least one video camera. From the video signal, a head region of the person is detected and tracked. The dimensions and location of a sub-region within the head region is also detected and tracked from the video signal. An estimate of the horizontal gaze of the person is computed from a relative position of the sub-region within the head region.08-19-2010
20120212623SYSTEM AND METHOD OF CONTROLLING VISION DEVICE FOR TRACKING TARGET BASED ON MOTION COMMANDS - A system of controlling a vision device based on motion commands may include: a movable body; a vision device driven by being connected to the body and receiving image information; a driving unit driving the body in accordance with a motion command; and a control unit which calculates motion information of the body using the motion command and drives the vision device so as to compensate an influence caused by the motion of the body using the calculated motion information. By using the system, reliable image information may be obtained in a manner such that a vision device included in a subject such as a locomobile robot is controlled to watch a predetermined target even when the subject moves.08-23-2012
20120133777CAMERA TRACKING WITH USER SCRIPT CONTROL - The present application provides increased flexibility and control to a user by providing a camera and camera controller system that is responsive to a user-defined script. The user-defined script can allow a user to choose a subject and have the camera follow the subject automatically. In one embodiment, a camera is provided for taking still or video images. Movement of the camera is automatically controlled using a camera controller coupled to the camera. A user script is provided that describes a desired tracking of an object. The camera controller is responsive to the script for controlling the camera in order to track the object.05-31-2012
20120133778TRACKING SYSTEM AND METHOD FOR IMAGE OBJECT REGION AND COMPUTER PROGRAM PRODUCT THEREOF - In one exemplary embodiment, an object region tracking and picturing module is constructed on a moving platform of a mobile end and a remote control module is constructed on anther platform for an image object region tracking system. The two modules communicate with each other via a digital network for delivering required information. The object region tracking and picturing module uses a real-time image backward search technology to store at least an image frame previously captured on the moving platform into a frame buffer, and start tracking an object region from the position pointed out by the remote control module to a newest image frame captured on the moving platform, then find out a relative position on the newest image frame for the tracked object region.05-31-2012
20120212622MOVING OBJECT IMAGE TRACKING APPARATUS AND METHOD - According to one embodiment, a moving object image tracking apparatus includes two drivers, a camera sensor, a tracking error detector, angle sensors, angular velocity sensors, a first calculator, a second calculator, a corrected tracking error detector, a generator, and a controller. The tracking error detector detects tracking errors as deviation amounts of a moving object from a visual field center from the image data as tracking error detection values. The corrected tracking error detector calculates corrected tracking errors for each period shorter than a sampling period, tracking error detection values being constant, from a velocity vector and a relationship between a visual axis vector and a position vector. The generator generates angular velocity command values required to drive the drivers to track the moving object using the corrected tracking errors. The controller controls the drivers so that differences between the angular velocity command values and angular velocities become zero.08-23-2012
20090059009OBJECT TRACKABILITY VIA PARAMETRIC CAMERA TUNING - A method and apparatus are described for improving object trackability via parametric camera tuning. In one embodiment, a determination is made as to whether the camera settings loaded cause saturation of a video image and hue differences between objects and between the objects and a background of the video image. If the saturation and hue differences do not exceed the threshold, a search of camera settings is performed to increase saturation and hue differences between objects and between the objects and a background of the video image.03-05-2009
20090059008DATA PROCESSING APPARATUS, DATA PROCESSING METHOD, AND DATA PROCESSING PROGRAM - A data processing apparatus includes: a detection section detecting an image of an object from moving image data; a table creation section recording position information indicating a position on the moving image data in a table on the basis of a detection result by the detection section; a dubbing processing section performing dubbing processing on the moving image data; and a control section controlling the dubbing processing section so as to extract a portion of the moving image data recorded on a first recording medium on the basis of the position information recorded in the table, and to perform the dubbing processing on the extracted portion onto a second recording medium.03-05-2009
20120224068DYNAMIC TEMPLATE TRACKING - Various arrangements for tracking a target within a series of images is presented. The target may be detected within a first image at least partially based on a correspondence between the target and a stored reference template. A tracking template may be created for tracking the target using the first image. The target may be located within a second image using the tracking template.09-06-2012
20120188379AUTOMATIC-TRACKING CAMERA APPARATUS - An automatic-tracking camera apparatus which is capable of realizing continuous and smooth driving and obtaining an image with little position variation of a tracking target from a target position within the image and with little blur. The position of a camera body is changed by a gimbal device. The speed of a tracking target object at the next-after-next start timing of image acquisition by the camera body is predicted. The gimbal device is controlled so that the camera body reaches the position indicated by a position instruction value generated for the next-after-next start timing of image acquisition by the camera body, at the next-after-next start timing, and the speed of the camera body at the next-after-next start timing of image acquisition by the camera body corresponds to the speed predicted for the next-after-next timing of image acquisition.07-26-2012
20090027502Portable Apparatuses Having Devices for Tracking Object's Head, and Methods of Tracking Object's Head in Portable Apparatus - The portable apparatus includes camera section, head tracking section, image processor, video codec section and camera controller. The camera section obtains image of object. The head tracking section receives the image from the camera section, detects head area from the image, simulates the head area using model of ellipse, and calculates shape similarity, which represents similarity between the shape of the gradients of pixels at boundary of the ellipse and that of the ellipse, and color histogram similarity between internal area of candidate figure and internal area of the modeling figure. In order to obtain of the position of candidate ellipse of which color histogram similarity has maximum value, mean shift, which requires small amount of calculation with respect to first number of samples in internal area of the candidate ellipse, is used. Image processing section performs image-processing on the image based on quality information of the detected head area. Video codec section performs differential encoding on the detected head area based on the location of the detected head area. The camera controller controls rotation of the camera section on the basis of the location of the detected head area. Robust head tracking algorithm with small quantity of calculation is modified to be adapted to portable device, the user's head area may be tracked appropriately for the portable device.01-29-2009
20120081552VIDEO TRACKING SYSTEM AND METHOD - A video tracking system and method are provided for tracking a target object with a video camera.04-05-2012
20120229651IMAGE PICKUP APPARATUS WITH TRACKING FUNCTION AND TRACKING IMAGE PICKUP METHOD - An image pickup apparatus with tracking function including a lens apparatus having aperture and focus adjust function and a camera apparatus connected to the lens apparatus and having an image pickup element. The apparatus has a movement detector that detects movement of an object to be tracked in a picked-up image, and a stop controller that performs a control of changing the aperture value of the aperture stop in an opening direction when movement of the object to be tracked is detected by the movement detector.09-13-2012
20110122254DIGITAL CAMERA, IMAGE PROCESSING APPARATUS, AND IMAGE PROCESSING METHOD - Images of a target are stably tracked even when an image-capture environment changes. An image processing apparatus (05-26-2011
20110122253IMAGING APPARATUS - An imaging apparatus includes an imaging unit, a field angle change unit, and a movement detection unit. The imaging unit includes a lens that forms an image of a subject and acquires a picture image by taking the image formed by the lens. The field angle change unit changes a field angle of the picture image acquired by the imaging unit. The movement detection unit detects a movement of the imaging apparatus. The field angle change unit changes the field angle of the picture image in accordance with a moving direction of the imaging apparatus when the movement detection unit detects the movement of the imaging apparatus.05-26-2011
20100328467Movable-mechanical-section controlling device, method of controlling movable mechanical section, and program - A movable-mechanical-section controlling device includes a driving controlling unit and an angular position range setting unit. The driving controlling unit performs control so that a movable mechanical section performs a unit pan operation that is performed in a predetermined angular position range in a pan direction, the movable mechanical section having a structure that moves so that an image pickup direction of an image pickup section changes in the pan direction, the predetermined angular position range being set on the basis of a previously set pan-direction movable angular range. The angular position range setting unit sets the angular position range so that, after the unit pan operation has been performed in a first angular position range, the unit pan operation is performed in a second angular position range differing from the first angular position range.12-30-2010
20120320219IMAGE GATED CAMERA FOR DETECTING OBJECTS IN A MARINE ENVIRONMENT - System for detecting objects protruding from the surface of a body of water in a marine environment under low illumination conditions, the system comprising a gated light source, generating light pulses toward the body of water illuminating substantially an entire field of view, a gated camera, sensitive at least to wavelengths of the light generated by the gated light source, the gated camera receiving light reflected from at least one object, within the field of view, protruding from the surface of the body of water and acquiring a gated image of the reflected light, and a processor coupled with the gated light source and with the gated camera, the processor gating the gated camera to be set ‘OFF’ for at least the duration of time it takes the gated light source to produce a light pulse in its substantial entirety in addition to the time it takes the end of the light pulse to complete traversing a determined distance from the system and back to the gated camera, the processor further setting, for each pulse, the gated camera to be ‘ON’ for an ‘ON’ time duration until the light pulse, reflecting back from the object, is received by the gated camera.12-20-2012
20120268608AUTOMATIC TRACKING CONTROL APPARATUS FOR CAMERA APPARATUS AND AUTOMATIC TRACKING CAMERA SYSTEM HAVING SAME - An automatic tracking control apparatus for a camera apparatus having a panning or tilting function, comprising an object recognition unit to recognize an object in picked-up image, a tracking object setting unit to set the recognized object as tracking object, an output position setting unit to set a position in the image for outputting the image of the tracking object, a control computing unit to output drive signal to locate the tracked object at the output position, and an image output unit to output an image in which an indication of the tracking object is superimposed on the image. Automatic tracking is suspended to change the position of the tracking object in the image by the output position setting unit, and automatic tracking is restarted by outputting a drive signal for driving the camera apparatus to locate the tracked object at the output position.10-25-2012
20100231723APPARATUS AND METHOD FOR INFERENCING TOPOLOGY OF MULTIPLE CAMERAS NETWORK BY TRACKING MOVEMENT - Provided are an apparatus and a method for tracking movements of objects to infer a topology of a network of multiple cameras. The apparatus infers the topology of the network formed of the multiple cameras that sequentially obtain images and includes an object extractor, a haunting data generator, and a haunting database (DB), and a topology inferrer. The object extractor extracts at least one from each of the obtained images, for the multiple cameras. The haunting data generator generates appearing cameras and appearing times at which the moving objects appear, and disappearing cameras and disappearing times at which the moving objects disappear, for the multiple cameras. The haunting DB stores the appearing cameras and appearing times and the disappearing cameras and disappearing times of the moving object, for the multiple cameras. The topology inferrer infers the topology of the network using the appearing cameras and appearing times and the disappearing cameras and disappearing times of moving objects. Therefore, the apparatus accurately infers topologies and distances among the multiple cameras in the network of the multiple cameras using the cameras and appearing and disappearing times at which the moving objects appear and disappear. As a result, the apparatus accurately track the moving objects in the network.09-16-2010
20120327249AUGMENTED REALITY METHOD AND DEVICES USING A REAL TIME AUTOMATIC TRACKING OF MARKER-FREE TEXTURED PLANAR GEOMETRICAL OBJECTS IN A VIDEO STREAM - Methods and devices for the real-time tracking of an object in a video stream for an augmented-reality application are disclosed herein.12-27-2012
20110157370TAGGING PRODUCT INFORMATION - A system for enabling the tagging of items appearing in a moving or paused image includes the use of an identification device on or within said item to be tagged. The method includes capturing moving image footage containing the image of said item to be tagged, detecting the presence and the position of the identification means within each frame of the moving image footage and hence determining the position of the item to be tagged in each frame of the moving image. By automatically determining the position of the identification device, a suitable tag can then be automatically associated with the item which has the identification device provided thereon or therein when saving or transmitting the moving image.06-30-2011
20130021477METHOD AND CAMERA FOR DETERMINING AN IMAGE ADJUSTMENT PARAMETER - The present invention relates to a method and a camera for determining an image adjustment parameter. The method includes receiving a plurality of images representing an image view, detecting from the plurality of images events of a specific event type, identifying a location within the image view where the event of the specific type is present, determining a presence value of each of the identified locations, and determining an image adjustment parameter based on data from an adjustment location within the image view. The adjustment location is determined based on the presence value in each location of a plurality of locations within the image view.01-24-2013
20120242838IMAGE CAPTURING APPARATUS AND METHOD FOR CONTROLLING THE SAME - An image capturing apparatus is provided that is capable of performing both object detection using image recognition and object detection using movement detection on successively captured images. In the image capturing apparatus, the reliability of the result of the object detection using image recognition is evaluated based on the previous detection results. If it is determined that the reliability is high, execution of the object detection using movement detection is determined. If it is determined that the reliability is low, non-execution of the object detection using movement detection is determined. With this configuration, the object region can be tracked appropriately.09-27-2012
20130169821Detecting Orientation of Digital Images Using Face Detection Information - A method of automatically establishing the correct orientation of an image using facial information. This method is based on the exploitation of the inherent property of image recognition algorithms in general and face detection in particular, where the recognition is based on criteria that is highly orientation sensitive. By applying a detection algorithm to images in various orientations, or alternatively by rotating the classifiers, and comparing the number of successful faces that are detected in each orientation, one may conclude as to the most likely correct orientation. Such method can be implemented as an automated method or a semi automatic method to guide users in viewing, capturing or printing of images.07-04-2013
20130100295INFORMATION PROCESSING APPARATUS AND METHOD - An information processing apparatus includes an acquirement section, a detection section, a selection section, a display control section and a commodity recognition section. The acquirement section acquires an image captured by an image capturing section. The detection section detects all or part of targets included in the image acquired by the acquirement section. The selection section selects any one target in the condition that the detection section detects a plurality of targets. The display control section displays the target selected by the selection section in the plurality of targets on the image acquired by the acquirement section. The commodity recognition section recognizes a commodity captured by the image capturing section based on a similarity showing a degree with which all or part of the images of the target selected by the selection section are similar to the reference image of each commodity.04-25-2013
20130113941TRACKING APPARATUS AND TRACKING METHOD - The tracking target subject specifying unit specifies a tracking target subject in image data. The first tracking position detection unit detects first characteristic information from the image data and set a first candidate tracking position based on the first characteristic information. The second tracking position detection unit detects second characteristic information from the image data and detect a second candidate tracking position based on the second characteristic information. The reference information acquisition unit acquires reference information. The control unit decides a true tracking position based on two determinations.05-09-2013
20130113940IMAGING DEVICE AND SUBJECT DETECTION METHOD - An imaging device includes a first detection part which detects one or more subjects in an image captured by the image capturing part capturing an image continuously; a second detection part which follows the one or more subjects detected; and a system control part which includes a setting part setting a part of the image as a limited region, and causes, after the first detection part detects the one or more subjects in the captured image, the second detection part to follow and detect a subject in an image captured subsequently to the captured image, and causes the first detection part to detect a subject in the limited region.05-09-2013
20080259163METHOD AND SYSTEM FOR DISTRIBUTED MULTIPLE TARGET TRACKING - A method and system for distributed tracking of multiple targets is disclosed. Multiple targets to be tracked by a plurality of trackers are detected in a frame. The motion state variable of each of the plurality of trackers is calculated in the E-step of a variational Expectation-Maximization algorithm. Further, the data association variable of each of the plurality of trackers is calculated in the M-step of the algorithm. Depending on the motion state variable and the data association variable, the multiple targets are tracked.10-23-2008
20130141591HUMAN-MACHINE-INTERFACE AND METHOD FOR MANIPULATING DATA IN A MACHINE VISION SYSTEM - This invention provides a Graphical User Interface (GUI) that operates in connection with a machine vision detector or other machine vision system, which provides a highly intuitive and industrial machine-like appearance and layout. The GUI includes a centralized image frame window surrounded by panes having buttons and specific interface components that the user employs in each step of a machine vision system set up and run procedure. One pane allows the user to view and manipulate a recorded filmstrip of image thumbnails taken in a sequence, and provides the filmstrip with specialized highlighting (colors or patterns) that indicate useful information about the underlying images. The programming of logic is performed using a programming window that includes a ladder logic arrangement.06-06-2013
20120274780IMAGE APPARATUS, IMAGE DISPLAY APPARATUS AND IMAGE DISPLAY METHOD - An image apparatus, comprising a image capture device that captures an image, a display device that displays the image, a display control block that enlarges and displays a first area, included in a first image, on the display device, a target setting block that sets an object in the first area as a target for tracking and a detecting block that detects a second area, including the object, in a second image which is specified to be displayed next on the display device in place of the first image, the display control block enlarging and displaying the second area on the display device.11-01-2012
20120274785SUBJECT TRACKING APPARATUS, CAMERA HAVING THE SUBJECT TRACKING APPARATUS, AND METHOD FOR TRACKING SUBJECT - A subject tracking apparatus includes a light detector, a focus detector, and a tracking controller. The focus detector is configured to detect focus information at a plurality of focus detection regions in a view of an image. The focus information includes at least focus information detected at a subject region of a subject in the view. The tracking controller is configured to determine at least one first region having substantially same light information as light information detected at the subject region from among a plurality of light measurement regions, to determine at least one second region having substantially same focus information as the focus information detected at the subject region from among the plurality of focus detection regions, and to determine reference information for tracking the subject in the view based on the at least one first region and the at least one second region.11-01-2012
20120274784SYSTEM AND METHOD FOR RECOGNIZING A UNIT LOAD DEVICE (ULD) NUMBER MARKED ON AN AIR CARGO UNIT - The present invention provides a system and method for recognizing a Unit Load Device (ULD) number marked on an air cargo unit. The system includes at least one camera configured to acquire images of the ULD number. It includes also a presence sensing module configured to detect a presence status of the air cargo unit in a scanning zone of the system, the presence status can have a value being one of present and absent, and a recognition processor coupled to the presence sensing module and to the at least one camera. The recognition processor is configured to obtain from the presence sensing module information relating to the presence status of said air cargo unit, to trigger the at least one camera to acquire the images upon a change in the value of the presence status, and to process the images for recognizing the ULD number.11-01-2012
20120274783IMAGING WITH REAL-TIME TRACKING USING OPTICAL COHERENCE TOMOGRAPHY - An optical coherence tomography system is provided. The system includes an OCT imager; a two-dimensional transverse scanner coupled to the OCT imager, the two-dimensional transverse scanner receiving light from the light source and coupling reflected light from a sample into the OCT imager; optics that couple light between the two-dimensional transverse scanner and the sample; a video camera coupled to the optics and acquiring images of the sample; and a computer coupled to receive images of the sample from the video camera, the computer processing the images and providing a motion offset signal based on the images to the two-dimensional transverse scanner.11-01-2012
20120274782Object Recognition Method and Recognition Apparatus - A target fish-eye image for object recognition is split into regions in accordance with the distortion direction. An object recognition portion performs object recognition by using databases prepared in accordance with the regions respectively. The same process is applied to a plurality of target fish-eye images created by rotation of the target fish-eye image. A detected coordinate transformation portion restores the obtained position of an object into the original position by inverse rotation, and outputs a thus obtained result as a detection result. Detection accuracy in object recognition is improved, and the quality of data in databases is reduced. Rotated object images required for database creation are used for creation of databases for object recognition for the fish-eye image. Thus, a distorted fish-eye image for object recognition is directly used without projective conversion or image conversion based on corrective operation to the fish-eye image.11-01-2012
20120274781MARGINAL SPACE LEARNING FOR MULTI-PERSON TRACKING OVER MEGA PIXEL IMAGERY - A method for tracking pedestrians in a video sequence, where each image frame of the video sequence corresponds to a time step, includes using marginal space learning to sample a prior probability distribution p(x11-01-2012
20130182122INFORMATION PROCESSING APPARATUS AND METHOD - According to one embodiment, an information processing apparatus includes an acquirement unit configured to acquire a color image captured by an image capturing unit, an image conversion unit configured to convert the acquired color image into a monochrome image, a first recognition unit configured to specify an commodity included in the image captured by the image capturing unit based on the monochrome image, a second recognition unit configured to specify the commodity included in the image captured by the image capturing unit based on the acquired color image if the commodity cannot be specified by the first recognition unit and an output unit configured to output information showing the commodity specified by the first recognition unit or the second recognition unit.07-18-2013
20130182123IMAGE PICKUP APPARATUS, CONTROL METHOD FOR THE SAME, AND PROGRAM THEREOF - An image pickup apparatus including an image pickup device configured to obtain a captured image corresponding to an object image, an object detecting unit configured to detect a specific object in the captured image, an position obtaining unit configured to obtain a position at which the specific object exists in the captured image, a composition specifying unit configured to specify a recommended composition on the basis of the existence position of the specific object when the specific object is viewed as a main object, and an instruction control unit configured to instruct the execution of a predetermined operation which notifies a user of the recommended composition.07-18-2013
20130188059Automated System and Method for Tracking and Detecting Discrepancies on a Target Object - A detection system including a target object having a target object coordinate system, a tracking unit configured to monitor a position and/or an orientation of the target object and generate a target object position signal indicative of the position and/or the orientation of the target object, a camera positioned to capture an image of the target object, an orienting mechanism connected to the camera to control an orientation of the camera relative to the target object, and a processor configured to analyze the image to detect a discrepancy in the image and, when the discrepancy is present in the image, determine a location of the discrepancy relative to the target object coordinate system based at least upon the target object position signal and the orientation of the camera, and then orient the camera and laser to aim at and point out the discrepancy.07-25-2013
20120019665AUTONOMOUS CAMERA TRACKING APPARATUS, SYSTEM AND METHOD - An autonomous camera tracking system. The system can include a tracking apparatus, and a beacon. The tracking apparatus can include an infrared sensor, a camera, a processor, and a panning stepper motor. The beacon can include an infrared emitter. The infrared sensor can receive an infrared signal from the beacon and, upon detection of a shift in the position of the infrared signal, the stepper motor can pan the camera and the infrared sensor so as to place the beacon in the field of view of the camera. The tracking apparatus can further include a tilting stepper motor, and, upon detection of a shift in the position of the infrared signal, the stepper motor can tilt the camera and the infrared sensor so as to place the beacon in the field of view of the camera.01-26-2012
20120019664CONTROL APPARATUS FOR AUTO-TRACKING CAMERA SYSTEM AND AUTO-TRACKING CAMERA SYSTEM EQUIPPED WITH SAME - A control apparatus for a camera-monitor system having an auto-tracking function has a mode switcher switching the camera between a normal shooting mode and a tracking mode, a display position selector selecting a display position at which the selected specific object is to be displayed on the monitor screen, and a signal outputting unit outputting a drive signal for driving the auto-tracking camera so that an image of the tracked specific object is displayed at the display position. When the tracking mode is selected by the mode switcher, the signal outputting unit outputs to the camera a drive signal responsive to the amount of operation of the operation part so that the direction of shift of the selected specific object in shifting the selected specific object on the monitor screen coincides with the represented direction during setting performed before the camera starts auto tracking of the selected specific object.01-26-2012
20130194433IMAGING PROCESSING SYSTEM AND METHOD AND MANAGEMENT APPARATUS - An imaging processing system includes one or more image capturing apparatuses, a reading unit configured to read biometric information from an authentication object person, a similarity calculation unit configured to calculate similarity based on a result of comparing biometric information read by the reading unit with true biometric information of the authentication object person, an authentication unit configured to perform authentication based on a comparison between the similarity calculated by the similarity calculation unit and a preliminarily set threshold, and a control unit configured to control, if the authentication performed by the authentication unit is successful, imaging processing, which is performed by the image capturing apparatus, based on the similarity calculated by the similarity calculation unit.08-01-2013
20120057028IMAGING SYSTEM AND PIXEL SIGNAL READOUT METHOD - An imaging system is provided that includes a target detector, a readout area determiner and a readout processor. The target detector detects a target subject from an effective pixel area of an image sensor. The readout area determiner defines a readout area within the effective pixel area, the readout area corresponding to a detected target. The readout processor reads out only pixel signals within the readout area. A partial area within the readout area is redefined as the readout area when the size of the original readout area is greater than a predetermined size.03-08-2012
20130201347PRESENCE DETECTION DEVICE - A user presence detection device includes a camera module with a silicon-based image sensor adapted to capture an image and a processing device configured to process the image to detect the presence of a user. The camera module further includes a light filter having a lower cut-off wavelength of between 550 nm and 700 nm and a higher cut-off wavelength of between 900 nm and 1100 nm.08-08-2013
20130201344SMART CAMERA FOR TAKING PICTURES AUTOMATICALLY - Methods, apparatuses, systems, and computer-readable media for taking great pictures at an event or an occasion. The techniques described in embodiments of the invention are particularly useful for tracking an object, such as a person dancing or a soccer ball in a soccer game and automatically taking pictures of the object during the event. The user may switch the device to an Event Mode that allows the user to delegate some of the picture-taking responsibilities to the device during an event. In the Event Mode, the device identifies objects of interest for the event. Also, the user may select the objects of interest from the view displayed by the display unit. The device may also have pre-programmed objects including objects that the device detects. In addition, the device may also detect people from the users' social networks by retrieving images from social networks like Facebook® and LinkedIn®.08-08-2013
20130201345METHOD AND APPARATUS FOR CONTROLLING VIDEO DEVICE AND VIDEO SYSTEM - Embodiments of the present invention disclose a method and an apparatus for controlling a video device as well as a video system. The video device includes a monitor and a camera that are relatively fixed, face a same direction, and are connected to a moving mechanism. The method includes: obtaining a facial image of a participant that is identified from a conference site image, where the conference site image is shot and provided by the camera; analyzing the facial image, and, after determining, with reference to an analysis result, that a facial position of the participant has deviated from directions of facing the monitor and the camera, determining a deviation direction; and controlling the moving mechanism to drive the monitor and the camera to move to a position of facing the facial position of the participant according to the deviation direction.08-08-2013
20130201346IMAGE PROCESSING DEVICE WITH FUNCTION FOR AUTOMATICALLY ADJUSTING SEARCH WINDOW - An image processing device according to the present invention includes a means (08-08-2013
20130208128METHOD AND APPARATUS FOR USING GESTURES TO CONTROL A LASER TRACKER - A laser measurement system includes a laser tracker having a structure rotatable about first and second axes, a first light source that launches a light beam from the structure, a distance meter, first and second angular encoders that measure first and second angles of rotation about the first and second axes, respectively, a processor, and a camera system. Also, a communication device that includes a second light source and an operator-controlled device that controls emission of a light from the second light source; a retroreflector target not disposed on the communication device. Also, the camera system is operable to receive the second light and to convert it into a digital image, and the processor is operable to determine a command to control operation of the tracker based on a pattern of movement of the second light source between first and second times and the digital image.08-15-2013

Patent applications in class OBJECT TRACKING

Patent applications in all subclasses OBJECT TRACKING