39th week of 2012 patent applcation highlights part 35 |
Patent application number | Title | Published |
20120243705 | Systems And Methods For Reconstructing An Audio Signal From Transformed Audio Information - A system and method may be configured to reconstruct an audio signal from transformed audio information. The audio signal may be resynthesized based on individual harmonics and corresponding pitches determined from the transformed audio information. Noise may be subtracted from the transformed audio information by interpolating across peak points and across trough points of harmonic pitch paths through the transformed audio information, and subtracting values associated with the trough point interpolations from values associated with the peak point interpolations. Noise between harmonics of the sound may be suppressed in the transformed audio information by centering functions at individual harmonics in the transformed audio information, the functions serving to suppress noise between the harmonics. | 2012-09-27 |
20120243706 | Method and Arrangement for Processing of Audio Signals - Method and arrangement in an audio handling entity, for damping of dominant frequencies in a time segment of an audio signal. A time segment of an audio signal is obtained, and an estimate of the spectral density or “spectrum” of the time segment is derived. An approximation of the estimate is derived by smoothing the estimate, and a frequency mask is derived by inverting the approximation. Frequencies comprised in the audio time segment are then damped based on the frequency mask. The method and arrangement involves no multi-band filtering or selection of attack and release times. | 2012-09-27 |
20120243707 | SYSTEM AND METHOD FOR PROCESSING SOUND SIGNALS IMPLEMENTING A SPECTRAL MOTION TRANSFORM - A system and method are provided for processing sound signals. The processing may include identifying individual harmonic sounds represented in sound signals, determining sound parameters of harmonic sounds, classifying harmonic sounds according to source, and/or other processing. The processing may include transforming the sound signals (or portions thereof) into a space which expresses a transform coefficient as a function of frequency and chirp rate. This may facilitate leveraging of the fact that the individual harmonics of a single harmonic sound may have a common pitch velocity (which is related to the chirp rate) across all of its harmonics in order to distinguish an the harmonic sound from other sounds (harmonic and/or non-harmonic) and/or noise. | 2012-09-27 |
20120243708 | Modulation of Audio Signals in a Parametric Speaker - Methods and systems for amplitude modulation in a parametric speaker system are provided that perform truncated double sideband (TDSB) frequency modulation of audio signal in which most of the processing is performed in the frequency domain, thus permitting use of fast processing techniques for amplitude modulation (AM) and filtering and reducing computation cost over time domain processing. A maximum envelope value of the time domain audio signal may be to the carrier signal in the frequency domain that avoids emitting the carrier signal when the input signal level is low or mute. The application of the envelope value may be smoothed to reduce discontinuity at input block boundaries. | 2012-09-27 |
20120243709 | INTEGRATED AUDIO VIDEO SIGNAL PROCESSING SYSTEM USING CENTRALIZED PROCESSING OF SIGNALS - Integrated processing of audio/video signals can eliminate unnecessary signal processors and converters without losing the functionality of typical home entertainment system components. The integrated system includes a main player that captures and processes signals digitally, a dummy display, and a dummy speaker. The dummy display may only have a display panel and a panel driver. The dummy speaker may only have a driving unit and no crossover logic. The main player may have a PC architecture and process all signals digitally for outputting signals tailored for the display device and the individual driving units of the dummy speaker. The integrated system may also provide dynamic signal adjustments based on the surrounding environment. The main player may include a storage device and can process media content stored therein to produce supplemental information to provide an optimal audiovisual experience. This supplemental information can be shared among users over a network connection. | 2012-09-27 |
20120243710 | Methods and Systems Using a Compensation Signal to Reduce Audio Decoding Errors at Block Boundaries - Methods, systems and computer-readable medium reduce and/or eliminate errors in coding/decoding of streamed audio due to resetting of decoder state values based on playback buffer access. An inaudible compensation signal is included with the audio signal. The compensation signal is generated having a characteristic selected so that the encoded streamed audio signal substantially matches the reset state values at block boundaries. In an ADPCM example, the compensation signal is chosen such that the sum of the compensation signal and the original audio signal (=the compensated audio signal) has the characteristic that, at the block boundaries, the compensated audio signal matches the initial predictor value. | 2012-09-27 |
20120243711 | MIXING APPARATUS - In an automatic correction process, automatic correction processing portions | 2012-09-27 |
20120243712 | SWITCH AND SWITCH CIRCUIT USING THE SAME - A fourth n-channel MOSFET has a source terminal and a back-gate terminal connected to each other. A switch element is connected between the source terminal of the fourth n-channel MOSFET and a ground potential, and the source terminal of the fourth n-channel MOSFET is made become the ground potential when the fourth n-channel MOSFET is OFF. A protection circuit is provided between a connection node of the source terminal of the fourth n-channel MOSFET and an input terminal of the switch element, and the ground potential so that a negative inflow current from the drain terminal of the fourth n-channel MOSFET caused by electrostatic discharge flows to the ground potential. | 2012-09-27 |
20120243713 | SPATIALLY CONSTANT SURROUND SOUND SYSTEM - An audio processing system may modify an input surround sound signal to generate a spatially equilibrated output surround sound signal that is perceived by a user as spatially constant for different sound pressures of the surround sound signal. The audio processing system may determine based on a psychoacoustic model of human hearing, a loudness and a localisation for a combined sound signal. The loudness and the localisation may be determined by the system for a virtual user located between the front and the rear loudspeakers that has a predetermined head position in which one ear of the virtual user is directed towards one of front or rear loudspeakers and the other ear of the virtual user being directed towards the other of the front or rear loudspeakers. The audio processing system may adapt the front and/or rear audio signal channels based on the determined loudness and localisation. | 2012-09-27 |
20120243714 | MICROPHONE PLACEMENT FOR ORAL APPLICATIONS - Microphone placement for oral applications are disclosed herein. The assembly may be attached, adhered, or otherwise embedded into or upon a removable oral appliance to form a hearing aid assembly. Such an oral appliance may be a custom-made device which can enhance and/or optimize received audio signals for vibrational conduction to the user. Received audio signals may be processed to cancel acoustic echo such that undesired sounds received by one or more intra-buccal and/or extra-buccal microphones are eliminated or mitigated. Multiple microphones may be positioned throughout the user's mouth to enhance reception of audio signals from outside sources as well as from the user's own voice. For instance, one or more microphones may be placed in contact with the inner surface of the user's cheeks to detect outside audio signals as well as in direct contact with the user's tooth or teeth to receive the user's voice through vibrational detection. | 2012-09-27 |
20120243715 | AUDIO PROCESSING DEVICE, SYSTEM, USE AND METHOD - An audio processing device includes a) an input unit for converting a time domain input signal to a number N | 2012-09-27 |
20120243716 | HEARING APPARATUS WITH FEEDBACK CANCELER AND METHOD FOR OPERATING THE HEARING APPARATUS - A hearing apparatus has artifact-free, fast feedback cancelation properties. The hearing apparatus has a first microphone coupled by way of a pre-whitening filter to a feedback canceler in a first hearing device. The hearing apparatus is configured to set a frequency response of the pre-whitening filter in dependence on a signal of a second microphone of the hearing apparatus. | 2012-09-27 |
20120243717 | HEARING AID WITH A REPLACEABLE INSERTION CAP - A hearing aid has a coupling structure, a first hearing aid housing portion and a second hearing aid housing portion including a flexible material so that the second hearing aid housing portion is adaptable to different ear canal size. The second hearing aid housing portion is detachably coupled to the first hearing aid housing portion via the coupling structure. | 2012-09-27 |
20120243718 | SPEAKER BASE DEVICE FOR DISPLAY - A speaker base device for display comprises: a base, which internal has at least one PCB, the base extends outwardly to form at least one set of speaker-like structure; and a support column section, which extends upwardly from the base and connects with a display, the external cables of the display pass through an accommodating room of the support column section and electrically connect with the PCB of the base, so that the display can sound through the set of speaker-like structure of the base, and hence prior sounding structure mounted on the display can be replaced for better sound performance. | 2012-09-27 |
20120243719 | Display-Based Speaker Structures for Electronic Devices - Electronic devices that contain flexible displays and one or more display-based speaker structures may be provided. The speaker structures may be positioned under the flexible display. Portions of the flexible display may be used as speaker membranes for the speaker structures. The speaker structures may be driven by transducers that convert electrical audio signal input into sound. Piezoelectric transducers or transducers formed from coils and magnets may be used to drive the speaker structures. Speaker membranes may be formed from active display areas of the flexible display. Some, all, or substantially all of the flexible display may be used as a speaker membrane for one or more display-based speaker structures. An optional cover layer may be provided with speaker openings so that sound may pass from the display-based speaker structures to the exterior of the device. | 2012-09-27 |
20120243720 | Auto-Play Audible Publication - A novel auto-play audible publication, which enables to play the contents and voice simultaneously without external players; the audible publication is either of book, newspaper, magazine, booklet, leaflet, calendar, card, CD album, notebook, religious scripture/journal; in particular, the audible publication comprises: thin speaker, play panel, control circuit board, speaker driving flexible circuit board and power supply, of which the control circuit board is provided with a memory for storing all audio data in the audible publication; the play panel can be controlled to select the contents and send signals to the control circuit board, and make the speaker driving flexible circuit board output audio effect to the thin speaker for play back. While simultaneous reading and listening, the audible publication also allows learning acoustically the contents and listen to correct voice for a better learning effect. | 2012-09-27 |
20120243721 | Differential Microphone Unit and Mobile Apparatus - Disclosed is a differential microphone unit which can improve the characteristics of a microphone unit and widen the directional range thereof. The disclosed differential microphone unit ( | 2012-09-27 |
20120243722 | Speaker - A speaker includes a ferrite yoke plate, an inner magnet, a first pole member, an outer magnet, a second pole member, a diaphragm, and a voice coil. Both the inner and the outer magnets are disposed on the ferrite yoke plate, and are spaced apart from each other. The first pole member is disposed on a top of the inner magnet, while the second pole member on a top of the outer magnet. Besides, a circumference of the first pole member and a circumference of the second pole member are spaced apart from each other and define a gap. The voice coil is connected with the diaphragm, and extends axially into the gap between the two pole members. Thereby, the magnetic line of force in the speaker can be more regularly concentrated, and the driving efficiency of the voice coil be promoted, making the present invention particularly suitable for micro-speakers. | 2012-09-27 |
20120243723 | EARPHONE WITH A SUPPORT ELEMENT - An earphone including a mounting element for mounting the earphone to an auricle is provided. The mounting element is attachable to a body of the earphone. The earphone further includes an adjusting element for adjusting the position of the mounting element with respect to the body of the earphone. A mounting element for mounting the earphone to an auricle is provided. The mounting element is attachable to a body of an earphone, and the mounting element includes an adjusting element for adjusting the position of the mounting element with respect to the body of the earphone. | 2012-09-27 |
20120243724 | EAR PAD AND EARPHONE HAVING THE SAME - An ear pad is made of a deformable material and mounted on an earphone, at least a tip of which is insertable into an ear. The ear pad comprises a cylindrical body portion and an elastically deformable portion. The cylindrical body portion has a sound guide hole formed therein, being adapted to cause a sound to impinge on the ear. The deformable portion is connected to a tip of the body portion and spreads from the tip toward a rear end of the body portion, and includes a plurality of slit-shaped openings radially extending therefrom. The body portion includes grooves axially extending therefrom, corresponding to the openings in the outer peripheral surface of the body portion. Each opening overlaps with a corresponding groove in cases where the elastically deformable portion is deformed and pressed against the outer peripheral surface of the body portion. | 2012-09-27 |
20120243725 | EARPHONE - An earphone is disclosed. The earphone comprises an external auditory meatus insertion member, a casing and a driver unit. At least a part of the external auditory meatus insertion member is insertable into an external auditory meatus. The external auditory meatus insertion member is attached to the casing. The driver unit is disposed within the casing to generate sounds. The casing includes a front space formed in front of the driver unit, a sound guide hole in communication with the front end of the front space to cause sounds to impinge on the external auditory meatus, and a rear space formed in the rear of the driver unit. | 2012-09-27 |
20120243726 | EARPHONE - An earphone is disclosed. The earphone comprises an external auditory meatus insertion member, a casing and a driver unit. The external auditory meatus insertion member has a part which is insertable into an external auditory meatus. The external auditory meatus insertion member is detachable from the casing. The driver unit is disposed within the casing to generate sounds. The the external auditory meatus insertion member further includes an elastically deformable soft insertion member and a highly rigid hard insertion member. | 2012-09-27 |
20120243727 | DEVICE AND METHOD FOR EMBEDDING WATERMARKS IN CONTENTS AND FOR DETECTING EMBEDDED WATERMARKS - Provided are a device and method for detecting watermarks in content carrying watermarks. One method for watermark detection according to the present invention comprises the steps of: extracting the label of a frequency component of the content; generating a bit sequence by making bit values correspond on the basis of changes in the label being extracted; checking correlation while shifting the phase of the band-spreading code (pn sequence) with respect to the generated bit sequence; and checking the amount of phase shift in the band-spreading code at times when the correlation checked in this way falls into the category of autocorrelation, and then determining a bit group having a value corresponding to the amount of phase shift checked in this manner. The bit group determined in this way constitutes part of watermark data. | 2012-09-27 |
20120243728 | Bulk Region of Interest Learning - A system and method for mail processing. A method includes receiving an image of a mail piece, and identifying multiple regions of interest of the image. The method includes determining a classification key for the image based on a plurality of relationships between the multiple regions of interest and identifying a most-changing region of interest of the multiple regions of interest. The method includes processing the mail piece using the identified most-changing region of interest as the recipient address block. | 2012-09-27 |
20120243729 | LOGIN METHOD BASED ON DIRECTION OF GAZE - A method of authenticating a user of a computing device is proposed, together with computing device on which the method is implemented. A plurality of objects is displayed on a display screen. The plurality of objects includes at least objects that make up a sequence of objects pre-selected as the user's passcode. In response to a trigger signal an image of the user's face is captured while looking at one of the objects on the display screen. A determination of which object is in the direction of the user's gaze is made from the photograph and whether or not the gaze is on the correct object in the sequence of the passcode. This is repeated for each object in the sequence of the passcode. | 2012-09-27 |
20120243730 | COLLABORATIVE CAMERA SERVICES FOR DISTRIBUTED REAL-TIME OBJECT ANALYSIS - A collaborative object analysis capability is depicted and described herein. The collaborative object analysis capability enables a group of cameras to collaboratively analyze an object, even when the object is in motion. The analysis of an object may include one or more of identification of the object, tracking of the object while the object is in motion, analysis of one or more characteristics of the object, and the like. In general, a camera is configured to discover the camera capability information for one or more neighboring cameras, and to generate, on the basis of such camera capability information, one or more actions to be performed by one or more neighboring cameras to facilitate object analysis. The collaborative object analysis capability also enables additional functions related to object analysis, such as alerting functions, archiving functions (e.g., storing captured video, object tracking information, object recognition information, and so on), and the like. | 2012-09-27 |
20120243731 | IMAGE PROCESSING METHOD AND IMAGE PROCESSING APPARATUS FOR DETECTING AN OBJECT - An image processing method and an image processing apparatus for detecting an object are provided. The image processing method includes the following steps: partitioning an image into at least a first sub-image covering a first zone and a second sub-image covering a second zone according to a designed trait; and performing an image detection process upon the first sub-image for checking whether the object is within the first zone to generate a first detecting result. The object is a human face, and the image detection process is a face detection process. | 2012-09-27 |
20120243732 | Adaptable Framework for Cloud Assisted Augmented Reality - A mobile platform efficiently processes sensor data, including image data, using distributed processing in which latency sensitive operations are performed on the mobile platform, while latency insensitive, but computationally intensive operations are performed on a remote server. The mobile platform acquires sensor data, such as image data, and determines whether there is a trigger event to transmit the sensor data to the server. The trigger event may be a change in the sensor data relative to previously acquired sensor data, e.g., a scene change in an image. When a change is present, the sensor data may be transmitted to the server for processing. The server processes the sensor data and returns information related to the sensor data, such as identification of an object in an image or a reference image or model. The mobile platform may then perform reference based tracking using the identified object or reference image or model. | 2012-09-27 |
20120243733 | MOVING OBJECT DETECTING DEVICE, MOVING OBJECT DETECTING METHOD, MOVING OBJECT DETECTION PROGRAM, MOVING OBJECT TRACKING DEVICE, MOVING OBJECT TRACKING METHOD, AND MOVING OBJECT TRACKING PROGRAM - A moving object detecting device | 2012-09-27 |
20120243734 | Determining Detection Certainty In A Cascade Classifier - Disclosed are embodiments for determining detection certainty in a cascade classifier ( | 2012-09-27 |
20120243735 | ADJUSTING DISPLAY FORMAT IN ELECTRONIC DEVICE - A display format adjustment system includes a receiving module, a visual condition determination module, a display format determination module, and a display control module. The receiving module receives content for display in a first display format. The visual condition determination module determines a visual condition of a viewer in front of a display. The display format determination module determines a second display format based on the first display format and the visual condition of the viewer. The display control module displays the content in the second display format on the display. | 2012-09-27 |
20120243736 | ADJUSTING PRINT FORMAT IN ELECTRONIC DEVICE - A print format adjustment system includes a receiving module, a visual condition determination module, a print format determination module, and a print control module. The receiving module receives content for printing in a first print format. The visual condition determination module establishes the sharpness of vision of a viewer in front of a display, at a predetermined view distance. The print format determination module determines a second print format based on both the first print format and the visual condition of the viewer. The print control module prints the content in the second print format. | 2012-09-27 |
20120243737 | IMAGE PROCESSING APPARATUS, IMAGE PROCESSING METHOD, RECORDING MEDIUM, AND PROGRAM - An image processing apparatus includes: a calculating unit that calculates an evaluation value, which is expressed as a sum of confidence degrees obtained by mixing, at a predetermined mixing ratio, a matching degree of a first feature quantity and a matching degree of a second feature quantity between a target image containing an object to be tracked and a comparison image which is an image of a comparison region compared to the target image of a first frame, when the mixing ratio is varied and obtaining the mixing ratio when the evaluation value is maximum; and a detecting unit that detects an image corresponding to the target image of a second frame based on the confidence degrees in which the mixing ratio is set when the evaluation value is the maximum. | 2012-09-27 |
20120243738 | IMAGE PROCESSING DEVICE AND IMAGE PROCESSING METHOD - An image processing device comprises: a tracking area setting unit that sets a tracking area in an input moving image obtained by photographing an object; a following feature point setting unit that detects a feature point that exhibits a motion in correlation with the motion of the tracking area and sets the detected feature point as a following feature point; a motion detection unit that detects movement over time of the following feature point within the input image; and a clip area setting unit that sets a clip area of an image to be employed when a partial image including the tracking area is clipped out of the input image for either recording or displaying or both recording and displaying, and that sets a size and a position of the clip area on the basis of a motion detection result obtained by the motion detection unit. | 2012-09-27 |
20120243739 | INFORMATION PROCESSING DEVICE, OBJECT RECOGNITION METHOD, PROGRAM, AND TERMINAL DEVICE - There is provided an information processing device including a database that stores feature quantities of two or more images, the database being configured such that identification information for identifying an object in each image and an attribute value related to a lighting condition under which the object was imaged are associated with a feature quantity of each image, an acquisition unit configured to acquire an input image captured by an imaging device, and a recognition unit configured to recognize an object in the input image by checking a feature quantity determined from the input image against the feature quantity of each image stored in the database. The feature quantities stored in the database include feature quantities of a plurality of images of an identical object captured under different lighting conditions. | 2012-09-27 |
20120243740 | Scene Determination and Prediction - A system and method for scene determination is disclosed. The system comprises a communication interface, an object detector, a temporal pattern module and a scene determination module. The communication interface receives a video including at least one frame. The at least one frame includes information describing a scene. The object detector detects a presence of an object in the at least one frame and generates at least one detection result based at least in part on the detection. The temporal pattern module generates a temporal pattern associated with the object based at least in part on the at least one detection result. The scene determination module determines a type of the scene based at least in part on the temporal pattern. | 2012-09-27 |
20120243741 | Object Recognition For Security Screening and Long Range Video Surveillance - A method of detecting an object in image data that is deemed to be a threat includes annotating sections of at least one training image to indicate whether each section is a component of the object, encoding a pattern grammar describing the object using a plurality of first order logic based predicate rules, training distinct component detectors to each identify a corresponding one of the components based on the annotated training images, processing image data with the component detectors to identify at least one of the components, and executing the rules to detect the object based on the identified components. | 2012-09-27 |
20120243742 | INFORMATION PROCESSING DEVICE, INFORMATION PROCESSING METHOD, AND PROGRAM - An information processing device includes a conversion unit that performs conversion such that an area including a feature point and the periphery thereof in a specific target is set as a first area and, when one pixel in the first area is set as a reference pixel, an area including the reference pixel and pixels in the periphery thereof is set as a second area, and, based on a comparison result of the feature amount of the reference pixel and the feature amount of another pixel in the second area, the feature amount of another pixel is converted for each pixel in the second area, and a calculation unit that calculates a feature amount to be used in an identification process for identifying the specific target by performing computation for the value of each pixel in the second area which is obtained from the conversion for each reference pixel. | 2012-09-27 |
20120243743 | DEVICE FOR INTERACTION WITH AN AUGMENTED OBJECT - A device for interacting with at least one augmented object ( | 2012-09-27 |
20120243744 | SECURITY ELEMENT COMPRISING A SUBSTRATE BEARING AN OPTICAL STRUCTURE AND A REFERENCE PATTERN, AND ASSOCIATED METHOD - The invention relates to a security element ( | 2012-09-27 |
20120243745 | Methods and Apparatus for Automatic Testing of a Graphical User Interface - Methods and apparatus in a computer for automatically testing computer programs involve opening a predefined graphical user interface (GUI) on a screen of the computer; loading a set of program script instructions from a script database in communication with the computer that is associated with the predefined GUI; reading a loaded set of program script instructions; retrieving, based on the loaded set, data and at least one image object corresponding to the predefined GUI from a data and image object database in communication with the computer; taking a screenshot of the predefined GUI that includes at least one image object of the predefined GUI; determining whether an image object in the screen shot matches an image object retrieved from the data and object image database; and if a target position on the screen of the matching image object based on data retrieved from the data and image object database, and activating a control function adapted to control the predefined GUI based on the loaded set of program script instructions and the target position. | 2012-09-27 |
20120243746 | IMAGE PROCESSOR, IMAGE PROCESSING METHOD, AND PROGRAM - An image processor includes a synthesis processing unit adapted to generate a panoramic image by clipping images from a plurality of captured images and connecting the clipped images together and a processing unit adapted to correct image distortions of a subject in the panoramic image due to varying distances to the subject by modifying the panoramic image on the basis of distance information obtained by measuring the distances to a plurality of positions of the subject. | 2012-09-27 |
20120243747 | System and Method for Precision Measurement of Position, Motion and Resonances - A non-contact sensing system for measuring and analyzing an object's position, motion, and/or resonance utilizes optical capturing of image features, data extraction, and signal processing to determine changes in the object's motion or position according to changes in signals, which are associated with the excitation of photons due to the object's motion. | 2012-09-27 |
20120243748 | Image Capture and Manipulation - The present disclosure includes, among other things, systems, methods and program products for image capture and manipulation. | 2012-09-27 |
20120243749 | MIRROR SYSTEM AND METHOD FOR ACQUIRING BIOMETRIC DATA - A system and method for obtaining biometric imagery such as iris imagery from large capture volumes is disclosed wherein a substantially rotationally symmetric mirror such as a cone or sphere is rotated at a constant velocity about a central axis. | 2012-09-27 |
20120243750 | METHOD, APPARATUS AND SYSTEM FOR OUTPUTTING A GROUP OF IMAGES - There is described an apparatus for outputting a group of images for display, the group being taken from a plurality of images, each image in the plurality of images having a face located therein, the apparatus comprising a processor configured to: retrieve a set of images from a storage medium, the set of images containing at least the plurality of images from which the group of images to be displayed is selected; identify the face in each of the plurality of images; identify variable features on the face in each of the plurality of images; establish the group of images in accordance with a measure of the dissimilarity between the variable features in the plurality of images; and output the group of images for display. | 2012-09-27 |
20120243751 | BASELINE FACE ANALYSIS - Facial information is collected on a person and used to analyze affect. Facial information can be used to determine a baseline face which characterizes the default expression that a person has on their face. Deviations from this baseline face can be used to evaluate affect and further be used to infer mental states. Facial images can be automatically scored for various expressions including smiles, frowns, and squints. Image descriptors and image classifiers can be used during this baseline face analysis. | 2012-09-27 |
20120243752 | IMAGE PICK-UP APPARATUS HAVING A FUNCTION OF RECOGNIZING A FACE AND METHOD OF CONTROLLING THE APPARATUS - It is judged whether or not a human face detecting mode is set. When it is determined that the human face detecting mode is set, a two-dimensional face detecting process is performed to detect a human face. When it is determined that a human face has not been detected in the two-dimensional face detecting process, a three-dimensional face detecting process is performed to detect a human face. In addition, when an animal face detecting mode is set, a three-dimensional face detecting process is performed to detect a face of an animal corresponding to the set detecting mode. | 2012-09-27 |
20120243753 | System and Method for Assessing Image Interpretability in Anatomic Pathology - A portion of imagery data is obtained from a digital slide and a protocol of image analysis/diagnostic tasks is performed on the portion of imagery data by a pathologist or an image analysis module. The result of each task (e.g., success or no success) is recorded and a score is determined for the portion of the imagery data. Multiple portions of imagery data from the digital slide are analyzed and scored and the various scores from the multiple portions of imagery data are calculated to determine an overall score for the digital slide. Regions of the digital slide can be scored separately. Multiple rounds of scoring (by different pathologists and/or different image analysis algorithms) may be employed to increase the accuracy of the score for a digital slide or region thereof. | 2012-09-27 |
20120243754 | SYSTEM AND METHOD FOR MANAGEMENT AND DISTRIBUTION OF DIAGNOSTIC IMAGING - A method of distributing an image study to a chosen image reader is disclosed having steps of receiving an image study from an image producer at a third party communication module, sending a receive notification message to a messaging layer, sending a study available notification message from the messaging layer to a workload distribution engine wherein the available notification message includes extracted image study information pulled from study headers of the image study, identifying image study rules from the extracted image study information, applying an image study complexity to the image study based on the image study rules, calculating image reader complexities for a plurality of image readers subscribed to receive image studies from the image producer, each of the image reader complexities calculated using the image study complexity and an Image reader profile assigned to each of the plurality of accredited image readers, selecting the chosen image reader from the plurality of image readers based on the image reader complexities, assigning the image study to the chosen image reader, and displaying the image study on a user interface to the chosen image reader. | 2012-09-27 |
20120243755 | METHOD FOR AUTOMATICALLY SEEDING PREVIOUSLY-CLASSIFIED IMAGES AMONG IMAGES OF OBJECTS OF INTEREST FROM A SPECIMEN - A computer-assisted method of classifying cytological samples, includes using a processor to analyze images of cytological samples and identify cytological objects of interest within the sample images, wherein the processor (i) displays images of identified cytological objects of interest from the sample images to a reviewer, (ii) accesses a database of images of previously classified cytological objects, and (iii) displays to the reviewer, interspersed with the displayed images of the identified objects of interest from the sample images, one or more images obtained from the database of images of previously-classified objects. | 2012-09-27 |
20120243756 | Method for Reconstructing Motion-Compensated Magnetic Resonance Images From Non-Cartesian K-Space Data - A method for reconstructing a motion-compensated image depicting a subject with a magnetic resonance imaging (MRI) system is provided. An MRI system is used to acquire a time series of k-space data from the subject by sampling k-space along non-Cartesian trajectories, such as radial, spiral, or other trajectories at a plurality of time frames. Those time frames at which motion occurred are identified and this information used to segment the time series into a plurality of k-space data subsets. For example, the k-space data subsets contain k-space data acquired at temporally adjacent time frames that occur between those identified time frames at which motion occurred. Motion correction parameters are determined from the k-space data subsets. Using the determined motion correction parameters, the k-space data subsets are corrected for motion. The corrected data subsets are combined to form a corrected k-space data set, from which a motion-compensated image is reconstructed. | 2012-09-27 |
20120243757 | SYSTEM AND METHOD FOR DETECTION OF ACOUSTIC SHADOWS AND AUTOMATIC ASSESSMENT OF IMAGE USABILITY IN 3D ULTRASOUND IMAGES - A method for automatically assessing medical ultrasound (US) image usability, includes extracting one or more features from at least one part of a medical ultrasound image, calculating for each feature a feature score for each pixel of the at least one part of the ultrasound image, and classifying one or more image pixels of the at least one part as either usable or unusable, based on a combination of feature scores for each pixel, where usable pixels have intensity values substantially representative of one or more anatomical structures. | 2012-09-27 |
20120243758 | METHODS AND SYSTEMS FOR FUNCTIONAL IMAGING OF CARDIAC TISSUE - One embodiment of these teachings includes an imaging modality that is based on the ability of imaging technologies to detect wave-induced tissue deformation at depth, that allows viewing the propagation of action potentials deep within myocardial tissue, thereby helping to clarify clinical and physiological dynamical issues. | 2012-09-27 |
20120243759 | IMAGE PROCESSING APPARATUS, X-RAY CT APPARATUS, AND IMAGE PROCESSING METHOD - An image processing apparatus includes a contrast side obtaining unit, estimating unit, a simple side obtaining unit, a core area computing unit, a synthesizing unit and a display control unit. The contrast side obtaining unit obtains a contrast area and a high CT value area around the contrast area. The estimating unit estimates a contrast area in the non-contrast data and corresponding to the obtained contrast area. The simple side obtaining unit obtains a high CT value area around the estimated contrast area. The core area computing unit computes a core area included in the high CT value area of the contrast data and the non-contrast data. The synthesizing unit aligns the contrast data with the non-contrast data and generates superimposed data by superimposing the high CT value area of the contrast data on the non-contrast data. The display control unit displays the superimposed data on a display device. | 2012-09-27 |
20120243760 | PLAQUE REGION EXTRACTING METHOD AND APPARATUS THEREFOR - According to one embodiment, a plaque region extracting apparatus includes a blood vessel wall data extracting unit, an intermediate image data generating unit, an enhancement processing unit, a plaque extracting unit. The blood vessel wall data extracting unit extracts first image data including a blood vessel wall from image data acquired by imaging a subject including blood vessels. The intermediate image data generating unit filters an intermediate region in the first image data to generate intermediate second image data. The enhancement processing unit processes the difference between the first image data and the second image data to generate third image data. The plaque extracting unit extracts a plaque in the blood vessel on the basis of the third image data. | 2012-09-27 |
20120243761 | SYSTEM AND METHOD FOR ESTIMATING VASCULAR FLOW USING CT IMAGING - A system and method for estimating vascular flow using CT imaging include a computer readable storage medium having stored thereon a computer program comprising instructions, which, when executed by a computer, cause the computer to acquire a first set of data comprising anatomical information of an imaging subject, the anatomical information comprises information of at least one vessel. The instructions further cause the computer to process the anatomical information to generate an image volume comprising the at least one vessel, generate hemodynamic information based on the image volume, and acquire a second set of data of the imaging subject. The computer is also caused to generate an image comprising the hemodynamic information in combination with a visualization based on the second set of data. | 2012-09-27 |
20120243762 | ODONTOLOGICAL IMAGING APPARATUS - A dental CT apparatus includes a control system arranged to move a radiation source and an imaging sensor on the opposite sides of an imaging station. The control system includes at least a first imaging mode designed for imaging patients and means for selecting at least one second imaging mode in which the source of radiation and the imaging sensor are driven during an exposure at an angular velocity of less than 4 degrees/second. A digital three dimensional models of teeth are generated based on imaging impressions or models of teeth by a CT apparatus provided with at least one specific imaging mode for the purpose. | 2012-09-27 |
20120243763 | SIGNAL-TO-NOISE ENHANCEMENT IN IMAGING APPLICATIONS USING A TIME-SERIES OF IMAGES - An apparatus and method are disclosed for improving imaging based on a time-series of images. In one embodiment, a time-series of images are acquired using a same imaging protocol of the same subject area, but the images are spaced in time by one or more time intervals (e.g, 1, 2, 3 . . . seconds apart). A sub-region is projected across all of the images to perform a localized analysis (corresponding X-Y pixels or X-Y-Z voxels are analyzed across all images) that identifies temporal components within each sub-region. In some of the sub-regions, the temporal components are removed when the amplitude of the component is below a predetermined amplitude threshold. The images are then combined using the sub-regions with reduced components in order to obtain a single image with reduced noise. | 2012-09-27 |
20120243764 | METHOD AND SYSTEM FOR PLAQUE CHARACTERIZATION - A method of quantifying plaques imaged by cardiac computed tomography angiography (“CCTA”) scan data. Calcified and non-calcified component thresholds are determined based at least in part on attenuation values of a pool of blood in the CCTA scan data. An epicardial fat threshold is determined and used to classify epicardial fat in the CCTA scan data. A portion of CCTA scan data positioned between a detected outer boundary of the coronary artery and a portion classified as lumen is classified as arterial wall. NCP and CP seeds are identified in the arterial wall portion. Portions of the CCTA scan data continuous with a NCP seed and having attenuation values greater than an artery wall value and less than the NCP threshold are classified as NCP, and portions of the CCTA scan data continuous with the CP seed and having attenuation values greater than the CP threshold are classified as CP. | 2012-09-27 |
20120243765 | ASSOCIATING ACQUIRED IMAGES WITH OBJECTS - A system for associating acquired images with objects is disclosed. It comprises an image selector ( | 2012-09-27 |
20120243766 | IMAGE EVALUATION METHOD AND SYSTEM - A method and system for evaluating images, such as x-ray image, to provide feedback that can be used for subsequent image acquisition. The feedback may be used to adjust the positioning of a patient with respect to an image capture device, or a setting of the image capture device. The image capture device may be part of a dental x-ray imaging system that generates an x-ray image of the patent and provides the image to a collector service. The image is evaluated by the collector service for positioning errors and other operator correctable issues that may have had an impact on the image quality. A report that includes feedback regarding the operator correctable issues, as well as suggested corrective action, is generated and provided to the imaging system operator. | 2012-09-27 |
20120243767 | METHOD FOR TRACKING CELLS - Simple, high-precision cell tracking is realized. Provided is a method for tracking cells, comprising an image acquisition step (S | 2012-09-27 |
20120243768 | METHOD OF ANALYZING CELL STRUCTURES AND THEIR COMPONENTS - A cell is provided that contains a plurality of virus particles. A first image of a first virus particle and a second image of a second virus particle are taken by electron microscopy technology. The first virus particle is characterized as being in a first maturity stage and the second virus particle as being in a second maturity stage. The first image and the second image are transformed to first and second gray scale profiles, respectively, based on pixel data. The first and second gray scale profiles are then saved as first and second templates, respectively. A third virus particle in a third image is identified. The third image is transformed into a third gray scale profile. The third gray scale is compared to the first and second template to determine a maturity stage of the third virus particle. | 2012-09-27 |
20120243769 | QUANTIFYING CELL DEATH - The invention relates to methods of diagnosis, particularly methods of staging and diagnosing neurodegenerative diseases using images of cell death in the eye. | 2012-09-27 |
20120243770 | PATTERN INSPECTION APPARATUS AND PATTERN INSPECTION METHOD - In accordance with an embodiment, a pattern inspection method includes: applying a light generated from a light source to the same region of a substrate in which an inspection target pattern is formed; guiding, imaging and then detecting a reflected light from the substrate, and acquiring a detection signal for each of a plurality of different wavelengths; and adding the detection signals of the different wavelengths in association with an incident position of an imaging surface to generate added image data including information on a wavelength and signal intensity, judging, by the added image data, whether the inspection target pattern has any defect, and when judging that the inspection target pattern has a defect, detecting the position of the defect in a direction perpendicular to the substrate. | 2012-09-27 |
20120243771 | THREE-DIMENSIONAL ULTRASONIC INSPECTION APPARATUS - A three-dimensional ultrasonic inspection apparatus | 2012-09-27 |
20120243772 | METHOD FOR EXTRACTING CONTOUR OF PATTERN ON PHOTO MASK, CONTOUR EXTRACTION APPARATUS, METHOD FOR GUARANTEEING PHOTO MASK, AND METHOD FOR MANUFACTURING SEMICONDUCTOR DEVICE - According to one embodiment, a method includes acquiring information about a two-dimensional distribution of secondary electron intensity for a measurement target pattern, extracting, by a first method, an edge position of an edge for correction value acquisition, extracting, by a second method, an edge position of the edge for correction value acquisition, acquiring a difference between the edge positions extracted by the first and second methods, as a correction value, extracting, by the second method, an edge position of a desired edge based on the information about the two-dimensional distribution, and correcting the edge position of the desired edge based on the correction value. | 2012-09-27 |
20120243773 | Design-Based Inspection Using Repeating Structures - Systems and methods for design-based inspection using repeating structures are provided. | 2012-09-27 |
20120243774 | METHOD FOR RECONSTRUCTION OF URBAN SCENES - An urban scenes reconstruction method includes: acquiring digital data of a three-dimensional subject, the digital data comprising a 2D photograph and a 3D scan; fusing the 3D scan and the 2D photograph to create a depth-augmented photograph; decomposing the depth-augmented photograph into a plurality of constant-depth layers; detecting repetition patterns of each constant-depth layer; and using the repetitions to enhance the 3D scan to generate a polygon-level 3D reconstruction. | 2012-09-27 |
20120243775 | WIDE BASELINE FEATURE MATCHING USING COLLOBRATIVE NAVIGATION AND DIGITAL TERRAIN ELEVATION DATA CONSTRAINTS - A method for wide baseline feature matching comprises capturing one or more images from an image sensor on each of two or more platforms when the image sensors have overlapping fields of view, performing a 2-D feature extraction on each of the captured images in each platform using local 2-D image feature descriptors, and calculating 3-D feature locations on the ellipsoid of the Earth surface from the extracted features using a position and attitude of the platform and a model of the image sensor. The 3-D feature locations are updated using digital terrain elevation data (DTED) as a constraint, and the extracted features are matched using the updated 3-D feature locations to create a common feature zone. A subset of features from the common feature zone is selected, and the subset of features is inputted into a collaborative filter in each platform. A convergence test is then performed on other subsets in the common feature zone, and falsely matched features are pruned from the common feature zone. | 2012-09-27 |
20120243776 | IMAGE PROCESSING APPARATUS, IMAGE PROCESSING METHOD, AND PROGRAM - An image processing apparatus includes a noise removal unit that corrects a geometric mismatch of optical noise of a left eye image and a right eye image by performing a noise removal process for removing the separately generated optical noise on the left eye image and the right eye image which are captured and obtained by a two-lens type stereoscopic image capturing camera. | 2012-09-27 |
20120243777 | SYSTEM AND METHOD FOR SEGMENTATION OF THREE-DIMENSIONAL IMAGE DATA - In one embodiment, a system for computing class identifiers for three-dimensional pixel data has been developed. The system comprises a plurality of class identifying processors, and a data grouper operatively connected to a first memory. Each class identifying processor has a plurality of inputs for at least one pixel value and a plurality of class identifiers for pixel values neighboring the at least one pixel value and each class identifying processor is configured to generate a class identifier for the at least one pixel value input with reference to the class identifiers for the neighboring pixel values. The data grouper is configured to retrieve a plurality of pixel values from the first memory and a plurality of class identifiers for pixel values neighboring the retrieved pixel values. | 2012-09-27 |
20120243778 | IMAGE RECOGNIZING APPARATUS, METHOD FOR RECOGNIZING IMAGE AND NON-TRANSITORY COMPUTER READABLE MEDIUM - An image recognizing apparatus includes a dictionary memory, a block determining module and a recognizing module. The dictionary memory stores dictionary data. The block determining module determines that a target block comprising a target pixel to be processed of a plurality of pixels in image data is a shared block to which the dictionary data is used or a mirror block to which the dictionary data to the shared block is used, based on a position of the target block. The recognizing module uses common dictionary data for the shared block and the mirror block, and recognizes a characteristic portion of the image expressed by the image data. | 2012-09-27 |
20120243779 | RECOGNITION DEVICE, RECOGNITION METHOD, AND COMPUTER PROGRAM PRODUCT - According to an embodiment, a recognition device includes a generation unit to select, plural times, groups each including learning samples from a storage unit, learn a classification metric for classifying the groups selected in each selection, and generate an evaluation metric including the classification metrics; a transformation unit to transform a first feature value of an image including an object into a second feature value using the evaluation metric; a calculation unit to calculate similarities of the object to categories in a table using the second feature value and reference feature values; and a registration unit to register the second feature value as the reference feature value in the table associated with the category of the object and register the first feature value as the learning sample belonging to the category of the object in the storage unit. The generation unit performs the generation again. | 2012-09-27 |
20120243780 | Red-Eye Removal Using Multiple Recognition Channels - This disclosure pertains to apparatuses, methods, and computer readable media for red-eye removal techniques using multiple recognition channels. In the following examples, red, golden, and white recognition channels are used. A recognition channel is the monochrome extraction from a color photograph in a manner designed to make one kind of red-eye artifact glow with maximum contrast. Once the red-eye artifact has been characterized by, e.g., size and location, the techniques disclosed herein may then discern whether the red-eye artifact is, for example, a red-, golden-, or white-eye case by examining the configuration and characteristics of prominence bitmasks created for the various recognition channels. Once the type of red-eye case has been discerned, the techniques disclosed herein may then replace the artifact with a photographically reasonable result based on the type of red-eye case being repaired. Specular reflection may also be re-added to the photograph. | 2012-09-27 |
20120243781 | IMAGE PROCESSING DEVICE, IMAGE PROCESSING METHOD, AND PROGRAM - An image processing device including a decoding unit that decodes compressed image data that is an encoded image and generates a decoded image that is decoded and parameters that relate to encoding which are calculated during encoding; an image processing unit that applies image processing including at least a color adjustment process on the decoded image; a control unit that controls whether to encode the decoded image on which image processing has been applied by the image processing unit using the parameters or to encode the decoded image without using the parameters according to processing of the image processing unit; and an encoding unit that encode the decoded image according to a control Of the control unit. | 2012-09-27 |
20120243782 | SIGNAL PROCESSING APPARATUS AND METHOD, AND PROGRAM - Signals are provided which allow colors in a wider color range than predetermined standards, which can be handled by apparatus according to such predetermined standards. A primary color converter converts first color signals having primary color points in a wider color range than the primary color points according to BT.709 into second color signals based on the primary colors according to BT.709. A photoelectric transducer converts the second color signals into third color signals according to photoelectric transducer characteristics defined in a numerical range wider than a range from 0 to 1.0 of color signals corresponding to a luminance signal and color difference signals according to BT.709. A color signal converter converts the third color signals into a luminance signal and color difference signals. A corrector incorporated in the color signal converter corrects the color difference signals into color difference signals. | 2012-09-27 |
20120243783 | Red-Eye Removal Using Multiple Recognition Channels - This disclosure pertains to apparatuses, methods, and computer readable media for red-eye removal techniques using multiple recognition channels. In the following examples, red, golden, and white recognition channels are used. A recognition channel is the monochrome extraction from a color photograph in a manner designed to make one kind of red-eye artifact glow with maximum contrast. Once the red-eye artifact has been characterized by, e.g., size and location, the techniques disclosed herein may then discern whether the red-eye artifact is, for example, a red-, golden-, or white-eye case by examining the configuration and characteristics of prominence bitmasks created for the various recognition channels. Once the type of red-eye case has been discerned, the techniques disclosed herein may then replace the artifact with a photographically reasonable result based on the type of red-eye case being repaired. Specular reflection may also be re-added to the photograph. | 2012-09-27 |
20120243784 | IMAGE PROCESSING DEVICE AND METHOD - There is provided an image processing device which transforms an image with a bit depth of N into an upper layer with a bit depth of M and a remaining lower layer, including a histogram unit that generates histogram indicating occurrence frequency of each pixel value of an image with a bit depth of N, a table unit that generates a table listing pixel values of which occurrence frequency in the histogram generated by the histogram unit is equal to or more than one, a reordering unit that reorders an arrangement of values in the histogram using the table generated by the table unit, an update unit that updates the table generated by the table unit and the histogram reordered by the reordering unit; and an index image unit that generates an index image with a bit depth of N using the updated table and the updated histogram. | 2012-09-27 |
20120243785 | METHOD OF DETECTION DOCUMENT ALTERATION BY COMPARING CHARACTERS USING SHAPE FEATURES OF CHARACTERS - A document alteration detection method compares a target image with an original image by comparing character shape features without actually recognizing the characters. Bounding boxes for the characters are generated for both images, each enclosing one or more connected groups of pixels of one character. The bounding boxes in the original and target images are matched into pairs. Addition and deletion of text is detected if a bounding box in one image does not have a matching one in the other image. Each pair of bounding boxes is processed to compare their shape features. The shape features include the Euler numbers of the characters, the aspect ratio of the bounding boxes, the pixel density of the bounding boxes, and the Hausdorff distance between the two characters. The two characters are determined to be the same or different based on the shape feature comparisons. | 2012-09-27 |
20120243786 | IMAGE PROCESSING METHOD AND DISPLAY DEVICE - An image processing method to obtain a high sense of depth or high stereoscopic effect for an image and a display device utilizing the method are provided. Image data of an image is separated into image data of a plurality of objects and a background. A feature amount is obtained from the image data of each object, so that the objects are identified. The relative distance between viewer's eye and any of the objects is determined by the data of the sizes of the objects in the image and the sizes of the objects stored in the database. The image data of each object is processed so that an object with a shorter relative distance is enlarged. The image data of each object after image processing is combined with the image data of the background, so that a sense of depth or stereoscopic effect of an image is increased. | 2012-09-27 |
20120243787 | SYSTEM, METHOD AND COMPUTER PROGRAM PRODUCT FOR DOCUMENT IMAGE ANALYSIS USING FEATURE EXTRACTION FUNCTIONS - Methods, systems and computer program products to improve the efficiency and computational speed of an image enhancement process. In an embodiment, information that is generated as interim results during feature extraction may be used in a segmentation and classification process and in a content adaptive enhancement process. In particular, a cleaner image that is generated during a noise removal phase of feature extraction may be used in a content adaptive enhancement process. This saves the content adaptive enhancement process from having to generate a cleaner image on its own. In addition, low-level segmentation information that is generated during a neighborhood analysis and cleanup phase of feature extraction may be used in a segmentation and classification process. This saves the segmentation and classification process from having to generate low-level segmentation information on its own. | 2012-09-27 |
20120243788 | DYNAMIC RADIAL CONTOUR EXTRACTION BY SPLITTING HOMOGENEOUS AREAS - Systems and methods for extracting a radial contour around a given point in an image includes providing an image including a point about which a radial contour is to be extracted around. A plurality of directions around the point and a plurality of radius lengths for each direction are provided. Local costs are determined for all radius lengths for each direction by comparing texture variances at each radius length with the texture variance at a further radius length. A radius length is determined, using a processor, for each direction based on the accumulated value of the local costs to provide a radial contour. | 2012-09-27 |
20120243789 | FAST IMAGE CLASSIFICATION BY VOCABULARY TREE BASED IMAGE RETRIEVAL - Systems and methods are disclosed to categorize images by detecting local features for each image; applying a tree structure to index local features in the images; and extracting a rank list of candidate images with category tags based on a tree indexing structure to estimate a label of a query image. | 2012-09-27 |
20120243790 | EDGE LOCATION MEASUREMENT CORRECTION FOR COAXIAL LIGHT IMAGES - A method for correcting coaxial light image edge location errors in a precision machine vision inspection system is disclosed. The method comprises comparing an edge position measurement of a workpiece edge feature using coaxial light and stage light. Edge position measurements using stage light have a lower uncertainty than that of coaxial light. Position correction factors may be determined from the difference between the two edge position measurements. The position correction factors may be stored for correcting subsequent edge position measurements that are based on images acquired using coaxial light. In some embodiments, position correction factors may be determined based on comparing edge position measurements for a plurality of edges. | 2012-09-27 |
20120243791 | METHOD AND APPARATUS FOR CLASSIFYING IMAGE PIXELS - A method of classifying pixels in an image is described that includes calculating for each target pixel in the image, a functional value based on a median value of a block of pixels including the target pixel and storing the functional value for each pixel. Pixels in the image are then analyzed to determine if they correspond to edges in the image and if so, are classified as edge pixels. Next the stored functional values are analyzed to define a flat area delimiting function for the image. The stored functional values that do not correspond to edge pixels are then analyzed to define an image detail delimiting function and the non-flat area pixels are classified as being either flat area pixels or detail pixels based on the flat area delimiting function and the detail delimiting function. | 2012-09-27 |
20120243792 | Detecting and Correcting Blur and Defocusing - Detecting blur and defocusing in images is described. After detection, correction algorithms are applied. Detection provides an image processing system with parameters related to a blur (e.g., direction, strength) and noise levels, or may trigger a message to a user to re-take a photograph. Detection involves finding and analyzing edges of objects instead of an entire image. Disclosed detector may be used for OCR purposes, blur and defocusing detection in photographic and scanning devices, video cameras, print quality control systems, computer vision. Detection of blur and defocusing of an image involve second derivatives of image brightness. Object edges are detected. For points on edges, profiles of second derivative are obtained in the direction of the gradient. Statistics are gathered about parameters of profiles in various directions. By analyzing statistics, image distortions and their type (e.g., blur, defocusing), the strength of distortion, the direction of the blur are detected. | 2012-09-27 |
20120243793 | METHOD AND SYSTEM FOR EDGE DETECTION - A method executed by a computer system for detecting edges comprises receiving an image comprising a plurality of pixels, determining a phase congruency value for a pixel, where the phase congruency value comprises a plurality of phase congruency components, and determining if the phase congruency value satisfies a phase congruency criteria. If the phase congruency value satisfies the phase congruency criteria, the computer system categorizes the pixel as an edge pixel. If the phase congruency value does not satisfy the phase congruency criteria, the computer system compares a first phase congruency component of the plurality of phase congruency components to a phase congruency component criteria. If the first phase congruency component satisfies the phase congruency component criteria, the computer system categorizes the pixel as an edge pixel, and if the first phase congruency component does not satisfy the phase congruency component criteria, categorizes the pixel as a non-edge pixel. | 2012-09-27 |
20120243794 | IMAGE PROCESSING APPARATUS, IMAGE ENCODING SYSTEM AND IMAGE DECODING SYSTEM - According to one embodiment, an image processing apparatus includes a feature based detector, an optimum area generator, and an output data generator. The feature based detector detects a feature portion of input image data and generates feature area information showing a position of a feature area comprising the detected feature portion. The optimum area generator generates optimum area information showing a position of an optimum area in accordance with a size of the feature area based on the feature area information. The output data generator extracts pixels of the optimum area among the input image data based on the optimum area information and generates output image data based on the extracted pixels. | 2012-09-27 |
20120243795 | SCALABLE IMAGE DISTRIBUTION IN VIRTUALIZED SERVER ENVIRONMENTS - A method and system include replicating an image representing a sequence of bytes on a local storage medium on a target device by determining a similarity between images and reconstructing a new image using equivalent blocks from one or more similar images locally available on the target device or available on donor devices to reduce network link usage and transfer time in replicating the image. | 2012-09-27 |
20120243796 | IMAGE PROCESSING APPARATUS, COMPUTER READABLE MEDIUM STORING PROGRAM, AND IMAGE PROCESSING METHOD - An image processing apparatus includes the following elements. An image receiving unit receives a first image and a second image. A first specified area specifying unit allows a user to specify a first specified area in the first image. A first comparison area setting unit sets a first comparison area in the first image on the basis of the first specified area. A second comparison area setting unit sets a second comparison area in the second image. A geometric difference correction unit corrects a difference in geometric properties between a first comparison image in the first comparison area and a second comparison image in the second comparison area. An image output unit outputs an image in which the second comparison image is superimposed on the first comparison image or a difference image between the first comparison image and the second comparison image. | 2012-09-27 |
20120243797 | MEANS FOR USING MICROSTRUCTURE OF MATERIALS SURFACE AS A UNIQUE IDENTIFIER - The present application concerns the visual identification of materials or documents for tracking or authentication purposes. | 2012-09-27 |
20120243798 | IMAGE PROCESSING APPARATUS, IMAGE PROCESSING METHOD, AND NON-TRANSITORY COMPUTER READABLE MEDIUM STORING IMAGE PROCESSING PROGRAM - An image processing apparatus includes an image receiving unit receiving an image, a conversion unit converting the received image, a separation unit separating the converted image into pixel synchronization information and pixel asynchronization information, a first encoding unit encoding the pixel synchronization information, a second encoding unit encoding the pixel asynchronization information, a first decoding unit decoding a code encoded by the first encoding unit to generate the pixel synchronization information, a second decoding unit decoding a code encoded by the second encoding unit to generate the pixel asynchronization information, a synthesis unit synthesizing the decoded pixel synchronization information with the decoded pixel asynchronization information on the basis of the pixel synchronization information, a reverse conversion unit performing a conversion process reverse to the conversion process of the conversion unit on the synthesized information, and an output unit outputting the image converted by the reverse conversion unit. | 2012-09-27 |
20120243799 | METHOD AND APPARATUS FOR ENCODING AND DECODING CODING UNIT OF PICTURE BOUNDARY - A method and apparatus for encoding an image is provided. An image coding unit, including a region that deviates from a boundary of a current picture, is divided to obtain a coding unit having a smaller size than the size of the image coding unit, and encoding is performed only in a region that does not deviate from the boundary of the current picture. A method and apparatus for decoding an image encoded by the method and apparatus for encoding an image is also provided. | 2012-09-27 |
20120243800 | IMAGE INFORMATION ENCODING METHOD AND ENCODER, AND IMAGE INFORMATION DECODING METHOD AND DECODER - An image decoding method includes decoding encoded image data to generate a decoded image signal including a luma signal and a chroma signal. The method further includes scaling, when a reference field is a top field while a current field is a bottom field for motion estimation and when the decoded image signal is in a format in which the number of chroma pixels is vertically different from the number of luma pixels, a chroma motion vector of the chroma signal by mv/2+¼, where mv is a vertical component in a luma motion vector of the luma signal. The method also includes performing motion compensation of the decoded image signal using the scaled chroma motion vector. | 2012-09-27 |
20120243801 | IMAGE PROCESSING APPARATUS, IMAGING APPARATUS, STORAGE MEDIUM STORING IMAGE PROCESSING PROGRAM, AND IMAGE PROCESSING METHOD - An image processing apparatus includes an image reducing section generating at least one reduced image by generating a reduced pixel with reference to pixel values by each predetermined area of a target image being a process object, a noise extracting section extracting a noise component at a frequency band according to a reduction ratio of the reduced image for each reduced pixel based on the reduced image, a noise subtraction section subtracting the noise component of the reduced pixel from each of pixels at the predetermined area being referred to generate the reduced pixel, and an area smoothing section smoothing the pixels with each other at the predetermined area being referred to generate the reduced pixel based on the noise component of the reduced pixel. | 2012-09-27 |
20120243802 | COMPOSITE IMAGE FORMED FROM AN IMAGE SEQUENCE - A method for forming a composite image from a sequence of digital images, comprising: receiving a sequence of digital images of a scene, each digital image being captured at a different time, wherein the scene includes a moving object; using a data processor to automatically analyze two or more of the digital images in the sequence of digital images to determine a rate of motion for the moving object; determining a frame rate responsive to the rate of motion for the moving object; selecting a subset of the digital images from the sequence of digital images corresponding to the determined frame rate; and forming the composite image by combining the selected subset of digital images from the sequence of digital images. | 2012-09-27 |
20120243803 | Laying Out Multiple Images - Systems, methods, and apparatuses, including computer program products, are provided for re-layout of composite images. In some implementations, a method includes identifying geometric transformations corresponding to multiple images from a collection of images, where a geometric transformation reorients a corresponding image in relation to a common reference frame when applied and identifying a reference image for the multiple images in the collection of images. The method also includes determining overlapping image regions for the multiple images starting from the reference image, the determining based on the identified geometric transformations, determining additional transformations of a specified type for the multiple images based on the overlapping image regions, where an additional transformation lays out a corresponding image in relation to the reference image when applied, and making the additional transformations available for further processing and output with respect to the collection of images. | 2012-09-27 |
20120243804 | SYSTEMS AND METHODS FOR PHOTOGRAPH MAPPING - Systems and methods for photograph mapping are disclosed herein. In one embodiment a first digital image and at least one user-generated datum is received from at least one user. The first image is geographically organized according to the at least one datum. The first image is associated with at least one location and at least one direction. The first image is provided from a first person perspective to a user in response to a request. | 2012-09-27 |