Inventors list

Assignees list

Classification tree browser

Top 100 Inventors

Top 100 Assignees


Including orientation sensors (e.g., infrared, ultrasonic, remotely controlled)

Subclass of:

345 - Computer graphics processing and selective visual display systems

345156000 - DISPLAY PERIPHERAL INTERFACE INPUT DEVICE

345157000 - Cursor mark position control device

Patent class list (only not empty are listed)

Deeper subclasses:

Entries
DocumentTitleDate
20110175810Recognizing User Intent In Motion Capture System - Techniques for facilitating interaction with an application in a motion capture system allow a person to easily begin interacting without manual setup. A depth camera system tracks a person in physical space and evaluates the person's intent to engage with the application. Factors such as location, stance, movement and voice data can be evaluated. Absolute location in a field of view of the depth camera, and location relative to another person, can be evaluated. Stance can include facing a depth camera, indicating a willingness to interact. Movements can include moving toward or away from a central area in the physical space, walking through the field of view, and movements which occur while standing generally in one location, such as moving one's arms around, gesturing, or shifting weight from one foot to another. Voice data can include volume as well as words which are detected by speech recognition.07-21-2011
20130044053Combining Explicit Select Gestures And Timeclick In A Non-Tactile Three Dimensional User Interface - A method including presenting, by a computer, multiple interactive items on a display coupled to the computer, and receiving, from a depth sensor, a sequence of three-dimensional (3D) maps containing at least a hand of a user of the computer. An explicit select gesture performed by the user toward one of the interactive items is detected in the maps, and the one of the interactive items is selected responsively to the explicit select gesture. Subsequent to selecting the one of the interactive items, a TimeClick functionality is actuated for subsequent interactive item selections to be made by the user.02-21-2013
20130044055METHOD AND SYSTEM OF USER AUTHENTICATION WITH BIORESPONSE DATA - In one exemplary embodiment, a computer-implemented method includes the step of providing an image to a user. The image is provided with a computer display, An eye-tracking data is obtained from the user when the user views the image. The eye-tracking data is obtained with an eye-tracking system. A user attribute is determined based on the eye-tracking data. The user is enabled to access a digital resource when the user attribute is associated with a permission to access the digital resource. The user attribute can be a personhood state. The digital resource can be a web page document. An instruction can be provided to the user regarding a pattern of viewing the image. The pattern of viewing the image can include instructing the user to gaze on a specified sequence of image elements.02-21-2013
20090109176DIGITAL, DATA, AND MULTIMEDIA USER INTERFACE WITH A KEYBOARD - A system and corresponding method for providing a 3-dimensional (3-D) user interface to display images in a 3-D coordinate system. The 3-D interface generates and displays one type of holographic keyboard in response to a user's desired selection. The holographic keyboard provides versatility and ergonomic benefits to the user. Sensors are configured to sense user interaction within the 3-D coordinate system, so that a processor may receive user interaction information from the sensors. The sensors are able to provide information to the processor that enables the processor to correlate user interaction with images in the 3-D coordinate system. The system may be used for interconnecting or communicating between two or more components connected to an interconnection medium (e.g., a bus) within a single computer or digital data processing system.04-30-2009
20120162074USER INTERFACE APPARATUS AND METHOD USING TWO-DIMENSIONAL IMAGE SENSOR - There are provided a user interface apparatus and method using a two-dimensional (2D) image sensor capable of setting an object for an interface from a user image detected by the 2D image sensor and performing a user input accordingly. The user interface apparatus includes: a two-dimensional (2D) image sensor; and a user interface unit detecting a movement of a user captured by the 2D image sensor, setting the detected movement as an object, and outputting the movement of the object detected by the 2D image sensor to a system in which the user intends to establish an interface.06-28-2012
20100020011MAPPING DETECTED MOVEMENT OF AN INTERFERENCE PATTERN OF A COHERENT LIGHT BEAM TO CURSOR MOVEMENT TO EFFECT NAVIGATION OF A USER INTERFACE - A method, system and apparatus provide that movement of an interference pattern of a coherent light beam is mapped to cursor movement to effect navigation of a user interface. A remote controller operable to emit a coherent light beam is in cooperative arrangement with a display device operable to display a user interface, navigable by means of a cursor. A laser diode element and coupled diffuser element of the remote controller generate the coherent light beam. Movement of the remote controller causes movement of an interference pattern of the coherent light beam impinging upon a sensor of the display device; movement of the interference pattern is sensed by the display device and mapped to corresponding movement of the cursor in the user interface. Thus, the remote controller may be used to navigate an on-screen user interface by movement of the remote controller itself.01-28-2010
20130044054METHOD AND APPARATUS FOR PROVIDING BARE-HAND INTERACTION - An apparatus for providing bare-hand interaction includes a pattern image projecting unit for projecting a pattern image of structured light onto a projection zone and an image projection unit for projecting an image of digital contents onto a projection zone. The pattern image is captured from the projection zone by a pattern image capturing unit and processed by an image recognizing unit in order to recognize a user input interacted with the projected contents image. The apparatus then generate a system event corresponding to the recognized user input to control an application program in accordance with the event.02-21-2013
20120169594ELECTRONIC DEVICE AND METHOD FOR BACKLIGHT CONTROL - An electronic device includes a display screen, a backlight for illuminating the display screen, and a timer for indicating the expiration of a given time period, wherein when the backlight is first turned on, the timer is initialized to indicate if the expiration of a first predetermined illumination time period occurs. An image capture component is operable to acquire one or more images of a scene in front of the display screen; and a processor is operable to analyze the one or more images to determine whether a user is present in front of the display screen, to control the timer if a user is determined to be present such that the timer is reset to indicate if the expiration of a second predetermined illumination time period occurs, and to control the backlight to dim it or turn it off if the timer indicates an expiration has occurred.07-05-2012
20100141578IMAGE DISPLAY CONTROL APPARATUS, IMAGE DISPLAY APPARATUS, REMOTE CONTROLLER, AND IMAGE DISPLAY SYSTEM - An image display control apparatus comprises a menu creating unit displays an operation menu ME on a liquid crystal display unit of an image display apparatus, a camera with an infrared filter capable of recognizing an infrared signal coming from a remote controller, a remote controller position identifying unit identifies the position which the remote controller occupies during image capturing on the basis of the recognition result, a remote controller position signal creating unit displays the identified position of the remote controller on the liquid crystal display unit, and a user operation judging unit determines the operable specification object in the operation menu displayed on the liquid crystal display unit.06-10-2010
20080259030Pre-assembled part with an associated surface convertible to a transcription apparatus - An apparatus including a part board having an associated surface or a frame for a surface, such as a whiteboard, that is pre-assembled to include components that when connected to an external module convert the surface to an electronic transcription apparatus. In one version, the components include a set of sensors and electronics therefor, with wiring and a connector.10-23-2008
20080259031CONTROL APPARATUS, METHOD, AND PROGRAM - There is provided an intuitive, easy-to-use operation interface that is less liable to erroneous operations and is operated by a motion of a user. A motion operation mode is entered in response to recognition of a particular motion (preliminary motion) of a particular object in a video image and, after that, operation of any of various devices is controlled in accordance with various command motions recognized in a motion area being locked on. When an end command motion is recognized or after the motion area is unable to be recognized for a predetermined period of time, the lock-on is canceled to exit the motion operation mode.10-23-2008
20080259029Input Device Having the Function of Recognizing Hybrid Coordinates and Operating Method of the Same - The present invention relates a coordinate input device to input a variety of job commands, diagrams or characters, and to store or output the input data. More particularly, the present invention provides an input device for recognizing hybrid coordinates and a method of operating the device. The input device uses an absolute coordinate recognition method and a relative coordinate recognition method in a combined fashion as a coordinate recognition method for inputting the track of the characters or diagrams. By doing so, input coordinates are converted into absolute coordinates, and the tracks of handwritten characters and diagrams are stored as the converted absolute coordinates such that the tracks are displayed on a display of the input device or on a monitor of an information terminal connected to the input device. Accordingly, problems of respective conventional input devices for recognizing coordinates can be solved, the structure of hardware can be simplified, and characters or diagrams can be input with accurate recognition of coordinates.10-23-2008
20080259028Hand glove mouse - A hand glove mouse, including first, second and third buttons or sensors and a cursor positioner that is not a motion or tilt sensor (i.e., a roller ball, a light sensor, etc.) for adjusting the position of a cursor on a computer monitor. In one embodiment the positioner and buttons are located on a users body, but not on the user's fingertips. In another embodiment, the first, second and third sensors are load sensors on the user's fingertips with an actuation threshold above the load generated by typing. The hand glove mouse also includes a communication means communicating signals generated by the first, second and third buttons or sensors, and by the cursor positioner, to the computer.10-23-2008
20110193778DEVICE AND METHOD FOR CONTROLLING MOUSE POINTER - Disclosed is a device for controlling a mouse pointer, providing a display unit; an image photographing unit for photographing images of a first object and a second object; and a controller for setting a point between the first object and the second object detected from the photographed images as a position of a mouse pointer on the display unit, and when a distance between the first object and the second object is less than a predetermined distance, determining that a user selection instruction for the point has been input. The device detects movement of fingers using differential images according to the movement of the fingers, so that even when a continuously changing surrounding lighting or a user face having the similar color with the finger is included in a background, it is possible to accurately identify the movement of the fingers.08-11-2011
20090122010Apparatus for operating objects and a method for identifying markers from digital image frame data - An input apparatus other than mice is provided that supports an input to a computer by fingers or a mouth. An apparatus for operating objects according to the present invention operates an object on a screen based on an imaged operating element. The apparatus includes a computing apparatus; a display apparatus connected to the computing apparatus; and an imaging apparatus connected to the computing apparatus. The imaging apparatus images a predetermined operating element. The computing apparatus displays, as a marker, an image of the imaged operating element on the display apparatus. The object on the screen of the display apparatus is operated by movement of the marker.05-14-2009
20100117958INFORMATION INPUT PANEL USING LIGHT EMITTED DIODE MATRIX - The present invention relates to an information input panel using the light emitted diode (LED) matrix. The panel includes the LED matrix and a control circuit. The LED matrix includes N×M LEDs. The control circuit includes N pieces of first terminal and M pieces of second terminal, wherein the i05-13-2010
20130082926IMAGE DISPLAY - There is provided an image display including a first light source, a second light source and a modulation unit. The first light source has a plurality of first point lights arranged to form a first shape for generating a predetermined spectrum signal. The second light source has a plurality of second point lights arranged to form a second shape for generating a predetermined spectrum signal. The modulation unit is for simultaneously modulating the first point lights of the first light source with a first predetermined modulation frequency and for simultaneously modulating the second point lights of the second light source with a second predetermined modulation frequency to generate a modulated predetermined spectrum.04-04-2013
201000732893D CONTROL OF DATA PROCESSING THROUGH HANDHELD POINTING DEVICE - A handheld pointer device provides radiation with a directional characteristic relative to its main axis. The radiation is detected by at least two detectors at different positions. This enables determining the orientation of the main axis as well as the displacement of the device along its main axis for 3D control of an object rendered on the screen of a display monitor.03-25-2010
20100033431SELECTION DEVICE AND METHOD - A selection device for selecting an icon in an image area is provided including a motion-sensing unit and a processing unit. The motion-sensing unit senses a first motion and converts the first motion into a first signal. The processing unit converts the first signal into a first locus in the image area, determines a first area in the image area according to the first locus, and determines whether the icon is to be selected according to the first area and a second area where the icon is to be displayed in the image area.02-11-2010
20130033428ELECTRONIC APPARATUS USING MOTION RECOGNITION AND METHOD FOR CONTROLLING ELECTRONIC APPARATUS THEREOF - An electronic apparatus and controlling method thereof is disclosed. The method for controlling the electronic apparatus includes using motion recognition photographs as an object, and changing and displaying a screen based on a movement direction of the object, when a determination that the photographed object is moved while maintaining a first shape is made. By this method, the user is able to perform zoom in and zoom out operations more easily and intuitively by using motion recognition.02-07-2013
20130038532INFORMATION STORAGE MEDIUM, INFORMATION INPUT DEVICE, AND CONTROL METHOD OF SAME - Provided is a program capable of facilitating an operation input made by a user. A computer-readable information storage medium has stored thereon a program for controlling a computer connected to an operation input device to function so as to obtain, for each time unit, an input value indicating specifics of the operation input of the user received by the operation input device, and to calculate, as an output value of a parameter to be operated, a value obtained by changing a reference value by a change amount, the reference value being determined by one of a plurality of obtained input values, the change amount being determined by each of the plurality of obtained input values.02-14-2013
20130038531CURSOR CONTROLLING SYSTEM AND APPARATUS - A cursor-controlling system or apparatus includes a sensing module, a displacement controlling module and a driving module. The sensing module is for receiving a force to generate an input signal. The displacement controlling module includes a first signal amplifying circuit for amplifying the input signal to generate a first amplified signal; a comparator for receiving the first amplified signal to generate a selected signal; a second signal amplifying circuit for amplifying the selected signal to generate a second amplified signal; a filtering circuit for filtering the second amplified signal to generate a filtered signal; a third signal amplifying circuit for amplifying the filtered signal to generate a displacement signal. The driving module is for receiving the displacement signal to control movement of a cursor.02-14-2013
20100045599INPUT APPARATUS, CONTROL APPARATUS, CONTROL SYSTEM, AND CONTROL METHOD - An input apparatus includes a sensor, a calculation section, and a transmission section. The sensor detects a movement of the input apparatus and outputs a detection signal corresponding to the movement of the input apparatus. The calculation section calculates a corresponding value that corresponds to a movement of an image displayed on a screen in a predetermined calculation cycle, the corresponding value corresponding to the detection signal. The transmission section transmits the corresponding value in a transmission cycle shorter than the calculation cycle.02-25-2010
20100045598APPARATUS FOR CONTROLLING THE MOVEMENT OF AN OBJECT ON A PLANE - An apparatus is provided for controlling the movement of an object on a plane. The apparatus comprising a basin, a movable object positioned within the basin, and a sensor coupled to the apparatus for detecting the movement of the movable object within the basin, wherein the movement of the object on the plane is related to movement of the object within the basin.02-25-2010
20120206348DISPLAY DEVICE AND METHOD OF CONTROLLING THE SAME - A display device and a method of controlling the same are provided. The display device comprises: a camera for acquiring an image comprising a gesture taken by a user; and a controller for extracting the gesture from the image acquired by the camera and for setting a specific point of an object in which the gesture is performed as a reference point when the gesture corresponding to acquisition of a control right is comprised in the extracted gesture. Therefore, by setting a specific point of an object in which the gesture corresponding to acquisition of a control right is performed as a reference point, a gesture taken by a user can be accurately and effectively recognized.08-16-2012
20100097317METHODS AND APPARATUS TO PROVIDE A HANDHELD POINTER-BASED USER INTERFACE - Methods and apparatus to provide a handheld pointer-based user interface are described herein. An example apparatus includes a wireless pointer component and one or more base components. The wireless pointer component is configured to transmit one or more human-computer interaction (HCI) signals associated with an HCI event via a first communication link. One or more base components are operatively coupled to a screen of a display to receive the one or more HCI signals from the wireless pointer component via the first communication link. Further, the one or more base components are configured to generate at least one of operating information and position information of the wireless pointer component based on the one or more HCI signals, and to transmit the at least one of operating information and position information to a processor configured to generate screen information on the screen of the display via a second communication link.04-22-2010
20130027303Touch Screen Device, Method, and Graphical User Interface for Moving On-Screen Objects Without Using a Cursor - An electronic device with a touch screen display: detects a single finger contact on the touch screen display; creates a touch area that corresponds to the single finger contact; determines a representative point within the touch area; determines if the touch area overlaps an object displayed on the touch screen display, which includes determining if one or more portions of the touch area other than the representative point overlap the object; connects the object with the touch area if the touch area overlaps the object, where connecting maintains the overlap of the object and the touch area; after connecting the object with the touch area, detects movement of the single finger contact; determines movement of the touch area that corresponds to movement of the single finger contact; and moves the object connected with the touch area in accordance with the determined movement of the touch area.01-31-2013
20130027301OPERATION METHOD AND CONTROL SYSTEM FOR MULTI-TOUCH CONTROL - An operation method and a control system for multi-touch control are provided. A map positioner and a map block are set in a display region, and the position of a cursor is set according to an input signal of an operation region. The position of a fast positioner set in the operation region corresponds to the position of the map positioner. The position of the cursor is shifted according to a motion vector inputted by an input device, and the position of the map block is reset. The map block and the cursor are shifted to the map positioner when the input device receives a trigger signal of the fast positioner. An object in the map block is selected, a multi-touch function is enabled, and the operation property of the object is changed according to a relative shift quantity formed by a first and a second control points.01-31-2013
20130027300RECOGNITION APPARATUS, METHOD, AND COMPUTER PROGRAM PRODUCT - In an embodiment, a recognition apparatus includes: an obtaining unit configured to obtain positions of a specific part in a coordinate system having a first axis to an n-th axis (n≧2); a calculating unit configured to calculate a movement vector of the specific part; a principal axis selecting unit configured to select a principal axis; a turning point setting unit configured to set a position at which there is a change in the principal axis, and set a position at which there is a change; a section setting unit configured to set a determination target section, and set a previous section; a determining unit configured to calculate an evaluation value of the determination target section and an evaluation value of the immediately previous section and, determine which of the first axis to the n-th axis is advantageous; and a presenting unit configured to perform the determined result.01-31-2013
20100110007INPUT SYSTEM AND METHOD, AND COMPUTER PROGRAM - An input system comprises: a plurality of indicators having a plurality of displays for displaying an image including coordinate information unique to the displays on the screens, (i) a detecting means for detecting the image displayed on the screens, and (ii) a transmitting means for transmitting transmission information generated according to the coordinate information included in the detected image: and a controller having (i) a receiving means for receiving the transmission information, (ii) a generating means for generating drawing information indicating the trace along which each of the indicators moves on the screen of each of the displays according to the transmission information, and (iii) a control means for controlling each of the displays so as to perform the drawing processing corresponding to the drawing information generated by the generating means.05-06-2010
20100110006Remote pointing appratus and method - A remote pointing apparatus is provided. A light receiving unit may receive emitted light. A filtering unit may filter a received signal to a first frequency component, a second frequency component, a third frequency component, and a fourth frequency component. A calculation unit may compare amplitudes of the first frequency component and the second frequency component to calculate a first coordinate axis value of a cursor on a display unit, and compare amplitudes of the third frequency component and the fourth frequency component to calculate a second coordinate axis value of the cursor on the display unit.05-06-2010
20100110005INTERACTIVE INPUT SYSTEM WITH MULTI-ANGLE REFLECTOR - An interactive input system comprises a pointer input region; and a multi-angle reflecting structure located along a single side of the pointer input region and operable to reflect radiation from a pointer within the pointer input region from at least two surface locations of the multi-angle reflecting structure, wherein the at least two surface locations each have different respective angles. An imaging system is operable to capture within at least a portion of the pointer input region images of the reflected radiation located within a field of view of the imaging system. Processing structure is provided for determining the location of the pointer relative to the pointer input region based on the at least one image.05-06-2010
20090033622Smartscope/smartshelf - The SmartScope technology implements perceptual interfaces with a focus on machine vision and establishes a footprint for data collection based on the field of view of the data collecting device. The SmartScope implemented in a retail environment integrates multiple perceptual modalities such as computer vision, speech and sound processing, and haptic (feedback) Input/Output) into the customer's interface. The SmartScope computer vision technology will be used as an effective input modality in human computer interaction (HCI).02-05-2009
20130027302ELECTRONIC DEVICE, ELECTRONIC DOCUMENT CONTROL PROGRAM, AND ELECTRONIC DOCUMENT CONTROL METHOD - There are provided an electronic device, an electronic document control program and an electronic document control method for the electronic device. The electronic device includes a display unit configured to display an electronic document, an image taking unit configured to take an image, an eye-gaze position detecting unit configured to detect an eye-gaze position with respect to the display unit based on the image taken by the image taking unit, a determining unit configured to determine whether the electronic document displayed on the display unit has been read based on the eye-gaze position detected by the eye-gaze position detecting unit, and a performing unit configured to perform a predetermined process on the electronic document if the determining unit determines that the electronic document has been read.01-31-2013
20100066676Gestural Control of Autonomous and Semi-Autonomous Systems - Systems and methods are described for controlling a remote system. The controlling of the remote system comprises detecting a gesture of a body from gesture data received via a detector. The gesture data is absolute three-space location data of an instantaneous state of the body at a point in time and physical space. The detecting comprises identifying the gesture using the gesture data. The controlling comprises translating the gesture to a gesture signal, and controlling a component of the remote system in response to the gesture signal.03-18-2010
20130050079OPTICAL POINTING SYSTEM AND RELATED METHOD - An optical pointing system includes a plurality of light sources, an image receiver, and an analyzing unit. The plurality of light sources are disposed on multiple locations of an object and configured to provide light having distinct wavelengths. The image receiver is configured to detect optical signals of the plurality of light sources, thereby generating a plurality of corresponding images. The analyzing unit is configured to calculate a relative position or angle of the image receiver with respect to the object according to the images.02-28-2013
20130050080USER INTERFACES - A first handheld device transmits an acoustic signal. A second handheld device receives a signal derived from the transmitted signal, and performs an action based on the received signal.02-28-2013
20130069872POSITION DETECTING DEVICE, INFORMATION PROCESSING DEVICE, POSITION DETECTION METHOD, INFORMATION PROCESSING METHOD, AND COMPUTER READABLE MEDIUM - A position detecting device includes a device specifying unit, a motion obtaining unit, and a relative position detecting unit. The device specifying unit specifies plural information processing devices that have been brought into contact with one another. The motion obtaining unit obtains information about a motion of any one of the plural information processing devices. The relative position detecting unit detects, on the basis of a motion produced when the any one of the plural information processing devices specified by the device specifying unit is brought into contact with another of the plural information processing devices, relative positions of the plural information processing devices specified by the device specifying unit.03-21-2013
20130069873Dynamically Varying Classified Image Display System - A dynamically varying image display system including a display, a thematic based sorting algorithm processor, and a display processor. The display is configured to sequentially display a plurality of thematically classified images and the thematic based sorting algorithm processor relies on various recognition systems.03-21-2013
20130088430ULTRA THIN LIGHT SCANNING APPARATUS FOR PORTABLE INFORMATION DEVICE - Disclosed is an ultra thin optical scanning device for a portable information device, which includes an LED as a light source and totally reflects light from an object-side surface to form an image, thereby increasing a contrast ratio of the image and improving the resolution thereof. The ultra thin optical scanning device includes a light emitting device that emits light for sensing an object, an object-side surface contacting the object and totally reflecting the light emitted from the light emitting device, an image formation part collecting the light totally reflected by the object-side surface, and transmitting the light, and a light receiver part forming an image by using the light transmitted by the image formation part.04-11-2013
20090322679ORIENTATION CALCULATION APPARATUS, STORAGE MEDIUM HAVING ORIENTATION CALCULATION PROGRAM STORED THEREIN, GAME APPARATUS, AND STORAGE MEDIUM HAVING GAME PROGRAM STORED THEREIN - An orientation calculation apparatus obtains data from an input device including at least a gyroscope and an acceleration sensor, and calculates an orientation of the input device in a three-dimensional space. Orientation calculation means calculates the orientation of the input device in accordance with an angular rate detected by the gyroscope. Acceleration vector calculation means calculates an acceleration vector representing an acceleration of the input device in accordance with acceleration data from the acceleration sensor. Correction means corrects the orientation of the input device such that a direction of the acceleration vector in the space approaches a vertically downward direction in the space. Also, the correction means corrects the orientation of the input device such that a directional change before and after the correction is minimized regarding a predetermined axis representing the orientation of the input device.12-31-2009
20120218184ELECTRONIC FINGER RING AND THE FABRICATION THEREOF - The present invention provides an electronic finger ring (08-30-2012
20090040179GRAPHIC USER INTERFACE DEVICE AND METHOD OF DISPLAYING GRAPHIC OBJECTS - A graphic user interface, an input/output computing apparatus for intuitive interfacing, and a method of interfacing are disclosed. The input/output computing apparatus for intuitive interfacing with a user, includes an input unit to detect one of a plurality of predetermined motions of the user and generate a signal corresponding to the detected predetermined motion, and a controller to carry out an operation corresponding to the signal and generate a control signal to display the result corresponding to the operation.02-12-2009
20090040178Position detecting device - An object of the present invention is to provide a position detecting device that allows simultaneous use by a plurality of users by recognizing positional information about a plurality of pointers and associating the pointers and the users. A position detecting device of the invention includes a rear projector (02-12-2009
20090091533Keyboard - A keyboard includes a control panel and a cursor controller. The cursor controller includes a first key unit provided on the control panel, a second key unit provided on the control panel and a photo-sensor unit provided on the control panel. When a user moves his finger over the photo-sensor unit, the photo-sensor unit senses the movement and causes a cursor to move on a screen of a display of a computer used with the keyboard.04-09-2009
20090091532REMOTELY CONTROLLING COMPUTER OUTPUT DISPLAYED ON A SCREEN USING A SINGLE HAND-HELD DEVICE - A method, hand-held device and system for remotely controlling computer output displayed on a screen. A single hand-held device is used to remotely control the output of the computer displayed on a screen, where the hand-held device includes both a laser pointer and a camera. The camera is aligned with the laser pointer in such a manner as to be able to capture an image on the screen where the light is projected from the laser pointer. Image matching software may then be used to match the captured image with the image of the output of the computer displayed on the screen. User input (e.g., left-click action) may be received which is then used by the computer to perform that action in connection with the location (e.g., print icon) on the image displayed on the screen by the computer that corresponds to the position the point of light is projecting.04-09-2009
20100090949Method and Apparatus for Input Device - Method and apparatus for an input device. In an embodiment, the present invention provides an apparatus that is designed as a hollow-out glove, which can be worn on the wrist of a user. This apparatus has three working mode: touch mode, air-mouse mode, and action-induction mode. In touch mode, it works with laser-positioning device to identify the user's moving motion; in air-mouse mode, it works with multi-sensors to capture user's moving motion; in action-induction mode, the apparatus can track all the directions of user's moving motion. This invention also provides a method to operate this apparatus. It makes it possible to position accurately in the space and capture the user's moving motion.04-15-2010
20090015553IMAGE DISPLAYING APPARATUS, IMAGE DISPLAYING METHOD, AND COMMAND INPUTTING METHOD - A disclosed image displaying apparatus comprises: a photographing unit configured to photograph an image on a screen; a projection image generating unit generating the image to be projected on the screen; an image extracting unit extracting identification information from the image photographed by the photographing unit, the identification information concerning object or figure information; an object recognizing unit recognizing attribute information from the identification information concerning the object information extracted by the image extracting unit; a figure recognizing unit recognizing characteristic information from the identification information concerning the figure information extracted by the image extracting unit; and an operation processing unit operating the projection image generating unit based on the attribute information and characteristic information.01-15-2009
20090267898INPUT APPARATUS AND CONTROL SYSTEM - An input apparatus includes: a casing; a sensor module that includes a reference potential and outputs, as a detection signal, a fluctuation of a potential with respect to the reference potential, that corresponds to a movement of the casing; a velocity calculation unit to calculate a pointer velocity value as a velocity value for moving a pointer based on an output of the sensor module; a first execution section to execute a calibration mode as processing for correcting the reference potential; a second execution section to execute an operation mode as processing for moving the pointer on a screen in accordance with the pointer velocity value calculated by the velocity calculation unit; and a switch to switch the execution of the calibration mode to the execution of the operation mode and vise versa in accordance with an input operation from outside.10-29-2009
20110012830STEREO IMAGE INTERACTION SYSTEM - A stereo image interaction system includes a stereo image capturing module, a stereo image processing unit, a system host, and a stereo image display module. When the stereo image display module displays a stereo image, the stereo image capturing module obtains a motion image of an operation body, the stereo image processing unit obtains a motion characteristic from the motion image and transmits the motion characteristic to a central processing unit (CPU), and the CPU calculates a real-time motion of the stereo image under the motion characteristic. At this time, the stereo image displayed by the stereo image display module changes along with the motion characteristic, so that a virtual stereo image is displayed in a physical space, and the operation body is enabled to directly perform a real-time interaction on the stereo image. Furthermore, the displayed stereo image may be a first stereo image captured by the stereo image capturing module or a second stereo image pre-stored by a storage unit in the system host.01-20-2011
20120223885IMMERSIVE DISPLAY EXPERIENCE - A data-holding subsystem holding instructions executable by a logic subsystem is provided. The instructions are configured to output a primary image to a primary display for display by the primary display, and output a peripheral image to an environmental display for projection by the environmental display on an environmental surface of a display environment so that the peripheral image appears as an extension of the primary image.09-06-2012
20120223884SYSTEM AND METHOD TO DISPLAY CONTENT - An apparatus and method for displaying content is disclosed. A particular method includes determining a viewing orientation of a user relative to a display and providing a portion of content to the display based on the viewing orientation. The portion includes at least a first viewable element of the content and does not include at least one second viewable element of the content. The method also includes determining an updated viewing orientation of the user and updating the portion of the content based on the updated viewing orientation. The updated portion includes at least the second viewable element. A display difference between the portion and the updated portion is non-linearly related to an orientation difference between the viewing orientation and the updated viewing orientation.09-06-2012
20110063217DIRECT NAVIGATION OF TWO-DIMENSIONAL CONTROL USING A THREE-DIMENSIONAL POINTING DEVICE - Direct and absolute pointing is provided for with respect to a two-dimensional information display surface, much like how one would point a laser pointer or flashlight at a desired point. The displayed control may be moved by manipulating the pointing device in three dimensions. The translational position of the pointing device may be measured in three dimensions. Also, the three-dimensional orientation of the pointing device may be measured. A computing device may receive this information from the pointing device and determine where the pointing device is pointing to. If the pointing device is pointing at a display, then the computing device may cause the control to be displayed at the position to which the pointing device is pointing. In addition, the control may be displayed at an orientation that depends upon the orientation of the pointing device.03-17-2011
20110063216SYSTEM AND METHOD FOR NAVIGATING A MOBILE DEVICE USER INTERFACE WITH A DIRECTIONAL SENSING DEVICE - An electronic mobile device includes a display, a tilt sensor and a processor. The display is for displaying a graphical element. The tilt sensor is configured to measure a tilt angle of the mobile device. The processor is configured to store the measured tilt angle as a reference tilt angle, subsequently determine a delta tilt angle as the difference between a currently measured tilt angle and the reference tilt angle, compare the delta tilt angle to different thresholds, and alter the position of the displayed element on the display at a rate that is based on the number of the thresholds the delta tilt angle has exceeded.03-17-2011
20110063215REMOTE CONTROL SYSTEM AND REMOTE CONTROL METHOD - A remote control system includes a PC and a MFP capable of remotely controlling an external device. PC includes a browsing portion to receive an operation screen from the MFP, a display control portion to allow a projector to project the received operation screen onto a projection plane, and a position detection portion to detect a position pointed to by a user in the projected operation screen. The browsing portion transmits to the MFP a command which is included in the operation screen and is related to the detected position in the operation screen. The MFP includes an operation screen transmission portion to transmit to the PC the operation screen, including a command for specifying control, for accepting an operation, a command reception portion to receive a command transmitted from the PC, and a process control portion to control the external device or the device itself in accordance with the received command.03-17-2011
20110063214Display and optical pointer systems and related methods - Display and optical pointer systems and related methods are disclosed that utilize LEDs in a display device to respond to optical signals from optical pointing devices. In part, the disclosed embodiments relate to displays with arrays of LEDs and associated pointing devices that communicate with individual LEDs in the arrays using visible light. The LED arrays can produce images directly as in LED billboards and sports arena scoreboards or can produce the backlight for LCD screens for instance. The pointing devices communicate with individual pixels or groups of pixels using a beam of light that may or may not be modulated with data, which is detected by the LEDs in the array that are exposed to the beam. Periodically, the LEDs in an array stop producing light and are configured with an associated driver device to detect light from the pointing device. Such configurations enable the user to point and click at on screen displays much like a computer mouse.03-17-2011
20110063213REMOTE TOUCHPAD DEVICE FOR VEHICLE AND CONTROL METHOD THEREOF - The present invention features a remote touchpad device for a vehicle, which preferably comprises a circuit board having luminous elements installed at predetermined intervals along the circumference of a circle to irradiate light and at least one light-receiving element to receive the light the luminous elements, a pad provided on an upper part of the circuit board to make the light from the luminous elements reflected by an approaching or contact object and incident to the light-receiving element, a controller controlling a user interface by calculating the position of the object with 3D coordinates based on the amount of light incident to the light-receiving element, and a housing forming the exterior of the circuit board, the pad, and the controller. The invention also features a method of controlling the remote touchpad device.03-17-2011
20130063349OPTICAL NAGIVATION DEVICE - An optical navigation device may have an adaptive sleep mode for preventing unwanted scrolling inputs. A motion indicator may move a device between a sleep mode and an active mode. According to the sleep mode, a number of different sleep states are defined which have further reduced frame rates. The device may be only woken from the deeper sleep modes once repeated motion events are detected. This may prevent the device from being woken accidentally, while preserving the user experience.03-14-2013
20130063350SPATIALLY-CORRELATED MULTI-DISPLAY HUMAN-MACHINE INTERFACE - A human-machine interface involves plural spatially-coherent visual presentation surfaces at least some of which are movable by a person. Plural windows or portholes into a virtual space, at least some of which are handheld and movable, are provided by using handheld and other display devices. Aspects of multi-dimensional spatiality of the moveable window (e.g., relative to another window) are determined and used to generate images. As one example, the moveable window can present a first person perspective “porthole” view into the virtual space, this porthole view changing based on aspects of the moveable window's spatiality in multi-dimensional space relative to a stationary window. A display can present an image of a virtual space, and an additional, moveable display can present an additional image of the same virtual space.03-14-2013
20130063348POINTING DEVICE WITH MULTIPLE VIEW ANGLES - A pointing device includes two lenses with wide and narrow view angles respectively. In a short distance range, the pointing device utilizes the lens with wide view angle to increase visible range. In a long distance range, the pointing device utilizes the lens with narrow view angle to increase size of a formed image of a reference point. In addition, the pointing device senses images through both of the lenses with wide and narrow view angles to obtain rotational information. The pointing device can provide not only positional information but also angular information.03-14-2013
20130162537ORIENTATION CALCULATION APPARATUS, STORAGE MEDIUM HAVING ORIENTATION CALCULATION PROGRAM STORED THEREIN, GAME APPARATUS, AND STORAGE MEDIUM HAVING GAME PROGRAM STORED THEREIN - An orientation calculation apparatus obtains data from an input device including at least a gyroscope and an acceleration sensor, and calculates an orientation of the input device in a three-dimensional space. Orientation calculation means calculates the orientation of the input device in accordance with an angular rate detected by the gyroscope. Acceleration vector calculation means calculates an acceleration vector representing an acceleration of the input device in accordance with acceleration data from the acceleration sensor. Correction means corrects the orientation of the input device such that a direction of the acceleration vector in the space approaches a vertically downward direction in the space. Also, the correction means corrects the orientation of the input device such that a directional change before and after the correction is minimized regarding a predetermined axis representing the orientation of the input device.06-27-2013
20120194433IMAGE PROCESSING APPARATUS, IMAGE PROCESSING METHOD, AND STORAGE MEDIUM - There is provided an image processing apparatus comprising: an input unit configured to input object data including reflection characteristics of an object; an acquisition unit configured to acquire observation position data indicating the observation position of an observer and light source data indicating a surrounding light source around the image capture unit, on the basis of image data captured by the image capture unit; and a generation unit configured to generate image data, which image includes the object placed on the display unit, on the basis of the object data, the observation position data, and the light source data, wherein the image data indicate the image which is observed at the observation position when light from the light source is reflected by the object.08-02-2012
20090128489METHODS AND DEVICES FOR REMOVING UNINTENTIONAL MOVEMENT IN 3D POINTING DEVICES - Systems and methods according to the present invention describe 3D pointing devices and methods which detect movement of the 3D pointing device and remove unintentional movement from the output readings.05-21-2009
20090009471INPUT APPARATUS, CONTROL APPARATUS, CONTROL SYSTEM, AND CONTROL METHOD - An input apparatus outputting input information for controlling a movement of a user interface displayed on a screen is provided. The input apparatus includes: an angular velocity output unit for outputting a first angular velocity about a first axis, a second angular velocity about a second axis. A third angular velocity about a third axis; a combination calculates unit calculating a first combined angular velocity as a combination result of two angular velocities obtained by respectively multiplying the second and third angular velocities by two migration coefficients of a predetermined ratio. An output unit outputs, as the input information, information on the first angular velocity for controlling a movement of the user interface on the screen in an axial direction corresponding to the second axis and information on the first combined angular velocity for controlling the movement of the user interface on the screen in an axial direction corresponding to the first axis.01-08-2009
20090009470WRISTWATCH-TYPE MOBILE TERMINAL HAVING PRESSURE SENSOR AND METHOD FOR RECEIVING USER INPUTS THEREIN - Disclosed is a wristwatch-type mobile terminal and method for receiving user inputs therein, which can efficiently receive user inputs through a pressure sensor. The method for receiving user inputs in a wristwatch-type mobile terminal includes providing a sensor unit to the wristwatch-type mobile terminal and detecting a user input through the sensor unit, deciding an operation corresponding to a corresponding location of the sensor unit depending on the user input, determining whether the decided operation relates to movement of a pointer movable on a display screen, determining whether pressure of the user input is more than a preset value when the decided operation relates to the movement of the pointer, and moving the pointer a first distance and displaying the pointer when the pressure of the user input is not more than the preset value, or moving the pointer a second distance and displaying the pointer when the pressure of the user input is more than the preset value.01-08-2009
20090009469Multi-Axis Motion-Based Remote Control - Motion-based control of an electronic device uses an array of at least three reference elements forming a triangle. An image sensor (e.g., a video camera), which may be located on a user-manipulated device, captures an image of the array. The array image has a pattern formed by a nonparallel projection of the reference triangle onto the image sensor. The pattern carries information of the relative position between the image sensor and the reference element array, and changes as the relative position changes. The pattern is identified and used for generating position information, which may express a multidimensional position of the user-manipulated device with respect to three axes describing a translational position, and three rotational axes describing pitch, roll and yaw motions. The control system and method are particularly suitable for videogames.01-08-2009
20120113005DUAL POINTER MANAGEMENT METHOD USING COOPERATING INPUT SOURCES AND EFFICIENT DYNAMIC COORDINATE REMAPPING - The pointer management technology establishes a protocol and method for dual pointer management in both absolute input mode and relative input mode. The method defines a set of properties/constraints for contextual dynamic remapping between input sensor coordinates and target screen coordinates. The remapping of the left pointer (respectively the right pointer) depends on the position of the right pointer (respectively the left pointer) in the target screen space. This inter-dependence enables a more flexible and more powerful interaction as it exploits the contextual layout to re-estimate the remapping transformations at each instant.05-10-2012
20120113004COMPUTER READABLE RECORDING MEDIUM RECORDING IMAGE PROCESSING PROGRAM AND IMAGE PROCESSING APPARATUS - Displayed region size data indicating a size of a screen of a display device, or a size of a region in which an image of a virtual space is displayed on the screen, is obtained. Distance data indicating a distance between a user and the display device is obtained. A position and an angle of view of the virtual camera in the virtual space are set based on the displayed region size data and the distance data.05-10-2012
20120113003DISPLAY SURFACE AND CONTROL DEVICE COMBINED THEREWITH FOR A DATA PROCESSING SYSTEM - A combined display surface and a control device for a data processing system, wherein the position of a light beam hitting the display surface is measured and the measured result is used by the data processing system as a basis for determining a cursor position on the display surface. Several strip-shaped optical position detectors are arranged along the edge of the display surface, the measured signals of which are fed into the data processing system. The cross-sectional shape of the indicator beam is formed by several lines which protrude both the display surface and the position detectors arranged thereon. The optical position detectors are formed by a layered structure made of organic material.05-10-2012
20120113002METHOD AND APPARATUS FOR CONTROLLING AN OUTPUT DEVICE OF A PORTABLE ELECTRONIC DEVICE - According to embodiments described in the specification, a method and apparatus are provided for controlling an output device of a portable electronic device comprising a processor, a first motion sensor, a second motion sensor and an output device. The method comprises: receiving at the processor, from the first motion sensor, first motion data representing movement of an external object relative to the portable electronic device; receiving at the processor, from the second motion sensor, second motion data representing movement of the portable electronic device; generating, at the processor, third motion data based on the first and second motion data, the third motion data representing movement of the external object; and, controlling the output device based on the third motion data.05-10-2012
20120235908MULTI-DIRECTIONAL REMOTE CONTROL SYSTEM AND METHOD WITH AUTOMATIC CURSOR SPEED CONTROL - A multi-directional remote control system and method is adapted for use with an entertainment system of a type including a display such as a monitor or TV and having display functions employing a mouse type control. The remote controller may be conveniently held in one hand of a user and still provides full mouse type functionality. The remote control system and method images the controller to detect relative motion between the controller and screen. This position information is used for control of a cursor or other GUI interface with automatic control of cursor speed based on detected controller distance from the screen and characteristic hand movement.09-20-2012
20120235907SENSOR MAPPING - Techniques, systems and computer program products are disclosed for providing sensor mapping. In one aspect, a method includes receiving input from a user. The received input includes at least one of motion, force and contact. In addition, a sensor signal is generated based on the received input. From a choice of data structures a data structure associated with a selected application having one or more functions is identified. The data structure indicates a relationship between the generated sensor signal and the one or more functions of the selected application. The generated sensor signal is selectively mapped into a control signal for controlling the one or more functions of the selected application by using the identified data structure.09-20-2012
20120235906APPARATUS AND METHOD FOR INPUTTING INFORMATION BASED ON EVENTS - Disclosed are an apparatus and a method for inputting events. An embodiment of the present invention can generate left and right click events in addition to activating and stopping pointers by sensing a rolling of a wrist and calculate and output a coordinate displacement according to the motion of the hand at the time of activating the pointers according to the events. Further, the embodiment of the present invention can be applied to a large-sized display or a contactless spatial input apparatus of an HMD, entertainment such as games, and the like, and can overcome restricted environments by a gesture input scheme under special environment.09-20-2012
20120235905POINTING METHOD, A DEVICE AND SYSTEM FOR THE SAME - The invention shows a method to control a pointing device with an angular, rate sensor, that comprises generating an ensemble of orthogonal unit vector associated signals by at least one angular rate sensor to represent angular rates in a dimensional space for each mutually orthogonal unit vector direction of said dimensional space, amplifying the at least one of said signal non-linearly for determination of cursor on a screen for (x,y) coordinates of the screen, applying a decision criterion to determine the state of the pointing device as based on said unit vector associated signals. The invention also shows a pointer utilising the method and a system comprising such a pointer.09-20-2012
20120235904Method and System for Ergonomic Touch-free Interface - With the advent of touch-free interfaces such as described in the present disclosure, it is no longer necessary for computer interfaces to be in predefined locations (e.g., desktops) or configuration (e.g., rectangular keyboard). The present invention makes use of touch-free interfaces to encourage users to interface with a computer in an ergonomically sound manner. Among other things, the present invention implements a system for localizing human body parts such as hands, arms, shoulders, or even the fully body, with a processing device such as a computer along with a computer display to provide visual feedback on the display that encourages a user to maintain an ergonomically preferred position with ergonomically preferred motions. For example, the present invention encourages a user to maintain his motions within an ergonomically preferred range without have to reach out excessively or repetitively.09-20-2012
20120235903APPARATUS AND A METHOD FOR GESTURE RECOGNITION - The present invention relates to a method for gesture recognition and an apparatus for gesture recognition carrying out the method. More specifically, the present invention relates to a method for gesture recognition recognizing a finger gesture by using depth information and an apparatus for gesture recognition carrying out the method.09-20-2012
20130162534Device, Method, and Graphical User Interface for Manipulating a Three-Dimensional Map View Based on a Device Orientation - An electronic device displays on a display a first three-dimensional map view of a respective map location. The first three-dimensional map view is viewed from a first angle while an orientation of the electronic device corresponds to a first orientation. The electronic device detects a rotation of the electronic device with at least one orientation sensor, and determines a respective orientation of the electronic device. The respective orientation is distinct from the first orientation. While detecting the rotation of the electronic device, the electronic device updates the first three-dimensional map view with a respective three-dimensional map view of the respective map location. The respective three-dimensional map view is viewed from a respective angle distinct from the first angle. The respective angle is determined in accordance with the respective orientation of the electronic device.06-27-2013
20130162535MANIPULATION INPUT DEVICE WHICH DETECTS HUMAN HAND MANIPULATIONS FROM CAPTURED MOTION IMAGES - When a vehicle navigation system is manipulated by taking pictures of a user hand motion and gesture with a camera, as the number of apparatuses and operational objects increases, the associated hand shapes and hand motions increase, thus causing a complex manipulation for a user. Furthermore, in detecting a hand with the camera, when the image of a face having color tone information similar to that of a hand appears in an image taken with a camera, or outside light rays such as sun rays or illumination rays vary, detection accuracy is reduced. To overcome such problems, a manipulation input device is provided that includes a limited hand manipulation determination unit and a menu representation unit, whereby a simple manipulation can be achieved and manipulation can accurately be determined. In addition, detection accuracy can be improved by a unit that selects a single result from results determined by a plurality of determination units, based on images taken with a plurality of cameras.06-27-2013
20130162536SYSTEMS AND METHODS FOR ENABLING OR ACCESSING OR VIEWING COMPONENTS, INVOLVING AN INPUT ASSEMBLY AND A SCREEN - The present invention in a preferred embodiment provides systems and methods for enabling or accessing or viewing components through a graphical user interface, wherein the system comprises, 06-27-2013
20130207896Augmented reality display system and method of display - The present invention describes a display system that includes a display, including a display screen; a viewpoint assessment component to determine a viewpoint of a user positioned in front the display screen; and an object tracking component to track the user manipulation of an object positioned behind the display screen.08-15-2013
20130207895EYE TRACKING METHOD AND DISPLAY APPARATUS USING THE SAME - A display apparatus employs two tracking units to track the gaze of a user. The display apparatus includes a first tracking unit to generate position information on a user positioned relative to a displayed image; and a second tracking unit to track a gaze of the user upon the displayed image, based on the position information. A method of eye tracking using the display apparatus includes steps of displaying an image; generating position information on a user positioned relative to the displayed image; and tracking a gaze of the user upon the displayed image, based on the position information.08-15-2013
20130207894INFORMATION PROCESSING DEVICE, INFORMATION PROCESSING METHOD AND PROGRAM - There is provided an information processing device including a movement information acquisition part acquiring information which is based on movement of an operation device, and a control information generation part generating, based on the information, control information for changing a display status continuously in a non-linear manner according to the movement.08-15-2013
20110025603Spatial, Multi-Modal Control Device For Use With Spatial Operating System - A system comprising an input device includes a detector coupled to a processor. The detector detects an orientation of the input device. The input device has multiple modal orientations corresponding to the orientation. The modal orientations correspond to multiple input modes of a gestural control system. The detector is coupled to the gestural control system and automatically controls selection of an input mode in response to the orientation.02-03-2011
20110304540IMAGE GENERATION SYSTEM, IMAGE GENERATION METHOD, AND INFORMATION STORAGE MEDIUM - An image generation system includes an image information acquisition section that acquires image information from an image sensor, a motion information acquisition section that acquires motion information about an operator based on the image information from the image sensor, an object control section that moves an object in a movement area based on the motion information about the operator, and an image generation section that generates an image displayed on a display section. The object control section limits movement of the object so that the object does not move beyond a movement limiting boundary set in the movement area even when it has been determined that the object has moved beyond the movement limiting boundary based on the motion information.12-15-2011
20110141013USER-INTERFACE APPARATUS AND METHOD FOR USER CONTROL - An apparatus comprising at least two sensors, a pointing device and an object-recognition unit. The sensors are at different locations and are capable of detecting a signal from at least a portion of a user. The pointing device is configured to direct a user-controllable signal that is detectable by the sensors. The object-recognition unit is configured to receive output from the sensors, and, to determine locations of the portion of the user and of the pointing device based on the output. The object-recognition unit is also configured to calculate a target location pointed to by the user with the pointing device, based upon the determined locations of the portion of the user and of the pointing device.06-16-2011
20110141014MOVABLE TOUCHPAD WITH HIGH SENSITIVITY - A highly sensitive movable touchpad is disclosed in the present invention. It is used for laptop computers and has a slidable template for users to move so that a cursor can be controlled by the touchpad. A resistive or capacitive detecting surface can be applied for detecting users' click, double click, drag, or scroll motion on any point of the surface. Additionally, there is an optical displacement sensor provided under the slidable template for detecting surface information on the back surface of the slidable template. A sequence of images of surface movement are processed by an image processing unit. Then, relative movement information is calculated and sent to an operating system in the computer. The operating system controls the cursor with the relative movement information. The present invention uses edge detectors for dynamically controlling the cursor and calibrating location of the cursor so that positioning of the touchpad is synchronous with the cursor.06-16-2011
20110279367PRESENTER - A presenter includes a main body, a light source, a pointing input unit and a wireless transmission unit. The light source can be a laser diode or any other light-emitting element capable of emitting a laser beam. The pointing input unit can be a magnetically sensitive pointing unit or an optical cursor controller. The pointing input unit is operable to output a pointing control signal to a host via the wireless transmission unit. The presenter is capable of projecting a laser beam and controlling a cursor of the host device via the wireless transmission unit.11-17-2011
20110279369HYBRID POINTING DEVICE - The present invention discloses a hybrid pointing device including an optical navigation module and a pointing module. The optical navigation module is configured to replace the conventional buttons of a convention pointing device, such as an optical mouse or a trackball mouse. The optical navigation module is configured to sense gestures of at least one object operated by a user to activate commands associated with particular programs running on the host. Since the optical navigation module is only configured to sense gestures of the object but not the movement of the hybrid pointing device relative to a surface, the resolution thereof is aimed to be sufficiently high enough for sensing gestures and no need to be relatively high.11-17-2011
20110279368INFERRING USER INTENT TO ENGAGE A MOTION CAPTURE SYSTEM - Techniques are provided for inferring a user's intent to interact with an application run by a motion capture system. Deliberate user gestures to interact with the motion capture system are disambiguated from unrelated user motions within the system's field of view. An algorithm may be used to determine the user's aggregated level of intent to engage the system. Parameters in the algorithm may include posture and motion of the user's body, as well as the state of the system. The system may develop a skeletal model to determine the various parameters. If the system determines that the parameters strongly indicate an intent to engage the system, then the system may react quickly. However, if the parameters only weakly indicate an intent to engage the system, it may take longer for the user to engage the system.11-17-2011
20100171698DIGITAL TELEVISION AND METHOD OF DISPLAYING CONTENTS USING THE SAME - A DTV and a method of displaying content using the same are provided. a DTV includes: a plurality of first and second display units physically isolated from each other; a communication unit configured to communicate with an external remote control; and a control unit configured to move a content between the first and second display units in response to a motion of the remote control sensed by a signal received from the remote control through the communication unit.07-08-2010
20100171697METHOD OF CONTROLLING VIEW OF STEREOSCOPIC IMAGE AND STEREOSCOPIC IMAGE DISPLAY USING THE SAME - A method of controlling a view of a stereoscopic image and a stereoscopic image display using the same are disclosed. The method of controlling a view of a stereoscopic image includes: detecting a position information of a viewer from an output of a sensor; changing parameters for rendering a viewing angle and a depth information according to the position information; generating a left-eye image and a right-eye image in which a viewing angle and a depth information are changed in accordance with the parameters; and displaying the left-eye image and the right-eye image on a stereoscopic image display.07-08-2010
20100171696MOTION ACTUATION SYSTEM AND RELATED MOTION DATABASE - The claimed invention relates to an interactive system for recognizing a single or a series of hand motion of the user to control or create applications used in multimedia. In particular, the system includes a motion sensor detection unit (MSDU) 07-08-2010
20100225582INFORMATION PROCESSING APPARATUS, STORAGE MEDIUM HAVING INFORMATION PROCESSING PROGRAM STORED THEREIN, INFORMATION PROCESSING SYSTEM, AND DISPLAY RANGE CONTROL METHOD - An information processing apparatus inputs an angular rate detected by a gyroscope included in an input device and displays an image on a display device. The information processing apparatus initially calculates an orientation of the input device based on the angular rate. Then, the information processing apparatus calculates a coordinate point at an intersection between a line extending from a predetermined position in a predetermined space toward a vector representing the orientation and a predetermined plane within the predetermined space. A display range of a display target that is to be displayed on the display device is controlled based on the coordinate point.09-09-2010
20090278799COMPUTER VISION-BASED MULTI-TOUCH SENSING USING INFRARED LASERS - The claimed subject matter provides a system and/or a method that facilitates detecting a plurality of inputs simultaneously. A laser component can be coupled to a line generating (LG) optic that can create a laser line from an infrared (IR) laser spot, wherein the laser component and line generating (LG) optic emit a plane of IR light. A camera device can capture a portion of imagery within an area covered by the plane of light. The camera device can be coupled to an IR-pass filter that can block visible light and pass IR light in order to detect a break in the emitted plane of IR light. An image processing component can ascertain a location of the break within the area covered by the emitted plane of IR light.11-12-2009
20080266253SYSTEM AND METHOD FOR TRACKING A LASER SPOT ON A PROJECTED COMPUTER SCREEN IMAGE - A system and method for tracking a laser spot on a projected computer screen image computes a projective transformation matrix using the projected computer screen image electronically captured in a frame of image data. The projective transformation matrix is computed by fitting a polygon to a contour of each graphical object in the frame of image data and determining whether the polygon for each graphical object satisfies a set of predefined parameters to find a candidate polygon that corresponds to an outline of the projected computer screen image in the frame of image data.10-30-2008
20100007604TOUCH-SENSITIVE CONTROL SYSTEMS AND METHODS - Touch-sensitive control systems and methods are provided. The touch-sensitive control system includes a touch interface and a sensor. The touch interface includes a first zone and a second zone having an icon corresponding to a function. The sensor is disposed under the touch interface for detecting contacts on the physical interface. When a contact on the icon of the second zone is detected by the sensor, the corresponding function is activated. When a movement on the first zone is detected by the sensor, an operation corresponding to the function is performed according to the movement.01-14-2010
20110122062MOTION RECOGNITION APPARATUS AND METHOD - Provided are a motion recognition apparatus and method, and more particularly, a motion recognition apparatus and method which are employed to move a pointer only when intended by a user using a touch sensor included in a pointing device that moves the pointer according to a motion sensed by a motion sensor.05-26-2011
20090207134REMOTE CONTROL APPARATUS WITH INTEGRATED POSITIONAL RESPONSIVE ALPHABETIC KEYBOARD - In accordance with one embodiment, a remote control apparatus includes a first transmission means for use when the remote control apparatus is in a horizontal orientation and a second transmission means for use when the remote control apparatus is in a vertical orientation. The remote control apparatus further includes a keypad having a plurality of keys that have a first set of labels for use in the horizontal orientation and a second set of labels for use in the vertical orientation. In addition, a means for determining whether the remote control is in the horizontal orientation or the vertical orientation is provided as part of the remote control apparatus. At least some of the keys have a first functionality when in the horizontal orientation and a second functionality when in the vertical orientation.08-20-2009
20110298708Virtual Touch Interface - A user may issue commands to a computing device by moving a pointer within a light field. Sensors may capture light reflected from the moving pointer. A virtual touch engine may analyze the reflected light captured as light portions in a sequence of images by the sensors to issue a command to a computing device in response to the movements. Analyzing the sequence of images may include finding the light portions in the sequence of images, determining a size of the light portions, and determining a location of the light portions.12-08-2011
20110285623Motion Sensing System - A motion sensing system includes a hand-held device and a receiver device. The hand-held device includes a microcontroller, a G-sensor (one 3-axis accelerometer), only one 2-axis gyroscope, and a wireless transmitter. The receiver device is preferably a dongle and includes a microcontroller and a wireless receiver. A first axis of the 2-axis gyroscope is parallel to the Z axis of the hand-held device and the second axis of the 2-axis gyroscope forms an acute angle α with the X axis of the hand-held device. The acute angle α allows the microcontroller of the receiver device to calculate rotational data around each of the three axis of the hand-held device.11-24-2011
20110285626GESTURE RECOGNIZER SYSTEM ARCHITECTURE - Systems, methods and computer readable media are disclosed for a gesture recognizer system architecture. A recognizer engine is provided, which receives user motion data and provides that data to a plurality of filters. A filter corresponds to a gesture, that may then be tuned by an application receiving information from the gesture recognizer so that the specific parameters of the gesture—such as an arm acceleration for a throwing gesture—may be set on a per-application level, or multiple times within a single application. Each filter may output to an application using it a confidence level that the corresponding gesture occurred, as well as further details about the user motion data.11-24-2011
20110285625INFORMATION PROCESSING APPARATUS AND INPUT METHOD - According to one embodiment, an information processing apparatus comprises a touch-screen display, a detection module and an output module. The detection module is configured to detect a number of touch positions on the touch-screen display. The output module is configured to output first data indicative of a touch position on the touch-screen display in order to activate a function associated with a touched display object on the touch-screen display, the output module being configured to output second data indicative of a direction of movement and an amount of movement of a touch position on the touch-screen display, in place of the first data, in order to move a cursor on a screen of a display, if the detection module detects that a plurality of positions on the touch-screen display are touched.11-24-2011
20110285624SCREEN POSITIONING SYSTEM AND METHOD BASED ON LIGHT SOURCE TYPE - A screen positioning system includes a display, a light emitter, an image capture device and a computer. The light emitter emits light to a position on the display in which an object icon is displayed to form a projection point thereon. The image capture device captures whole images of the screen; and the computer determines the position or properties of the projection point, and further determines an operating command of the object icon corresponding to the position or properties of the projection point, and then implements the determined operating command and directs the display to display a result of implementing the operating command.11-24-2011
20110285622RENDITION OF 3D CONTENT ON A HANDHELD DEVICE - A handheld device having a display and a front-facing sensor and a back-facing sensor is able to render 3D content in a realistic and spatially correct manner using position-dependent rendering and view-dependent rendering. In one scenario, the 3D content is only computer-generated content and the display on the device is a typical, non-transparent (opaque) display. The position-dependent rendering is performed using either the back-facing sensor or a front-facing sensor having a wide-angle lens. In another scenario, the 3D content is composed of computer-generated 3D content and images of physical objects and the display is either a transparent or semi-transparent display where physical objects behind the device show through the display. In this case, position-dependent rendering is performed using a back-facing sensor that is actuated (capable of physical panning and tilting) or is wide-angle, thereby enabling virtual panning.11-24-2011
20110199301Sensor-based Pointing Device for Natural Input and Interaction - A pointing or input device is generally cylindrical or puck-shaped, and has various sensors for sensing 2D, 3D, and high degree of freedom motion for more natural user interaction.08-18-2011
20110298710HAND-HELD POINTING DEVICE, SOFTWARE CURSOR CONTROL SYSTEM AND METHOD FOR CONTROLLING A MOVEMENT OF A SOFTWARE CURSOR - A hand-held pointing device for controlling a movement of a software cursor comprises an acceleration detector and an image capturing unit. The acceleration detector determines an inclination parameter, wherein the movement of the software cursor in a vertical direction is controllable based on the inclination parameter. Further, the image capturing unit records images within the visible spectral range, wherein the movement of the software cursor in a horizontal direction is controllable based on the recorded images.12-08-2011
20110298709SYSTEM AND METHOD FOR DIGITAL RECORDING OF HANDPAINTED, HANDDRAWN AND HANDWRITTEN INFORMATION - The present invention provides method and system for recording hand-painted, hand-drawn and handwritten information defined by a hand and/or fingers movement. The system corresponding to the invented method comprises: a computing device with a display, an input device comprising: an end-point coupled to a force sensor, additional motion sensors, IC circuit for digitizing the information from sensors and processing the data related to the force and motion vectors components; hardware and software for providing a digital description of how the device has been pressed to the surface and how the device has been moved. Besides above mentioned applications the method and system can also be used for precise cursor navigation on the display, computer gaming and as a universal remote control for electronic equipment and appliances or as a security device with multi-level authentication. With an addition of several components the input device can be used as a smart cell-phone.12-08-2011
20120098746Optical Position Detection Apparatus - This invention is to provide an optical position detection apparatus including a retroreflective member (04-26-2012
20120098745SYSTEM AND METHOD FOR HIGHLIGHTING MULTIMEDIA PRESENTATION - A projector includes an infrared camera. Projected infrared rays on a screen are detected and the projector calculate coordinates of a center point of a projection area of the infrared ray projected on the screen. The coordinates of the center point is transformed to coordinates of a pixel point on the screen corresponding to a resolution of a projection lens of the projector. The projector determines the pixel point on the screen according to the converted coordinates. A light source of the projector is controlled to highlight the pixel point on the screen.04-26-2012
20120098744SYSTEMS, METHODS, AND APPARATUSES FOR SPATIAL INPUT ASSOCIATED WITH A DISPLAY - An exemplary system includes a handheld user input device configured to emit a pointing signal and a selection signal from within a physical user space and directed at a display screen. The exemplary system further includes a spatial input subsystem configured to detect the pointing signal, determine a physical position within the physical user space based on the detected pointing signal, map the determined physical position within the physical user space to a cursor position on the display screen, output data representative of the cursor position for use by a display subsystem associated with the display screen, detect the selection signal, and output, in response to the selection signal, data representative of a selection command for use by the display subsystem. Corresponding systems, methods, and apparatuses are also disclosed.04-26-2012
20090085870Multimedia device - A multimedia device includes a display device, a storage device, an orientation sensor and a processing unit. The storage device has a plurality of programs, and each of which is capable of rendering of an operation interface on the display device. The orientation sensor is capable of detecting the disposed orientation of the display device and sending an orientation signal. The processing unit electrically connected to the display device, the storage device and the orientation sensor. The operation interface shown on the display device is determined by the processing unit in accordance with the orientation signal emitted from the orientation sensor.04-02-2009
20110291928Multifunctional flexible handwriting board and method for manufacturing the same - The present invention discloses a multifunctional flexible handwriting board and its manufacturing method. The multifunctional flexible handwriting board comprises a name plate, a linear plate, a shock absorbing layer and a circuit board. The name plate is made of a flexible material and has an upper surface uses as a mouse pad. The linear plate is made of a flexible material and has a sensing area at the middle. The shock absorbing layer is made of a soft material for protecting the linear plate. The circuit board is electrically coupled to the sensing area of the linear plate for transmitting information. The method of manufacturing the multifunctional flexible handwriting board is to couple the circuit board to an edge of the linear plate and sequentially couple the name plate, the linear plate and the shock absorbing layer.12-01-2011
20110291927Smart Method and Device for Adaptive User Interface Experiences - A wireless communication device (12-01-2011
20120026088HANDHELD DEVICE WITH PROJECTED USER INTERFACE AND INTERACTIVE IMAGE - Systems and methods for a device with a user interactive image projector disposed in a distal end of the device from the user are described. In one aspect, the device is operatively configured to project at least a portion of a user interactive image on a projection surface separate from the device. The device locks at least a portion of the projected user interactive image with respect to the projection surface. Responsive to receiving user input, the device allows the user to navigate the user interactive image in accordance with the user input.02-02-2012
20110134036Touchscreen Display With Plural Cameras - A display system (AP06-09-2011
20090289896INPUT ARRANGEMENT FOR ELECTRONIC DEVICES - A new form of input arrangement for cursor control devices or other handheld electronic devices in which the activation surfaces are designed to allow the fingers and thumb of the user to affect commands by means of ergonomic and less repetitive motion as compared to current devices. In embodiments, sensors are associated with the user's digits which sense motion in not only the downward direction, but in multiple directions. In embodiments, the cursor control device has resilient pads to further add comfort to a user. The resulting ability of the user to vary and reduce the points of pressure and other stresses onto different surfaces of the digits and corresponding nerves and muscles serves to reduce discomfort and pain resulting from current devices.11-26-2009
20080309619Cursor control method applied to presentation system and computer readable storage medium - The invention discloses a cursor control method applied to a presentation system. The presentation system comprises a computer, an imaging plane, an optical pointer, a camera, and a projector. The projector is a mobile or built-in projector of the computer for projecting output from the computer onto the imaging plane, wherein the output of the computer comprises an internal cursor generated by the computer. The optical pointer is used for projecting an external cursor onto the imaging plane. The camera is a mobile or built-in camera of the computer for capturing an image of the imaging plane. After capturing the image, a processor of the computer detects both a first position of the external cursor and a second position of the internal cursor corresponding to the image, calculates a shift vector between the first and second positions, and moves the internal cursor based on the shift vector.12-18-2008
20090128488OPTICAL NAVIGATION DEVICE WITH CONSOLIDATED PROCESSING FOR SURFACE AND FREE SPACE NAVIGATION - An optical navigation device for operation in a surface navigation mode and a free space navigation mode. The optical navigation device includes a microcontroller, a first navigation sensor, and a second navigation sensor. The first navigation sensor is coupled to the microcontroller, and the second navigation sensor is coupled to the first navigation sensor. The microcontroller processes a movement of the optical navigation device. The first navigation sensor generates a first navigation signal in a first navigation mode. The second navigation sensor generates a second navigation signal in a second navigation mode and sends the second navigation signal to the first navigation sensor. By implementing a navigation sensor to process signals from multiple navigation sensors, the cost and size of the optical navigation device can be controlled, and a small packaging design can be used.05-21-2009
20090295719DTV CAPABLE OF RECEIVING SIGNAL FROM 3D POINTING DEVICE, AND METHOD OF EXECUTING FUNCTION AND ADJUSTING AUDIO PROPERTY OF DTV EMPLOYING 3D POINTING DEVICE - A system and method for controlling a digital TV, the system including the DTV; and a 3D pointing device. The DTV includes a receiver to receive control signals from the 3D pointing device; and a control unit to select one of a plurality of functions provided by the DTV in response to a selection signal received from the 3D pointing device, each function having a corresponding execution profile, and execute the selected function in accordance with the corresponding execution profile and a motion parameter sensed by the 3D pointing device including a direction of movement and one of a distance and velocity corresponding to the direction of movement. The 3D pointing device includes a transmitter to transmit the control signals and motion parameter; and a sensor to sense within the 3D pointing device the motion of the 3D pointing device and generate the corresponding motion parameter.12-03-2009
20110216002Calibration of Portable Devices in a Shared Virtual Space - Methods, systems, and computer programs for generating an interactive space viewable through at least a first and a second device are presented. The method includes an operation for detecting from the first device a location of the second device or vice versa. Further, synchronization information data is exchanged between the first and the second device to identify a reference point in a three-dimensional (3D) space relative to the physical location of the devices in the 3D space. The devices establish the physical location in the 3D space of the other device when setting the reference point. The method further includes an operation for generating views of an interactive scene in the displays of the first and second devices. The interactive scene is tied to the reference point and includes virtual objects. The view in the display shows the interactive scene as observed from the current location of the corresponding device. Moving the device in the 3D space causes the view to change according to the perspective from the current location.09-08-2011
20090189858Gesture Identification Using A Structured Light Pattern - In at least some embodiments, a computer system includes a processor. The computer system also includes a light source. The light source provides a structured light pattern. The computer system also includes a camera coupled to the processor. The camera captures images of the structured light pattern. The processor receives images of the structured light pattern from the camera and identifies a user gesture based on distortions to the structured light pattern.07-30-2009
20090189857TOUCH SENSING FOR CURVED DISPLAYS - Described herein is an apparatus that includes a curved display surface that has an interior and an exterior. The curved display surface is configured to display images thereon. The apparatus also includes an emitter that emits light through the interior of the curved display surface. A detector component analyzes light reflected from the curved display surface to detect a position on the curved display surface where a first member is in physical contact with the exterior of the curved display surface.07-30-2009
20110205157System and Method for Information Handling System Touchpad Enablement - A disabled information handling system integrated pointing device is automatically enabled based upon user inputs detected at the integrated pointing device, such as inputs that indicate a user is unsuccessfully attempting to use the integrated pointing device. For example, a touchpad disposed at a portable information handling system housing in a disabled state automatically transitions to an enabled state when user inputs include rapid movements, movements from end to end or movements of increasing speed. The touchpad automatically transitions to an enabled state based upon movements interpreted as an attempt by the user to make inputs at the touchpad.08-25-2011
20100001952STORAGE MEDIUM STORING INFORMATION PROCESSING PROGRAM, INFORMATION PROCESSING APPARATUS AND INFORMATION PROCESSING METHOD - A game apparatus includes a CPU, and the CPU controls a moving object within a virtual space on the basis of acceleration data and angular velocity data which are transmitted from a controller. For example, before the angular velocity data is above a predetermined magnitude, a position and an orientation of the moving object is controlled on the basis of the angular velocity data. When the angular velocity data is above the predetermined magnitude, an initial velocity of the moving object is decided on the basis of the acceleration data, and a moving direction (orientation) of the moving object is decided on the basis of the angular velocity data. Thereafter, the moving object moves within the virtual space according to a general physical behavior.01-07-2010
20090315829Multi-User Pointing Apparaus and Method - An apparatus for interaction of a plurality of users with an application, comprising: a plurality of pointing devices (12-24-2009
20090153477Computer mouse glove - A computer mouse glove for transferring computer mouse functions to the hand of a computer user. The glove includes: a glove member having finger fittings and a thumb fitting; a computer cursor control system having buttons and a tracking system having an optical tracking device; a computer module; a power module; a connection module; a tracking ball; and a power switch. The glove member encases a user's hand. The computer cursor control system controls functions of a cursor on a computer screen. The buttons provides mouse electrical switching functions. The tracking system controls movement of the cursor on a computer screen. The power module provides energy to the computer glove mouse. The connection module transmits electronic signals from the computer glove mouse to computer module. The tracking ball controls movement of the computer cursor. The power switch enables a user to select either the tracking ball or optical tracking device.06-18-2009
20090153479Positioning Device of Pointer and Related Method - A positioning device for positioning an aim point of a pointer on a screen includes a screen, a pointer and a processor. The screen is utilized for displaying a plurality of characteristic points having already-known coordinate values. The pointer is utilized for forming an aim point, and includes an image acquisition unit for acquiring an image and a calculation unit for calculating image coordinate values of the plurality of characteristic points in the image. The processor is coupled to the screen and the pointer, and is utilized for establishing a transformation matrix according to the already-known coordinate values of the plurality of characteristic points and the image coordinate values of the plurality of characteristic points in the image and for deciding the position of the aim point according to the transformation matrix.06-18-2009
20090153478Centering a 3D remote controller in a media system - An electronic device associated with a remote wand controlling the operations of the electronic device is provided. The wand may include a motion detection component operative to provide an output reflecting the motion of the wand to the electronic device, such that the movements of a cursor displayed by the electronic device may be related to the output of the motion detection component. The wand may also include an input mechanism operative to receive user inputs. Using the input mechanism, the wand may detect a user's inputs and direct the electronic device to zoom or scroll displayed objects. The electronic device may display a screen saver by which the user may select particular media items for playback while remaining in the screen saver mode. In some embodiments, the electronic device may display video with a scroll bar that includes a preview window of the video.06-18-2009
20100117959MOTION SENSOR-BASED USER MOTION RECOGNITION METHOD AND PORTABLE TERMINAL USING THE SAME - A motion sensor-based user motion recognition method and portable terminal having a motion sensor is disclosed. The method recognizes user motions in a portable terminal. At least one parameter value is extracted from at least one user motion applied to the portable terminal. A reference parameter value serving as a user motion recognition reference is established according to at least one extracted parameter value. The established reference parameter value is stored.05-13-2010
20090015554SYSTEM, CONTROL MODULE, AND METHOD FOR REMOTELY CONTROLLING A COMPUTER WITH OPTICAL TRACKING OF AN OPTICAL POINTER - A control module is used with a projector projecting an image frame having at least one icon from a computer onto a screen, and an optical pointer projecting an optical cursor onto the screen, and obtains a close-loop optical track of the optical cursor based on positions of the optical cursor in a series of images of the projected image frame and the optical cursor captured by an image capturing device, selects an image portion of one captured image based on the close-loop optical track of the optical cursor to serve as a standard sample image, compares the standard sample image with the image frame to determine an image block from the image frame most similar to the standard sample image, and generates a control signal based on the image block to control the computer to execute a command associated with at most one icon contained in the image block.01-15-2009
20120229381PUSH PERSONALIZATION OF INTERFACE CONTROLS - A computing system is configured to receive one or more depth images, from the depth camera, of a world space scene including a human target. The computing system translates a world space position of a hand of the human target to a screen space cursor position of the user interface using a virtual desktop transformation. The computing system also dynamically adjusts the virtual desktop transformation based on a history of button press actions executed by the human target.09-13-2012
20090309831WIRELESS CONTROL DEVICE AND MULTI-CURSOR CONTROL METHOD - A wireless control device and a multi-cursor control method for use with a computer system are provided. The application program of the computer system is executed to generate at least a first pointer and a second pointer. The wireless control device includes a first pointer controller, a second pointer controller and a wireless transceiver. The multi-cursor control method includes steps of issuing a first wireless control signal containing a first identification code in response to manipulation of a first user, issuing a second wireless control signal containing a second identification code in response to manipulation of a second user, receiving and transmitting the first wireless control signal and the second wireless control signal to the computer system, and controlling corresponding shifts of the first pointer and the second pointer defined in the application program according to the first identification code and the second identification code, respectively.12-17-2009
20110199303DUAL WRIST USER INPUT SYSTEM - A dual wrist user input system includes a first wrist band that conforms to a first wrist of a user. The dual wrist user input system includes a motion tracking sensor that tracks aerial motion of the first wrist of the user as aerial motion data, the motion tracking sensor being adhered to the first wrist band. The aerial motion of the first wrist of the user is performed by the user to move an indicator displayed by a display screen operably connected to a computing device. Further, the dual wrist user input system includes a second wrist band that conforms to a second wrist of the user. In addition, the dual wrist user input system includes a rotational sensor that tracks rotational movement of the second wrist of the user as rotational movement data. The rotational sensor is adhered to the second wrist band.08-18-2011
20130088429APPARATUS AND METHOD FOR RECOGNIZING USER INPUT - An apparatus includes an image sensor to obtain optical image information, a control unit to generate input recognition information based on the optical image information, and to determine a user input based on the input recognition information, and a display unit to display control information corresponding to the user input. A method for recognizing a user input includes obtaining optical image information, generating input recognition information based on the optical image information, the input recognition information including a region corresponding to an input object, and determining a user input based on the input recognition information.04-11-2013
20090066648GUI APPLICATIONS FOR USE WITH 3D REMOTE CONTROLLER - A remote wand for controlling the operations of a media system is provided. The wand may be operative to control the movement of a cursor displayed on screen by the position and orientation at which the wand is held. As the user moves the wand, the on-screen cursor may move. The user may use the wand to control a plurality of operations and applications that may be available from the media system, including for example zoom operations, a keyboard application, an image application, an illustration application, and a media application.03-12-2009
20130021245INTERACTIVE CONTENT CONTROL METHOD AND USER INTERFACE APPARATUS USING THE SAME - An interactive content control method and a user interface apparatus using the same are provided. The interactive content control method detects a reference length which becomes a reference for controlling interactive content according to a movement of a user, on the basis of skeletal information of the user, detects a comparison length on the basis of the skeletal information, and controls the interactive content according to a result of comparing the reference length and the comparison length. Accordingly, the present invention can provide a highly interactive user interface environment.01-24-2013
20130021246INPUT APPARATUS OF DISPLAY APPARATUS, DISPLAY SYSTEM AND CONTROL METHOD THEREOF - An input apparatus of a display apparatus, a display system, and a control method thereof, are provided herein, the input apparatus including: a communication unit which communicates with the display apparatus; a sensing unit which detects angular speed and acceleration from a motion of the input apparatus; a storage unit which stores position information on a position of the input apparatus; and a controller which calculates the motion information based on the detected angular speed and the position information and transmits the calculated motion information through the communication unit if the input apparatus moves, and updates the position information in the storage unit based on the detected acceleration if the input apparatus does not move.01-24-2013
20090244005INPUT SYSTEM INCLUDING POSITION-DETECTING DEVICE - A position-detecting device detects a position pointed to by a position-pointing instrument and includes an operation panel detecting the position pointed to by the position-pointing instrument; and a manipulation-detecting unit located at at least one of the interior and the exterior of the operation panel, and detecting a manipulation by a second instrument other than the position-pointing instrument, or detecting a manipulation by both the position-pointing instrument and the second instrument.10-01-2009
20110169736INTERACTIVE INPUT SYSTEM AND TOOL TRAY THEREFOR - An interactive input system comprises an interactive surface and a tool tray supporting at least one tool to be used to interact with the interactive surface. The tool tray comprises processing structure for communicating with at least one imaging device and processing data received from the at least one imaging device for locating a pointer positioned in proximity with the interactive surface.07-14-2011
20080211772SUCCESSIVELY LAYERED MODULAR CONSTRUCTION FOR A PORTABLE COMPUTER SYSTEM - A modular portable computer system is described. A top modular layer with a coupled display interface and adapted to be interconnected with other modular layers. A second modular layer is interconnected with the top modular layer and other modular layers, for providing a power source to supply operating power to said top modular layer and to those other modular layers present and is disposed beneath the top modular layer. A third modular layer is interconnected with the top modular layer and the second modular layer for providing baseline logic electronics and communication components to the modular portable computer system and is disposed beneath the top modular layer. A universal interconnect for providing electronic and communicative interconnection of each modular layer is disposed at least once on each modular layer.09-04-2008
20110169737STORAGE MEDIUM HAVING INFORMATION PROCESSING PROGRAM STORED THEREIN, INFORMATION PROCESSING APPARATUS, AND INFORMATION PROCESSING SYSTEM - An information processing apparatus is capable of obtaining operation data according to a tilt of a predetermined object that can be moved by a user. The information processing apparatus calculates tilt information corresponding to the tilt of the object based on the operation data. The information processing apparatus calculates a specified position on a screen of a display device based on the operation data so that the specified position changes according to at least one of a position and the tilt of the object. Selection items are displayed on the screen of the display device. The information processing apparatus switches between sets of the selection items displayed on the screen according to an amount of tilt represented by the tilt information. Moreover, the information processing apparatus selects an item displayed at the specified position from among the selection items to perform an information process according to the selected item.07-14-2011
20090278798Active Fingertip-Mounted Object Digitizer - A finger-mounted implement including a kinesthetic sensor, at least one tactile sensor, and means for securing the kinesthetic sensor and the at least one tactile sensor to a fingertip. The tactile sensor may be a thin-film force transducer, a piezoelectric accelerometer, or a combination thereof. An artificial fingernail may be connected to the accelerometer. The kinesthetic sensor may include a magnetic transducer and may sense an X-Y-Z position and an angular orientation of a fingertip to which the kinesthetic sensor is secured. The securing means may include at least one means selected from the group consisting of adhesive tape, an elastically deformable cover, and detachable adhesive. The implement can be further connected to a computer processing system for, amongst other things, the virtual representation of sensed objects. The implement can also be used as part of a method of haptic sensing of objects.11-12-2009
20090278800METHOD OF LOCATING AN OBJECT IN 3D - Methods and devices for calculating the position of a movable device are disclosed. The device may include multiple optical detectors (ODs) and the movable device may include light sources. Optics may be above the ODs. A controller may calculate the position of the light source based on data from the ODs and properties of the optics. The device may be a game console, and the light source may be a game controller. The roles of the OD and light sources may be interchanged. The rotation of the movable device may be determined using multiple light sources and/or multiple ODs on the movable device. The movable device may calculate its position and transmit it to a console. The light sources may be modulated by time or frequency to distinguish between the light sources. There may be two or more movable devices. There may be two or more consoles.11-12-2009
20090295720METHOD FOR EXECUTING MOUSE FUNCTION OF ELECTRONIC DEVICE AND ELECTRONIC DEVICE THEREOF - A method for executing a mouse function of electronic device and an electronic device thereof are provided. In the present method, an amount and a relative position of input signals are detected by a sensor module. Then, whether the amount and the relative position are respectively conformed to a predetermined value is determined. If the predetermined values are conformed, whether the input signal is conformed to a specific signal is determined when a variation of the relative position is occurred. Finally, a corresponding mouse function is executed according to a type of the variation if the variation is conformed to the specific signal. As a result, a mouse device is no longer needed for a user to accomplish a directional operation on the electronic device so as to prevent inconvenience of particularly carrying a mouse device.12-03-2009
20090295722INPUT APPARATUS, CONTROL SYSTEM, HANDHELD APPARATUS, AND CALIBRATION METHOD - An input apparatus includes a casing, an output section, a processing section, a judgment section, and a calibration section. The output section includes a reference potential and outputs, as a detection signal, a fluctuation of a potential with respect to the reference potential, that corresponds to a movement of the casing. The processing section executes processing for generating a control command for controlling a screen based on an output value of the output section. The judgment section judges an operational state of the input apparatus. The calibration section executes, when it is judged that the input apparatus is moving, a calibration mode as processing for correcting the reference potential by obtaining the output value over a data obtainment period that is a sampling period of the output value and calculating a correction value of the reference potential based on the obtained output value.12-03-2009
20110199305MOUSE CONTROLLED BY MOVEMENTS OF FINGERS IN THE AIR - Provided is a new type of finger mouse capable of increasing the user convenience by checking the fine movement information of a finger that freely moves in the air, and by using the fine movement information as coordinate information of a mouse of a computer. The user may easily and conveniently control a mouse pointer regardless of the user's posture or place, departing from the geometrical limitation and the spatial limitation of the wireless mice, through the technology of the present disclosure.08-18-2011
20110199304Systems and Methods for Providing Enhanced Motion Detection - Provided are systems and methods for providing enhanced motion detection. One system providing enhanced motion detection includes a smart display, an interface subsystem including a human interface device (HID), and a console having a processor configured to form communication links with the smart display and the interface subsystem and to provide motion detection feedback, using the smart display, to a user of the HID, where the HID is configured to sense motion of the HID and utilize a predictive model to characterize the motion of the HID. One interface subsystem includes a camera to sense motion of a user of the HID. One processor is configured to negotiate a reduced response latency with the smart display.08-18-2011
20090284469VIDEO BASED APPARATUS AND METHOD FOR CONTROLLING THE CURSOR - A video-based apparatus and method for controlling the cursor are provided. A video camera is used to acquire a hand image of a user, and then the image is analyzed and processed to move the cursor and to take place of functions of a mouse left button and a mouse right button. The user may use a “V” shaped hand gesture to replace the mouse. An index finger image is corresponding to the mouse left button, and a middle finger image is corresponding to the mouse right button. A valley point of the “V” shaped hand gesture is corresponding to the position of the cursor.11-19-2009
20100134415IMAGE PROCESSING APPARATUS, IMAGE DISPLAYING METHOD, AND IMAGE DISPLAYING PROGRAM - An image processing apparatus includes an instructed-position detecting unit configured to receive an instruction operation by a user on a display screen of a display device and detect and output a position where the instruction operation is performed; a storing unit configured to store multiple image data items each including information corresponding to a search key; a search-key display controlling unit configured to cause at least one search key to be selectively displayed on the display screen of the display device; a searching unit configured to, if the search key displayed on the display screen is instructed by the search-key display controlling unit through the instructed-position detecting unit, search the storing unit for the image data corresponding to the search key to extract the image data; and a display controlling unit configured to collectively display images corresponding to the image data in a certain part on the display screen.06-03-2010
20090262074CONTROLLING AND ACCESSING CONTENT USING MOTION PROCESSING ON MOBILE DEVICES - Various embodiments provide systems and methods capable of facilitating interaction with handheld electronics devices based on sensing rotational rate around at least three axes and linear acceleration along at least three axes. In one aspect, a handheld electronic device includes a subsystem providing display capability, a set of motion sensors sensing rotational rate around at least three axes and linear acceleration along at least three axes, and a subsystem which, based on motion data derived from at least one of the motion sensors, is capable of facilitating interaction with the device.10-22-2009
20090262073TOUCH SENSITIVE REMOTE CONTROL SYSTEM THAT DETECTS HAND SIZE CHARACTERISTICS OF USER AND ADAPTS MAPPING TO SCREEN DISPLAY - Sensors around the periphery of the remote control unit detect contact with the user's hand. A trained model-based pattern classification system analyzes the periphery sensor data and makes a probabilistic prediction of the user's hand size. The hand size is then used to control a mapping system that defines how gestures by the user's thumb upon a touchpad of the remote control unit are mapped to the control region upon a separate display screen.10-22-2009
20090295723INTERACTIVE DISPLAY SYSTEM - An interactive display system comprises a white board which communicates with a PC. A projector receives signals from the PC which are translated into corresponding project image which is projected on to the white board. The image projected on to the white board is the same as that shown on a computer screen. By using an electronic pen the position of which can be detected electronically by means of a plurality of wires embedded beneath the surface of the white board and using methods already known in the art, the electronic pen can function in the same way as a computer mouse. The image projected on to the white board may also be manipulated by means of a remote control device, which uses Infra red communication to transmit signals to a transponder built within the white board.12-03-2009
20090295721INPUT APPARATUS, CONTROL APPARATUS, CONTROL SYSTEM, AND CONTROL METHOD - An input apparatus is provided, and includes: a main body operated by a user in a first operation form of pointing a predetermined position on a screen using a pointer on the screen and a second operation form different from the first operation form. The input apparatus includes an operation form detection section to detect which of the first operation form and the second operation form an operation form of the main body is; a movement detection section to detect a movement of the mail body. The input apparatus includes an operational section to switch a first operational mode corresponding to the first operation form and a second operational mode corresponding to the second operation form therebetween according to the operation form of the main body, and calculating a corresponding movement value corresponding to the movement of the pointer on the screen that corresponds to the detected movement of the main body.12-03-2009
20090295718MULTIPLE INPUT OPTICAL NAVIGATION SYSTEM - An optical navigation system with optical imaging of multiple inputs using a single navigation sensor. The optical navigation system includes a tactile interface device, an image sensor, and a processor. The tactile interface device facilitates a navigation input. The image sensor intermittently generates images of a surface of the tactile interface device and images of a contact navigation surface. The image sensor also generates the images of the surface of the tactile interface device exclusive of the images of the contact navigation surface. The processor is coupled to the image sensor. The processor generates a first navigation signal based on the images of the tactile interface device and generates a second navigation signal based on the images of the contact navigation surface.12-03-2009
20110169735Apparatus and Method for Interacting with Handheld Carrier Hosting Media Content - Improved techniques for interacting with one or more handheld carriers hosting media content are disclosed. The handheld carrier hosting media content may be sensed, and at least a portion of the media content may be integrated into operation of a media activity provided by a computing device, upon recognizing the media activity and the media content. The media activity provided by the computing device may involve creating or editing an electronic document. The integration of the media content into operation of the media activity may involve insertion or importation of the media content into the electronic document.07-14-2011
20130100018ACCELERATION-BASED INTERACTION FOR MULTI-POINTER INDIRECT INPUT DEVICES - An indirect interaction input device, such as but not limited to a touch sensor, can provide multiple points of input. These multiple points are in turn mapped to multiple positions on an output device such as a display. The multiple points of input, however, make the application of pointer ballistics and resolution differences between the input sensor and target display more difficult to manage. Thus, a characteristic of the set of points is identified and used to adjust the mapping of each of the points. For example, one way to solve this problem is to identify the input point with the least displacement from a prior frame, whether from its prior point or from a reference point. This displacement is used to adjust the mapping of the set of input points from the input device to their corresponding display coordinates.04-25-2013
20120293412METHOD AND SYSTEM FOR TRACKING OF A SUBJECT - Method, computer program and system for tracking movement of a subject. The method includes receiving data from a distributed network of camera sensors employing one or more emitted light sources associated with one or more of the one or more camera sensors to generate a volumetric three-dimensional representation of the subject, identifying a plurality of clusters within the volumetric three-dimensional representation that correspond to motion features indicative of movement of the motion features of the subject, presenting one or more objects on one or more three dimensional display screens, and using the plurality of fixed position sensors to track motion of the motion features of the subject and track manipulation of the motion features of the volumetric three-dimensional representation to determine interaction of one or more of the motion features of the subject and one or more of the one or more objects on the three dimensional display.11-22-2012
20100201621MOVING OBJECT DETECTING APPARATUS, MOVING OBJECT DETECTING METHOD, POINTING DEVICE, AND STORAGE MEDIUM - Even when a user is gazing at one point intentionally but the eyeball of the user is actually moving slightly, the slight movement is not reproduced as it is as the position of a cursor but a determination is made that the user is gazing at one point intentionally, that is, the eyeball is stopping. Thus, when a determination is made that the eyeball is stopping, the cursor is displayed still even when the gazing point is moving slightly depending on the slight movement. Furthermore, when a determination is made that the cursor is stopped, selection of an object such as other icon displayed at a position where the cursor is displayed is identified.08-12-2010
20100201620FIREARM TRAINING SYSTEM - The present invention provides a firearm training system for actual and virtual moving targets comprising a firearm, a trigger-initiated image-capturing device mounted on a firearm, a processor, and a display. The system allows a user to visualize the accuracy of a shot taken when the trigger was pulled or the gun fired by showing the predicted position of the firearm's projectile in relation to the moving targets.08-12-2010
20100201619INPUT APPARATUS, CONTROL APPARATUS, CONTROL SYSTEM, CONTROL METHOD, AND HANDHELD APPARATUS - An input apparatus, a control apparatus, a control system, a control method, and a handheld apparatus with which a user can easily control a movement and stop of a pointer displayed on a screen are provided. An input apparatus includes a sensor unit for detecting a movement of a casing and a button. An MPU outputs a determination code when a press of the button is released within a first time period. On the other hand, when the button is pressed and held for a time period equal to or longer than a first time period, a movement command is output from after an elapse of the first time period. Accordingly, the button is provided with a function corresponding to a determination input button and a function corresponding to an input button for controlling a movement and stop of a pointer, for example. As a result, a user can easily control the movement and stop of the pointer without mixing up an input operation for moving and stopping the pointer with other input operations.08-12-2010
20100103099POINTING DEVICE USING CAMERA AND OUTPUTTING MARK - Pointing device like mouse or joystick comprises camera for capturing the display screen and image processing means for recognizing and tracking the pointing cursor icon or mark from the captured image and producing the pointing signal. The pointing device of present invention can be used with any type of display without and additional tracking means like ultra sonic sensor, infrared sensor or touch sensor. The pointing device of present invention includes mark outputting portion, camera portion for capturing the said mark outputting portion and image processing portion for recognizing the said mark outputting portion from the captured image and producing the pointing signal.04-29-2010
20110199302CAPTURING SCREEN OBJECTS USING A COLLISION VOLUME - A system is disclosed for providing a user a margin of error in capturing moving screen objects, while creating the illusion that the user is in full control of the onscreen activity. The system may create one or more “collision volumes” attached to and centered around one or more capture objects that may be used to capture a moving onscreen target object. Depending on the vector velocity of the moving target object, the distance between the capture object and target object, and/or the intensity of the collision volume, the course of the target object may be altered to be drawn to and captured by the capture object.08-18-2011
20100103105APPARATUS AND METHOD FOR EXECUTING A MENU IN A WIRELESS TERMINAL - A menu execution apparatus and method for conveniently providing a menu in a wireless terminal are provided. The apparatus includes a pointer unit for indicating a specific object in a menu execution recognition mode. A camera photographs the object indicated by the pointer unit in the menu execution recognition mode. A controller controls an operation for recognizing the object indicated by the pointer unit and photographed by the camera and displaying a menu for controlling the recognized object.04-29-2010
20100103104APPARATUS FOR USER INTERFACE BASED ON WEARABLE COMPUTING ENVIRONMENT AND METHOD THEREOF - Provided is an apparatus for user interface based on wearable computing environment includes: a signal measurement unit including a plurality of image measurement units that includes image sensors each of which receives optical signals generated from a position indicator that is worn on user's fingers or near a user's wrist to generate optical signals and measures images of the user foreground; and a signal processor that analyzes each image measured by the signal measurement unit to measure three-dimensional coordinates, recognizes a motion pattern of the user's hand on the three-dimensional coordinates from the optical signals received by the signal measurement unit, and outputs the corresponding instructions.04-29-2010
20100103103Method And Device for Input Of Information Using Visible Touch Sensors - A device and method for manual input of information into computing device using a camera and visible touch sensors. An image of a virtual input device is displayed on a screen, and the positions of the visible touch sensors, recorded by the video camera, are overlaid on the image of the virtual input device, thus allowing the user to see the placement of the touch sensors relative to the keys or buttons on the virtual input device. The touch sensors change their appearance upon contact with a surface, and the camera records their position at the moment of change. This way information about the position of intended touch is recorded. Touch sensors can be binary (ON-OFF) or may have a graded response reflecting the extent of displacement or pressure of the touch sensor relative to the surface of contact.04-29-2010
20100103098User Interface Elements Positioned For Display - User interface elements positioned for display is described. In various embodiment(s), sensor input can be received from one or more sensors that are integrated with a portable device. A device hold position that corresponds to where the portable device is grasped by a user can be determined based at least in part on the sensor input. A display of user interface element(s) can then be initiated for display on an integrated display of the portable device based on the device hold position that corresponds to where the portable device is grasped.04-29-2010
20100103102DISPLAYING METHOD AND DISPLAY CONTROL MODULE - A displaying method for a portable device under a standby condition is provided. The portable device includes a display unit displaying a first standby frame. The method includes steps of sensing a physical motion of the portable device. Then, the physical motion is identified as an operation command. Finally, according to the operation command, a corresponding operation is executed to change the first standby frame displayed on the display unit.04-29-2010
20100103101SPATIALLY-AWARE PROJECTION PEN INTERFACE - One embodiment of the present invention sets forth a technique for providing an end user with a digital pen embedded with a spatially-aware miniature projector for use in a design environment. Paper documents are augmented to allow a user to access additional information and computational tools through projected interfaces. Virtual ink may be managed in single and multi-user environments to enhance collaboration and data management. The spatially-aware projector pen provides end-users with dynamic visual feedback and improved interaction capabilities.04-29-2010
20100103100INPUT APPARATUS, CONTROL APPARATUS, CONTROL SYSTEM, CONTROL METHOD, AND HANDHELD APPARATUS - An input apparatus, a control apparatus, a control system, and a control method that are capable of correcting an output signal when a hand movement is input to the input apparatus and with which a user does not feel a phase delay are provided. An input apparatus includes a velocity calculation section, a filter, a control section, and a memory. The velocity calculation section calculates velocity values of a casing in X′- and Y′-axis directions based on physical amounts output from a sensor unit like acceleration values in the X′- and Y′-axis directions output from an acceleration sensor unit. The filter attenuates, by predetermined scale factors, velocity values of signals of the predetermined frequency range out of the velocity values calculated by the velocity calculation section. Since the filter dynamically attenuates the velocity values of a shake frequency range in accordance with the velocity values, a precise pointing operation with a pointer becomes possible.04-29-2010
20100060575COMPUTER READABLE RECORDING MEDIUM RECORDING IMAGE PROCESSING PROGRAM AND IMAGE PROCESSING APPARATUS - Displayed region size data indicating a size of a screen of a display device, or a size of a region in which an image of a virtual space is displayed on the screen, is obtained. Distance data indicating a distance between a user and the display device is obtained. A position and an angle of view of the virtual camera in the virtual space are set based on the displayed region size data and the distance data.03-11-2010
20100060576Control System for Navigating a Principal Dimension of a Data Space - Systems and methods are described for navigating through a data space. The navigating comprises detecting a gesture of a body from gesture data received via a detector. The gesture data is absolute three-space location data of an instantaneous state of the body at a point in time and physical space. The detecting comprises identifying the gesture using the gesture data. The navigating comprises translating the gesture to a gesture signal, and navigating through the data space in response to the gesture signal. The data space is a data-representational space comprising a dataset represented in the physical space.03-11-2010
20100060574OPERATING APPARATUS FOR HAND-HELD ELECTRONIC APPARATUS AND METHOD THEREOF - An operating apparatus for a hand-held electronic apparatus includes a motion sensor, an analog to digital converting module and a processing unit. The motion sensor is utilized for sensing a motion track of the hand-held electronic apparatus to generate an analog sensing signal. The analog to digital converting module is coupled to the motion sensor, and is utilized for converting the analog sensing signal into a digital sensing signal. The processing unit is coupled to the analog to digital converting module, and is utilized for executing a multimedia motion control software to determine whether the motion track of the hand-held electronic apparatus corresponds to a predetermined track according to the digital sensing signal. Furthermore, the processing unit executes at least one predetermined function corresponding to the predetermined track from a plurality of predetermined functions when determining that the motion track corresponds to the predetermined track.03-11-2010
20100060573METHODS AND APPARATUS FOR INCREMENTAL PREDICTION OF INPUT DEVICE MOTION - Methods and apparatus for incremental prediction of input device motion. In one embodiment, the input device comprises one or more sensors adapted to output motional data of the input device as measured at a certain period. A prediction of input device motion is generated based upon the last prediction and a weighted error in estimate determined by the sensory output. According to one embodiment, the weight is calculated as a Kalman gain. In one embodiment, once the prediction has been generated, it is provided to a display update algorithm adapted to orient a navigational object upon an associated display screen.03-11-2010
20110267269HETEROGENEOUS IMAGE SENSOR SYNCHRONIZATION - A computer implemented method for synchronizing information from a scene using two heterogeneous sensing devices. Scene capture information is provided by a first sensor and a second sensor. The information comprises video streams including successive frames provided at different frequencies. Each frame is separated by a vertical blanking interval. A video output comprising a stream of successive frames each separated by a vertical blanking interval is rendered based on information in the scene. The method determines whether an adjustment of the first and second video stream relative to the video output stream is required by reference to the video output stream. A correction is then generated to at least one of said vertical blanking intervals.11-03-2011
20110267268BACKLIGHTING FOR OPTICAL FINGER NAVIGATION - An optical finger navigation (OFN) device includes an OFN sensor module, a light source, and a vertical light guide. The OFN sensor module is coupled to a circuit substrate. The OFN sensor module generates a navigation signal in response to a movement detected at a navigation surface based on light reflected from a user's finger. The light source is also coupled to the circuit substrate. The light source generates light (which is separate from the light generated for the OFN sensor module). The vertical light guide is disposed to circumscribe a perimeter of the OFN sensor module. The vertical light guide receives the light from the light source and guides the light toward a light emission surface at a perimeter surface area circumscribing the navigation surface.11-03-2011
20080238871Display apparatus and method for operating a display apparatus - A display apparatus includes a plurality of scanning lines and data lines and, at each intersection of said data lines and scanning lines, an electrochromic pixel element and a sensor element connected to the pixel element. The sensor element is sensitive to a user's graphical input, which changes a charge state of the respective pixel element. The display apparatus also includes a control means, which is configured to allow the pixel element to be selectively set to a first charge state corresponding to a first display state, or to a second charge state corresponding to a second display state. The second display state reflects the user's graphical input. The second charge state of the pixels may be brought about by exposing the sensor elements to light from a light pen or to a magnetic field from a magnetic pen, the pen being drawn across the display screen of the apparatus. Due to the direct action of the sensor elements in modulating the charge on the pixel elements, it is possible to display an image drawn on the screen instantly, without the need to scan the display to determine which pixels have changed their initial charge. This reduces the power consumption of the apparatus.10-02-2008
20090167683INFORMATION PROCESSING APPARATUS - According to one embodiment, an optical position detection IC outputs, in accordance with movement of an object on a detection area including a light-transmissive area which is disposed on a top surface of a housing. A control module controls a movement direction and a movement amount of a cursor, which is displayed on a display screen of a display device, based on an attitude signal which indicates in which of two directions the optical position detection IC is disposed, and movement amount information which is output from the optical position detection IC.07-02-2009
20080278446MULTI-SENSOR USER INPUT DEVICE FOR INTERACTIVE PROGRAMS ON A MACHINE - A user input device is provided for a machine having an interactive program and signals for a humanly-perceptible output for said program to allow a user in an inflatable physical structure to interact or perceive the output. The user input device includes one or more physical structures forming an enclosure adapted to accommodate and substantially laterally surround the body of at least one human user of the machine, a plurality of input elements which are disposed on the physical structure in three dimensions, wherein at least one of the input elements does not lie on the same horizontal or vertical plane on which others of said input elements lie, relative to the possible locations of the user so to be accessible to the user by unseated body movement of the user, and an interface between the input elements and the machine for providing inputs from the input elements to the interactive program.11-13-2008
20080278445FREE-SPACE MULTI-DIMENSIONAL ABSOLUTE POINTER WITH IMPROVED PERFORMANCE - According to one embodiment, a system includes a handheld device having a pixelated sensor, an optical filter for passing a predetermined frequency band of radiation to the sensor and a transmitter, an electronic equipment having a display, and at least two spaced-apart markers, where each of which are positioned proximate to the display. The markers provide radiation at the frequency band passed by the optical filter. The handheld device includes a processor coupled to receive image data of the markers from the sensor for computing coordinate data from the image data. The coordinate data requires less data than the image data. The processor is coupled to the transmitter to transmit the coordinate data to the electronic equipment. Other methods and apparatuses are also described.11-13-2008
20120293411Methods and apparatus for actuated 3D surface with gestural interactivity - In exemplary implementations of this invention, an array of linear actuators can be used to form a segmented surface. The surface can resemble a low relief sculpture. A user may control the shape of the surface by direct touch manipulation or by making freehand gestures at a distance from the surface. For example, the freehand gestures may comprise input instructions for selecting, translating, and rotating the shape of an object. A projector may augment the rendered shapes by projecting graphics on the surface.11-22-2012
20080211771Approach for Merging Scaled Input of Movable Objects to Control Presentation of Aspects of a Shared Virtual Environment - A system for controlling operation of a computer. The system includes, a sensing apparatus configured to obtain positional data of a sensed object controllable by a first user, such positional data varying in response to movement of the sensed object, and engine software operatively coupled with the sensing apparatus and configured to produce control commands based on the positional data, the control commands being operable to control, in a multi-user software application executable on the computer, presentation of a virtual representation of the sensed object in a virtual environment shared by the first user and a second user, the virtual representation of the sensed object being perceivable by the second user in a rendered scene of the virtual environment, where the engine software is configured so that the movement of the sensed object produces control commands which cause corresponding scaled movement of the virtual representation of the sensed object in the rendered scene that is perceivable by the second user.09-04-2008
20080316171Low-Cost Haptic Mouse Implementations - Low-cost haptic interface device implementations for interfacing a user with a host computer. A haptic feedback device, such as a mouse or other device, includes a housing physically contacted by a user, and an actuator for providing motion that causes haptic sensations on the device housing and/or on a movable portion of the housing. The device may include a sensor for detecting x-y planar motion of the housing. Embodiments include actuators with eccentric rotating masses, buttons having motion influenced by various actuator forces, and housing portions moved by actuators to generate haptic sensations to a user contacting the driven surfaces.12-25-2008
20080278447THREE-DEMENSIONAL MOUSE APPRATUS - A 3D mouse apparatus is disclosed in the present invention. The apparatus is mainly utilized to calculate, recognize, analyze and output the 3D gestures which include the physical quantity such as 3D position coordinate, displacement, velocity and acceleration of a point light source and the moving behavior of human hand, so as to achieve the purpose of 3D mouse apparatus.11-13-2008
20120293410Flexible Input Device Worn on a Finger - Methods and apparatus are directed to an input device including a wearable ring shaped component that is supported on a finger and located between a finger tip and a knuckle of the wearer. The wearable component comprises a touch pad device that is located on an outward surface of the wearable ring shaped component. The touch pad device is contacted to provide an input command. The wearable component includes a transmitter to transmit the input command.11-22-2012
20080273011Interactive game method and system with sports injury protection - The present invention discloses an interactive game method with sports injury protection, comprising: providing a remote pointing device for a user to swing; and triggering a safety mechanism in one or more of the following conditions: (1) when a user swings the remote pointing device drastically; (2) when a count of swings exceeds a first threshold; and (3) when a count of swings in a predetermined time period exceeds a second threshold.11-06-2008
20110006985DISPLAY SURFACE AND CONTROL DEVICE COMBINED THEREWITH - The invention relates to a display surface and a control device combined therewith for a data processing system, wherein a display surface is equipped with photosensitive elements. A photosensitive element is configured as a planar position detector on the basis of a layer made of an organic photoactive material, which on both sides is connected by a planar electrode, wherein at least one electrode inside the circuit thereof has relatively high resistance, wherein the current through an electrode having poor conductivity is measured at several connecting points disposed at a distance from each other and from this conclusions may be drawn about the position of a local conductive connection through the photosensitive layer caused by light absorption. A luminous indicator produces a light spot on the display surface, the spot is detectable by the position detectors and reported to a data processing unit.01-13-2011
20110006984Optical Helmet-Position Detection Device Having a Large Dynamic Range - The general field of the invention is that of optical devices for detecting the position/orientation of a helmet. The device according to the invention comprises an optional stationary light source, a stationary camera associated with an image processing system, and a helmet. The helmet has a scattering coating and includes at least one set of markers, each marker comprising at least a first optical element having a very low reflection coefficient, a very low scattering coefficient and a very high absorption coefficient in the visible range and in that of the light source. In one embodiment, each marker may also include a first optical element having a very high retroreflection coefficient and a very low scattering coefficient in the visible range. The marker may also include a second optical element having a high scattering or phosphorescence coefficient in the emission range of the light source.01-13-2011
20100141580PIEZO-ELECTRIC SENSING UNIT AND DATA INPUT DEVICE USING PIEZO-ELECTRIC SENSING - Disclosed herein is a piezoelectric sensing unit and a data input device using piezoelectric sensing. The data input device of the present invention includes a base, an input unit, first piezoelectric sensing parts, and a control unit. The input unit performs a first directional input in such a way that the input unit moves to one of first direction indicating locations arranged around a base location in radial directions at positions spaced apart from each other within a pre-determined input radius defined on the base. The first piezoelectric sensing parts are provided on respective moving paths of the input unit, so that when the first directional input is performed, the corresponding first piezoelectric sensing part is pressed by the input unit, thus generating a first sensing signal proportional to a pressing force. When the first sensing signal is greater than a preset value, the control unit extracts data, assigned to the corresponding first direction indicating location at which movement of the input unit is sensed, from a memory unit and inputs the data.06-10-2010
20090051653TOY DEVICES AND METHODS FOR PROVIDING AN INTERACTIVE PLAY EXPERIENCE - The invention provides a unique interactive play experience carried out utilizing a toy “wand” and/or other actuation/tracking device. In one embodiment the wand incorporates a wireless transmitter and motion-sensitive circuitry adapted to actuate the transmitter in response to particular learned wand motions. The wand allows play participants to electronically and “magically” interact with their surrounding play environment simply by pointing, touching and/or using their wands in a particular manner to achieve desired goals or produce desired effects. Various wireless receivers or actuators are distributed throughout the play facility to support such wireless interaction and to facilitate full immersion in a fantasy experience in which participants can enjoy the realistic illusion of practicing, performing and mastering “real” magic.02-26-2009
20130215027Evaluating an Input Relative to a Display - Disclosed embodiments relate to evaluating an input relative to a display. A processor may receive information from an optical sensor 106 and a depth sensor 108. The depth sensor 108 may sense the distance of an input from the display. The processor may evaluate an input to the display based on information from the optical sensor 106 and the depth sensor 108.08-22-2013
20130120254Two-Stage Swipe Gesture Recognition - Systems, methods and computer program products for facilitating the recognition of user air swipe gestures are disclosed. Such systems, methods and computer program products provide a two-stage gesture recognition approach that combines desirable aspects of object manipulation gestures and symbolic gestures in order to create an interaction that is both reliable and intuitive for users of a computing system. In a first position-based stage, the user moves the cursor into a swipe activation zone. Second, in a motion-based stage, the user swipes their hand from the activation zone past a swipe gate within a certain amount of time to complete the interaction. GUI feedback is provided following the first stage to let the user know that the swipe interaction is available, and after the second stage to let the user know that the swipe is completed.05-16-2013
20100141579METHOD AND APPARATUS FOR CONTROLLING A COMPUTING SYSTEM - A handheld computing device is introduced comprising a motion detection sensor(s) and a motion control agent. The motion detection sensor(s) detect motion of the computing device in one or more of six (6) fields of motion and generate an indication of such motion. The motion control agent, responsive to the indications of motion received from the motion sensors, generate control signals to modify, one or more of the operating state and/or the displayed content of the computing device based, at least in part, on the received indications.06-10-2010
20090179858APPARATUS AND METHOD GENERATING INTERACTIVE SIGNAL FOR A MOVING ARTICLE - Apparatus and method generate interactive signal for a moving article such as an airplane model. The airplane model is provided with a human-sensible interactive signal source; and the moving status of the air plane model such velocity is detected to generate a movement parameter. The movement parameter is operated with a frequency-dependent conversion function to obtain a first interactive data. A second interactive data is generated when a trace of the moving article is matched with a default pattern. A third interactive data is generated when the velocity along at least one dimension exceeds a threshold value. The interactive signal source, such as loudspeaker or lamps, is selectively driven by one of the interactive data to generate a movement-dependent audiovisual effect. Therefore, the apparatus and method generating interactive signal for moving article can provide enhanced amusement effect for user.07-16-2009
20090160768Enhanced Presentation Capabilities Using a Pointer Implement - Providing enhanced presentation capabilities using a pointer implement. In an embodiment, a user operates a key on a pointer implement to cause the pointer implement to capture the display image on a screen and send the captured image frame to a digital processing system. The digital processing system examines the image frame to determine the location of a beam spot caused by the pointer implement, which can be used as a basis for several user features. For example, a user may cause the digital processing system to draw a line on the screen or use the pointer implement as a mouse as well.06-25-2009
20120068926DISPLAY DEVICE WITH REVERSIBLE DISPLAY AND DRIVING METHOD THEREOF - A display device includes a control circuit, a data driving circuit, a gate driving circuit, and a display panel comprising a display region. The control circuit determines a located orientation of the display panel, sets a display-mode state parameter according to the located orientation of the display panel, and outputs control signals to the data driving circuit and the gate driving circuit according to the display-mode state parameter. The data driving circuit outputs data signals of an image along a first shift direction according to corresponding control signals to the display region, and the gate driving circuit outputs gate signals along a second shift direction according to corresponding control signals to the display region.03-22-2012
20120068925SYSTEM AND METHOD FOR GESTURE BASED CONTROL - Methods and apparatus are provided for gesture based control of a device. In one embodiment, a method includes detecting a first position sensor signal, the first position sensor signal detected by a first sensor, and detecting a second position sensor signal, the second position sensor signal detected by a second sensor, wherein the second position sensor signal is based on position of the second sensor relative to the first sensor. The method may further include generating a control signal for a device based on the first and second position sensor signals, wherein the first and second position sensor signals are generated based on user positing of the first and second sensors. The method may further include transmitting the control signal to the device.03-22-2012
20090140981IMAGE SENSOR AND OPTICAL POINTING SYSTEM - Provided is an image sensor and optical pointing system using the same. The image sensor has a plurality of pixels, each pixel including a photocell for receiving light and generating an analog signal having a voltage corresponding to a quantity of the received light, a comparator for, in response to a shutter control signal, comparing the analog signal of the photocell with an analog signal of an adjacent pixel to generate a digital signal for movement calculation, or comparing the analog signal of the photocell with a reference voltage to generate a digital signal for shutter control, and a switch for transferring the digital signal for movement calculation and the digital signal for shutter control in response to a pixel selection signal. The optical pointing system includes a reference voltage generation unit for generating a reference voltage, the image sensor, a signal selector for receiving the digital signal for movement calculation and the digital signal for shutter control, and selecting and outputting one of the digital signal for movement calculation and the digital signal for shutter control in response to a shutter control period selection signal, a movement calculation and shutter control unit for receiving the digital signal for movement calculation to obtain an image of an object and output a movement value of the optical pointing system and the shutter control period selection signal, and receiving the digital signal for shutter control to compare a high-level count value with a maximum count value and a minimum count value and output the shutter control signal.06-04-2009
20090140980DISPLAY SYSTEM AND METHOD FOR DETECTING POINTED POSITION - A plurality of infrared-light-emitting areas are displayed in a display screen of a liquid crystal display apparatus in a method that allows each of the infrared-light-emitting areas to be distinguished. Then, an image in a direction of a pointed position is captured by an operating device. Based on a result of distinguishing each of the infrared-light-emitting areas and a position of each of the infrared-light-emitting areas, a pointed position on the display screen is calculated. This makes it possible to properly detect the pointed position on the display screen pointed by the operating device, regardless of (i) a distance between the operating device and the display apparatus and (ii) a rotation angle of the operating device around an axis in an image capture direction of the operating device.06-04-2009
20090021480POINTER LIGHT TRACKING METHOD, PROGRAM, AND RECORDING MEDIUM THEREOF - A pointer light tracking method wherein all black image and white square images located at four corners of the all black image are projected on the display, the display on which the all black image and the white square images are displayed is shot by a camera, a domain corresponding to the white square image is extracted from the obtained image data, central coordinates (x, y) of the extracted domain are computed, and a parameter necessary in performing distortion correction by use of projection conversion for coordinate expressing the position of the pointer light on the display is computed from the computed central coordinates (x, y) and central coordinates (X, Y) of the white square image.01-22-2009
20090015555INPUT DEVICE, STORAGE MEDIUM, INFORMATION INPUT METHOD, AND ELECTRONIC APPARATUS - An optical input device has a display screen, a portion of the display screen being shielded from light by an operating member to implement input processing based on a plurality of input modes. The input device includes an input detection unit including a display unit that displays predetermined input information, and a control unit. The input detection unit detects an area and/or brightness of a light-shielded portion formed on the display screen by the operating member by approaching the display unit. The control unit provides display control of the display unit and input control on the basis of the detected area and/or brightness. The control unit compares the detected area and/or brightness with a predetermined threshold value to select a desired input mode from among the plurality of input modes.01-15-2009
20090058807MOUSE POINTER FUNCTION EXECUTION APPARATUS AND METHOD IN PORTABLE TERMINAL EQUIPPED WITH CAMERA - An apparatus and a method for executing a mouse pointer function in a portable terminal equipped with a camera are disclosed. The method includes: capturing an external image signal by a camera module; sampling the external image signal and converting a sampled image signal into image data; mapping a group of pixels of the image data generated by the sampling to a group of pixels of an image sensor for each unit pixel on a one-to-one basis; detecting coordinate values of image data including a point light source in the group of the mapped pixels; determining if a point light source is actually included in the detected coordinate values of the image data; and displaying the detected coordinate values of the image data determined to include the point light source on a screen.03-05-2009
20090251411Computer input device for automatically scrolling in different speed - A computer input device includes a body and a trace-detecting module coupled to the body. The body has a micro control unit (MCU), and the trace-detecting module has a light pervious area, and a trace-detecting unit. The trace-detecting unit further includes a light source and a sensor. The sensor senses a reflected light beam for a user's digit movement on the light pervious area at a velocity which can be sensed by the sensor. If an automatically scrolling mode is activated and the velocity exceeds a threshold stored in the MCU, then the MCU executes automatic scrolling at a predetermined scrolling speed.10-08-2009
20090051650Pass through of remote commands - In one embodiment, a television set having remote control signaling to a controlled device has a data communication interface for communication of data, said data communication interface having a connection reserved for DC power. A remote control interface that receives commands from a remote control device. A circuit determines whether a command received at the remote control interface is destined for the television set or for the controlled device. The controlled device is connected to the television set via the data communication interface. A modulator modulates a signal representing a command destined for the controlled device onto the DC power connection in order to convey the command to the controlled device. In another embodiment a television accessory device that is interconnected to and controllable by the television has a data communication interface for communication of data, said data communication interface having a connection reserved for DC power. The accessory device is connected to the television via the data communication interface. A demodulator is coupled to the DC power connection and demodulates a signal representing a command that is modulated onto the DC power connection in order to receive a command from the television. A processor implements the command in the accessory device. This abstract is not to be considered limiting, since other embodiments may deviate from the features described in this abstract.02-26-2009
20090051652CONTROL APPARATUS AND METHOD - The invention discloses a control apparatus capable of providing a user with a control over an object displayed on a display apparatus by a pointing device. The control apparatus includes an image capturing module, a separating module, a positioning module, a constructing module, and a processing module. The image capturing module is used to record an image sequence included N images. The separating module is applied to capture the pointing device image related to the pointing device from each image. The positioning module is used for calculating a specific point of the pointing device image of each image to generate a first set of specific point information. The constructing module is applied to generate a trajectory in accordance with a pre-defined criterion and the first set of specific point information. Additionally, the processing module is used to analyze the trajectory and generate a control signal to control the object.02-26-2009
20090051651APPARATUS FOR REMOTE POINTING USING IMAGE SENSOR AND METHOD OF THE SAME - Problem: since a remote pointing system using an image sensor and having a communication function through an infrared remote controller is used in various environments, the various environments have to be considered when designing the system. Solution: a signal reception unit outputs a control signal controlled to operate in a mode that corresponds to an infrared signal received from a remote controller among a remote control mode and a remote pointing mode. When receiving a control signal controlled to operate in the remote pointing mode from the signal reception unit, an image reception unit is operated to obtain a background image during a first signal reception section and obtains an optical image that corresponds to an infrared signal received from the remote controller during a second signal reception section. The infrared signal is not received during the first signal reception section and received during the second signal reception section from the remote controller. An image-processing unit creates a corrected optical image according to a difference value between the optical image and the background image. A pointing calculator calculates a distance up to the remote controller according to the size of the corrected optical image inputted from the image-processing unit and calculates a movement amount of the remote controller according to the calculated distance, thereby solving the above problem.02-26-2009
20090167682INPUT DEVICE AND ITS METHOD - In an input device capable of easily operating a touch panel at hand while viewing a forward display screen, the accuracy of sensing a hand shape and the accuracy of sensing a gesture are improved.07-02-2009
20090073117Image Processing Apparatus and Method, and Program Therefor - An image processing apparatus includes an extracting unit configured to extract a feature point from a captured image; a recognizing unit configured to recognize a position of the feature point; a display control unit configured to perform control, based on the position of the feature point, to display a feature-point pointer indicating the feature point; and an issuing unit configured to issue, based on the position of the feature point, a command corresponding to the position of the feature point or a motion of the feature point.03-19-2009
20100013767Methods for Controlling Computers and Devices - One aspect of the invention provides a method for providing input to a first computer and a second computer. In some embodiments, the method includes the steps of emitting light from a registered light source, reflecting light emitted by the registered light source with a reflective element, detecting the movement of the reflective element, translating the movement of the reflective element to movement of a cursor on a viewing system coupled to the first computer, detecting a computer switching input from a reflective element, and translating the movement of the reflective element to movement of a cursor on a viewing system coupled to the second computer.01-21-2010
20110227827Interactive Display System - An interactive display system including a wireless pointing device including a camera or other video capture system. The pointing device captures images displayed by the computer, including one or more human-imperceptible positioning targets. The positioning targets are presented as patterned modulation of the intensity (e.g., variation in pixel intensity) in a display frame of the visual payload, followed by the opposite modulation in a successive frame. At least two captured image frames are subtracted from one another to recover the positioning target in the captured visual data and to remove the displayed image payload. The location, size, and orientation of the recovered positioning target identify the aiming point of the remote pointing device relative to the display. Another embodiment uses temporal sequencing of positioning targets (either human-perceptible or human-imperceptible) to position the pointing device.09-22-2011
20110141015Storage medium having information processing program stored thereon and information processing apparatus - A motion information obtaining step successively obtains motion information from a motion sensor. An imaging information obtaining step successively obtains imaging information from an imaging means. An invalid information determination step determines whether the imaging information is valid information or invalid information for predetermined processing. A motion value calculation step calculates a motion value representing a magnitude of a motion of the operation apparatus in accordance with the motion information. A processing step executes, when the imaging information is determined as the invalid information in the invalid information determination step and when the motion value calculated in the motion calculation step is within a predetermined value range, predetermined processing in accordance with most recent valid imaging information among valid imaging information previously obtained.06-16-2011
20090231278Gesture Based Control Using Three-Dimensional Information Extracted Over an Extended Depth of Field - Systems and methods are described for gesture-based control using three-dimensional information extracted over an extended depth of field. The system comprises a plurality of optical detectors coupled to at least one processor. The optical detectors image a body. At least two optical detectors of the plurality of optical detectors comprise wavefront coding cameras. The processor automatically detects a gesture of the body, wherein the gesture comprises an instantaneous state of the body. The detecting comprises aggregating gesture data of the gesture at an instant in time. The gesture data includes focus-resolved data of the body within a depth of field of the imaging system. The processor translates the gesture to a gesture signal, and uses the gesture signal to control a component coupled to the processor.09-17-2009
20090027336REMOTE CONTROLLED POSITIONING SYSTEM, CONTROL SYSTEM AND DISPLAY DEVICE THEREOF - The invention relates to a remote controlled positioning system, a control system and a display device thereof. The remote controlled positioning system includes a liquid crystal display (LCD) panel, a backlight source, a plurality of infrared ray (IR) sources and a directional remote controller. The LCD panel includes a plurality of display areas. The plurality of IR sources is disposed behind the LCD panel, wherein the IR sources are correspondingly disposed according to the positions of the display areas to respectively emit infrared rays to pass through the LCD panel. The directional remote control receives the infrared rays emitted by the infrared ray sources to obtain positional information pointed to a position of the LCD panel by the directional remote control.01-29-2009
20130215028APPARATUS SYSTEM AND METHOD FOR HUMAN-MACHINE-INTERFACE - There is provided a 3D human machine interface (“3D HMI”), which 3D HMI may include: (1) an image acquisition assembly, (2) an initializing module, (3) an image segmentation module, (4) a segmented data processing module, (5) a scoring module, (6) a projection module, (7) a fitting module, (8) a scoring and error detection module, (9) a recovery module, (10) a three dimensional correlation module, (11) a three dimensional skeleton prediction module, (12) an output module and (13) a depth extraction module.08-22-2013
20090115724THREE-DIMENSIONAL OPERATION INPUT APPARATUS, CONTROL APPARATUS, CONTROL SYSTEM, CONTROL METHOD, METHOD OF PRODUCING A THREE-DIMENSIONAL OPERATION INPUT APPARATUS, AND HANDHELD APPARATUS - A three-dimensional operation input apparatus for controlling a pointer on a screen includes: a casing; a sensor for detecting a movement of the casing; a movement value calculation section for calculating, based on a detection value detected by the sensor, first and second movement values respectively corresponding to the movements of the casing in directions along first and second axes that are mutually orthogonal; and a modification section for calculating first and second modified movement values for respectively moving the pointer in first and second directions on the screen respectively corresponding to the first and second axes, the first modified movement value obtained by multiplying the first movement value by a first modification coefficient, the second modified movement value obtained by multiplying the second movement value by a second modification coefficient different from the first modification coefficient.05-07-2009
20090207135SYSTEM AND METHOD FOR DETERMINING INPUT FROM SPATIAL POSITION OF AN OBJECT - A system and method for determining an input is provided. The system includes an object position determination device and an input determination device. The object position determination device is configured to determine a first position of an object at a first time and a second position of the object at a second time. The object position determination device includes a camera configured to detect light traveling from the object to the camera. The input determination device is configured to determine an input based at least partly upon the first position and the second position. The object position determination device can include a second camera. The object can include a radio frequency emitter. The object can include an infrared emitter. The object can be an electronic device.08-20-2009
20090102788Manipulation input device - When a vehicle navigation system is manipulated by taking pictures of a user hand motion and gesture with a camera, as the number of apparatuses and operational objects increases, the associated hand shapes and hand motions increase, thus causing a complex manipulation for a user. Furthermore, in detecting a hand with the camera, when the image of a face having color tone information similar to that of a hand appears in an image taken with a camera, or outside light rays such as sun rays or illumination rays vary, detection accuracy is reduced. To overcome such problems, a manipulation input device is provided that includes a limited hand manipulation determination unit and a menu representation unit, whereby a simple manipulation can be achieved and manipulation can accurately be determined. In addition, detection accuracy can be improved by a unit that selects a single result from results determined by a plurality of determination units, based on images taken with a plurality of cameras.04-23-2009
20090213071CONTROL SYSTEM AND METHOD FOR A CURSOR IN DISPLAY DEVICE - The present invention discloses a control system for a cursor in display device, in which said display device is connected to a data processing device and said system comprises: an image acquisition device for acquiring user image information and sending a signal; a signal receiving unit for receiving the signal from the image acquisition device; a signal processing unit for parsing said signal and determining whether the cursor and the target region the cursor reaches need to be shifted or not; and a cursor control unit for sending a cursor control signal to said display device based on the determination result by the signal processing unit and shifting the cursor to said target region. Accordingly, the present invention further discloses a control method for a cursor in a display device.08-27-2009
20090213072REMOTE INPUT DEVICE - An input device providing users with a pointing capability includes a sender portion and a receiver portion. The sender portion is adapted to be manipulated by a user to specify a target point within a target area. The sender portion projects a light beam including a pattern on to the target area. A receiver portion includes one or more sensor units located in or near the target area. At least some of the sensor units receive a portion of the light beam regardless of the location of the target point within the target area. A processing unit in the receiver portion analyzes the portions of the light beam received by one or more sensor units to determine an attribute of the target point. The attribute can be the location or relative motion of the target point. The receiver portion may be integrated with a display device.08-27-2009
20110227826INFORMATION PROCESSING APPARATUS AND INFORMATION PROCESSING METHOD - There is provided an information processing apparatus including a position detection section which detects a position of an object, and a coordinate calculation section which calculates absolute coordinates based on the position of the object detected by the position detection section, and which calculates relative coordinates, which indicate a display position of the object on a screen, depending on the absolute coordinates and a motion of the object. The coordinate calculation section moves the relative coordinates in order for the relative coordinates to be asymptotic to or correspondent to the absolute coordinates based on a predetermined condition.09-22-2011
20090251412MOTION SENSING INPUT DEVICE OF COMPUTER SYSTEM - A motion sensing input device of a computer system includes a motion sensor and a receiver. The motion sensor has two gyroscopes for sensing motions in different directions when a user operates the motion sensor. The motion signals are wirelessly transmitted to the receiver connected with the computer system. Furthermore, the motion signals are directly decoded via the receiver, and then they are converted to keyboard input signals. When the user plays a computer game by using the motion sensor to replace a keyboard, the user can experience a lifelike game environment and an intuitive operation mode.10-08-2009
20120194432PORTABLE ELECTRONIC DEVICE AND METHOD THEREFOR - An electronic device includes an object sensor for detecting motion of an object, such as a stylus or finger, relative to device and during a period of contactless object movement. A motion sensor, such as an accelerometer, detects device motion during the period of contactless object movement. A processor determines a gesture that corresponds to the movement of the object and to movement of the device. This device, and the associated method, results in a more accurate determination of an intended gesture, such as a three-dimensional gesture. For example, the processor, or gesture determinator, can compensate for movement of the device when determining the gesture corresponding to detected contactless movement of the object.08-02-2012
20120105326METHOD AND APPARATUS FOR GENERATING MOTION INFORMATION - Provided are a method and an apparatus for generating motion information relating to motion of an object to provide reaction of a user interface to the motion of the object. The method for generating the motion information includes: detecting depth information of the object using an image frame acquired by capturing the object through a sensor; generating the motion information by compensating a determined size of the motion of the object in the acquired image frame based on the detected depth information; and generating an event corresponding to the generated motion information.05-03-2012
20120105325CAPACITIVE FINGER NAVIGATION INPUT DEVICE - A capacitive finger navigation input device uses a capacitive sensor array of capacitive sensing cells that includes only two capacitive sensing cells positioned along a linear direction. The capacitive finger navigation input device uses a drive circuit to drive at least one drive electrode of the capacitive sensor array and a sense circuit to sense mutual capacitance at each of the capacitive sensing cells of the capacitive sensor array to produce mutual capacitance signals, which are used to determine at least one of position and movement of a finger of a user with respect to the capacitive sensor array. The capacitive finger navigation input device may be used in a hand-held computing device and in a method for performing finger navigation.05-03-2012
20100013763METHOD AND APPARATUS FOR TOUCHLESS INPUT TO AN INTERACTIVE USER DEVICE - A plurality of light sources is mounted on a housing of an interactive user device. The sources are spaced from each other in a defined spatial relationship, for example in a linear configuration. At least one light sensor is also positioned at the surface of the housing. The light sensor senses light that is reflected from an object placed by the user, such as the user's finger, within an area of the light generated by the light sources. A processor in the user device can recognize the sensed reflected light as a user input command correlated with a predefined operation and respond accordingly to implement the operation.01-21-2010
20100156788INPUT DEVICE AND DATA PROCESSING SYSTEM - An input device includes a main body, a motion sensor unit and a coordinate conversion processing unit. The coordinate conversion processing unit configured to perform coordinate conversion processing based on a Y-axis acceleration and a Z-axis acceleration detected by the motion sensor unit with a first two-dimensional orthogonal coordinate system being defined by a mutually orthogonal Y-axis and Z-axis in a first plane perpendicular to an X-axis coinciding with a pointing direction of the main body. The coordinate conversion processing unit is configured to convert the Y-axis angular velocity and the Z-axis angular velocity detected by the motion sensor unit to a U-axis angular velocity and a V-axis angular velocity, respectively, in a second two-dimensional orthogonal coordinate system defined by a U-axis corresponding to a horizontal axis in the first plane and a V-axis perpendicular to the U-axis in the first plane.06-24-2010
20120139838APPARATUS AND METHOD FOR PROVIDING CONTACTLESS GRAPHIC USER INTERFACE - Disclosed herein are an apparatus and method for providing a contactless Graphical User Interface (GUI). The apparatus for providing a contactless GUI includes a basic information management unit, a pointer tracking unit, and a mouse event management unit. The basic information management unit receives finger image information, and generates basic pointer information for a mouse pointer service. The pointer tracking unit analyzes the finger image information based on the basic pointer information, and generates a mouse operation event by tracking a mouse operation in order to control a mouse pointer based on results of the analysis. The mouse event management unit analyzes the mouse operation event, and generates an operation message corresponding to the mouse operation. The pointer tracking unit tracks the mouse operation by calculating the movement distance of the tips of a thumb and an index finger based on the finger image information.06-07-2012
20100182235INPUT DEVICE AND INPUT METHOD, INFORMATION PROCESSING DEVICE AND INFORMATION PROCESSING METHOD, INFORMATION PROCESSING SYSTEM AND PROGRAM - An input device includes an operation unit that is held by a user and operated in a three-dimensional free space to remotely operate an information processing device, a directional button that is provided on the operation unit and operated by the user to point in a direction, and a transmission unit that, when the directional button is operated while the operation unit is being operated in the free space, transmits information corresponding to the operation in the free space and information corresponding to the operated directional button to the information processing device so that an object image linearly moves by only an amount corresponding to a directional component of the directional button out of an operation amount in the free space after the operation of the directional button.07-22-2010
20100188334INPUT DEVICE AND METHOD, INFORMATION PROCESSING APPARATUS AND METHOD, INFORMATION PROCESSING SYSTEM, AND PROGRAM - An input device includes: an operating section which is held by a user and operated in a three-dimensional free space in order to operate an information processing apparatus by remote control; a calculation section which calculates a hand shake related value for controlling selection of an image to be controlled which is displayed on the information processing apparatus, the hand shake related value being relevant to an amount of hand shake of the operating section; and an output section which outputs the hand shake related value as an operation signal for operating the information processing apparatus by remote control.07-29-2010
20100259478METHOD AND DEVICE FOR INPUTTING INFORMATION BY DESCRIPTION OF THE ALLOWABLE CLOSED TRAJECTORIES - The method and the device for inputting information by description of the allowable closed trajectories (ACT), the device of the sensors of the characteristic points. The manipulator unit (10-14-2010
20120139839METHODS AND SYSTEMS FOR MEDIA ANNOTATION, SELECTION AND DISPLAY OF ADDITIONAL INFORMATION ASSOCIATED WITH A REGION OF INTEREST IN VIDEO CONTENT - Methods, systems, and processor-readable media for selecting a region within a particular frame of video content to access additional information about an area of interest associated with the region within the particular frame, and displaying the additional information, in response to selecting the region associated with the particular frame of video content to access the additional information about the area of interest associated with the region within the particular frame. A selection packed can be generated, which includes frame selection data associated with the particular frame of video content. The frame selection data can include data that is sufficient to identify the particular frame of video content.06-07-2012
20100271300Multi-Touch Pad Control Method - A multi-touch pad control method is disclosed. The method comprises the following steps. First of all, a primary cursor is detected. Then a first touch motion is detected, if there is no first touch motion, primary cursor is re-detected. Next a secondary cursor is detected if there is a first touch motion. Then a second touch motion is detected. A first function is performed if there is no second touch motion. A first direction of the second touch motion is detected, if the second touch motion controlling the secondary cursor is detected. A second function is performed if the second touch motion is toward first direction. A second direction of the second touch motion is detected, if the second touch motion is not toward the first direction. A third function is performed if the second touch motion is toward the second direction, wherein the first function is performed if the second touch motion controlling the secondary cursor is detected and the second touch motion is neither toward the first direction nor the second direction.10-28-2010
20100001953INPUT APPARATUS, CONTROL APPARATUS, CONTROL METHOD, AND HANDHELD APPARATUS - To provide an input apparatus, a control apparatus, a control system, and a control method that are capable of improving an operational feeling when a user uses the input apparatus to input an operation signal via an operation section. An MPU (01-07-2010
20090046062POINTING DEVICE WITH CUSTOMIZATION OPTIONS - A pointing device that can interface with a graphical user interface of a computer or other electronic device. The pointing device includes a body having an upper portion and an underside. Also included is a tracking assembly having at least one sensor to detect movement and output a control signal responsive to the detected movement. The pointing device further includes several customization features. The customization features include mechanical customization features and software customization features. At least some of the mechanical customization features are configured to be replaceable. Such replaceable customization features are releasably mechanically coupled to the pointing device body.02-19-2009
20100013766Methods for Controlling Computers and Devices - One aspect of the invention provides a method for providing input to a computer. In some embodiments, the method includes the steps of emitting light from a registered light source, reflecting a first pattern of light emitted by the registered light source with at least two reflective elements, detecting the movement of at least one of the reflective elements, and translating the movement of the at least one reflective element to movement of a cursor on a viewing system such that there is a first relationship between the movement of the at least one reflective element and the movement of the cursor, detecting a change from the first pattern to a second pattern of light with the at least two reflective elements, and changing the relationship between the movement of the reflective element and the movement of the cursor from the first relationship to a second relationship.01-21-2010
20100013765METHODS FOR CONTROLLING COMPUTERS AND DEVICES - One aspect of the invention provides a method for providing input to a computer. In some embodiments, the method includes the steps of emitting light from a registered light source, reflecting a first pattern of reflected light emitted by the registered light source from at least first and second reflective elements, moving a movable member, coupled to a body, with respect to the body to create a second pattern of reflected light from the at least two reflective elements, and detecting a change from the first pattern to the second pattern to perform at least one of a computer mouse click, a computer mouse scroll, a keyboard input, and a combination thereof.01-21-2010
20100013764Devices for Controlling Computers and Devices - One aspect of the invention provides a device for providing input to a computer. In some embodiments, the device includes a body, first and second reflective elements that have at least a first configuration and a second configuration, and a movable member coupled to the body. The movable member may be configured to move from a first position to a second position under an applied load, such that the reflective elements change from the first configuration to the second configuration, and then return to the first position.01-21-2010
20100182236COMPACT RTD INSTRUMENT PANELS AND COMPUTER INTERFACES - This concerns my RTD invention disclosed in co pending applications, particularly but not necessarily for use in the “center stack” region of vehicle instrument panels. Disclosed are novel prism devices to shrink the size of the unit, while increasing resistance to vibration and condensation and providing easier assembly into the vehicle. Also disclosed are methods to improve the efficiency of such projector based systems for delivering display light to the driver of the vehicle, as well as to reduce noise caused by backscatter from the screen and control surface, or sunlight coming through the windshield. RTD versions for home or office use are also disclosed, including a “cushion computer”-like device meant primarily for use on one's lap and optionally having a reconfigurable keyboard. The device can also serve as a TV remote and perform other useful functions in the home, workplace, or car, and may serve as a useful interface accessory to expand the usability and enjoyment of mobile devices.07-22-2010
20110109545Pointer and controller based on spherical coordinates system and system for use - A hand-held pointing device for manipulating an object on a display is disclosed. The device is constructed from at least one accelerometer and at least one linear input element. The accelerometer or accelerometers generate a pitch signal and a roll signal. These pitch and roll signals are used to determine a position on a display.05-12-2011
20100253622POSITION INFORMATION DETECTION DEVICE, POSITION INFORMATION DETECTION METHOD, AND POSITION INFORMATION DETECTION PROGRAM - A position information detection device, method, and program are provided, which are capable of detecting position information with high precision using simple and easily identified discrimination marks. A position information detection device has an image capture portion 10-07-2010
20100253625METHOD AND DEVICE FOR DETERMINATION OF COORDINATES OF A COMPUTER POINTING DEVICE SUCH AS MOUSE OR ELECTRONIC STYLUS - The invention relates to radio engineering, in particular to methods and devices for loading information into computers or game consoles. The inventive method for extending functionalities and the scope of a mouse- or electronic stylus pen-type manipulator involves emitting electromagnetic wave radiation with the aid of a main transmitter antenna built into the manipulator, receiving said electromagnetic waves with the aid of at least three, one base and two working, spaced antennas of a receiver, when the manipulator is moved on a plain, measuring the phase difference of the signals for the different pairs of the receiver antennas (the base antenna and each of the working antennas) with the aid of at least four, one base and three working, spaced antennas of the receiver, when the manipulator is three-dimensionally moved, and calculating the manipulator plane or three-dimensional coordinates according to the phase differences, wherein the device for determining the coordinates of a mouse- or electronic stylus pen-type manipulator comprises a manipulator transmitter with a base antenna built into the manipulator and a receiving device which contains at least three, one base and two working, spaced antennas, when the manipulator is moved on a plane, and at least four, one base and three working, spaced antennas, when the manipulator is three-dimensionally moved.10-07-2010
20100253623REMOTE CONTROL, IMAGING DEVICE, METHOD AND SYSTEM FOR THE SAME - A remote control, an imaging device, a method and a system for the same are provided to realize the functions such as channel selection of programs, character input, etc. The remote control comprises: an operation means having multiple keys, an ultrasonic and radio signal transmitting means for transmitting radio signals and ultrasonic signals while one of the said multiple keys is operated to map the position of the remote control into a cursor displayer on a screen, and a control means for controlling the said operation means and the said ultrasonic and radio signal transmitting means.10-07-2010
20100253624SYSTEM FOR DISPLAYING AND CONTROLLING ELECTRONIC OBJECTS - The present invention is directed toward a system and process that controls a group of networked electronic components using a multimodal integration scheme in which inputs from a speech recognition subsystem, gesture recognition subsystem employing a wireless pointing device and pointing analysis subsystem also employing the pointing device, are combined to determine what component a user wants to control and what control action is desired. In this multimodal integration scheme, the desired action concerning an electronic component is decomposed into a command and a referent pair. The referent can be identified using the pointing device to identify the component by pointing at the component or an object associated with it, by using speech recognition, or both. The command may be specified by pressing a button on the pointing device, by a gesture performed with the pointing device, by a speech recognition event, or by any combination of these inputs.10-07-2010
20100117960HANDHELD ELECTRONIC DEVICE WITH MOTION-CONTROLLED CURSOR - A handheld electronic device of the present invention includes a display, a memory, a motion sensor, and a controller. The memory is configured for storing a viewable output of at least one software application. The controller is in communication with the display, the memory, and the motion sensor. The controller includes a first control logic that generates a first image on the display representative of a portion of the viewable output of the software application. The first image has a field of view (FOV), where the viewable output includes an inner region and an outer region. A second control logic adjusts the FOV of the first image based upon movement of the handheld device. A third control logic displays a second image of a cursor in the inner region. A fourth control logic displays the second image of the cursor in the outer region.05-13-2010
20100079374METHOD OF CONTROLLING A SYSTEM - The invention describes a method of controlling a system (04-01-2010
20130127713Input Device - An input device comprising an optical module having a light guide plate, a light source, a scattering layer and a sensor is provided. The light guide plate has a top surface, a bottom surface, and a side. The light source emits a light to the side. The light travels within the light guide plate. The scattering layer changes a path of parts of the light on the bottom surface so that the light is projected out of the top surface to form a penetrating light. When an object approaches or touches the top surface of the light guide plate, at least a part of the penetrating light is reflected by the object to form a reflected light received by the sensor. According to the reflected light, the input device generates a position signal indicating at least one relative position of the object with respect to the top surface.05-23-2013
20130127714USER INTERFACE SYSTEM AND OPTICAL FINGER MOUSE SYSTEM - There is provided a user interface system including a slave device and a master device. The slave device provides light of two different wavelengths to illuminate a finger surface, receives reflected light from the finger surface to generate a plurality of image frames, calculates and outputs an image data associated with a predetermined number of the image frames. The master device calculates a contact status and a displacement of the finger surface and a physiological characteristic of a user according to the image data.05-23-2013
201301277153D Pointing Device With Up-Down-Left-Right Mode Switching and Integrated Swipe Detector - A 3D pointing device for use with a content delivery system is provided. The pointing device can operate in one of at least one of two modes: a first 3D or scrolling mode, and a second non-3D mode that can also be referred to as an up-down-left-right (UDLR) mode. The pointing device can include one or more directional sensors, to provide orientation and movement information. For either of the at least two modes, an optical finger navigation module is provided that can detect movement of a user's finger or object across its screen, and provides a predetermined threshold that must be exceeded before movement information is generated from the OFN module. The pointing device can generate scroll and UDLR commands based on the information from the orientation and movement sensors, as well as the OFN module, or can provide the information from the orientation and movement sensors to a user interface that can generate the appropriate scrolling or UDLR commands for use by the content delivery system.05-23-2013
20130127716Projector - A projector capable of detecting the coordinates of a detection object at a height position away from a projection area to some extent is provided. This projector (05-23-2013
20120242576INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING METHOD, AND PROGRAM - An information processing apparatus may include a touchpad, a hardware processor, and a storage medium coupled to the processor. The storage medium may store instructions that, when executed by the processor, cause the information processing apparatus to receive a proximity signal indicative of whether a user is providing input to the touchpad; receive a movement signal indicative of whether the input includes movement of an object relative to the touchpad and/or whether the input includes movement of the object from an outer area surrounding an inner area of the touchpad to the inner area of the touchpad; and select one of a pointing user input mode or a scrolling user input mode based on the signals.09-27-2012
20090146951User Interface Devices - A method and apparatus of user interface having multiple motion dots capable of detecting user inputs are disclosed. In one embodiment, a user interface (“UI”) device includes a first motion dot and a second motion dot. The first motion dot is capable of attaching to a first finger and the second motion dot is configured to attach to a second finger. The first finger, in one example, is a thumb and the second finger is an index finger. The first motion dot includes multiple accelerometers used for identifying the physical location of the first motion dot. The second motion dot, which is logically coupled to the first motion dot via a wireless communications network, is capable of detecting a user input in response to a relative physical position between the first and the second motion dots.06-11-2009
20090140979ELECTRONIC APPARATUS - According to one embodiment, an electronic apparatus includes a housing having an outer surface and an inner surface, a pointing device having a flat input surface and located in the housing with the input surface on the inner surface of the housing, an operation area provided in a position on the outer surface of the housing corresponding to at least a part of the pointing device, and a display section which illuminates at least a part of an outline of the operation area and displays a position of the operation area.06-04-2009
20090115725Input device and method of operation thereof - A generic input device built of Electro-optical camera, sensors, buttons and communication means, provides a measure for operating in absolute and/or relational mode, most software applications on many electronic platforms with display independent of the screen characters.05-07-2009
20090115723Multi-Directional Remote Control System and Method - A multi-directional remote control system and method is adapted for use with an entertainment system of a type including a display (05-07-2009
20100231513POSITION MEASUREMENT SYSTEMS USING POSITION SENSITIVE DETECTORS - Methods and devices for a remote control device for a display device are disclosed. In one embodiment, the remote control device may comprise a plurality of light sources that each has a light profile angled in a predetermined degree different from other light sources. In another embodiment, the remote control device may comprise a controller; and a plurality of optical detectors coupled to the controller. Each optical detector may generate a pair of electrical signals in response to incident light from a plurality of light sources located on a display device and the controller may calculate the position of the remote control device based on the electrical signals.09-16-2010
20100295781Electronic Device with Sensing Assembly and Method for Interpreting Consecutive Gestures - A method for interpreting at least two consecutive gestures includes providing a sensing assembly having at least one photoreceiver and a plurality of phototransmitters, wherein each phototransmitter is positioned to emit infrared light away from the electronic device about a corresponding central transmission axis, wherein each central transmission axis is oriented in a different direction with respect to the others, and controlling emission of infrared light by each of the phototransmitters during each of a plurality of time periods. During each of the plurality of phototransmitters and for each of the plurality of time periods, a corresponding measured signal is generated which is indicative of a respective amount of infrared light which originated from that phototransmitter during that time period and was reflected by the external object prior to being received by a photoreceiver. The measured signals are evaluated to identity a first gesture, and the electronic device is controlled in response to identification of the first gesture according to a first mode of operation. A parameter of a second gesture is also determined, and the electronic device is controlled in response to the determined parameter of the second gesture according to a second mode of operation.11-25-2010
20100302153DEPRESSABLE TOUCH SENSOR - An input device and a method for providing an input device are provided. The input device assembly includes a base, a sensor support, and a scissor mechanism attached to the base and the sensor support. The scissor mechanism allows for only substantially uniform translation of the sensor support towards the base in response to a force biasing the sensor support substantially towards the base.12-02-2010
20100295783GESTURE RECOGNITION SYSTEMS AND RELATED METHODS - A method and apparatus for performing gesture recognition. In one embodiment of the invention, the method includes the steps of receiving one or more raw frames from one or more cameras, each of the one or more raw frames representing a time sequence of images, determining one or more regions of the one or more received raw frames that comprise highly textured regions, segmenting the one or more determined highly textured regions in accordance textured features thereof to determine one or more segments thereof, determining one or more regions of the one or more received raw frames that comprise other than highly textured regions, and segmenting the one or more determined other than highly textured regions in accordance with color thereof to determine one or more segments thereof. One or more of the segments are then tracked through the one or more raw frames representing the time sequence of images.11-25-2010
20100295782SYSTEM AND METHOD FOR CONTROL BASED ON FACE ORE HAND GESTURE DETECTION - System and method for control using face detection or hand gesture detection algorithms in a captured image. Based on the existence of a detected human face or a hand gesture in an image captured by a digital camera, a control signal is generated and provided to a device. The control may provide power or disconnect power supply to the device (or part of the device circuits). The location of the detected face in the image may be used to rotate a display screen to achieve a better line of sight with a viewing person. The difference between the location of the detected face and an optimum is the error to be corrected by rotating the display to the required angular position. A hand gesture detection can be used as a replacement to a remote control for the controlled unit, such as a television set.11-25-2010
20100302154MULTI-MODE POINTING DEVICE AND METHOD FOR OPERATING A MULTI-MODE POINTING DEVICE - A method for pairing and operating a multi-mode pointing device is provided. A pointing device may be automatically paired with an image display device. A pairing request signal may be transmitted on a prescribed frequency channel, and a signal may be received indicating information of a plurality of frequency channels. One of the frequency channels may be selected as a pairing frequency channel for operating in a radio frequency mode.12-02-2010
20100134414INPUT APPARATUS WITH BALL - An input apparatus is disclosed. The input apparatus provides a control signal to a host system. It includes a housing that includes an upper portion and a lower portion. A ball is coupled to the upper portion of the housing and can reside within a ring. A first sensor assembly is configured to sense the position of the ball, and a second sensor assembly is configured to sense the position of the input apparatus relative to a work surface. The input apparatus also includes a mode switch, where the mode switch is operatively coupled to the first sensor assembly and the second sensor assembly. The mode switch includes a first mode where the first sensor assembly provides the control signal to the host system and a second mode where the second sensor assembly provides the control signal to the host system.06-03-2010
20100302152DATA PROCESSING DEVICE - A data processing device having a display unit, a position sensor unit and an input control unit is provided. The display unit displays an option for operation. The position sensor unit detects an orientation of a linearly shaped pointer put in a front space of a screen of the display unit, and detects a position of a tip of the pointer. The input control unit identifies, upon the position sensor unit detecting the tip of the pointer as being in front of the option displayed on the screen of the display unit, the option as being selected. The input control unit displays the selected option in a form different from a form in which another option is displayed. The input control unit scrolls, upon the position sensor unit detecting a change of the orientation of the pointer, content displayed on the screen of the display unit.12-02-2010
20100302150PEER LAYERS OVERLAPPING A WHITEBOARD - A method for displaying edits overlapping a whiteboard comprising creating peer layers overlapping the whiteboard for a peer and peers coupled to the peer and sending or receiving metadata of edits for updating one or more of the peer layers on the peer and the coupled peers in response to any of the peer layers being edited.12-02-2010
20110128222INFORMATION PROCESSING APPARATUS AND CONTROL METHOD - According to one embodiment, a switch circuit switches a resonance frequency band of an antenna in a display unit between first and second resonance frequency bands. The second resonance frequency band is overlapped with a part of the first resonance frequency band and is higher than the first resonance frequency band. A wireless communication module wirelessly transmits and receives signals using a transmission frequency band and a reception frequency band which are included in the first resonance frequency band. A screen image orientation control module changes an orientation of a screen image displayed on the display unit. A resonance frequency shift module shifts the resonance frequency band of the antenna from the first resonance frequency band to the second frequency band by controlling the switch circuit when the orientation of the screen image is an orientation in which the antenna is positioned on a downward side of the screen image.06-02-2011
20130141332HYBRID POINTING DEVICE - The present invention discloses a hybrid pointing device including an optical navigation module and a pointing module. The optical navigation module is configured to replace the conventional buttons of a convention pointing device, such as an optical mouse or a trackball mouse. The optical navigation module is configured to sense gestures of at least one object operated by a user to activate commands associated with particular programs running on the host. Since the optical navigation module is only configured to sense gestures of the object but not the movement of the hybrid pointing device relative to a surface, the resolution thereof is aimed to be sufficiently high enough for sensing gestures and no need to be relatively high.06-06-2013
20100315337OPTICAL CAPACITIVE THUMB CONTROL WITH PRESSURE SENSOR - A small sensor surface designed to control a smart phone or Mobile Internet Device (MID). The sensor surface may be mounted on the side of the proposed device in a position where a user's thumb or finger naturally falls when holding the device in his/her hand. The sensor surface is simultaneously convex and concave, providing both visual and physical cues for the use of the sensor surface. The sensor may include capacitive sensing, optical sensing and pressure sensing capabilities to interpret thumb gestures into device control.12-16-2010
20110241987INTERACTIVE INPUT SYSTEM AND INFORMATION INPUT METHOD THEREFOR - An interactive input system comprises at least one light source configured for emitting radiation into a region of interest, a bezel at least partially surrounding the region of interest and having a surface in the field of view of the at least one imaging device, where the surface absorbs the emitted radiation and at least one imaging device having a field of view looking through a filter and into the region of interest and capturing image frames. The filter has a passband comprising a wavelength of the emitted radiation.10-06-2011
20110241988INTERACTIVE INPUT SYSTEM AND INFORMATION INPUT METHOD THEREFOR - An interactive input system includes at least one imaging device having a field of view looking into a region of interest and capturing images; at least one pen tool comprising an accelerometer configured to measure acceleration of the pen tool and to generate acceleration data, the pen tool configured to wirelessly transmit the acceleration data; and processing structure configured to process the images and acceleration data to determine the location of at least one pointer in the region of interest.10-06-2011
20110109546ACCELEROMETER-BASED TOUCHSCREEN USER INTERFACE - A CE device for, e.g., displaying the time can incorporate an accelerometer to provide various features and enhancements. For example, tilting of the housing as sensed by the accelerometer may be used for controlling a volume output by an audio display, and/or for controlling a position of a screen cursor relative to underlying presentation on a visual display, and/or for controlling motion of a virtual object presented on the visual display; and/or for rotating a presentation on the visual display to always be oriented up and/or for determining that a person has tapped the housing based on signals from the accelerometer and in response thereto presenting an image of a rotatable object on the display.05-12-2011
20110109547POSITION REMOTE CONTROL SYSTEM FOR WIDGET - A position remote control system for a widget is disclosed. The position remote control system is utilized for controlling positions of a plurality of widgets on a display device, and includes a remote controller for selecting one of the plurality of widgets and a corresponding target position and accordingly generating a remote control signal, and a display module including a first wireless device for receiving the remote control signal, an interpreter coupled to the first wireless device for interpreting the remote control signal to generate a first display signal, and a display unit coupled to the interpreter for displaying the widget window on a display area of the display device according to the first display signal.05-12-2011
20090174657ENTERTAINMENT SYSTEM AND PROCESSING APPARATUS - In a game system 07-09-2009
20090066646Pointing apparatus, pointer control apparatus, pointing method, and pointer control method - Provided is a pointing apparatus, a pointer control apparatus, a pointing method, and a pointer control method to a pointing apparatus, a pointer control apparatus, a pointing method, and a pointer control method capable of recognizing image codes included in an image frame using an image sensor to determine a pointing direction, and continuously updating the gain between the displacement of the motion of the pointing apparatus and the displacement of the motion of a displayed pointer. The pointing apparatus includes an image receiving unit sensing image patterns that exist in a sensed region, among all of the image patterns arranged in a display region; an inertial sensor sensing an input motion using at least one of the acceleration and angular velocity that are generated due to the motion; and a coordinate determining unit determining moving coordinates that are moved from the central coordinates of the sensed image pattern by coordinate displacement corresponding to the sensed motion.03-12-2009
20090066645WIRELESS MOUSE WITH ALPHANUMERIC KEYPAD - A wireless mouse for use by a hand of a user with a computer and a display having a cursor that includes a hand-holdable housing and a battery carried within the housing. An integrated circuit is carried within the housing and includes radio communications circuitry adapted for communicating with the computer. At least one sensor is carried by the housing for moving the cursor across the display and at least one button is provided on the housing for clicking on objects on the display. An alphanumeric keypad is provided on the housing for typing on the display.03-12-2009
20110018805LOCATION-DETECTING SYSTEM AND ARRANGEMENT METHOD THEREOF - The invention discloses a location-detecting system including an indication region, one of a camera unit and a light-emitting unit, and an optical device. The indication region is for indication of a target location thereon. One of the camera unit and the light-emitting unit is disposed at a first location of the indication region, and the optical device is disposed at a second location of the indication region and corresponding to one of the camera unit and the light-emitting unit. The optical device is for forming one of a specular reflection camera unit and a specular reflection light-emitting unit, wherein the specular reflection camera unit originates from the camera unit, and the specular reflection light-emitting unit originates from the light-emitting unit.01-27-2011
20110018804Operation control device and operation control method - There is provided an operation control device including a motion detection part which detects an object to be detected, which is moved by motion of a user, a motion determination part which determines motion of the object to be detected based on a detection result, a movable region movement processing part which moves a cursor movable region including a cursor operating an object displayed in a display region, and a cursor movement processing part which moves the cursor. Based on motion of a first detected object, the movable region movement processing part moves the cursor movable region along with the cursor in the display region by a first movement unit. Based on motion of a second detected object, the cursor movement processing part moves only the cursor in the cursor movable region by a second movement unit smaller than the first movement unit.01-27-2011
20110018802Remote Control Device and Multimedia System - A remote control device for a multimedia device includes a housing, a touch pad placed on a plane of the housing and comprising a dedicated touch area, a signal determination unit placed inside the housing and coupled to the touch pad for generating an indication signal corresponding to an output effect when the dedicated touch area receives a touch signal, and a wireless transmitter placed inside the housing and coupled to the signal determination unit for wirelessly transmitting the indication signal to the multimedia device to control the multimedia device to reach the output effect.01-27-2011
20110018801Visual Input/Output Device with Light Shelter - A visual input/output device with a light shield is disclosed. The device includes an active area for performing visual input/output tasks, and a non-active area surrounding the active area in the peripheral area of the device. The light shield is formed in the non-active area for substantially preventing unwanted light from interfering with the active area. The light shield is formed on a same layer as a color filter in the active area, and the light shield is made of black pigment or carbon.01-27-2011
20110018803Spatial, Multi-Modal Control Device For Use With Spatial Operating System - A system comprising an input device includes a detector coupled to a processor. The detector detects an orientation of the input device. The input device has multiple modal orientations corresponding to the orientation. The modal orientations correspond to multiple input modes of a gestural control system. The detector is coupled to the gestural control system and automatically controls selection of an input mode in response to the orientation.01-27-2011
20110109548Systems and methods for motion recognition with minimum delay - Techniques for performing motion recognition with minimum delay are disclosed. A processing unit is provided to receive motion signals from at least one motion sensing device, where the motion signal describes motions made by a user. The processing unit is configured to access a set of prototypes included in a motion recognizer to generate corresponding recognition signals from the motion signals in response to the motion recognizer without considering one or more of the prototypes completely in the motion recognizer. Movements of at least one of the objects in a virtual interactive environment is responsive to the recognition signals such that feedback from the motions to control the one of the objects is immediate and substantially correct no matter how much of the motion signals have been received.05-12-2011
20110115705POINTING DEVICE AND ELECTRONIC APPARATUS - A pointing device includes: a touching surface on which a fingertip is placed; a light emitting diode for illuminating the touching surface from a side opposite to a side where the fingertip is placed; and an imaging element for receiving light reflected from the fingertip, the pointing device further including first light control means for controlling light which is emitted from the light emitting diode and reaches the touching surface so that the light is evenly incident to the touching surface, the light emitting diode emitting light whose light intensity is deviated with respect to a radiation angle, and the first light control means being positioned on a light path from the light emitting diode to the touching surface. This allows providing a pointing device capable of improving decrease in detection accuracy due to deviation in an output from a light source and thus preventing malfunction, and an electronic apparatus including the pointing device.05-19-2011
20090066647GUI APPLICATIONS FOR USE WITH 3D REMOTE CONTROLLER - A remote wand for controlling the operations of a media system is provided. The wand may be operative to control the movement of a cursor displayed on screen by the position and orientation at which the wand is held. As the user moves the wand, the on-screen cursor may move. The user may use the wand to control a plurality of operations and applications that may be available from the media system, including for example zoom operations, a keyboard application, an image application, an illustration application, and a media application.03-12-2009
20110115706Apparatus and method for providing pointer controlfunction in portable terminal - An apparatus and a method for controlling a pointer output at a peripheral device using a portable terminal. In particular, an apparatus and method for controlling a pointer on a peripheral device screen using the portable terminal and shifting an object selected using the pointer to another peripheral device, including a pointer management unit to select information on a pointer to be used and transmit the information to a peripheral device, and determine a motion position of the pointer and provide the motion position to the peripheral device.05-19-2011
20100149096NETWORK MANAGEMENT USING INTERACTION WITH DISPLAY SURFACE - A computing system is provided to make managing the devices and content on a network easier by making the process intuitive, tactile and gestural. The computing system includes a display surface for graphically displaying the devices connected to a network and the content stored on those devices. A sensor is used to recognize activity on the display surface so that gestures may be used to control a device on the network and transport data between devices on the network. Additionally, new devices can be provided access to communicate on the network based on interaction with the display device.06-17-2010
20090033623THREE-DIMENSIONAL VIRTUAL INPUT AND SIMULATION APPARATUS - The present invention relates to a three-dimensional virtual input and simulation apparatus, and more particularly to an apparatus comprising a plurality of point light sources, a plurality of optical positioning devices with a visual axis tracking function, and a control analysis procedure. The invention is characterized in that the plurality of optical positioning devices with the visual axis tracking function are provided for measuring and analyzing 3D movements of the plurality of point light sources to achieve the effect of a virtual input and simulator.02-05-2009
20090033621Inertial Sensor-Based Pointing Device With Removable Transceiver - An inertial sensor-based pointing device 02-05-2009
20110128223METHOD OF AND SYSTEM FOR DETERMINING A HEAD-MOTION/GAZE RELATIONSHIP FOR A USER, AND AN INTERACTIVE DISPLAY SYSTEM - The invention describes a method of determining a head-motion/gaze relationship for a user (06-02-2011
20110128221Computer Display Pointer Device for a Display - According to one embodiment, a computer display pointer device includes an image processor coupled to a display and a video camera. The display is configured to be worn by a user and display a computer image over a portion of the user's field-of-view. The video camera is operable to be worn by the user and boresighted to a field-of-view of the user. The image processor receives a video signal from the video camera that includes an image of a pointer element configured on a hand of the user, determines a position of the pointer element according to the received video image, and moves a cursor on the display according to the position of the pointer element.06-02-2011
20100164866HANDHELD ELECTRONIC DEVICE, CURSOR POSITIONING SUB-SYSTEM AND METHOD EMPLOYING CURSOR SCALING CONTROL - A track ball cursor positioning sub-system is employed by a handheld electronic device including an operating system and a plurality of applications having a plurality of predetermined scaling values. The cursor positioning sub-system includes a track ball cursor positioning device adapted to output a plurality of device pulses, and a track ball cursor resolution controller adapted to repetitively input the device pulses and to responsively output to the operating system a plurality of cursor movement events. The cursor resolution controller is further adapted to be controlled by the operating system or by the applications to learn which one of the applications is active and to automatically scale a number of the cursor movement events for a corresponding number of the device pulses based upon a corresponding one of the predetermined scaling values of the active one of the applications.07-01-2010
20100123661SLIDE PRESENTATION SYSTEM AND METHOD OF PERFORMING THE SAME - A slide presentation system and a method of performing the same which are capable of providing a real-time interaction among conference presenter and attendees are disclosed. When a projector projects at least one slide to as map a screened image generated from a host, an image identifying unit identifies a pointer after an image capturing unit imaging the content expressed on the projected slide. After the pointer is identified, an orienting unit detects a two-dimension coordinate value with reference to where the pointer is pointed on the projected slide as the same as the screened image of the host. Then, the two-dimension coordinate values are transmitted to the host for determining an action of the pointer according to the two-dimension coordinate value with reference to the screened image of the host. By the present invention, the pointer pointing on the projected slide can be directly implemented as functioning a mouse.05-20-2010
20090146952DISPLAY SYSTEM AND LIQUID CRYSTAL DISPLAY DEVICE - A display system includes a liquid crystal display apparatus and an operating device. The operating device can point a desired position on a display screen of the display apparatus and capture an image of the display apparatus to obtain the captured image including the desired position pointed by the operating device. The display system includes an image analyzing section for analyzing the captured image obtained by the capture to find reference light areas in the display apparatus and to detect the desired position on the display screen based on positions of the reference light areas in the captured image. The display apparatus also includes a blinking controlling section which controls blinking conditions of all the backlight units in the display apparatus to differ a blinking condition of two or more reference light area backlight units from that of the other backlight units. This allows properly detecting the desired position on the display screen pointed by the operating device.06-11-2009
20110241991TRACKING OBJECT SELECTION APPARATUS, METHOD, PROGRAM AND CIRCUIT - Provided is a tracking object selection apparatus (10-06-2011
20100039383DISPLAY CONTROL DEVICE, PROGRAM FOR IMPLEMENTING THE DISPLAY CONTROL DEVICE, AND RECORDING MEDIUM CONTAINING THE PROGRAM - A virtual plane including a display screen is divided into small regions. If calculated coordinates of an intersection is located in one of the small regions outside the display screen, an icon corresponding to the small region is displayed at a predetermined position. If coordinates of the intersection are not calculated, an icon corresponding to the small region in which the preceding intersection coordinates were located in is displayed at a predetermined position.02-18-2010
20110241989Remote touch panel using light sensor and remote touch screen apparatus having the same - A remote touch panel includes a plurality of light sensor cells arranged in two dimensions. Each light sensor cell may include a light-sensitive semiconductor layer and first and second electrodes electrically connected to the light-sensitive semiconductor layer. The remote touch panel may be controlled at a remote distance. For example, a large display apparatus can be easily controlled by using a simple light source device, for example, a laser pointer.10-06-2011
20090046063Coordinate positioning system and method with in-the-air positioning function - A coordinate positioning system with in-the-air positioning function includes an illuminator and an image sensor. The illuminator produces a directional light. The image sensor receives the directional light produced by the illuminator and produces an image corresponding to the directional light to accordingly analyze the image and obtain a rotating angle corresponding to the directional light.02-19-2009
20100039382INFORMATION PROCESSING APPARATUS, INPUT APPARATUS, INFORMATION PROCESSING SYSTEM, INFORMATION PROCESSING METHOD, AND PROGRAM - An information processing apparatus, an input apparatus, an information processing system, an information processing method, and a program that are capable of improving operability when a target object is selected on a screen are provided. A control apparatus is provided to which, when a button is pressed in a state where a pointer is indicating an area around an icon on a screen, a signal indicating that the button has been pressed and a signal of positional information of the pointer at that time are input, and the control apparatus performs movement control such that the pointer indicates the icon based on those signals. Therefore, even when the pointer is not directly indicating the icon, the icon can be indicated by indicating the area around the icon, thus improving operability in selecting the icon on the screen by the pointer.02-18-2010
20090322676GUI APPLICATIONS FOR USE WITH 3D REMOTE CONTROLLER - A remote wand for controlling the operations of a media system is provided. The wand may be operative to control the movement of a cursor displayed on screen by the position and orientation at which the wand is held. As the user moves the wand, the on-screen cursor may move. The user may use the wand to control a plurality of operations and applications that may be available from the media system, including for example zoom operations, a keyboard application, an image application, an illustration application, and a media application.12-31-2009
20090322678PRIVATE SCREENS SELF DISTRIBUTING ALONG THE SHOP WINDOW - An interactive method and system include at least one detector (12-31-2009
20100039381ROTATABLE INPUT DEVICE - In an example embodiment, a computer mouse is provided. This computer mouse includes a surface tracking sensor that detects movement of the computer mouse along the support surface. Additionally included are one or more orientation sensors that detect a movement of the computer mouse relative to a pivot point. The computer mouse also includes a controller that is configured to translate the movement along the support surface into a two-dimensional coordinate and to translate the movement relative to the pivot point into a magnitude of rotation.02-18-2010
20100066672METHOD AND APPARATUS FOR MOBILE COMMUNICATION DEVICE OPTICAL USER INTERFACE - An optical user interface is provided in a mobile communication device. Motion of an input mechanism of the mobile communication device is detected via an optical sensor. In response to detection, one or more properties governing the motion are evaluated based on input from a motion sensor. Control information is generated based on evaluation.03-18-2010
20100053082REMOTE CONTROLS FOR ELECTRONIC DISPLAY BOARD - Techniques for interacting with an electronic display board are disclosed. According to one embodiment, a remote controller includes a laser generator, several motion sensors, a Micro Central Unit has data process capability and memory inside to store programs and a transceiver or a transmitter. A laser beam from the laser generator facilitates a writing movement on the electronic data board, the motion sensors detect the movement of the remote controller, and the MCU calculate the sensor data to derive the movement, the transmitter transmits the movement from the controller to the electronic data board. In accordance of the detected movement, the movement of the remote controller corresponding to the laser is electronically represented on the electronic display board.03-04-2010
20100053083PORTABLE DEVICES AND CONTROLLING METHOD THEREOF - A portable electronic device including a first input/output unit including a monostable display element, a second input/output unit including a bistable display element, a main setting unit configured to selectively set either one of the first and second input/output units as a main input/output unit and the other one of the first and second input/output units as a sub input/output unit, and a conversion unit configured to convert the sub input/output unit into a touch pad for inputting a command on the main input/output unit.03-04-2010
20110095979Real-Time Dynamic Tracking of Bias - A bias value associated with a sensor, e.g., a time-varying, non-zero value which is output from a sensor when it is motionless, is estimated using at least two, different bias estimating techniques. A resultant combined or selected bias estimate may then be used to compensate the biased output of the sensor in, e.g., a 3D pointing device.04-28-2011
20100127978POINTING DEVICE HOUSED IN A WRITING DEVICE - A method for controlling a pointing icon in a computer, including the steps of (A) establishing a wireless connection between a pointing device and the computer, (B) generating directional information through one or more three dimensional movements of the pointing device, (C) transmitting the directional information from the pointing device to the computer and (D) translating the directional information into movements of the pointing icon on a screen of the computer using a device driver program stored on the computer.05-27-2010
20110074675METHOD AND APPARATUS FOR INITIATING A FEATURE BASED AT LEAST IN PART ON THE TRACKED MOVEMENT - In accordance with an example embodiment of the present invention, an apparatus comprising a camera configured to capture one or more media frames. Further, the apparatus comprises at least one processor and at least one memory including computer program code. The at least one memory and the computer program code is configured to, with the at least one processor, cause the apparatus to perform at least the following: filter the one or more media frames using one or more shaped filter banks; determine a gesture related to the one or more media frames; track movement of the gesture; and initiate a feature based at least in part on the tracked movement.03-31-2011
20110074676Large Depth of Field Navigation Input Devices and Methods - Disclosed are various embodiments of a navigation input device, and methods, systems and components corresponding thereto. According to some embodiments, the navigation input device has a large depth of field associated therewith and employs time- and/or frequency-domain processing algorithms and techniques. The device is capable of providing accurate and reliable information regarding the (X,Y) position of the device on a navigation surface as it is moved laterally thereatop and thereacross, notwithstanding changes in a vertical position of the device that occur during navigation and that do not exceed the depth of field of an imaging lens incorporated therein. According to one embodiment, the navigation input device is a writing instrument that does not require the use of an underlying touch screen, touch pad or active backplane to accurately and reliably record successive (X,Y) positions of the writing device as it is moved across and atop an underlying writing medium such as paper, a pad or a display.03-31-2011
20090027337ENHANCED CAMERA-BASED INPUT - Enhanced camera-based input, in which a detection region surrounding a user is defined in an image of the user within a scene, and a position of an object (such as a hand) within the detection region is detected. Additionally, a control (such as a key of a virtual keyboard) in a user interface is interacted with based on the detected position of the object.01-29-2009
20090027335Free-Space Pointing and Handwriting - A position detection method using one or more one-dimensional image sensors for detecting a light source (01-29-2009
20120146903GESTURE RECOGNITION APPARATUS, GESTURE RECOGNITION METHOD, CONTROL PROGRAM, AND RECORDING MEDIUM - A gesture recognition apparatus has a temperature sensor in which a plurality of infrared sensors are arranged, a change region specifying unit that specifies a change region where a temperature change is generated as a region indicating a hand based on a temperature detected by each infrared sensor of the temperature sensor, and a gesture recognition unit that specifies a movement locus of the change region specified by the change region specifying unit and recognizes a gesture of the hand.06-14-2012
20110069007POINTING DEVICE - A gyroscopic pointing apparatus for an electronic device, particularly a games console, that is provided with means to detect when the mobile component is pointing at the screen upon which it controls a cursor to provide a mechanism to correct for drift in gyroscope readings and an improved method of dynamically recalibrating the zero point of the gyroscopes. The pointing detection mechanism may be provided by the combination of an infra-red LED and an infra-red sensor in either permutation on the mobile component and fixed component respectively.03-24-2011
20110043448OPERATION INPUT SYSTEM, CONTROL APPARATUS, HANDHELD APPARATUS, AND OPERATION INPUT METHOD - An operation input system includes a casing and a motion sensor for detecting a movement of the casing inside the casing and calculates a position of the casing in a predetermined space based on an output of the motion sensor. The operation input system includes a position sensor and a correction section. The position sensor directly detects the position of the casing in the predetermined space. The correction section corrects the output of the motion sensor using an output of the position sensor.02-24-2011
20100302151IMAGE DISPLAY DEVICE AND OPERATION METHOD THEREFOR - An image display device and an operation method thereof are provided. This may include displaying an image on a display, displaying a pointer that moves in correspondence with an operation of a pointing device on the display, and displaying an object for receiving a command or representing image display device-related information on the display when the pointer moves to a predetermined area of the display.12-02-2010
20110254765Remote text input using handwriting - A method for user input includes capturing a sequence of positions of at least a part of a body, including a hand, of a user of a computerized system, independently of any object held by or attached to the hand, while the hand delineates textual characters by moving freely in a 3D space. The positions are processed to extract a trajectory of motion of the hand. Features of the trajectory are analyzed in order to identify the characters delineated by the hand.10-20-2011
20100097318METHODS AND APPARATUSES FOR OPERATING A PORTABLE DEVICE BASED ON AN ACCELEROMETER - Methods and apparatuses for operating a portable device based on an accelerometer are described. According to one embodiment of the invention, an accelerometer attached to a portable device detects a movement of the portable device. In response, a machine executable code is executed within the portable device to perform one or more predetermined user configurable operations. Other methods and apparatuses are also described.04-22-2010
20100097316System and Method for Determining an Attitude of a Device Undergoing Dynamic Acceleration - A system and a method for determining an attitude of a device undergoing dynamic acceleration is presented. A first attitude measurement is calculated based on a magnetic field measurement received from a magnetometer of the device and a first acceleration measurement received from a first accelerometer of the device. A second attitude measurement is calculated based on the magnetic field measurement received from the magnetometer of the device and a second acceleration measurement received from a second accelerometer of the device. A correction factor is calculated based at least in part on a difference of the first attitude measurement and the second attitude measurement. The correction factor is then applied to the first attitude measurement to produce a corrected attitude measurement for the device.04-22-2010
20120200500INFORMATION PROCESSING PROGRAM - A game system comprises image obtaining means, direction calculation means, first rotation means, and display control means. The image obtaining means obtains an image taken by an imaging device. The direction calculation means calculates a direction originally determined from the image of an imaging target included in the image taken by the imaging device. The first rotation means rotates an operation target in accordance with the calculated direction. The display control means generates an image in accordance with the rotation of the operation target performed by the first rotation means and displays the generated image on a screen of a display device.08-09-2012
20120169596METHOD AND APPARATUS FOR DETECTING A FIXATION POINT BASED ON FACE DETECTION AND IMAGE MEASUREMENT - The present invention provides an apparatus for detecting a fixation point based on face detection and image measurement, comprising: a camera for capturing a face image of a user; a reference table acquiring unit for acquiring a reference table comprising relations between reference face images and line-of-sight directions of the user; and a calculating unit for performing image measurement based on the face image of the user captured by the camera and looking up the reference table in a reference table acquiring unit, so as to calculate the fixation point of the user on the screen.07-05-2012
20080246726Portable electronic device having a pliable or flexible portion - A portable electronic device or handheld computer is disclosed. The device has a housing and computing electronics supported by the housing. A pliable sensor that is supported by the housing is also disclosed. The pliable sensor provides input from the hand of a user by applying pressure to the pliable sensor.10-09-2008
20110175809Tracking Groups Of Users In Motion Capture System - In a motion capture system, a unitary input is provided to an application based on detected movement and/or location of a group of people. Audio information from the group can also be used as an input. The application can provide real-time feedback to the person or group via a display and audio output. The group can control the movement of an avatar in a virtual space based on the movement of each person in the group, such as in a steering or balancing game. To avoid a discontinuous or confusing output by the application, missing data can be generated for a person who is occluded or partially out of the field of view. A wait time can be set for activating a new person and deactivating a currently-active person. The wait time can be adaptive based on a first detected position or a last detected position of the person.07-21-2011
20080291164COORDINATE INPUT APPARATUS, CONTROL METHOD THEREOF, AND PROGRAM - A plurality of sensors for receiving arrival light detect the change ranges of light amount distributions generated upon the pointing operation of a pointer on a coordinate input region. Coordinate values corresponding to the change ranges are calculated on the basis of the number of change ranges in the respective sensors and the number of pen-down signals obtained from the pointer.11-27-2008
20100321294STRETCH OBJECTS - There is disclosed an interactive display system comprising an interactive surface for displaying an image and for receiving inputs from remote devices, the system being adapted to detect the presence of at least two remote devices proximate the interactive surface.12-23-2010
20100321293COMMAND GENERATION METHOD AND COMPUTER USING THE SAME - A command generation method is suitable for a computer. First, a human body image is captured by an image capturing device in a two-dimensional image form. Then, the shape of the human body image is determined in two-dimensional image recognition for obtaining a determined result. A command is generated according to the determined result.12-23-2010
20120200499AR GLASSES WITH EVENT, SENSOR, AND USER ACTION BASED CONTROL OF APPLICATIONS RESIDENT ON EXTERNAL DEVICES WITH FEEDBACK - This disclosure concerns an interactive head-mounted eyepiece with an integrated processor for handling content for display and an integrated image source for introducing the content to an optical assembly through which the user views a surrounding environment and the displayed content, wherein the eyepiece includes event, sensor, and user action based control of applications resident on external devices with feedback.08-09-2012
20080204410RECOGNIZING A MOTION OF A POINTING DEVICE - The present invention is directed toward a system and process that controls a group of networked electronic components using a multimodal integration scheme in which inputs from a speech recognition subsystem, gesture recognition subsystem employing a wireless pointing device and pointing analysis subsystem also employing the pointing device, are combined to determine what component a user wants to control and what control action is desired. In this multimodal integration scheme, the desired action concerning an electronic component is decomposed into a command and a referent pair. The referent can be identified using the pointing device to identify the component by pointing at the component or an object associated with it, by using speech recognition, or both. The command may be specified by pressing a button on the pointing device, by a gesture performed with the pointing device, by a speech recognition event, or by any combination of these inputs.08-28-2008
20080204411RECOGNIZING A MOVEMENT OF A POINTING DEVICE - The present invention is directed toward a system and process that controls a group of networked electronic components using a multimodal integration scheme in which inputs from a speech recognition subsystem, gesture recognition subsystem employing a wireless pointing device and pointing analysis subsystem also employing the pointing device, are combined to determine what component a user wants to control and what control action is desired. In this multimodal integration scheme, the desired action concerning an electronic component is decomposed into a command and a referent pair. The referent can be identified using the pointing device to identify the component by pointing at the component or an object associated with it, by using speech recognition, or both. The command may be specified by pressing a button on the pointing device, by a gesture performed with the pointing device, by a speech recognition event, or by any combination of these inputs.08-28-2008
20110163956Bimanual Gesture Based Input and Device Control System - A user conveys information to a receiving device with a data input tool which uses combinatorial gesture patterns from the cursor or track point of two single point devices. The input method is independent from hardware and language limitations, improves the user's ability to focus on the data stream being entered and reduces the footprint of the data input tool.07-07-2011
20110163955MOTION SENSING AND PROCESSING ON MOBILE DEVICES - Handheld electronic devices including motion sensing and processing. In one aspect, a handheld electronic device includes a set of motion sensors provided on a single sensor wafer, including at least one gyroscope sensing rotational rate of the device around at least three axes and at least one accelerometer sensing gravity and linear acceleration of the device along the at least three axes. Memory stores sensor data derived from the at least one gyroscope and accelerometer, where the sensor data describes movement of the device including a rotation of the device around at least one of the three axes of the device, the rotation causing interaction with the device. The memory is provided on an electronics wafer positioned vertically with respect to the sensor wafer and substantially parallel to the sensor wafer. The electronics wafer is vertically bonded to and electrically connected to the sensor wafer.07-07-2011
20110163954DISPLAY DEVICE AND CONTROL METHOD THEREOF - A display device and a control method thereof are provided. The display device includes a camera obtaining an image and a controller obtaining the direction of a user included in the obtained image and correcting the image such that the direction of the user is synchronized with the photographing direction of the camera. Even when the direction of the user does not correspond to the photographing direction of the camera, an image of the user can be corrected to correctly recognize a user's gesture.07-07-2011
20100283732EASILY DEPLOYABLE INTERACTIVE DIRECT-POINTING SYSTEM AND PRESENTATION CONTROL SYSTEM AND CALIBRATION METHOD THEREFOR - A method for controlling movement of a computer display cursor based on a point-of-aim of a pointing device within an interaction region includes projecting an image of a computer display to create the interaction region. At least one calibration point having a predetermined relation to said interaction region is established. A pointing line is directed to substantially pass through the calibration point while measuring a position of and an orientation of the pointing device. The pointing line has a predetermined relationship to said pointing device. Movement of the cursor is controlled within the interaction region using measurements of the position of and the orientation of the pointing device.11-11-2010
20100283730INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING METHOD, AND INFORMATION PROCESSING PROGRAM - An information processing apparatus include: a control detection block configured to detect a control in a predetermined detection space; a position detection block configured to detect a three-dimensional position of a control detected by the control detection block; a threshold value setting block configured, if the control has approached the control detection block beyond a threshold value set to a predetermined distance through the control detection block on the basis of a three-dimensional position detected by the position detection block, to set the threshold value farther from the control detection block than the predetermined distance; a setting change block configured, if the control has exceeded the threshold value set by the threshold value setting block, to change setting values for predetermined processing; and a processing execution block configured to execute the processing by use of the setting values set by the setting change block.11-11-2010
20110134034Input Device and Method, and Character Input Method - There is provided an input device capable of detecting a motion of a hand of a user (06-09-2011
20110163953Lapdesk with Retractable Touchpad - A lapdesk for use with a laptop computer includes a housing having a top configured to support the laptop computer. The housing is configured to block heat emitted from the laptop computer from passing through the housing. The lapdesk further includes a tray having a touchpad disposed thereon. The tray is configured to slide into the housing and slide out from the housing. The lapdesk further includes a circuit coupled to the touchpad where the circuit is configured to transmit control signals from the touchpad to the laptop computer.07-07-2011
20120146904APPARATUS AND METHOD FOR CONTROLLING PROJECTION IMAGE - Disclosed are an apparatus and a method related with a user interface that allows a user to easily use services provided by equipment when output images of video equipment are projected onto an external screen by a projector. According to exemplary embodiments of the present invention, a human body may be used as a pointer to form a shadow, and the shadow may be analyzed to control the projection image. Therefore, the projection image can be controlled without using an interface device, which can provide an easy and intuitive user interface to a user.06-14-2012
20110134035Transmitting Apparatus, Display Apparatus, and Remote Signal Input System - Disclosed is a remote signal input system. The remote signal input system comprises a transmitting device for generating signal light and a display panel comprising a plurality of sensors for sensing the signal light. Location signals are input into a display device having the display panel by using the signal light generated from the transmitting device.06-09-2011
20110095978REMOTE CONTROL - In a method for controlling objects, objects to be controlled are arranged in a real space. Said real space is linked to a multi-dimensional representational space by a transformation rule. Representations in the representational space are associated with the controllable objects of the real space by a mapping. Said method comprises steps of determining the position and orientation of a pointer in the real space, determining the position and orientation of a pointer representation associated with the pointer in the representational space using the position and orientation of the pointer in the real space and the transformation rule between the real space and the representational space, determining the representations in the representational space that are intersected by the pointer representation, selecting a representation that is intersected by the pointer representation, and controlling the object in the real space that is associated with the pointer representation in the representational space.04-28-2011
20110095980HANDHELD VISION BASED ABSOLUTE POINTING SYSTEM - A method is described that involves identifying one or more images of respective one or more fixed markers. Each marker is positioned on or proximate to a display. The images appear on a pixilated sensor within a handheld device. The method also involves determining a location on, or proximate to, the display where the handheld device was pointed during the identifying. The method also involves sending from the handheld device information derived from the identifying of the one or more images of respective one or more fixed markers. The method also involves triggering action taken by electronic equipment circuitry in response to the handheld device's sending of a signal to indicate the action is desired.04-28-2011
20110095977INTERACTIVE INPUT SYSTEM INCORPORATING MULTI-ANGLE REFLECTING STRUCTURE - An interactive input system comprises at least one imaging device having a field of view looking into a region of interest. At least one radiation source emits radiation into the region of interest. A bezel at least partially surrounds the region of interest. The bezel comprises a multi-angle reflecting structure to reflect emitted radiation from the at least one radiation source towards the at least one imaging device.04-28-2011
20100214215INPUT DEVICE FOR USE WITH A DISPLAY SYSTEM - A methods and systems for use of an input device with a display system are disclosed. In an example embodiment, a projection device configured to project a displayed image is provided where the displayed image includes one or more selectable items. The projection system further includes an input device which may be movable in free space and may be configured to point to the selectable items. The input device may be enabled to provide a double-click input to effect one or more changes in a graphical user interface that corresponds with a selection of a particular one of the selectable items at which the input device is pointed. The double click input may be identified such that movement of the input device after initiation of the double-click input may be ignored by the graphical user interface until completion of the double click input.08-26-2010
20100214214REMOTE INPUT DEVICE - An input device providing users with a pointing capability includes a sender portion (08-26-2010
20100127979Input device - Provided is an input device that may sense a touch and a motion, generate a sensing signal with respect to the sensed touch and motion, generate an input signal based on the sensing signal, and transmit the input signal to a display device. The display device may control an object displayed on the display device based on the input signal.05-27-2010
20080218475METHOD AND APPARATUS FOR DYNAMICALLY MODIFYING WEB PAGE DISPLAY FOR MOBILE DEVICES - A method of dynamically modifying web page displays used in various mobile devices. The method uses a motion detection mechanism to detect whether the mobile device is moving or in motion and then modifies web page displays sent to the device based upon the sensor readings. As such, the method, system, and apparatus are capable of automatically modifying a display provided to a mobile device based upon a determination that the user and/or device are moving and/or in motion. In another aspect, the method, system, and apparatus are also capable of modifying the complexity of a display provided to a mobile device based upon the degree of movement and/or motion by the user and/or device.09-11-2008
20100066673Laser pointer capable of detecting a gesture associated therewith and representing the gesture with a function - Described is a laser pointer capable of detecting a gesture associated therewith and representing the gesture with a function, comprising a light emitting unit for emitting a laser light; a detection unit for detecting a set of accelerations composed of at least an acceleration of the gesture and outputting an acceleration signal set; a control unit for receiving and outputting the acceleration signal set or outputting a processed acceleration signal set; and a communications unit for receiving and transmitting the acceleration signal set or the processed acceleration signal set through a communications channel to be received by an electronic device in order to determine the gesture and perform the corresponding function according to the acceleration signal set or the processed acceleration signal set.03-18-2010
20100066674Cursor controlling apparatus and the method therefor - The invention relates to a cursor controlling apparatus for controlling a cursor according to an input information comprising an input module, a processing module and a controlling module. The input module is used for generating the input information. The processing module coupled to the input module is used for generating processing data according to the input information. The controlling module is coupled to the processing module for comparing the processing data with a predetermined value to generate a comparing result and further generating a first control signal or a second control signal corresponding to the processing data and the compared result to control the cursor respectively.03-18-2010
20100194687REMOTE INPUT DEVICE - An input device providing users with a pointing capability includes a sender portion (08-05-2010
20090267897REMOTE CONTROL TRANSMITTER - A remote control transmitter detects motion in a specific direction or in a rotational direction around a specific axis: The transmitter includes a battery placed on a bottom surface side within a case containing circuitry of the transmitter. The bottom surface side has a convex surface, with a center of curvature C coincident in the upward direction of the force of gravity above a center of gravity G of the transmitter. When placed on a flat surface, the transmitter assumes a stable orientation such that when the transmitter is grasped in order to perform a motion-based operation, it can be can be assumed that the vertical direction is the direction of the line joining the center of curvature C of the stable portion to the center of mass G, as an absolute direction for reference in detecting the motion operation.10-29-2009
20110187642Interaction Terminal - Embodiments of the present invention are directed to systems, apparatuses and methods for using a mobile device with an accelerometer to conduct a financial transaction by making contact with an interaction terminal, thereby generating interaction data that is representative of the physical contact between the mobile device and the interaction terminal. The mobile device may be a mobile phone. The interaction terminal may be a point of sale terminal, access point device, or any other stationary (i.e., in a fixed position) device positioned at a line, door, gate, or entrance. A mobile device with an accelerometer physically contacts the interaction terminal. The interaction terminal flexes, recoils, or moves and generates interaction data (e.g., accelerometer, location, time data, etc.) representative of the physical interaction between the mobile device and the interaction terminal. A server computer determines, based on interaction data, that the mobile device and the interaction terminal made physical contact. After determining that the mobile device and the interaction terminal made contact, communication may be initiated between the devices. Communications may relate to processing a payment transaction using a payment processing network.08-04-2011
20110148763OPERATION INPUT DEVICE AND METHOD, PROGRAM, AND ELECTRONIC APPARATUS - An operation input device includes: angular velocity detecting means for detecting an angular velocity; relative velocity detecting means for contactlessly detecting a relative velocity to a target object; distance detecting means for detecting a distance to the target object; and computing means for computing an amount of movement based on the angular velocity, the relative velocity, and the distance.06-23-2011
20110148762SYSTEM AND METHOD FOR MULTI-MODE COMMAND INPUT - A controlling device has a moveable touch sensitive panel positioned above a plurality of switches. When the controlling device senses an activation of at least one of the plurality of switches when caused by a movement of the touch sensitive panel resulting from an input at an input location upon the touch sensitive surface, the controlling device responds by transmitting a signal to an appliance wherein the signal is reflective of the input location upon the touch sensitive surface.06-23-2011
20110187643USER INTERFACE SYSTEM BASED ON POINTING DEVICE - The user interaction system comprises a portable pointing device (08-04-2011
20090174658SYSTEM AND METHOD OF ADJUSTING VIEWING ANGLE FOR DISPLAY BASED ON VIEWER POSITIONS AND LIGHTING CONDITIONS - A method for adjusting a viewing angle of a display, includes determining a location of one or more viewers and determining lighting conditions. Additionally, the method includes calculating an optimal viewing position of the display based on the location of the one or more viewers and the lighting conditions and adjusting the display based on the optimal viewing position.07-09-2009
20100171699AUTOMATIC ORIENTATION-BASED USER INTERFACE FOR AN AMBIGUOUS HANDHELD DEVICE - An electronic device is provided that includes a user-interface feature, a detection mechanism and one or more internal components. The user-interface feature is configurable to have a selected orientation about one or more axes. The detection mechanism can detect orientation information about the electronic device. The one or more components may select the orientation of the user-interface feature based on the detected orientation information.07-08-2010
20100066675Compact Interactive Tabletop With Projection-Vision - The subject application relates to a system(s) and/or methodology that facilitate vision-based projection of any image (still or moving) onto any surface. In particular, a front-projected computer vision-based interactive surface system is provided which uses a new commercially available projection technology to obtain a compact, self-contained form factor. The subject configuration addresses installation, calibration, and portability issues that are primary concerns in most vision-based table systems. The subject application also relates to determining whether an object is touching or hovering over an interactive surface based on an analysis of a shadow image.03-18-2010
20110074677Methods for Determining a Cursor Position from a Finger Contact with a Touch Screen Display - A portable device with a touch screen display detects a contact area of a finger with the touch screen display and then determines a first position associated with the contact area. The cursor position of the finger contact is determined, at least in part, based on: the first position, one or more distances between the first position and one or more of the user interface objects; and one or more activation susceptibility numbers, each associated with a respective user interface object in the plurality of user interface objects. If the cursor position falls into the hidden hit region of a virtual push button on the touch screen display, the portable device is activated to perform operations associated with the virtual push button.03-31-2011
20090146950METHOD AND SYSTEM OF VISUALISATION, PROCESSING, AND INTEGRATED ANALYSIS OF MEDICAL IMAGES - The present invention concerns a method and a system for the management of a station of visualisation, processing and analysis of images based on not manuals commands, particularly optical and vocal, and able to provide a feedback to the user to direct the further exploration of the medical images.06-11-2009
20120146906PROMOTABLE INTELLIGENT DISPLAY DEVICE AND PROMOTING METHOD THEREOF - A promotable intelligent display device includes: an image output means outputting character and promotion images; a camera taking pictures of people; an image recognition means receiving the taken image, and outputting contextual information source among information on position of people, moving direction, distance, number, and size; a controlling means specifying a promotion target at time intervals based on the contextual information, determining driving information on the direction and speed of rotation of the image output means based on the information on moving direction and position of the promotion target, and determining promotion activities by the distance, number, and size of the promotion target; a voice output means outputting promotion and character voices based on the promotion activities; and a driving means connected with the image output means by connection shaft to enable the rotation of the image output means by controlling a shaft based on the driving information and promotion activities.06-14-2012
20120146905TILT DIRECTION DETECTOR FOR ORIENTING DISPLAY INFORMATION - An electronic apparatus having a display function is able to alter the orientation of an image displayed on a display means for displaying images between a first orientation and a second orientation different from the first orientation. A plurality of operating means are provided at positions symmetrical between disposal positions which take the first orientation as a standard orientation and disposal positions which take the second orientation as a standard orientation.06-14-2012
20120146902ORIENTING THE POSITION OF A SENSOR - Techniques are provided for re-orienting a field of view of a depth camera having one or more sensors. The depth camera may have one or more sensors for generating a depth image and may also have an RGB camera. In some embodiments, the field of view is re-oriented based on the depth image. The position of the sensor(s) may be altered to change the field of view automatically based on an analysis of objects in the depth image. The re-orientation process may be repeated until a desired orientation of the sensor is determined. Input from the RGB camera might be used to validate a final orientation of the depth camera, but is not required to during the process of determining new possible orientation of the field of view.06-14-2012
20120306745MOTION PATTERN CLASSIFICATION AND GESTURE RECOGNITION - Methods, program products, and systems for gesture classification and recognition are disclosed. In general, in one aspect, a system can determine multiple motion patterns for a same user action (e.g., picking up a mobile device from a table) from empirical training data. The system can collect the training data from one or more mobile devices. The training data can include multiple series of motion sensor readings for a specified gesture. Each series of motion sensor readings can correspond to a particular way a user performs the gesture. Using clustering techniques, the system can extract one or more motion patterns from the training data. The system can send the motion patterns to mobile devices as prototypes for gesture recognition.12-06-2012
20120306746LENS ACCESSORY FOR VIDEO GAME SENSOR DEVICE - A lens accessory for a video game sensor device and a method of adjusting a sensing distance of a video game sensor device. A lens accessory for a video game sensor device includes a first lens configured to cover an infrared light emitter of the video game sensor device, a second lens configured to cover an infrared light receiver of the video game sensor device, and a body portion coupling the first lens and the second lens together, the body portion being removably attachable to the video game sensor device, and the first lens and the second lens having a magnification for adjusting a sensing distance of the video game sensor device.12-06-2012
20110304541METHOD AND SYSTEM FOR DETECTING GESTURES - A method and system for detecting user interface gestures that includes obtaining an image from an imaging unit; identifying object search area of the images; detecting at least a first gesture object in the search area of an image of a first instance; detecting at least a second gesture object in the search area of an image of at least a second instance; and determining an input gesture from an occurrence of the first gesture object and the at least second gesture object.12-15-2011
20110304537AUTO-CORRECTION FOR MOBILE RECEIVER WITH POINTING TECHNOLOGY - A mobile station and unattached work area is used with an electronic pen, which includes a transmitter, such as an acoustic transmitter. The mobile station includes a receiver that receives signals from the transmitter and orientation sensors that detect movement of the mobile station. The position of the receiver is calibrated with respect to the unattached work area. Data from the orientation sensors is received when the mobile station, and thus, the receiver is moved with respect to the work area. A transformation matrix is generated based on the data from the orientation sensors, which can be used to correct for the movement of the receiver. The position of the transmitter in the electronic pen is calculated and mapped based on received signals and the transformation matrix and the mapped position is then displayed.12-15-2011
20100283731METHOD AND APPARATUS FOR PROVIDING A HAPTIC FEEDBACK SHAPE-CHANGING DISPLAY - A haptic device includes a processor, a communication module coupled to the processor for receiving a shape input, and a housing for housing the communication module and including a deformable portion. The deformable portion includes a deformation actuator, and the processor provides a signal to the deformation actuator in response to the shape input to deform the housing. The shape of other areas of the device may also change in response to the signal. The shape changes may provide haptic effects, provide information, provide ergonomic changes, provide additional functionality, etc., to a user of the device.11-11-2010
20120038552UTILIZATION OF INTERACTIVE DEVICE-ADJACENT AMBIENTLY DISPLAYED IMAGES - A telecommunication device configured to project an ambiently displayed image at a location proximate to the telecommunication device on a surface that is substantially parallel to a plane formed by a body of the telecommunication device is described herein. The telecommunication device is further configured to detect an interaction with the ambiently displayed image and perform an action based on the detected interaction.02-16-2012
20120038553THREE-DIMENSIONAL VIRTUAL INPUT AND SIMULATION APPARATUS - The present invention relates to a three-dimensional virtual input and simulation apparatus, and more particularly to an apparatus comprising a plurality of point light sources, a plurality of optical positioning devices with a visual axis tracking function, and a control analysis procedure. The invention is characterized in that the plurality of optical positioning devices with the visual axis tracking function are provided for measuring and analyzing 3D movements of the plurality of point light sources to achieve the effect of a virtual input and simulator.02-16-2012
20110050570Orientation-Sensitive Signal Output - Orientation-sensitive signal output, in which a neutral position of a device is automatically determined in relation to at least a first axis, an angular displacement of the device is measured about at least the first axis, and shaking of the device is detected. A selection of the first control is received, and an output signal is output based at least upon the selection and the angular displacement or based upon detecting the shaking of the device.03-03-2011
20110050569Motion Controlled Remote Controller - A handheld device includes a display having a viewable surface and operable to generate an image indicating a currently controlled remote device and a gesture database maintaining a plurality of remote command gestures. Each remote command gesture is defined by a motion of the device with respect to a first position of the handheld device. The device includes a gesture mapping database comprising a mapping of each of the remote command gestures to an associated command for controlling operation of the remote device and a motion detection module operable to detect motion of the handheld device within three dimensions and to identify components of the motion in relation to the viewable surface. The device includes a control module operable to track movement of the handheld device using the motion detection module, to compare the tracked movement against the remote command gestures to determine a matching gesture, and to identify the one of the commands corresponding to the matching gesture. The device also includes a wireless interface operable to transmit the identified command to a remote receiver for delivery to the remote device.03-03-2011
20090027338Gestural Generation, Sequencing and Recording of Music on Mobile Devices - System and methods for an application that allows users to interactively create, transform and play music using cell phones, iPhones™ and other enabled mobile communication devices communicating with a remote host are disclosed. Using an enabled mobile communication device, users are able to strike the mobile device like a drum to create and record rhythms, enter melodies using the keypads, add voice recordings, and manipulate musical tracks by tilting the mobile device continuously in three dimensions. The musical input is sequenced in multiple tracks and the transformative manipulations are applied in real time, allowing users to create their songs in an expressive motion-based manner.01-29-2009
20120176312POINTING DEVICE USING CAPACITIVE SENSING - A pointing device includes: an operator supported above a substrate to be horizontally movable from an initial position; a conductive layer formed on the substrate and formed with conductive patterns spaced apart from each other to form a ring-shaped structure that is disposed coaxially around the operator when the operator is at the initial position; a conductor plate spaced apart from the conductive patterns, mounted coaxially on a bottom end of the operator, and having a diameter larger than an inner diameter of the ring-shaped structure; and a control unit providing a corresponding electrical signal to each conductive pattern during movement of the operator to measure a capacitance generated between each conductive pattern and the conductor plate so as to generate a sensing output corresponding to the measured capacitances, and generating a pointer signal corresponding to the movement of the operator based on the sensing output and predetermined capacitance information.07-12-2012
20120176311Optical pointing system and method - Optical pointing systems, devices and methods are provided wherein a selected area on a surface is illuminated with a spot of light generated by an optical pointer, the spot being substantially invisible to unassisted human vision. The spot of light is detected and its position determined via an optical sensor, and a visible marker representing the selected area is provided on the surface, under control of an electronic interface. Surfaces of physical objects as well as displayed images are accommodated, and systems, devices and methods are provided for independent operation, as well as for integrated operation with electronic display and presentation systems.07-12-2012
20120176313DISPLAY APPARATUS AND VOICE CONTROL METHOD THEREOF - A voice-controllable display apparatus is provided. The display apparatus includes a display unit which displays a plurality of icons on a screen, a control unit which controls the display unit to display identifiers corresponding the plurality of icons, the identifiers being assigned to the plurality of icons based on a preset standard if a voice recognition starts, and the identifiers being different from each other, and a voice input unit which receives a voice input. The control unit, selects an icon corresponding to the received voice input assigned to the identifier, if a voice input for an arbitrary identifier is received through the voice input unit. Thereby, effective voice control of the apparatus is achieved.07-12-2012
20110316778POINTING DEVICE, CONTROLLING METHOD OF THE SAME, GLASSES FOR 3D IMAGE, AND DISPLAY APPARATUS - A pointing device includes: a communication unit which performs communication with glasses for viewing a 3D image and a display apparatus; and a control unit which controls the communication unit to perform communication with the glasses, detects a reference distance between the glasses and the pointing device and a measurement distance between the glasses and the pointing device, and determines depth information for 3D pointing based on the reference distance and the measurement distance.12-29-2011
20110316777ELECTRONIC DEVICE PROVIDING REGULATION OF BACKLIGHT BRIGHTNESS AND METHOD THEREOF - An electronic device includes a display module, a function key, a motion sensor, a determination module, and a regulation module. The motion sensor is operable to acquire a coordinate of the electronic device. The determination module is operable to determine whether the electronic device is in the upright position based on the coordinate. The regulation module regulates backlight brightness of the display module when the function key is operative and the specific application is not executed when the electronic device is in the upright position.12-29-2011
20110316776INPUT DEVICE WITH PHOTODETECTOR PAIRS - Input devices configured to provide user interface by detecting three dimensional movement of an external object are disclosed. The input device comprises at least two photodetector pairs, a radiation source and a circuit configurable to detect differential and common mode signals generated in the photodetector pairs. By detecting the common mode and differential signals, movement of an external object may be determined and used to control a pointer, or a cursor.12-29-2011
20120044142DISPLAY HAVING AUTO-OFF FUNCTION BY PERSON INFRARED SENSING - A display having an auto-off function by person infrared sensing includes a display unit, an infrared sensor module, a control unit and a power supply unit. The infrared sensor module senses whether or not a person is within a zone surrounding the display, and generates a sensing trigger signal when sensing no person within the zone. A processor of the control unit stops outputting a control signal when keeping on receiving the sensing trigger signal for a predetermined period of time. The power supply unit supplies power to the control unit or to the display unit and the control unit when receiving the control signal, and enters a DC off mode to stop supplying power when not receiving the control signal. The present invention can automatically turn off the display when sensing no person within the zone for the predetermined period of time so as to achieve power saving.02-23-2012
20110057880INFORMATION DISPLAY APPARATUS, INFORMATION DISPLAY METHOD AND PROGRAM - An information display apparatus including: a tilt detection unit that detects a basic position of a casing and detects a tilt from the basic position of the casing; a display unit that is mounted on the casing and displays information on a display screen; a touch detection unit that is mounted on the casing and detects a touch of an operating body on the casing; and a control unit that after movement of information displayed on the display screen of the display unit is started based on the tilt of the casing detected by the tilt detection unit and when a touch of an operating body is detected by the touch detection unit, stops the movement of the information displayed on the display screen.03-10-2011
20110057879Image Projection System with Adjustable Cursor Brightness - An image projection system is disclosed. The system comprises a projector, a user input device and a computing device. The micro-mirror based projector projects an image including a cursor on a display plane. The present invention discloses various embodiments for the system and method of adjusting the brightness of the cursor to improve effects of a presentation. According to one embodiment, the brightness of the cursor is adjusted by modifying the on/off time ratio of the mirrors by which the cursor is formed. According to another embodiment, the projector comprises a first and a second micro-mirror array. The second array is dedicated for projecting the cursor image. The brightness of the cursor may be adjusted by changing the number of micro-mirrors by which the cursor is formed from the second array.03-10-2011
20110157017PORTABLE DATA PROCESSING APPARTATUS - A portable data processing apparatus is provided. It has at least one data processing function which depends on detected motion of the apparatus. The apparatus comprises a video camera operable to capture successive images of a part of the real environment around the apparatus; a video motion detector operable to detect motion of the apparatus by analysis of image motion between pairs of captured images; a hardware motion detector operable to detect motion of the apparatus, whereby the data processing function depends on motion detected by the hardware motion detector; and a controller operable to adjust the operation of the hardware motion detector if the motion detected by the video motion detector and the motion detected by the hardware motion detector differ by at least a threshold difference.06-30-2011
20110157015METHOD OF GENERATING MULTI-TOUCH SIGNAL, DONGLE FOR GENERATING MULTI-TOUCH SIGNAL, AND RELATED CONTROL SYSTEM - A dongle includes: a receiver, for receiving a control signal which is not generated from a multi-touch panel; a processing unit, coupled to the receiver, for generating a multi-touch output signal corresponding to a multi-touch event according to the control signal; and a data port, coupled to the processing unit, for outputting the multi-touch output signal.06-30-2011
20110157016GESTURE RECOGNITION INPUT DEVICE - A gesture recognition based input device includes a number of finger wear components and image capture modules, and an image capture module. Each finger wear component dedicatedly reflects light of a unique wavelength. Each image capture module dedicatedly picks up light reflected by a corresponding finger wear component and thereby dedicatedly captures images of the corresponding finger wear component. The image recognition module recognizes movements of the finger wear components from the images and interprets the movements of the finger wear components into control signals.06-30-2011
20120007804INTERACTIVE INPUT SYSTEM AND METHOD - A method of resolving ambiguities between at least two pointers within a region of interest comprises capturing images of the region of interest and at least one reflection thereof from different vantages using a plurality of imaging devices, processing image data to identify a plurality of targets for the at least two pointers, for each image, determining a state for each target and assigning a weight to the image data based on the state, and calculating a pointer location for each of the at least two pointers based on the weighted image data.01-12-2012
20120056809HANDHELD PORTABLE POINTING APPARATUS - A portable pointing apparatus utilized with a computing device such as a personal computer, laptop computer, and/or an Internet connected television. The pointing apparatus generally comprises a spin ball, a right click button, a left click button, and a hold button for performing various cursor movements and cursor operations on a display of the computing device. The hold button can be utilized to highlight data displayed on the computing device. The apparatus also includes a laser pointer for pointing out important data on the display. A tracking device generates a movement signal based on a movement of the pointing apparatus. The computing device receives a movement signal from a tracking system and controls the movement of a cursor displayed via the computing device. The pointing apparatus can be operated from any convenient location and can be of any shape or form for user comfort.03-08-2012
20120056808EVENT TRIGGERING METHOD, SYSTEM, AND COMPUTER PROGRAM PRODUCT - An event triggering method, system, and computer program product are provided, in which an optical spot trajectory tracking is utilized. Firstly, a cursor at a position on the surface of a screen is moved to a preset position through a first optical spot, then, the projecting of the first optical spot is stopped. Next, it is determined whether the preset position is a preset position of a command. Then, a second optical spot is projected to generate a light-triggered signal, and then a processing unit executes the command according to the light-triggered signal.03-08-2012
20120056807POSITION SENSING SYSTEMS FOR USE IN TOUCH SCREENS AND PRISMATIC FILM USED THEREIN - A dual light-source position detecting system for use in touch screens is provided that utilizes parallax to determine the position of an interposing object, and a prismatic film that is brightly retroreflective over a broad entrance angle to enhance to accuracy of the parallax determination of position. The position detecting system includes at least one camera positioned to receive light radiation traversing a detection area and that generates a signal representative of an image; two spaced-apart sources of light radiation, which may be LEDs, or IR emitters positioned adjacent to the camera for outputting light radiation that overlaps over at least a portion of a detection area, and a prismatic film positioned along a periphery of at least a portion of the detection area that retroreflects the light radiation from the two sources to the camera. The prismatic film includes a plurality of triangular cube corner retroreflective elements having dihedral angle errors e03-08-2012
20110080340System And Method For Remote Control Of A Computer - A remote control system for a computer, and a corresponding method include a Web camera having an image capture unit, the image capture unit including one or more devices capable of receiving imagery from multiple, distinct sections of the electromagnetic spectrum; a detection and separation module capable of detecting and separating the imagery into at least one signal capable of cursor control, wherein the signal capable of cursor control is generated by a remote control device; and a processing unit that receives the signal capable of cursor control and generates one or more cursor control signals, wherein the one or more cursor control signals include signals indicative of movement of the remote control device, the movement capable of translation to movement of a cursor displayed on a display of the computer.04-07-2011
20120206352IMAGE-CAPTURING DEVICE FOR OPTICAL POINTING APPARATUS - An image-capturing device configured for an optical pointing apparatus includes a plurality of image-sensing units arranged adjacently. The plurality of image-sensing units are configured to sense images of a surface and generate sensing signals that are used for evaluating the velocity of the optical pointing apparatus. The image-capturing device is configured to use different image-sensing units arranged differently to sense the surface according to the velocity of the optical pointing apparatus. When the optical pointing apparatus moves at a first velocity, the image-capturing device uses the image-sensing units configured to occupy a smaller area to sense the surface. When the optical pointing apparatus moves at a second velocity, the image-capturing device uses the image-sensing units configured to occupy a larger area to sense the surface. The first velocity is lower than the second velocity.08-16-2012
20120154277OPTIMIZED FOCAL AREA FOR AUGMENTED REALITY DISPLAYS - A method and system that enhances a user's experience when using a near eye display device, such as a see-through display device or a head mounted display device is provided. An optimized image for display relative to the a field of view of a user in a scene is created. The user's head and eye position and movement are tracked to determine a focal region for the user. A portion of the optimized image is coupled to the user's focal region in the current position of the eyes, a next position of the head and eyes predicted, and a portion of the optimized image coupled to the user's focal region in the next position.06-21-2012
20120154276REMOTE CONTROLLER, REMOTE CONTROLLING METHOD AND DISPLAY SYSTEM HAVING THE SAME - Disclosed are a remote controller, a remote controlling method, and a display system having the same. An image displayed on a display apparatus may be converted by using one remote controller. The display system may perform a converting the image by using one remote controller, without implementing a touch screen on the display apparatus or using two or more remote controllers for multi-touch. This may enhance a user's convenience and reduce fabrication costs.06-21-2012
20120206350Device Control of Display Content of a Display - Methods, apparatuses and systems of providing control of display content on a display with a device are disclosed. One method includes establishing a fixed reference on the display. A user input is received indicating that the device is at a user selected position corresponding to the fixed reference and capturing a position of the device in order to establish a corresponding reference position. The display content on the display is determined based on measured displacement of the device relative to the established reference position.08-16-2012
20120206353HYBRID POINTING DEVICE - The present invention discloses a hybrid pointing device including an optical navigation module and a pointing module. The optical navigation module is configured to replace the conventional buttons of a convention pointing device, such as an optical mouse or a trackball mouse. The optical navigation module is configured to sense gestures of at least one object operated by a user to activate commands associated with particular programs running on the host. Since the optical navigation module is only configured to sense gestures of the object but not the movement of the hybrid pointing device relative to a surface, the resolution thereof is aimed to be sufficiently high enough for sensing gestures and no need to be relatively high.08-16-2012
20120206351INFORMATION PROCESSING APPARATUS, COMPUTER-READABLE STORAGE MEDIUM HAVING STORED THEREIN INFORMATION PROCESSING PROGRAM, INFORMATION PROCESSING METHOD, AND INFORMATION PROCESSING SYSTEM - A first housing including an orientation detection section for detecting an orientation, and a second housing including a screen section for displaying a predetermined image are connected to each other such that the relative orientation of the second housing with respect to the first housing can be changed. The relative orientation of the second housing is estimated based on a value obtained by adding a predetermined offset to detected data detected by the orientation detection section, and predetermined display processing is performed for the screen section, based on the relative orientation of the second housing.08-16-2012
20120206349UNIVERSAL STYLUS DEVICE - A stylus device receives light from a display though an optical element that is adapted to increase the field curvature of an image formed on an image sensor of the stylus device. Based on the size and shape of a portion of the image that is in focus, a distance, orientation, and/or azimuth of the stylus device with respect to the display can be determined. In addition, a position corresponding to each pixel, or groups of pixels, is encoded into blue light emitted by each pixel or group of pixels of the display. Upon initialization, or after a loss of synchronization, the stylus device can determine its position with respect to the pixels by decoding the encoded position. After synchronizing its position with the display, the stylus device can determine its subsequent positions by tracking pixels of the display.08-16-2012
20090009468MULTI-FUNCTIONAL WIRELESS MOUSE - A multi-functional wireless mouse (01-08-2009
20080218474Ultra Thin Optical Pointing Device and Personal Portable Device Having the Same - The present invention relates to an ultra thin optical pointing device, and a personal portable device having the optical pointing device. The optical pointing device includes a PCB (09-11-2008
20110074674PORTABLE INPUT DEVICE, METHOD FOR CALIBRATION THEREOF, AND COMPUTER READABLE RECORDING MEDIUM STORING PROGRAM FOR CALIBRATION - Provided are a portable input device for inputting coordinates, a method of calibrating the device, and a computer readable recording medium storing a computer program for making a computer perform the method. The portable input device includes two digital cameras, a calibration tool, a storage section, and a controller for calculating coordinates of an object on an input surface based on images taken by the two digital cameras based on images taken by the two digital cameras so as to include the calibration tool. The controller also calibrates positions and widths of a detection band which corresponds to a detection zone defined in a vicinity of the input surface. The positions and the widths of the detection band are stored in the storage section in relationship to positions on the input surface.03-31-2011
20120249422INTERACTIVE INPUT SYSTEM AND METHOD - An interactive input system comprises an interactive surface, an illumination source projecting light onto the interactive surface such that a shadow is cast onto the interactive surface when a gesture is made by an object positioned between the illumination source and the interactive surface, at least one imaging device capturing images of a three-dimensional (3D) space in front of the interactive surface, and processing structure processing captured images to detect the shadow and object therein, and determine therefrom whether the gesture was performed within or beyond a threshold distance from the interactive surface and execute a command associated with the gesture.10-04-2012
20120249423Image Display Apparatus - A screen is provided on a housing in such a manner that the screen protrudes toward its nearer side, and is inclined so that the nearer side becomes lower. This screen displays thereon a first hierarchy menu for indicating an arrangement of genre pictures, a second hierarchy menu for indicating an arrangement of commodity pictures, and a map image for indicating a map of the sales counters. A user brings his or her hand tip nearer to this screen. This operation allows a double-ring pointer to be displayed at a position which corresponds to the spatial position or direction of the user's hand tip relative to the screen. On this screen, the user is permitted to change the position of this double-ring pointer by performing a gesture of changing the spatial position or direction of the user's hand tip.10-04-2012
20110090149METHOD AND APPARATUS FOR ADJUSTING A VIEW OF A SCENE BEING DISPLAYED ACCORDING TO TRACKED HEAD MOTION - A method for controlling a view of a scene is provided. The method initiates with detecting an initial location of a control object. An initial view of the scene is displayed on a virtual window, the initial view defined by a view-frustum based on a projection of the initial location of the control object through outer edges of the virtual window. Movement of the control object to a new location is detected. An updated view of the scene is displayed on the virtual window, the updated view defined by an updated view-frustum based on a projection of the new location of the control object through the outer edges of the virtual window.04-21-2011
20110090148Wearable input device - A wearable input device includes a body and a soft battery. The body is connected to the soft battery to form a collar range, and the soft battery surrounds a hand of a user. The soft battery supplies an electric power to a finger-contact control module and a wireless transmitter in the body, such that the finger-contact control module senses a movement of an object (for example, a finger) on the body, and generates a control signal corresponding to a moving position of the object, and then the wireless transmitter transmits the control signal to a computer host, thus manipulating a cursor on an operating system frame of the computer host.04-21-2011
20110102321IMAGE DISPLAY APPARATUS AND METHOD FOR CONTROLLING THE IMAGE DISPLAY APPARATUS - An image display apparatus including a remote control interface configured to receive a signal from a remote controller; a controller configured to calculate a first pointer position at which a pointer is to be displayed on a display of the image display apparatus based on the received signal, to determine a depth of a three-dimensional (3D) object displayed on the display of the image display apparatus, and to calculate a second position of the pointer based on the determined depth of the 3D object; and a video processor configured to display the pointer at the calculated second pointer position on the display of the image display apparatus.05-05-2011
20110102320INTERACTION ARRANGEMENT FOR INTERACTION BETWEEN A SCREEN AND A POINTER OBJECT - An interaction arrangement for interaction between at least one screen arranged behind a transparent pane and at least one pointer object located in front of the pane, comprising at least two cameras arranged behind the pane, wherein there is associated with each of the cameras a deflection unit by means of which at least one optical path from an interaction area in the vicinity of and in front of the pane can be directed into the camera, and comprising a computing unit connected to all of the cameras for determining a position of the pointer object which is guided so as to be visible for at least two of the cameras, wherein at least the interaction area can be stroboscopically illuminated with infrared light, and the cameras are sensitive to infrared light and can be synchronized with the stroboscopic illumination.05-05-2011
20110102319HYBRID POINTING DEVICE - The present invention discloses a hybrid pointing device including an optical navigation module and a pointing module. The optical navigation module is configured to replace the conventional buttons of a convention pointing device, such as an optical mouse or a trackball mouse. The optical navigation module is configured to sense gestures of at least one object operated by a user to activate commands associated with particular programs running on the host. Since the optical navigation module is only configured to sense gestures of the object but not the movement of the hybrid pointing device relative to a surface, the resolution thereof is aimed to be sufficiently high enough for sensing gestures and no need to be relatively high.05-05-2011
20110102318USER INPUT BY POINTING - Presented is apparatus for capturing user input by pointing at a surface using pointing means. The apparatus comprises: a range camera for producing a depth-image of the pointing means; and a processor. The processor is adapted to determine from the depth-image the position and orientation of a pointing axis of the pointing means; extrapolate from the position and orientation the point of intersection of the axis with the surface; and control an operation based on the location of the point of intersection.05-05-2011
201201199913D GESTURE CONTROL METHOD AND APPARATUS - A 3D gesture control method is provided. The method includes the steps of: obtaining a series of images by a stereo camera; recognizing a control article in the images and acquiring 3D coordinates of the control article; determining the speed of the control article according to the 3D coordinates of the control article; and operating a visible object according to the speed.05-17-2012
20120119992INPUT SYSTEM, INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING PROGRAM, AND SPECIFIED POSITION CALCULATION METHOD - An example input system calculates a specified position on a screen of a display device, the position being specified by an operating device. The input system includes an attitude calculation section, an identification section, and a specified position calculation section. The attitude calculation section calculates an attitude of the operating device. The identification section identifies one of a plurality of display devices toward which the operating device is directed, based on the attitude of the operating device. The specified position calculation section calculates a specified position in accordance with the attitude of the operating device as a position on a screen of the display device identified by the identification section.05-17-2012
20120119990POINTER CONTROL DEVICE, SYSTEM AND METHOD - A pointer control system includes a pointer control device and a pointer sensing device. The pointer control device includes two coils. The two coils respectively transmit a first signal and a second signal to the pointer sensing device, so that the pointer sensing device senses a first coordinate and a second coordinate. The pointer sensing device outputs a control signal which is corresponding to a distance between the first coordinate and the second coordinate. Therefore, the pointer control system can use an angle formed between the pointer control device and the pointer sensing device to control the operation mode of the pointer.05-17-2012
20100245245SPATIAL INPUT OPERATION DISPLAY APPARATUS - A spatial input operation display apparatus providing a user interface allowing input operations inside a space without requiring hands or fingers to be stopped in space, and not requiring a physical shape or space. The spatial input operation display apparatus is provided with a shape identifying portion (09-30-2010
20100289743LASER POINTER AND GESTURE-BASED INPUT DEVICE - A laser pointer is combined with a gesture-based input system to enable presenters to make a seamless presentation, using the laser pointer to highlight content on a screen and in addition as a mount for a motion sensor comprising at least one small sensor such as a micro-electromechanical sensor MEMS that is used as an input device for delivering commands to a host computer.11-18-2010
20100245244CHARACTER INPUTTING DEVICE - The present invention relates to a character inputting device. The character inputting device includes an input unit, a detection unit, and a control unit. The input unit is configured such that a plurality of input areas is radially arranged around a reference location, one or more characters are assigned to each input area, and the respective characters can be selected through different input actions. The detection unit is configured to separately detect different input actions for each of the input areas. The control unit is configured to extract a relevant character from a memory unit in accordance with results of the detection by the detection unit, and input the character.09-30-2010
20120127074SCREEN OPERATION SYSTEM - A screen operation system obtains operation information associated with an operation performed by a user with a pointing object relative to a screen of an image display apparatus controlled by an information processing apparatus and causes the information processing apparatus to execute processing associated with the operation information. The screen operation system includes a mobile information apparatus including a camera capturing an image of the screen of the image display apparatus; a display displaying the image captured by the camera; and a communicator communicating information with the information processing apparatus. The camera of the mobile information apparatus captures an image such that the pointing object is displayed overlapping a predetermined position on the screen of the image display apparatus. Based on captured image information obtained thereby, operation information is obtained.05-24-2012
20120127073DISPLAY APPARATUS AND CONTROL METHOD THEREOF - A display apparatus and a control method thereof, are provided. The display apparatus includes: a camera which detects a coordinate of light projected on a screen by a pointing device projecting a light beam; an image processor which processes an image to be displayed on the screen, the image corresponding to the coordinate of the light detected by the camera; an optical filter which is disposed on a path of the light entering the camera to be detected by the camera and transmits light having a first wavelength to the camera by reducing of an amount of the light at a preset ratio; and a controller which determines a characteristic of the detected light according to a wavelength based on a brightness level of the light entering through the optical filter and detected by the camera.05-24-2012
20120235909CONTROLLING AND/OR OPERATING A MEDICAL DEVICE BY MEANS OF A LIGHT POINTER - The invention relates to a system for controlling and/or operating a medical device (09-20-2012
20120162075ULTRA THIN OPTICAL POINTING DEVICE AND PERSONAL PORTABLE DEVICE HAVING THE SAME - An optical pointing device including a Printed Circuit Board (PCB); an infrared Light Emitting Diode (LED) provided on a side of the PCB; a cover plate for detecting motion of a subject, which is placed in an upper portion of the optical pointing device; an image forming system lens placed below the cover plate and configured to condense a light reflected from the subject; an optical image sensor for receiving a reflected image of the subject and detecting motion of the subject; and an optical coating unit placed on the optical image sensor. The cover plate and the optical coating unit are made of a material capable of passing a wavelength of infrared rays which cannot be perceived by a user's eye. A first cut off wavelength of the cover plate is shorter than a second cut off wavelength of the optical coating unit.06-28-2012
20120162073APPARATUS FOR REMOTELY CONTROLLING ANOTHER APPARATUS AND HAVING SELF-ORIENTATING CAPABILITY - A remote control apparatus for communicating with a target device includes: a sensing portion for sensing points of user contact with the apparatus, user gestures, and an acceleration value of the apparatus; a transmitting device for sending signals representative of user commands to the target device; a controller; and a memory including instructions for configuring the controller to perform a self-orientation process based upon at least one of the acceleration value and the points of user contact to determine a forward direction of a plane of operation for defining the user gestures. An axis of the determined plane of operation substantially intersects the apparatus at any angle.06-28-2012
20100225583COORDINATE CALCULATION APPARATUS AND STORAGE MEDIUM HAVING COORDINATE CALCULATION PROGRAM STORED THEREIN - A coordinate calculation apparatus calculates a coordinate point representing a position on a display screen based on an orientation of an input device. The coordinate calculation apparatus includes direction acquisition means, orientation calculation means, first coordinate calculation means, and correction means. The direction acquisition means acquires information representing a direction of the input device viewed from a predetermined position in a predetermined space. The orientation calculation means calculates the orientation of the input device in the predetermined space. The first coordinate calculation means calculates a first coordinate point for determining the position on the display screen based on the orientation of the input device. The correction means corrects the first coordinate point such that the first coordinate point calculated when the input device is directed in a predetermined direction takes a predetermined reference value.09-09-2010
20110181511METHOD FOR ADJUSTING EXPOSURE CONDITION OF OPTICAL SENSOR MODULE AND OPTICAL SENSOR MODULE THEREOF - A method for adjusting an exposure condition of an optical sensor module includes the following steps, (A) receiving reflected light reflected by a working surface; (B) generating an image signal by exposing the optical sensor module to the reflected light, in which the image signal includes a plurality of luminance signals and an image quality signal; (C) setting an exposure condition of the optical sensor module according to part of the luminance signals; (D) repeating Step (B) and Step (C) under different exposure conditions so that the optical sensor module generates a plurality of image quality signals; and (E) setting an optimal exposure condition corresponding to the working surface according to the image quality signals under the different exposure conditions. The optical sensor module is applicable to a pointing device.07-28-2011
20110181510Gesture Control - An apparatus including one or more radio transmitters configured to transmit radio signals that are at least partially reflected by an object or objects moving as a consequence of a gesture; multiple radio receivers configured to receive the transmitted radio signals after having been at least partially reflected by an object or objects moving as a consequence of a gesture; a detector configured to detect an attribute of the received signals, for each receiver, that varies with the position of the object or objects moving as a consequence of the gesture; and a controller configured to interpret the detected attributes, for the receivers, as a user input associated with the gesture.07-28-2011
20110181509Gesture Control - An apparatus including: a radio transmitter configured to transmit radio signals that are at least partially reflected by a human body; one or more radio receivers configured to receive the transmitted radio signals after having been at least partially reflected by a human body of a user; a gesture detector configured to detect a predetermined time-varying modulation that is present in the received radio signals compared to the transmitted radio signals; and a controller configured to interpret the predetermined time-varying modulation as a predetermined user input command and change the operation of the apparatus.07-28-2011
20120212414AR GLASSES WITH EVENT AND SENSOR TRIGGERED CONTROL OF AR EYEPIECE APPLICATIONS - This disclosure concerns an interactive head-mounted eyepiece with an integrated processor for handling content for display and an integrated image source for introducing the content to an optical assembly through which the user views a surrounding environment and the displayed content, wherein the eyepiece includes an event and sensor triggered control of eyepiece applications.08-23-2012
20120212415INTERACTIVE SYSTEM, METHOD FOR CONVERTING POSITION INFORMATION, AND PROJECTOR - A position information converting device in an interactive system comprising: conversion control section which determines, if an image formed by an optical signal from an object of the neighborhood of the projection surface is detected within the projection image included in the captured image data, that the predetermined manipulation has been performed, uses the position conversion information stored in the position conversion information storing section to convert position information representing the position where the predetermined manipulation has been performed into a position on the image based on the image signal.08-23-2012
20120212413Method and System for Touch-Free Control of Devices - The present invention provides a system and computerized method for receiving image information and translating it to computer inputs. In an embodiment of the invention, image information is received for a predetermined action space to identify an active body part. From such image information, depth information is extracted to interpret the actions of the active body part. Predetermined gestures can then be identified to provide input to a computer. For example, gestures that can be interpreted to mimic computerized touchscreen operation. Also, touchpad or mouse operations can be mimicked.08-23-2012
20120133586INPUT DEVICE - An input device comprises a rotatable magnet member, a lower casing and an upper casing. The rotatable magnet member has a magnet, a spherical part and a rod-like shaft. The magnet is formed into a ring-shaped disc with a through hole in the center, and magnetized with N and S poles alternately along the circumferential direction. The rod-like shaft having a flange portion near a first end is inserted in the through hole, and both ends protrude from the through hole. The lower casing rotatably supports the rod-like shaft of the rotatable magnet member. There is a space provided between an outer peripheral surface of a part of the rod-like shaft inside the through hole and an inner wall of the through hole over the entire peripheries, and the flange portion is press-fitted and fixed to the spherical part.05-31-2012
20120133585Apparatus and method for controlling object - An apparatus and method for controlling an object are provided. Motions of fingers present in a 3-dimensional (3D) sensing area are detected, and a pointer or an object being displayed is controlled corresponding to the detected motions. Therefore, input of a control signal may be achieved without a direct touch on a display device such as a terminal, thereby preventing leaving marks on a screen of the display. In addition, since the motions of fingers are detected within the 3D sensing area, not on a 2-dimensional (2D) plane, more types of input motions may be used.05-31-2012
20120133584Apparatus and method for calibrating 3D position in 3D position and orientation tracking system - An apparatus and method for calibrating a 3D position in a 3D position and orientation tracking system are provided. The apparatus according to an embodiment may track the 3D position and the 3D orientation of a remote device in response to a detection of a pointing event, may acquire positions pointed to by the laser beams, in response to the detection of the pointing event, may generate a 3D reference position, based on information about the pointed to positions and the tracked 3D orientation, may calculate an error using the reference position and the tracked 3D position, and may calibrate the 3D position to be tracked, using the error.05-31-2012
20120299828METHOD AND APPARATUS FOR CLASSIFYING MULTIPLE DEVICE STATES - Techniques are described herein for classifying multiple device states using separate Bayesian classifiers. An example of a method described herein includes accessing sensor information of a device, wherein at least some of the sensor information is used in a first feature set and at least some of the sensor information is used in a second feature set; processing the first feature set using a first classification algorithm configured to determine a first proposed state of a first state type and a first proposed state of a second state type; processing the second feature set using a second classification algorithm configured to determine a second proposed state of the first state type and a second proposed state of the second state type; and determining a proposed state of the device as the first proposed state of the first state type and the second proposed state of the second state type.11-29-2012
20120299827MULTI-PLATFORM MOTION-BASED COMPUTER INTERACTIONS - Systems and methods for multi-platform motion interactivity, is provided. The system includes a motion-sensing subsystem, a display subsystem including a display, a logic subsystem, and a data-holding subsystem containing instructions executable by the logic subsystem. The system configured to display a displayed scene on the display; receive a dynamically-changing motion input from the motion-sensing subsystem that is generated in response to movement of a tracked object; generate, in real time, a dynamically-changing 3D spatial model of the tracked object based on the motion input; control, based on the movement of the tracked object and using the 3D spatial model, motion within the displayed scene. The system further configured to receive, from a secondary computing system, a secondary input; and control the displayed scene in response to the secondary input to visually represent interaction between the motion input and the secondary input.11-29-2012
20120169595Operation control device - An operation control device includes a housing, a control module having a movable operating device carried on a carrier frame in the housing and partially exposed to the outside and rotatable and axially slidable by the user, a circuit module, which includes a rotation sensor module for sensing the direction and amount of rotation of the movable operating device and producing a respective control signal and magnetic sensors for sensing the direction and amount of axial displacement of the movable operating device in a non-contact manner. The human-friendly design of the operation control device facilitates cursor control, assuring high operation stability and comfort.07-05-2012
20120313852SENSOR SYSTEM FOR GENERATING SIGNALS THAT ARE INDICATIVE OF THE POSITION OR CHANGE OF POSITION OF LIMBS - The invention relates to a sensor device for generating electronic signals which as such provide information on a position in space, and/or the movement of limbs, especially of the hand of a user in relation to the sensor device. Said electronic signals can be used to carry out input processes in data processing devices, communication devices and other electric devices. The aim of the invention is to devise ways of generating, in an especially advantageous manner, signals that are indicative of the position and/or the movement of limbs. A sensor device for generating electrical signals which as such provide information on the position or movement of limbs in relation to a reference area comprises a transmitter electrode device (G12-13-2012
20120176314METHOD AND SYSTEM FOR CONTROLLING MOBILE DEVICE BY TRACKING THE FINGER - A method and system controls a mobile device by tracking the movement of a finger. A finger mode in which the mobile device is controlled by tracking a movement of a finger is activated. The finger is detected and the movement of the detected finger is tracked via a camera. And when the tracked movement of the detected finger corresponds to a preset motion, a function corresponding to the preset motion pattern is performed. A number of application programs can be controlled respectively by tracking the movement of fingers, via one of a number of input means.07-12-2012
201200751833D Pointing Devices with Orientation Compensation and Improved Usability - Systems and methods according to the present invention describe 3D pointing devices which enhance usability by transforming sensed motion data from a first frame of reference (e.g., the body of the 3D pointing device) into a second frame of reference (e.g., a user's frame of reference). One exemplary embodiment of the present invention removes effects associated with a tilt orientation in which the 3D pointing device is held by a user.03-29-2012
20120075182MOBILE TERMINAL AND DISPLAYING METHOD THEREOF - A mobile terminal and a control method thereof are provided. The mobile terminal may include a body, at least one display provided to one side of the body, at least one key button provided to the body to receive a user input, a sensing unit to sense a motion of the body, and a controller to cancel a lock screen displayed on the display based on the motion detected by the sensing unit when the user input is acquired through the key button.03-29-2012
20100271301INPUT PROCESSING DEVICE - An input processing device includes an input pad; a detector which detects a position of an indicating object coming into contact with the input pad; and a processor which controls a display state of an image displayed on a display on the basis of an input signal obtained from the detector, wherein an input surface of the input pad is provided with a detection region for detecting a specific input operation, and wherein when the processor receives the input signal corresponding to the specific input operation given from the indicating object onto the detection region, the processor rotates the image.10-28-2010
20120249424Methods and apparatus for accessing peripheral content - In exemplary implementations of this invention, a main content feed is displayed on a main screen. A user may select one or more auxiliary feeds of content to display simultaneously on a second screen. The second screen is located on a handheld device. The user makes the selection by changing the orientation of the handheld device relative to the main screen. For example, the user may select which auxiliary feed to display by pointing the device at different areas that are around the periphery of the main screen. The handheld device includes one or more sensors for gathering data, and one or more processors for (a) processing the sensor data to calculate the orientation of the handheld device relative to the main screen and (b) based at least in part on that orientation, selecting which of the auxiliary feeds to display.10-04-2012
20100007603METHOD AND APPARATUS FOR CONTROLLING DISPLAY ORIENTATION - An approach provides controlling of display orientation in a mobile device. Motion of a mobile device having a plurality of displays is detected, wherein each of the displays is configured to present an image. Orientation of one or more of the images is changed on the displays in response to the detected motion.01-14-2010
20120256834PHYSICAL OBJECT FOR INTUITIVE NAVIGATION IN A THREE-DIMENSIONAL SPACE - A computer-implemented method for manipulating graphics objects within a display viewed by an end-user is disclosed. The method involves: receiving motion information generated in response to the end-user moving an object that is external to the display; determining at least one zone of motion in which the end-user moves the object; determining a first motion type associated with the movement of the object within the at least one zone of motion; and based on the at least one zone of motion and the first motion type, determining at least one change to a viewpoint associated with one or more graphics objects displayed to the end-user within the display. The at least one change to the viewpoint causes an alteration in how the one or more graphics objects are displayed to the end-user within the display.10-11-2012
20090021479Device for Extracting Data by Hand Movement - The invention relates to a hand receiver (01-22-2009
20120262373METHOD AND DISPLAY APPARATUS FOR CALCULATING COORDINATES OF A LIGHT BEAM - A display apparatus is disclosed which includes: a camera which senses a light beam focused on a screen; a video processor which processes at least one of a first image including a reference position for calculating coordinates of the light beam and a second image corresponding to the coordinates of the light beam to be displayed on the screen; and a controller which calculates the coordinates of the light beam on the basis of the reference position changed in accordance with change in a display characteristic of the first image, and transmits the calculated coordinates to the video processor so that the second image corresponding to the calculated coordinates can be displayed on the screen.10-18-2012
20120262372METHOD AND DEVICE FOR GESTURE RECOGNITION DIAGNOSTICS FOR DEVICE ORIENTATION - Systems, circuits, and devices for recognizing gestures are discussed. A mobile device includes a housing, an orientation sensor, a camera implemented on the housing, a memory for storing a lookup table comprising multiple gestures and corresponding commands, and a controller coupled to the orientation sensor, the camera, and the memory. The controller is configured to generate trace data corresponding to a gesture captured by the camera, wherein x, y, and z coordinates of the trace data are applied according to an orientation of the housing during the gesture. The controller is also configured to determine an orientation angle of the housing detected by the orientation sensor. The controller is further configured to recognize the gesture through accessing the lookup table based on the trace data and the orientation angle of the housing.10-18-2012
20120229383GESTURE SUPPORT FOR CONTROLLING AND/OR OPERATING A MEDICAL DEVICE - The invention relates to a gesture support device (09-13-2012
20120229384DISPLAY DEVICE WITH LOCATION DETECTION FUNCTION AND INPUT LOCATION DETECTION SYSTEM - An input position detection system (09-13-2012
20120229382COMPUTER-READABLE STORAGE MEDIUM, INFORMATION PROCESSING SYSTEM, AND INFORMATION PROCESSING METHOD - A movement direction of an object arranged in a virtual world is set based on attitude data. Further, the object is moved in the movement direction in the virtual world in accordance with data based on a load applied to a load detection device. An image showing the virtual world including at least the object or an image showing the virtual world viewed from the object is displayed as a first image on a portable display device.09-13-2012
20120229380Gyroscope control and input/output device selection in handheld mobile devices - A technique to provide a duplicate I/O device along an adjacent edge of a handheld mobile device to ensure that at least one I/O device is not obscured by a user when the user's hand grasps the handheld mobile device. Depending on which of the portrait or landscape orientation is relative to the user, a sensing device senses the orientation and sends a position signal to a control circuit. The control circuit controls a switching device that controls which of the I/O devices is to be activated depending on the orientation.09-13-2012
20090046061Dynamically Controlling a Cursor on a Screen when Using a Video Camera as a Pointing Device - A system provides for controlling a cursor on a screen automatically and dynamically when using a video camera as a pointing device. A computer displays static or dynamic content to a screen. A video camera connected to the computer points at the screen. As the video camera films the screen, frames captured by the video camera are sent to the computer. A target image is displayed by the computer onto the screen and marks the position of the screen cursor of the video camera. Frames captured by the video camera include the target image, and the computer dynamically moves the target image on the screen to ensure that the target image stays in the center of the view of the video camera.02-19-2009
20100328212 SYSTEM AND METHOD FOR PROVIDING ROLL COMPENSATION - The embodiments of the present disclosure are directed towards a method and apparatus for providing roll compensation in a control device, the method and apparatus including acquiring rotational data and linear data indicative of movement of the control device, applying roll compensation to the acquired data, and removing a roll compensation error from the roll compensated data. Inertial sensors such as gyroscope sensors and accelerometer sensor(s) may be used to acquire the rotational and linear data.12-30-2010
20080297474Electronic Device and a Method for Controlling the Functions of the Electronic Device as Well as Program Product for Implementing the Method - The invention relates to an electronic device, which includes a display component, in which at least one controllable element is arranged to be visualized in its entirety, the control of which element is arranged to be based on determining a change (M) relating to the attitude or position of the device and camera means arranged to form image frames (IMAGE12-04-2008
20120262374REMOTE CONTROL SYSTEM AND METHOD CAPABLE OF SWITCHING DIFFERENT POINTING MODES - The present relates to a remote control method and a remote control system capable of switching different pointing modes. The remote control method includes a first wireless transmission module transmitting a first wireless signal with a first pointing mode code to an electronic device, a second wireless transmission module transmitting a second wireless signal with a second pointing mode code to the electronic device, and the electronic device performing movement of an icon at a first pointing mode or at a second pointing mode according to the received first pointing mode code or the received second pointing mode code and pointing mode relation information.10-18-2012
20110037695ERGONOMIC CONTROL UNIT FOR PROVIDING A POINTING FUNCTION - A control unit disposed in a remote-control pointing device for providing a control and pointing function. The control unit comprises a stationary element, a movable control element with a reflective surface disposed proximate to the stationary element, at least one optical sensor fixedly disposed in close proximity to the movable control element, and at least one pressure sensor. When the optical sensor is activated by movement of the movable control element and when the movable control element is put into pressure contact with the pressure sensor, location and pressure contact data, respectively, are collected by the control unit enabling control and pointing functions associated with the pointing device without the need for an external, stationary reference surface.02-17-2011
20120268373METHOD FOR RECOGNIZING USER'S GESTURE IN ELECTRONIC DEVICE - Provided is a method for recognizing a user's gesture in an electronic device, the method including sensing movement of an object by using a motion sensor, checking a distance between the motion sensor and the movement-sensed object and referring to a preset value which is currently applied in relation to gesture recognition, and adaptively recognizing a gesture corresponding to the movement-sensed object according to the set value and the checked distance.10-25-2012
20120268374METHOD AND APPARATUS FOR PROCESSING TOUCHLESS CONTROL COMMANDS - A method and apparatus of detecting an input gesture command are disclosed. According to one example method of operation, a digital image may be obtained from a digital camera of a pre-defined controlled movement area. The method may also include comparing the digital image to a pre-stored background image previously obtained from the digital camera of the same pre-defined controlled movement area. The method may also include identifying one or more pixel differences between the digital image and the pre-stored background image and designating the digital image as having a detected input gesture command.10-25-2012
20120268372METHOD AND ELECTRONIC DEVICE FOR GESTURE RECOGNITION - Provided are a method and an electronic device for gesture recognition capable of dividing a display into a plurality of display regions assigned to a plurality of users, recognizing gestures of the plurality of users, and controlling the display regions assigned to the users who have made a gesture according to the recognized gestures. The method for gesture recognition includes: dividing a display into a plurality of display regions assigned to a plurality of users; recognizing gestures made by the plurality of users, respectively; and controlling the plurality of display regions respectively assigned to the plurality of users who have made the gestures according to the respective recognized gestures.10-25-2012
20120326980COMMUNICATION OF USER INPUT - A keyboard application operably configured to facilitate user inputs through a keyboard displayed within a screen or other user-viewable interface by way of signaling received from a remote control. The application may be configured to facilitate user selection of one or more keys with the use of shortcuts buttons on the remote control that allow the user to navigate to different areas of the keyboard in a more efficient manner. The application may be operable with an interactive television (iTV) system to facilitate interfacing user requests with an output device and other system associated with a multiple system operator (MSO), such as to support interfacing user inputs required to enable product purchasing, navigation of an electronic programming guide (EPG), instigation of video on demand (VOD), textual messaging, web browsing, etc.12-27-2012
201102278253D Pointer Mapping - Systems, devices, methods and software are described for mapping movement or motion of a 3D pointing device into cursor position, e.g., for use in rendering the cursor on a display. Absolute and relative type mapping algorithms are described. Mapping algorithms can be combined to obtain beneficial characteristics from different types of mapping.09-22-2011
20100238112INPUT APPARATUS, CONTROL APPARATUS, CONTROL SYSTEM, AND CONTROL METHOD - An input apparatus includes: a casing; a first acceleration detection section to detect a first acceleration value of the casing in a first direction; a first angle-related value detection section to detect a first angle-related value of the casing about an axis in a second direction; a radius gyration calculation section to calculate, based on the first acceleration value and first angle-related value, a first radius gyration of the casing about the axis in the second direction, the first radius gyration being a distance from a rotational center axis to the first acceleration detection section; and a pseudo velocity calculation section to generate a first pseudo radius related to a magnitude of the first radius gyration and calculate a first pseudo velocity value of the casing in the first direction by multiplying the first pseudo radius by a first angular velocity value obtained from the first angle-related value.09-23-2010
20100231512ADAPTIVE CURSOR SIZING - Disclosed herein are systems and methods for controlling a computing environment with one or more gestures by sizing a virtual screen centered on a user, and by adapting the response of the computing environment to gestures made by a user and modes of use exhibited by a user. The virtual screen may be sized using depth, aspects of the user such as height and/or user profile information such as age and ability. Modes of use by a user may also be considered in determining the size of the virtual screen and the control of the system, the modes being based on profile information and/or information from a capture device.09-16-2010
20120319949POINTING DEVICE OF AUGMENTED REALITY - The present invention relates to a pointing device capable of inputting a particular position of augmented reality to a computer. The invention comprises: a camera which takes a picture of a feature point or a mark used to generate an augmented reality image; and an imam processor which recognizes the feature point or the mark taken by the camera, and outputs position information indicative of a particular position in the augmented reality. Mouse cursor images can be synthesized as augmented reality images at the position outputted from the image processor.12-20-2012
20120319948MOBILE TERMINAL AND CONTROLLING METHOD THEREOF - According to one embodiment, a mobile terminal includes: a touchscreen configured to display a first menu and receive a plurality of touch inputs of a first pattern via the first menu and to display a second menu and receive a plurality of touch inputs of a second pattern via the second menu; and a controller configured to: calculate a first moving distance of a pointer for each of the received touch inputs of the first pattern; determine a minimum among the plurality of calculated first moving distances; calculate a second moving distance of the pointer for each of the received touch inputs of the second pattern; determine a maximum among the plurality of calculated second moving distances; and determine a threshold moving distance of the pointer for discriminating the touch input of the first pattern from the touch input of the second pattern by using the minimum and the maximum.12-20-2012
20120319950METHODS AND SYSTEMS FOR ENABLING DEPTH AND DIRECTION DETECTION WHEN INTERFACING WITH A COMPUTER PROGRAM - One or more images can be captured with a depth camera having a capture location in a coordinate space. First and second objects in the one or more images can be identified and assigned corresponding first and second object locations in the coordinate space. A relative position can be identified in the coordinate space between the first object location and the second object location when viewed from the capture location by computing an azimuth angle and an altitude angle between the first object location and the object location in relation to the capture location. The relative position includes a dimension of depth with respect to the coordinate space. The dimension of depth is determined from analysis of the one or more images. A state of a computer program is changed based on the relative position.12-20-2012
20110304539REMOTE CONTROLLING APPARATUS AND METHOD FOR CONTROLLING THE SAME - Disclosed are a remote controlling apparatus and a method for controlling the same, the apparatus capable of intuitively and conveniently controlling a display apparatus and capable of simply and precisely performing a text input, by controlling activation of a text input unit based on tilt information of the remote controlling apparatus having a three-dimensional pointing function. The remote controlling apparatus for controlling a pointer displayed on a screen of a display apparatus by a three-dimensional operation includes a housing, a text input unit disposed on one surface of the housing, and configured to receive a text input, a sensing unit configured to detect a tilt and a motion of the remote controlling apparatus, and a controller configured to selectively operate a first mode to transmit the text input to the display apparatus or a second mode to transmit information on the detected motion of the remote controlling apparatus to the display apparatus based on the detected tilt of the remote controlling apparatus.12-15-2011
20110304538OPTICAL POINTING DEVICE AND ELECTRONIC DEVICE INCLUDING THE OPTICAL POINTING DEVICE - An inclined plane (12-15-2011
20120092254PROXIMITY SENSOR WITH MOTION DETECTION - A proximity sensor with movement detection is provided. The proximity sensor may provide a navigation function in response to movement of an object. The proximity sensor includes a driver operable to generate a current to a plurality of light sources in a particular timing sequence, a photo detector configured to receive light and generate an output signal, a controller configured to report the movement of an object near the proximity sensor if the output signal pattern generated matches one of the output signal patterns from among a set of known output signal patterns. The proximity sensor may be configured to provide a navigation operation when an object moves near the proximity sensor.04-19-2012
20100225584SILENT OR LOUD 3D INFRARED FUTURISTIC COMPUTER MICE AND KEYBOARD DESIGN FOR A MICE&KEYBOARD LESS COMPUTER - New futuristic silent or loud 3D computer mice and key board design for a mice and keyboard less computer is presented. The current invention uses the fact that each word spoken by humans has a distinctive three dimensional pattern of the mouth and face and unique infrared spectrum. Using this fact, the present invention presents a method where the facial expression, irradiated by an array of infrared diodes, of the spoken word is picked up by infrared sensors installed either in stand alone mode, on top of the computer display or directly into the computer display to translate any spoken word silent or loud into computer commands that will facilitate the interaction of humans and computers without the use of a keyboard or mouse.09-09-2010
20120287043COMPUTER-READABLE STORAGE MEDIUM HAVING MUSIC PERFORMANCE PROGRAM STORED THEREIN, MUSIC PERFORMANCE APPARATUS, MUSIC PERFORMANCE SYSTEM, AND MUSIC PERFORMANCE METHOD - An input device includes a movement and orientation sensor for detecting one of a movement or an orientation of the input device itself. Firstly, information about one of the movement or the orientation of the input device having been detected by this movement and orientation sensor is obtained. Next, a difference between the orientation of the input device having been obtained, and a predetermined reference orientation is calculated. A predetermined sound is produced based on the difference in orientation, thereby executing music performance.11-15-2012
20120287044PROCESSING OF GESTURE-BASED USER INTERACTIONS USING VOLUMETRIC ZONES - Systems and methods for processing gesture-based user interactions within an interactive display area are provided. The display of one or more virtual objects and user interactions with the one or more virtual objects may be further provided. Multiple interactive areas may be created by partitioning an area proximate a display into multiple volumetric spaces or zones. The zones may be associated with respective user interaction capabilities. A representation of a user on the display may change as the ability of the user to interact with one or more virtual object changes.11-15-2012
20100214216MOTION SENSING AND PROCESSING ON MOBILE DEVICES - Display devices including motion sensing and processing. In one aspect, a handheld electronic device includes a subsystem providing display capability and a set of motion sensors provided on a single substrate and including at least one gyroscope sensing rotational rate of the device around three axes of the device and at least one accelerometer sensing gravity and linear acceleration of the device along these axes. A computation unit is capable of determining motion data from the sensor data stored in the memory, the motion data derived from a combination of the sensed rotational rate around at least one of the axes and the sensed gravity and linear acceleration along at least one of the axes. The motion data describes movement of the device including a rotation of the device around at least one of the axes, the rotation causing interaction with the device.08-26-2010
20100207881Apparatus for Remotely Controlling Computers and Other Electronic Appliances/Devices Using a Combination of Voice Commands and Finger Movements - An apparatus for remotely operating a computer using a combination of voice commands and finger movements. The apparatus includes a microphone and a plurality of control elements in the form of touch-sensitive touchpads and/or motion-sensitive elements that are used to operate the computer and to move an on-screen cursor. At least one touchpad is used to selectively switch between a command-mode of operation in which the computer interprets spoken words as commands for operating the computer and any software applications being used, and a text-mode of operation in which the computer interprets spoken words literally as text to be inserted into a software application. The apparatus is ergonomically designed to enable it to be easily worn and to enable a user to operate a computer from a standing, sitting or reclining position. The apparatus can be used to operate a computer for traditional computing purposes such as word processing or browsing the Internet, or for other purposes such as operating electronic devices such as a television and/or other household appliances. The apparatus eliminates the need for a keyboard and a mouse to operate a computer. In addition, the apparatus cart be used as a telephone.08-19-2010
20100207880COMPUTER INPUT DEVICES AND ASSOCIATED COMPUTING DEVICES, SOFTWARE, AND METHODS - Computer input devices include a detector adapted to detect relative movement of an input member in x-, y-, and z-dimensions relative to a base point in a base plane, and a controller adapted to send a signal to an associated computing device based at least in part on the relative movement of the input member. Associated computing devices, software, and methods are also disclosed.08-19-2010
20130009872VIRTUAL INPUT SYSTEM - For a user having a user input actuator, a virtual interface device, such as for a gaming machine, for determining actuation of a virtual input by the input actuator is disclosed. The device comprises a position sensing device for determining a location of the user input actuator and a controller coupled to the position sensing device, the controller determining whether a portion of the user input actuator is within a virtual input location in space defining the virtual input.01-10-2013
20100090950Sensing System and Method for Obtaining Position of Pointer thereof - In a sensing system and a method for obtaining a position of a pointer, the sensing system includes a sensing area, a reflective mirror, an image sensor and a processing circuit. The reflective mirror is configured for generating a mirror image of a pointer when the pointer approaches the sensing area. The image sensor is configured for sensing the pointer and the mirror image thereof when the pointer approaches the sensing area. When the pointer approaches the sensing area, the processing circuit calculates a coordinate value of the pointer according to an image sensed by the image sensor and a predetermined size of the pointer. The pointer forms an imaginary orthographic projection in the sensing area, the processing circuit regards the imaginary orthographic projection as a round projection, and a radius of the round projection is the predetermined size.04-15-2010
20090322677LIGHT GUIDE PLATE FOR SYSTEM INPUTTING COORDINATE CONTACTLESSLY, A SYSTEM COMPRISING THE SAME AND A METHOD FOR INPUTTING COORDINATE CONTACTLESSLY USING THE SAME - Disclosed are a light guide plate for a non-contact type coordinate input system, a system including the same, and a non-contact type coordinate input method using the same. More particularly, the present invention relates to a light guide plate for a non-contact type coordinate input system, which eliminates inconvenience of a conventional contact-type coordinate input system inputting coordinates through direct contact, and which can reduce use of sensors and optical loss as much as possible. The present invention also relates to a system including the same, and a non-contact type coordinate input method using the same.12-31-2009
20080231596KEY SHAPED POINTING DEVICE - A key shaped pointing device adapted to mounted on a keyboard of a notebook computer, a keypad of a mobile phone, or a keyboard of a PC includes a substrate including two opposite top latches; a circuit board releasably secured to the substrate; a chip fixedly mounted on the substrate and including a light source, a photosensor, and a processor; and a casing including a transparent top window and opposite openings releasably secured to the latches. The finger as a reflection member is adapted to contact the window for creating a first optical path from the light source to the photosensor via the finger. A movement of the finger is adapted to create a second optical path. The processor is adapted to calculate a direction and a distance corresponding to the movement by comparing the second optical path with the first optical path for generating a cursor control output.09-25-2008
20130169532System and Method of Moving a Cursor Based on Changes in Pupil Position - Moving a cursor based on changes in pupil position. At least some of the illustrative embodiments are methods including: creating an analog video signal of an eye of a computer user, the analog video signal comprising interlaced video with two fields per frame; calculating a first location of a pupil within at least one field of a frame; calculating a frame location of the pupil based on location of the pupil in the at least one field; and moving a cursor on a display device of the computer system, the moving responsive to a change in the frame location of the pupil with respect to a previous frame location, and the moving in real time with movement of the pupil.07-04-2013
20130169533System and Method of Cursor Position Control Based on the Vestibulo-Ocular Reflex - Cursor position control based on the vestibulo-ocular reflex. At least some of the illustrative embodiments are methods including: creating a first video stream, the first video stream depicting an eye of user of a computer system, wherein a pupil of the eye changes position relative to a face of the user during use of the computer system by the user; tracking pupil position relative to the face of the user, the tracking by way of the first video stream; moving a cursor position on the display device, the moving responsive to changes in pupil position relative to the face of the user, and the moving in real time with pupil position changes; and adjusting cursor position based on the vestibulo-ocular reflex.07-04-2013
20130169535MORE USEFUL MAN MACHINE INTERFACES AND APPLICATIONS - A method for enhancing a well-being of a small child or baby utilizes at least one TV camera positioned to observe one or more points on the child or an object associated with the child. Signals from the TV camera are outputted to a computer, which analyzes the output signals to determine a position or movement of the child or child associated object. The determined position or movement is then compared to preprogrammed criteria in the computer to determine a correlation or importance, and thereby to provide data to the child.07-04-2013
20130169537IMAGE PROCESSING APPARATUS AND METHOD, AND PROGRAM THEREFOR - An image processing apparatus includes an extracting unit for extracting a feature point from a captured image; a recognizing unit for recognizing a position of the feature point; a display unit for displaying, based on the position of the feature point, a feature-point pointer indicating the feature point and a mirrored image of the captured image in a translucent manner; and an issuing unit for issuing, based on the position of the feature point, a command corresponding to the position of the feature point or a motion of the feature point.07-04-2013
20080225001Method For Controlling an Interface Using a Camera Equipping a Communication Terminal - The invention concerns a method for controlling a graphic, audio and/or video interface using a camera equipping a communication terminal which consists in acquiring and/or storing a first image, acquiring and storing a new image, computing the apparent movement by matching both images, interpreting, in accordance with a predetermined control mode, the apparent movement, into user commands, storing in a memory of said terminal the user commands, modifying the display or sound of the terminal according to the user commands and optionally inputting a command validating an element or a graphic zone, or menu opening or triggering or scrolling an audio or video file, or triggering a sound superimposition above a sound track, or executing a task or application by the user on the communication terminal and optionally transmitting same to a second terminal.09-18-2008
20080225000Cancellation of Environmental Motion In Handheld Devices - A method, system and computer program product for compensating the environmental motion in handheld devices. A sensor unit is affixed to an object in the environment to detect and measure environmental motion. Upon measuring any detected environmental motion, the sensor unit transmits a value corresponding to the measured environmental motion to one or more handheld devices. Alternatively, the sensor unit may transmit the value corresponding to the measured environmental motion to a unit configured to retransmit the value to one or more handheld devices. Upon receiving the value corresponding to the measured environmental motion, the handheld device cancels this environmental motion from the motion it measured thereby taking into consideration only the motion inputted by the user of the handheld device.09-18-2008
201102609683D POINTING DEVICE AND METHOD FOR COMPENSATING ROTATIONS OF THE 3D POINTING DEVICE THEREOF - A 3D pointing device utilizing an orientation sensor, capable of accurately transforming rotations and movements of the 3D pointing device into a movement pattern in the display plane of a display device is provided. The 3D pointing device includes the orientation sensor, a rotation sensor, and a computing processor. The orientation sensor generates an orientation output associated with the orientation of the 3D pointing device associated with three coordinate axes of a global reference frame associated with the Earth. The rotation sensor generates a rotation output associated with the rotation of the 3D pointing device associated with three coordinate axes of a spatial reference frame associated with the 3D pointing device itself The computing processor uses the orientation output and the rotation output to generate a transformed output associated with a fixed reference frame associated with the display device above. The transformed output represents a segment of the movement pattern.10-27-2011
20110273369ADJUSTMENT OF IMAGING PROPERTY IN VIEW-DEPENDENT RENDERING - An image is displayed by determining a relative position and orientation of a display in relation to a viewer's head, and rendering an image based on the relative position and orientation. The viewer's eye movement relative to the rendered image is tracked, with at least one area of interest in the image to the viewer being determined based on the viewer's eye movement, and an imaging property of the at least one area of interest is adjusted.11-10-2011
20130169536CONTROL OF A WEARABLE DEVICE - A wearable device including a camera and a processor and a control interface between the wearable device and a user of the wearable device. An image frame is captured from the camera. Within the image frame, an image of a finger of the user is recognized. The recognition of the finger by the wearable device controls the wearable device.07-04-2013
20130169531System and Method of Determining Pupil Center Position - Determining pupil center position. At least some illustrative embodiments are methods including: creating a video signal of an eye, the video signal comprising a stream of frames; and finding an indication of pupil position. The finding may include: calculating a set of feature points within a first frame of the video signal; dividing, by the computer system, the first frame of the video signal into a plurality of sections; selecting a plurality of feature points from the first frame, at least one feature point selected from each section; and determining an ellipse from the plurality of feature points. The method may further include moving a cursor on a display device responsive to change in location of a feature of the ellipse with respect to a previous feature of an ellipse from a previous frame.07-04-2013
20130176220TOUCH FREE OPERATION OF ABLATOR WORKSTATION BY USE OF DEPTH SENSORS - An inventive system and method for touch free operation of an ablation workstation is presented. The system can comprise a depth sensor for detecting a movement, motion software to receive the detected movement from the depth sensor, deduce a gesture based on the detected movement, and filter the gesture to accept an applicable gesture, and client software to receive the applicable gesture at a client computer in an ablation workstation for performing a task in accordance with client logic based on the applicable gesture. The system can also comprise hardware for making the detected movement an applicable gesture. The system can also comprise voice recognition providing voice input for enabling the client to perform the task based on the voice input in conjunction with the applicable gesture. The applicable gesture can be a movement authorized using facial recognition.07-11-2013
20130176223DISPLAY APPARATUS, USER INPUT APPARATUS, AND CONTROL METHODS THEREOF - A display apparatus, a user input apparatus, and control methods thereof are provided. The user input apparatus is connected to a display apparatus having a plurality of display modes and includes: a touch pad unit which receives a user command relating to operation of a user interface (UI) screen corresponding to a first display mode; a button unit which receives a user command relating to operation of a UI screen corresponding to a second display mode; and a communicator which transmits a mode change command, which is entered via either the touch pad unit or the button unit, to the display apparatus. The mode change command is a user command which is generated in conjunction with an execution of an operation method corresponding to a display mode which is different from a current display mode.07-11-2013
20130176221SENSING DEVICE HAVING CURSOR AND HYPERLINKING MODES - An optical sensing device for controlling a computer system is disclosed. The device has a nib for receiving a nib force upon the nib being pressed against a substrate and a nib switch coupled to the nib. An optical sensor images optically coded data printed on the substrate. A processor effects a mode change between a cursor control mode and a hyperlinking mode upon the nib force actuating the nib switch, generates cursor control data when the optical sensing device is in the cursor control mode, and generates interaction data when the optical sensing device is in the hyperlinking mode. The interaction data indicates a coordinate position of the optical sensing device relative to the substrate. The cursor control data or the interaction data is then communicated to the computer system, where the cursor control data initiates a cursor control response and the interaction data initiates a hyperlinking response.07-11-2013
20130176222OPERATIONAL DISPLAY DEVICE AND METHOD OF CONTROLLING THE SAME, AND RECORDING MEDIUM - An operational display device includes a display portion for displaying an image based on image data including a specific portion, a detection portion for detecting an orientation in which the display portion is held, and a display control unit for controlling a manner of display on the display portion. The display control unit causes an image of the specific portion to be displayed as being zoomed-in and rotated in accordance with an orientation of holding of the display portion when the orientation in which the display portion is held is changed from a first orientation to a second orientation.07-11-2013
20130135205Display Method And Terminal Device - The embodiments of the present disclosure provide a display method and a terminal device. The method includes detecting the state of the terminal device and displaying a first content on the display unit according to the first display command; producing a second display command when it is detected that the terminal devices is in the second state and displaying a second content on the display unit according to the second display command; wherein objects included in the first content and objects included in the second content are not exactly the same. The embodiments of the present disclosure produce different display commands according to the detected different states of the terminal device, and contents not exactly the same through the display unit, realizing that the terminal device displays different contents in different states, which improves the user's operation and experience.05-30-2013
20130169534COMPUTER INPUT DEVICE - A computer input device is disclosed which comprised a keyboard having a plurality of keys for entering commands and characters into the computer, the keyboard having a designated surface area overlaying the plurality of keys, at least one of the plurality of keys being located outside of the designated surface area, a touch sensor for detecting one or more touches by one or more objects on the designated surface area of the keyboard, and an input processor configured to switch the keyboard into a mouse mode when the touch sensor having detected the designated surface area being touched by a single object, the input processor configured to switch the keyboard into a keyboard mode when the touch sensor having detected the designated surface area being touched by two or more objects.07-04-2013
20130135203INPUT GESTURES USING DEVICE MOVEMENT - A handheld electronic device has a cursor which is moved by tilting and or accelerating the device, where the cursor movement correlates to a bubble in a bull's eye level. Gestures include flicking, shaking, and reversing an acceleration or tilting, to control movement of the cursor, and to execute instructions corresponding to a position of the cursor. These gestures may be combined with touch, speech, buttons, or other known methods of communication between users and devices.05-30-2013
20130093674Hybrid Pointing System and Method - A handheld controller which includes at least two disparate sensors, such as a motion sensor and a touchpad sensor. A processor deployed in either handheld controller or separate product implements a hybrid pointing and selection method that uses data from the first sensor to adjust the sensitivity to stimulus of the second sensor, and vice versa. The respective sensor data are thus tempered and combined to generate a cursor control signal that includes a large scale control component to control size and movement of a rough pointer region, and a fine scale control component to control position of a precise pointer within the rough pointer region.04-18-2013
20130093675Remote controllable image display system, controller, and processing method therefor - The present invention discloses a remote controllable image display system, and a controller and a motion detection method for use in the system. The system includes: an image display showing images generated by a program; a light source generating at least a light beam; a controller controlling a current image according to its displacement or rotation and including at least one image sensor sensing the light beam to obtain a first frame having at least two light spots; a processor obtaining a first angle between a main operation surface of the controller and a basis plane according to the differences between the coordinates of the two light spots in the first frame.04-18-2013
201300936763D Pointing Devices and Methods - Systems and methods according to the present invention address these needs and others by providing a handheld device, e.g., a 3D device, which uses at least one sensor to detect motion of the handheld device. The detected motion can then be mapped into a desired output, e.g., cursor movement.04-18-2013
20130100020ELECTRONIC DEVICES WITH CAMERA-BASED USER INTERFACES - Electronic devices may include touch-free user input components that include camera modules having overlapping fields-of-view. The overlapping fields-of-view may form a gesture tracking volume in which multi-dimensional user gestures can be tracked using images captured with the camera modules. A camera module may include an image sensor having an array of image pixels and a diffractive element that redirects light onto the array of image pixels. The diffractive element may re-orient the field-of-view of each camera module so that an outer edge of the field-of-view runs along an outer surface of a display for the device. The device may include processing circuitry that operates the device using user input data based on the user gestures in the gesture tracking volume. The processing circuitry may operate the display based on the user gestures by displaying regional markers having a size and a location that depend on the user gestures.04-25-2013
20130120256ELECTRONIC APPARATUS, CONTROL PROGRAM, AND CONTROL METHOD - A control unit changes a display orientation of a display screen according to a tilt direction of a display unit when a display orientation detection unit detects the display orientation of the display screen coincides with, among orientations in which the display screen is allowed to be displayed, an orientation closest to a vertical downward direction in a state where the detected display orientation of the display screen is not changed according to the tilt direction of the display unit. The display orientation detection unit detects the display orientation of the display screen displayed on the display unit. A tilt detection unit detects the tilt direction of the display unit, which displays information, in relation to the vertical downward direction.05-16-2013
20130120257MOVEMENT SENSING DEVICE USING PROXIMITY SENSOR AND METHOD OF SENSING MOVEMENT - There is disclosed a movement sensing device configured to detect movement of an object on a touch region three or more proximity sensors arranged on one surface adjacent to the touch region independently and two-dimensionally, to measure electrical scalars corresponding to distances with the object on the touch region, respectively; and a control unit configured to calculate a vector of a second touch point relatively changed with respect to a first touch point based on a first electrical scalar and a second electrical scalar measured at a predetermined time difference, wherein a relative moving signal with respect to a reference point is generated by calculating the movement of the object as the vector.05-16-2013
20130127717Projector - A projector capable of detecting the position of a detection object while suppressing complication of the structure is provided. This projector is so configured that the optical axes of a laser beam of visible light emitted from a first laser beam generation portion and a laser beam of invisible light emitted from a second laser beam generation portion substantially coincide with each other.05-23-2013
20130127712GESTURE AND VOICE RECOGNITION FOR CONTROL OF A DEVICE - A user interface allows one or more gestures to be devised by a user and mapped or associated with one or more commands or operations of a TV or other device. The user can select the command/operation that is to be associated with each gesture that he/she devised or created. The user is not limited to the use of pre-set gestures that were previously programmed into a system and is not limited to using pre-set commands/operations that were previously associated with pre-set gestures. In alternative embodiments, voice commands or other audible signals are devised by a user and are mapped or associated with commands/operations of a device.05-23-2013
20130127711TOUCH TRACKING OPTICAL INPUT DEVICE - A trackpad has a cover abutting a housing. The cover includes a body that is transparent to infrared light and visible light. A first ink, deposited in a first area on a surface of the cover body, is transmissive to the infrared light and substantially opaque to visible light. A second ink is deposited in at least one second area on the surface of the cover body and is transmissive to visible light. A first emitter, within the housing, produces infrared light that is transmitted through the cover. A second emitter, within the housing, produces visible light that is transmitted through each second area of the cover. An optical sensor is provided within the housing for receiving infrared light that is reflected by an external object back through the cover.05-23-2013
20130135204Unlocking a Screen Using Eye Tracking Information - Methods and systems for unlocking a screen using eye tracking information are described. A computing system may include a display screen. The computing system may be in a locked mode of operation after a period of inactivity by a user. Locked mode of operation may include a locked screen and reduced functionality of the computing system. The user may attempt to unlock the screen. The computing system may generate a display of a moving object on the display screen of the computing system. An eye tracking system may be coupled to the computing system. The eye tracking system may track eye movement of the user. The computing system may determine that a path associated with the eye movement of the user substantially matches a path associated with the moving object on the display and switch to be in an unlocked mode of operation including unlocking the screen.05-30-2013
20130141331METHOD FOR PERFORMING WIRELESS DISPLAY CONTROL, AND ASSOCIATED APPARATUS AND ASSOCIATED COMPUTER PROGRAM PRODUCT - A method and apparatus for performing wireless display control and an associated computer program product are provided, where the method is applied to an electronic device. The method includes: detecting whether a wireless display control agent device corresponding to the electronic device exists, wherein the wireless display control agent device is utilized as an agent for the electronic device to perform wireless display control on a display device electrically connected to the wireless display control agent device; and when it is detected that the wireless display control agent device exists, providing a user with a user interface, allowing the user to utilize a specific operating gesture to start an automatic wireless configuration of the electronic device without performing any manual wireless configuration of the electronic device, wherein based upon the automatic wireless configuration, a wireless connection between the electronic device and the wireless display control agent device is automatically established.06-06-2013
20130100019ENHANCED PROJECTED IMAGE INTERFACE - An interactive display projection system, includes a pointing device which determines a location on the projected display indicated by the pointing device using a combination of a location signal in the display captured by the pointing device and optical mouse circuitry to determine motion of the pointing device when the pointing device is close to the projected display. In another embodiment, the pointing device also includes an inertial sensor and associated circuitry which detects linear accelerations and rotational rates to determine motion and orientation of the pointing device, which are also used to determine the location on the projected display indicated by the pointing device.04-25-2013
20080198131Temperature Feedback PC Pointing peripheral - A new pointing device implementation method that uses temperature sensors mounted on or in a personal computing device's pointing device or game console's game controller to collect and effectively send temperature information about the surface of the peripheral pointing device to the connected personal computing device or game console for use by the software running on the device. The software running on the personal computing device or game console connected to the peripheral pointing device or game controller also sends data instructions specifying the desired temperature for which the surface of the pointing peripheral should be. In response to the instructions sent by the game console or personal computing device, the device heats up or cools down to reflect the context of the video game or software running on the screen or display.08-21-2008
20110241990PROJECTION APPARATUS AND LOCATION METHOD FOR DETERMINING A POSITION OF A LIGHT POINT ON A PROJECTION IMAGE - A projection apparatus and a location method for determining a position of a light point on a projection image of the projection apparatus are provided. The projection apparatus comprises a lens, a light detector, a light guide module and a processing circuit. The light guide module is configured to receive the light point via the lens, and to guide the light point into the light detector. The light detector outputs a detection signal to the processing circuit according to the light point within a detection period. The processing circuit determines the position of the light point on the projection image according to the detection signal.10-06-2011
20110241986INTERACTIVE PROJECTION DEVICE - The subject invention relates to an interactive projection device. The interactive projection device includes a light source configured to emit an input light beam, wherein the light source comprises a visible light emitting device; a first beam splitter configured to split the input light beam into first and second split light beams; a second beam splitter configured to split a scattered light beam received from a surface into third and fourth split light beams; an image forming device configured to produce an image light beam based on the first split light beam and emit the image light beam onto the surface through the first and second splitters, thereby generating a projection image on the surface; and a detector configured to detect the invisible light of the third split light beam, thereby acquiring a scattering image from the surface.10-06-2011
20130147709APPARATUS AND METHOD FOR DETECTING TAP - Provided is an apparatus and method for detecting a tap. The apparatus includes a sensor configured to detect a motion and output a signal corresponding to the motion, a gradient calculating unit connected to the sensor to calculate a gradient of the output signal from the sensor, a similarity determining unit connected to the gradient calculating unit to determine a similarity between a rising gradient and a falling gradient of a curve of the output signal, a tap determining unit connected to the similarity determining unit to determine detection of a tap according to the determination result of the similarity determining unit, and an output unit configured to output the determination result of the tap determining unit.06-13-2013
20130147710DISPLACEMENT DETECTING APPARATUS AND DISPLACEMENT DETECTING METHOD - A displacement detecting apparatus, comprising: a first detecting module, for detecting displacement for an object on a detecting surface of the displacement detecting apparatus to generate first location information, and for generating a first control signal according to the first location information; a second detecting module, for detecting a target, and for detecting second location information for the displacement detecting apparatus relative to the target, and for generating a second control signal according to the second location information; and a switch apparatus, for selectively outputting at least one of the first control signal and the second control signal.06-13-2013
20130147711CAMERA-BASED MULTI-TOUCH INTERACTION APPARATUS, SYSTEM AND METHOD - An apparatus, system and method controls and interacts within an interaction volume within a height over the coordinate plane of a computer such as a computer screen, interactive whiteboard, horizontal interaction surface, video/web-conference system, document camera, rear-projection screen, digital signage surface, television screen or gaming device, to provide pointing, hovering, selecting, tapping, gesturing, scaling, drawing, writing and erasing, using one or more interacting objects, for example, fingers, hands, feet, and other objects, for example, pens, brushes, wipers and even more specialized tools. The apparatus and method be used together with, or even be integrated into, data projectors of all types and its fixtures/stands, and used together with flat screens to render display systems interactive. The apparatus has a single camera covering the interaction volume from either a very short distance or from a larger distance to determine the lateral positions and to capture the pose of the interacting object(s).06-13-2013
20130147712Information Processing Device And Control Method Thereof - An information processing device and a control method applied to the information processing device is described. The information processing device includes a display unit configured to display images; an input unit configured to receive inputs from a user; a motion detecting unit configured to detect motion of the information processing device and to generate data related to the motion; and a processing unit connected to the display unit, the input unit and the motion detecting unit. The processing unit is configured to receive the motion-related data from the motion detecting unit, and enable/disable of the display unit and/or the input unit based on the data related to the motion.06-13-2013
20100295784DUAL-PEN: MASTER-SLAVE - There is disclosed an interactive display system comprising an interactive surface for displaying an image and for receiving inputs from remote devices, the system being adapted to detect the presence of at least two remote devices proximate the interactive surface.11-25-2010
20100309124METHOD OF CALIBRATING POSITION OFFSET OF CURSOR - The present invention provides a method of calibrating a position offset of a cursor on a screen such that, when a pointing device has already been moved to a position beyond the screen boundary, virtual coordinates of the pointing device are calculated and recorded to track the physical positions of the pointing device efficiently, and then the position offset between the pointing device and the cursor on the screen is compensated and corrected, so as for the user to greatly reduce the hassle of manually operating the pointing device for controlling cursor movement and thereby operate the cursor on the screen at will.12-09-2010
20120274562Method, Apparatus and Computer Program Product for Displaying Media Content - In accordance with an example embodiment a method and apparatus is provided. The method comprises receiving at least one face as an input. A presence of the at least one face in a media content is determined and a modified display of the media content is generated if the at least one face is determined to be present in the media content.11-01-2012
20120274561OPERATION CONTROL SYSTEM USING INFRARED RAYS AND METHOD THEREOF - An operation control method is provided. The method is applied on an operation control system. The system includes an operation device and an electronic device. The operation device includes an infrared emitter to emit infrared rays. The electronic device includes a display unit and infrared receivers. The infrared ray creates a light/heat spot on the display unit. Infrared receivers receive the infrared ray emitted by the infrared emitter. The infrared receiver receiving one infrared ray generates a corresponding signal. The method includes receiving the signal generated by the infrared receiver, determining which infrared receivers generated the signal to determine the light spot on the display unit, and controlling the electronic device to execute the corresponding function according to signals created by the light spot or spots striking the display unit.11-01-2012
20120274560POINTING DEVICE - An improved air pointing device which is capable of compensating for the roll angle imparted to said device by a user of said device. The device of the invention includes at least two gyrometers and two accelerometers, the latter being each used for different roll angles. The correction of the roll angle is effected by using the measurements of the first accelerometer, the output of the second accelerometer being simulated as an approximated function of the measurements of the first accelerometer, using a polynomial approximation of the relation between the output of the two accelerometers, the order of the polynomial being chosen as a function of the computing power which is available and the precision which is needed. In an embodiment of the invention, the first and second accelerometers are advantageously swapped at a value of the roll angle which is substantially equal to 45°.11-01-2012
20120274559Diffusing light of a laser - Embodiments disclosed herein relate to diffusing light of a multi-mode laser. In one embodiment, the multi-mode laser projects a plurality of modes of light and a diffuser reflects the plurality of modes of light to output a single lobe of light.11-01-2012
20110234492Gesture processing - Presented is method and system for processing a gesture performed by a user of an input device. The method comprises detecting the gesture and determining a distance of the input device from a predetermined location. A user command is then determined based on the detected gesture and the determined distance.09-29-2011
20100315338DUPLICATE OBJECTS - There is disclosed an interactive display system comprising an interactive surface for displaying an image and for receiving inputs from remote devices, the system being adapted to detect the presence of at least two remote devices proximate the interactive surface.12-16-2010
20100315336Pointing Device Using Proximity Sensing - A pointing device using proximity sensing is described. In an embodiment, a pointing device comprises a movement sensor and a proximity sensor. The movement sensor generates a first data sequence relating to sensed movement of the pointing device relative to a surface. The proximity sensor generates a second data sequence relating to sensed movement relative to the pointing device of one or more objects in proximity to the pointing device. In embodiments, data from the movement sensor of the pointing device is read and the movement of the pointing device relative to the surface is determined. Data from the proximity sensor is also read, and a sequence of sensor images of one or more objects in proximity to the pointing device are generated. The sensor images are analyzed to determine the movement of the one or more objects relative to the pointing device.12-16-2010
20100315335Pointing Device with Independently Movable Portions - A pointing device with independently movable portions is described. In an embodiment, a pointing device comprises a base unit and a satellite portion. The base unit is arranged to be located under a palm of a user's hand and be movable over a supporting surface. The satellite portion is arranged to be located under a digit of the user's hand and be independently movable over the supporting surface relative to the base unit. In embodiments, data from at least one sensing device is read, and movement of both the base unit and the independently movable satellite portion of the pointing device is calculated from the data. The movement of the base unit and the satellite portion is analyzed to detect a user gesture.12-16-2010
20130154930GESTURE CONTROLLED AUDIO USER INTERFACE - A user interface, methods and article of manufacture each for selecting an audio cue presented in three-dimensional (3D) space are disclosed. The audio cues are audibly perceivable in a space about a user, where each of the audio cues may be perceived by the user as a directional sound at a distinct location from other audio cues in the space. Selection of a specific audio cue is made based on one or more user gestures. A portable electronic device may be configured to present the audio cues perceived by a user and detect certain user gestures to select audio cues. The audio cue selection can be used to control operation of the portable device and/or other associated devices.06-20-2013
20120280910CONTROL SYSTEM AND METHOD FOR CONTROLLING A PLURALITY OF COMPUTER DEVICES - The invention provides a control system for controlling a plurality of computer devices, the computer devices each having at least a processing unit, the control system comprising a pointer device, a tracking system connected to a processing unit of each of the plurality of computer devices, the tracking system comprising at least a tracking unit arranged to determine an actual position and/or orientation of the pointer device, wherein the tracking system is arranged to determine a parameter set representative of at least one of position and orientation of the pointer device and to select one of a plurality of computer devices depending on said parameter set and to send a control signal from the tracking system to a processing unit of the selected computer device wherein the control signal is based on said parameter set.11-08-2012
20120280909IMAGE DISPLAY SYSTEM AND IMAGE DISPLAY METHOD - An image display system includes an image projection unit that projects image information; a light pointer that designates a point in the image information by irradiating pointing light, wherein the irradiation of the pointing light can be turned on and turned off; a photographing unit that photographs an area on which the image information is projected, and that outputs photographed information; an instruction detection unit that detects an irradiation position of the pointing light in the image information based on the photographed information, and that detects whether the irradiation of the pointing light is turned on or turned off; and a control unit that sets additional image information at a timing in which the instruction detection unit detects that the irradiation of the pointing light is turned off, depending on a position at which the irradiation of the pointing light is turned off.11-08-2012
20130181900NON-CONTACT SELECTION DEVICE - A non-contract selecting device is disclosed. The non-contract selecting device include a light source, emitting light to an outside; a camera unit, generating and outputting a video signal corresponding to an external video; a video data generating unit, generating video data corresponding to the video signal; and an identity unit, detecting a location of a detected area formed by light, reflected by pointing-means and inputted, of the light emitted from the video data in units of each frame, recognizing a moving locus of the detected area by comparing at least two continuous frames and generating and outputting corresponding change information. With the present invention, the function-selecting can be more quickly and easily and increase making the most use of elements.07-18-2013
20130181899REMOTE CONTROL FOR SENSING MOVEMENT, IMAGE DISPLAY APPARATUS FOR CONTROLLING POINTER BY THE REMOTE CONTROL, AND CONTROLLING METHOD THEREOF - A remote control is provided including a plurality of sensors which sense movement of the remote control, and a control unit which turns on at least one sensor of the plurality of sensors and thereby senses movement of the remote control, and determines whether to turn on or off the remaining sensors according to whether or not the at least one sensor senses movement of the remote control. Consequently, battery consumption is reduced.07-18-2013
20110291929COMPUTER READABLE STORAGE MEDIUM HAVING STORED THEREIN INFORMATION PROCESSING PROGRAM, INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING METHOD, AND INFORMATION PROCESSING SYSTEM - Correspondence data representing correspondence between a plurality of selection target objects and the attitude of an input device is stored in an information processing apparatus. In accordance with attitude/motion data acquired from the input device, the attitude of the input device is calculated. In accordance with the correspondence data, a selection target object corresponding to the attitude of the input device is selected thereby to perform a process based on the selection target object having been selected.12-01-2011
20110291926Gesture recognition system using depth perceptive sensors - Acquired three-dimensional positional information is used to identify user created gesture(s), which gesture(s) are classified to determine appropriate input(s) to an associated electronic device or devices. Preferably at at least one instance of a time interval, the posture of a portion of a user is recognized, based at least one factor such as shape, position, orientation, velocity. Posture over each of the instance(s) is recognized as a combined gesture. Because acquired information is three-dimensional, two gestures may occur simultaneously.12-01-2011
20130120255IMAGE PROCESSING APPARATUS AND METHOD - An image processing apparatus provided with a display unit for displaying an operation picture, showing control indicia (e.g., icons) corresponding to multiple image processing functions, is operated to display the operation picture on an external display device. The operator selects one of the indicia shown on the external display device by operating an input device, such as touch screen of the apparatus or a remote control, to indicate the position of the desired one of the indicia. The operator can thereby quickly and reliably select an image processing function to be executed, without any need to view an operation picture on the display unit when making the selection.05-16-2013
20110310014IMAGE PICKUP APPARATUS AND PROJECTION TYPE IMAGE DISPLAY APPARATUS - In an image pickup apparatus, a visible light cut filter allows infrared components to pass through, and blocks visible light components. A plurality of image pickup devices receive the light transmitted through the visible light cut filter such that a plurality of color components are received separately from each other. The visible light cut filter allows part of the visible light components to transmit such that the visible light component enters at least one of the plurality of image pickup devices.12-22-2011
20110310013INTERFACE APPARATUS AND METHOD FOR CONTACT-FREE SPACE INPUT/OUTPUT - A space input/output interface apparatus includes: a proximity sensor for sensing a movement of a user's wrist; an inertial sensor for sensing a movement of the user's arm; and a controller for generating user input/output interface recognition information corresponding to a sensing value of the proximity sensor or the inertial sensor. The apparatus is an armband-type space input/output interface apparatus which can be put on the user's wrist.12-22-2011
20130187853DISPLAY SYSTEM - An embodiment of the present invention provides a display system, and the display system comprises a light beam emitting device, emitting a first light beam for marking an input position and a second light beam for confirming the input position, the first light beam being visible light, and the second light beam differing from the first light beam; and a display device, comprising a displaying area, photo-sensitive devices distributed within the displaying area, processing devices coupled with the photo-sensitive devices. The photo-sensitive devices are used for sensing the second light beam projected upon the displaying area and achieving a sensing result; the processing devices are used for determining the projecting position of the second light beam upon the displaying area according to the sensing result, and performing a corresponding operation based on the sensing result. The present invention is capable of performing remote touch operation on the display device.07-25-2013
20130187852THREE-DIMENSIONAL IMAGE PROCESSING APPARATUS, THREE-DIMENSIONAL IMAGE PROCESSING METHOD, AND PROGRAM - A three-dimensional image processing apparatus includes: an output unit configured to output a plurality of three-dimensional images to a display apparatus; a detection unit configured to detect a pointer in association with a three-dimensional image displayed on the display apparatus; an operation determination unit configured to determine a predetermined operation based on movement of the pointer detected by the detection unit; and an image processing unit configured to perform, on the three-dimensional image associated with the pointer, processing associated with the predetermined operation determined by the operation determination unit, and to cause the output unit to output the processed three-dimensional image.07-25-2013
20130187854Pointing Device Using Camera and Outputting Mark - Pointing device like mouse or joystick comprises camera for capturing the display screen and image processing means for recognizing and tracking the pointing cursor icon or mark from the captured image and producing the pointing signal. The pointing device of present invention can be used with any type of display without and additional tracking means like ultra sonic sensor, infrared sensor or touch sensor. The pointing device of present invention includes mark outputting portion, camera portion for capturing the said mark outputting portion and image processing portion for recognizing the said mark outputting portion from the captured image and producing the pointing signal.07-25-2013
20120019443TOUCH SYSTEM AND TOUCH SENSING METHOD - A touch system comprises a transparent panel, a first image sensing module, a second image sensing module and a processing circuit. The first image sensing module is at least partially disposed on a first flat surface of the transparent for obtaining an image above the first flat surface. The second image sensing module is at least partially disposed under a second flat surface of the transparent for obtaining an image of the first flat surface through the second flat surface. When two pointers approach the first flat surface, the processing circuit calculates possible coordinates of the pointers according to the image obtained by the first image sensing module and calculates coordinates of the pointers according to the image obtained by the second image sensing module, so as to compare all of the coordinates to obtain actual coordinates of the pointers from the possible coordinates.01-26-2012
20130194184METHOD AND APPARATUS FOR CONTROLLING MOBILE TERMINAL USING USER INTERACTION - A method and apparatus for controlling a mobile terminal through use of user interaction are provided. The method includes operating in a vision recognition mode that generates a vision recognition image through use of a signal output from a second plurality of pixels designated as vision pixels from among a plurality of pixels of an image sensor included in the mobile terminal; determining whether a predetermined object in the vision recognition image corresponds to a person; determining a gesture of the predetermined object when the predetermined object corresponds to the person; and performing a control function of the mobile terminal corresponding to the gesture of the predetermined object.08-01-2013
20130194183COMPUTER MOUSE PERIPHERAL - A computer pointing device including: a base portion with a lower surface adapted for sliding across a work surface, a spine portion, projecting substantially upward from said base portion and having a thumb-engaging surface on a first lateral side of the spine and at least one index fingertip and/or middle fingertip-engaging surface on a second lateral side of the spine opposing said first lateral side. A keyboard with an altered arrangement of function of keys, such as an enlarged or truncated or no spacebar or capable of an altered appearance in accordance with keys being re-mapped to sensors on a pointing device. A keyboard with a virtual screen display, which may be made semi-transparent by activating a sensor on a pointing device. A computer with a recess capable of accommodating a mouse device. A locked scrolling or zooming means, using any pointing device, in which scrolling or zooming in a defined direction is proportional to the distance travelled by the device, irrespective of direction of movement of the device.08-01-2013
20130194182GAME DEVICE, CONTROL METHOD FOR A GAME DEVICE, AND NON-TRANSITORY INFORMATION STORAGE MEDIUM - A position information acquiring unit acquires position information relating to positions of a plurality of body parts of a player. A determination unit determines whether or not the at least one of the plurality of body parts exists within a determination region including the reference position at a time point corresponding to the reference time. A body part information acquiring unit acquires body part information relating to a kind of the at least one of the plurality of body parts determined to exist within the determination region. An evaluation unit evaluates gameplay of the player based on the kind of the at least one of the plurality of body parts acquired by the body part information acquiring unit.08-01-2013
20120044141INPUT SYSTEM, INPUT METHOD, COMPUTER PROGRAM, AND RECORDING MEDIUM - A position of a cursor 02-23-2012
20130201106METHOD FOR CONTROLLING ACTIONS BY USE OF A TOUCH SCREEN - A method for controlling a pointer having a position determined by the position of at least one end of a member on a touch screen. An offset is inserted between the position of the pointer and that of the end of the member for driving movements of the pointer such that the end does not have to cover an object displayed on the screen in order to effectively select the object.08-08-2013
20130201105METHOD FOR CONTROLLING INTERACTIVE DISPLAY SYSTEM - A method for controlling a multi-user interactive display system including a soft-copy display including at least an information display region and a command control region, and a digital image capture system positioned to capture a time sequence of images of users located in a field-of-view of the soft-copy display. A time sequence of images is analyzed to detect a plurality of users, and at least one of the users is designated to be a controlling user. The captured images are displayed in the command control region, wherein the detected users are demarked using graphical elements. The captured time sequence of images is analyzed to detect a gesture made by the controlling user and content displayed in the information display region is updated accordingly.08-08-2013
20130201104MULTI-USER INTERACTIVE DISPLAY SYSTEM - A multi-user interactive display system including a soft-copy display including at least an information display region and a command control region, and a digital image capture system positioned to capture a time sequence of images of users located in a field-of-view of the soft-copy display. A time sequence of images is analyzed to detect a plurality of users, and at least one of the users is designated to be a controlling user. The captured images are displayed in the command control region, wherein the detected users are demarked using graphical elements. The captured time sequence of images is analyzed to detect a gesture made by the controlling user and content displayed in the information display region is updated accordingly.08-08-2013
20130100017Notification Profile Configuration Based on Device Orientation - In one embodiment, a user places a mobile device (e.g., a smart phone) facing downward on a table. A process running on the mobile device determines an orientation of the mobile device (i.e., a facing downward orientation), and determines that the mobile device has been in the facing downward orientation for over a threshold period of time (e.g., 3 seconds), then the process automatically selects a “Quiet” notification profile, and turn off the mobile device's display, without additional input from the user.04-25-2013
20120299826Human/Machine Interface for Using the Geometric Degrees of Freedom of the Vocal Tract as an Input Signal - A human/machine (HM) interface that enables a human operator to control a corresponding machine using the geometric degrees of freedom of the operator's vocal tract, for example, using the tongue as a virtual joystick. In one embodiment, the HM interface has an acoustic sensor configured to monitor, in real time, the geometry of the operator's vocal tract using acoustic reflectometry. A signal processor analyzes the reflected acoustic signals detected by the acoustic sensor, e.g., using signal-feature selection and quantification, and translates these signals into commands and/or instructions for the machine. Both continuous changes in the machine's operating parameters and discrete changes in the machine's operating configuration and/or state can advantageously be implemented.11-29-2012
20120086637SYSTEM AND METHOD UTILIZED FOR HUMAN AND MACHINE INTERFACE - The present invention discloses a system for human and machine interface. The system includes a 3-dimensional (3D) image capture device, for capturing a gesture of a motion object in a period of time; a hand-held inertial device (HHID), for transmitting a control signal; and a computing device. The computing device includes a system integration and GUI module, for compensating the control signal according to an image signal corresponding to the motion object, to generate a compensated control signal.04-12-2012
20120086636SENSOR RING AND INTERACTIVE SYSTEM HAVING SENSOR RING - This disclosure provides a sensor ring and an interactive system having the sensor ring. The interactive system includes the sensor ring, a RF receiver, an image-capture device and a signal processor. The sensor ring is adopted for wear on fingers or toes, has a sensor module to produce a sensing signal, and a RF transmitter for transmission of sensing signals. The image-capture device has a camera module used to produce a detection signal. The RF receiver receives the sensing signals of the ring. The signal processor processes the sensing signals of the rings and the detection signals of the image-capture device to produces interactive operation.04-12-2012

Patent applications in class Including orientation sensors (e.g., infrared, ultrasonic, remotely controlled)